1
|
Montgomery KB, Mellinger JD, Lindeman B. Entrustable Professional Activities in Surgery: A Review. JAMA Surg 2024; 159:571-577. [PMID: 38477902 DOI: 10.1001/jamasurg.2023.8107] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/14/2024]
Abstract
Importance Entrustable professional activities (EPAs) compose a competency-based education (CBE) assessment framework that has been increasingly adopted across medical specialties as a workplace-based assessment tool. EPAs focus on directly observed behaviors to determine the level of entrustment a trainee has for a given activity of that specialty. In this narrative review, we highlight the rationale for EPAs in general surgery, describe current evidence supporting their use, and outline some of the practical considerations for EPAs among residency programs, faculty, and trainees. Observations An expanding evidence base for EPAs in general surgery has provided moderate validity evidence for their use as well as practical recommendations for implementation across residency programs. Challenges to EPA use include garnering buy-in from individual faculty and residents to complete EPA microassessments and engage in timely, specific feedback after a case or clinical encounter. When successfully integrated into a program's workflow, EPAs can provide a more accurate picture of residents' competence for a fundamental surgical task or activity compared with other assessment methods. Conclusions and Relevance EPAs represent the next significant shift in the evaluation of general surgery residents as part of the overarching progression toward CBE among all US residency programs. While pragmatic challenges to the implementation of EPAs remain, the best practices from EPA and other CBE assessment literature summarized in this review may assist individuals and programs in implementing EPAs. As EPAs become more widely used in general surgery resident training, further analysis of barriers and facilitators to successful and sustainable EPA implementation will be needed to continue to optimize and advance this new assessment framework.
Collapse
|
2
|
Pradarelli AA, Park YS, Healy MG, Phitayakorn R, Petrusa E. National Profile of the ACGME Milestones 1.0 and 2.0 within General Surgery: A Seven-Year National Study from 2014 to 2021. JOURNAL OF SURGICAL EDUCATION 2024; 81:626-638. [PMID: 38555246 DOI: 10.1016/j.jsurg.2024.01.016] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/16/2023] [Revised: 09/15/2023] [Accepted: 01/29/2024] [Indexed: 04/02/2024]
Abstract
PURPOSE The Accreditation Council for Graduate Medical Education (ACGME) introduced General Surgery Milestones 1.0 in 2014 and Milestones 2.0 in 2020 as steps toward competency-based training. Analysis will inform residency programs on curriculum development, assessment, feedback, and faculty development. This study describes the distributions and trends for Milestones 1.0 and 2.0 ratings and proportion of residents not achieving the level 4.0 graduation target. METHODS A deidentified dataset of milestone ratings for all ACGME-accredited General Surgery residency programs in the United States was used. Medians and interquartile ranges (IQR) were reported for milestone ratings at each PGY level. Percentages of PGY-5s receiving final year ratings of less than 4.0 were calculated. Wilcoxon rank sum tests were used to compare 1.0 and 2.0 median ratings. Kruskal-Wallis and Bonferroni post-hoc tests were used to compare median ratings across time periods and PGY levels. Chi-squared tests were used to compare the proportion of level 4.0 nonachievement under both systems. RESULTS Milestones 1.0 data consisted of 13,866 residents and Milestones 2.0 data consisted of 7,633 residents. For 1.0 and 2.0, all competency domain median ratings were higher for subsequent years of training. Milestones 2.0 had significantly higher median ratings at all PGY levels for all competency domains except Medical Knowledge. Percentages of PGY-5 residents not achieving the graduation target in Milestones 1.0 ranged from 27% to 42% and in 2.0 from 5% to 13%. For Milestones 1.0, all subcompetencies showed an increased number of residents achieving the graduation target from 2014 to 2019. CONCLUSIONS This study of General Surgery Milestones 1.0 and 2.0 data uncovered significant increases in average ratings and significantly fewer residents not achieving the graduation target under the 2.0 system. We hypothesize that these findings may be related more to rating bias given the change in rating scales, rather than a true increase in resident ability.
Collapse
Affiliation(s)
- Alyssa A Pradarelli
- Medical Education Design Lab, Department of Surgery, University of Michigan, Ann Arbor, Michigan; Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts.
| | - Yoon Soo Park
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts
| | - Michael G Healy
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts
| | - Roy Phitayakorn
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts
| | - Emil Petrusa
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts
| |
Collapse
|
3
|
Chen XP, Harzman A, Go M, Arnold M, Ellison EC. Cumulative Sum Chart as Complement to Objective Assessment of Graduating Surgical Resident Competency: An Exploratory Study. J Am Coll Surg 2023; 237:894-901. [PMID: 37530413 DOI: 10.1097/xcs.0000000000000812] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 08/03/2023]
Abstract
BACKGROUND Rater-based assessment and objective assessment play an important role in evaluating residents' clinical competencies. We hypothesize that a cumulative sum (CUSUM) chart of operative time is a complement to the assessment of chief general surgery residents' competencies with ACGME Milestones, aiding residency programs' determination of graduating residents' practice readiness. STUDY DESIGN We extracted ACGME Milestone evaluations of performance of operations and procedures (POP) and 3 objective metrics (operative time, case type, and case complexity) from 3 procedures (cholecystectomy, colectomy, and inguinal hernia) performed by 3 cohorts of residents (N = 15) during their PGY4-5. CUSUM charts were computed for each resident on each procedure type. A learning plateau was defined as at least 4 cases consistently locating around the centerline (target performance) at the end of a CUSUM chart with minimal deviations (range 0 to 1). RESULTS All residents reached the ACGME graduation targets for the overall POP by the end of chief year. A total of 2,446 cases were included (cholecystectomy N = 1234, colectomy N = 507, and inguinal hernia N = 705), and 3 CUSUM chart patterns emerged: skewed distribution, bimodal distribution, and peaks and valleys distribution. Analysis of CUSUM charts revealed surgery residents' development processes in the operating room towards a learning plateau vary, and only 46.7% residents reach a learning plateau in all 3 procedures upon graduation. CONCLUSIONS CUSUM charts of operative time complement the ACGME Milestones evaluations. The use of both may enable residency programs to holistically determine graduating residents' practice readiness and provide recommendations for their upcoming career/practice transition.
Collapse
|
4
|
Kendrick DE, Thelen AE, Chen X, Gupta T, Yamazaki K, Krumm AE, Bandeh-Ahmadi H, Clark M, Luckoscki J, Fan Z, Wnuk GM, Ryan AM, Mukherjee B, Hamstra SJ, Dimick JB, Holmboe ES, George BC. Association of Surgical Resident Competency Ratings With Patient Outcomes. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2023; 98:813-820. [PMID: 36724304 DOI: 10.1097/acm.0000000000005157] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/18/2023]
Abstract
PURPOSE Accurate assessment of clinical performance is essential to ensure graduating residents are competent for unsupervised practice. The Accreditation Council for Graduate Medical Education milestones framework is the most widely used competency-based framework in the United States. However, the relationship between residents' milestones competency ratings and their subsequent early career clinical outcomes has not been established. It is important to examine the association between milestones competency ratings of U.S. general surgical residents and those surgeons' patient outcomes in early career practice. METHOD A retrospective, cross-sectional study was conducted using a sample of national Medicare claims for 23 common, high-risk inpatient general surgical procedures performed between July 1, 2015, and November 30, 2018 (n = 12,400 cases) by nonfellowship-trained U.S. general surgeons. Milestone ratings collected during those surgeons' last year of residency (n = 701 residents) were compared with their risk-adjusted rates of mortality, any complication, or severe complication within 30 days of index operation during their first 2 years of practice. RESULTS There were no associations between mean milestone competency ratings of graduating general surgery residents and their subsequent early career patient outcomes, including any complication (23% proficient vs 22% not yet proficient; relative risk [RR], 0.97, [95% CI, 0.88-1.08]); severe complication (9% vs 9%, respectively; RR, 1.01, [95% CI, 0.86-1.19]); and mortality (5% vs 5%; RR, 1.07, [95% CI, 0.88-1.30]). Secondary analyses yielded no associations between patient outcomes and milestone ratings specific to technical performance, or between patient outcomes and composites of operative performance, professionalism, or leadership milestones ratings ( P ranged .32-.97). CONCLUSIONS Milestone ratings of graduating general surgery residents were not associated with the patient outcomes of those surgeons when they performed common, higher-risk procedures in a Medicare population. Efforts to improve how milestones ratings are generated might strengthen their association with early career outcomes.
Collapse
Affiliation(s)
- Daniel E Kendrick
- D.E. Kendrick is assistant professor, Department of Surgery, University of Minnesota, Minneapolis, Minnesota
| | - Angela E Thelen
- A.E. Thelen is research fellow, Center for Surgical Training and Research, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Xilin Chen
- X. Chen is research analyst, Center for Surgical Training and Research, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Tanvi Gupta
- T. Gupta is research analyst, Center for Surgical Training and Research, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Kenji Yamazaki
- K. Yamazaki is senior data analyst, Accreditation Council for Graduate Medical Education, Chicago, Illinois
| | - Andrew E Krumm
- A.E. Krumm is assistant professor, Department of Learning Health Sciences, University of Michigan, Ann Arbor, Michigan
| | - Hoda Bandeh-Ahmadi
- H. Bandeh-Ahmadi is project manager, Center for Surgical Training and Research, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Michael Clark
- M. Clark is a biostatistician, Consulting for Statistics, Computing, and Analytics Research, University of Michigan, Ann Arbor, Michigan
| | - John Luckoscki
- J. Luckoscki is research fellow, Center for Surgical Training and Research, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Zhaohui Fan
- Z. Fan is research analyst, Center for Healthcare Outcomes and Policy, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Greg M Wnuk
- G.M. Wnuk is program manager, Center for Surgical Training and Research, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Andrew M Ryan
- A.M. Ryan is professor, Department of Health Management and Policy, School of Public Health, University of Michigan, Ann Arbor, Michigan
| | - Bhramar Mukherjee
- B. Mukherjee is professor and chair, Division of Biostatistics, School of Public Health, University of Michigan, Ann Arbor, Michigan
| | - Stanley J Hamstra
- S.J. Hamstra is professor, Department of Surgery, University of Toronto, Toronto, Ontario, Canada
| | - Justin B Dimick
- J.B. Dimick is professor and chair, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Eric S Holmboe
- E.S. Holmboe is chief research, Milestone Development, and evaluation officer, Accreditation Council for Graduate Medical Education, Chicago, Illinois
| | - Brian C George
- B.C. George is director, Center for Surgical Training and Research, and assistant professor, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| |
Collapse
|
5
|
Carroll MA, McKenzie A, Tracy-Bee M. Movement System Theory and Anatomical Competence: Threshold Concepts for Physical Therapist Anatomy Education. ANATOMICAL SCIENCES EDUCATION 2022; 15:420-430. [PMID: 33825338 DOI: 10.1002/ase.2083] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/16/2020] [Revised: 03/27/2021] [Accepted: 04/02/2021] [Indexed: 06/12/2023]
Abstract
This viewpoint proposes eight anatomy threshold concepts related to physical therapist education, considering both movement system theory and anatomical competence. Movement system theory provides classifications and terminology that succinctly identifies and describes physical therapy practice from a theoretical and philosophical framework. The cardiovascular, pulmonary, endocrine, integumentary, nervous, and musculoskeletal systems are all included within this schema as the movement system theory encompasses all body systems interacting to create movement across the lifespan. Implementing movement system theory requires an ability to use human anatomy in physical therapist education and practice. Understanding the human body is a mandatory prerequisite for effective diagnosis, assessment, treatment, and patient evaluation. Anatomical competence refers to the ability to apply anatomic knowledge within the appropriate professional and clinical contexts. Exploring the required anatomical concepts for competent entry-level physical therapist education and clinical practice is warranted. The recommended threshold concepts (fluency, dimensionality, adaptability, connectivity, complexity, stability or homeostasis, progression or development, and humanity) could serve as an integral and long-awaited tool for guiding anatomy educators in physical therapy education.
Collapse
Affiliation(s)
- Melissa A Carroll
- Division of Healthcare Professions, Doctor of Physical Therapy Program, DeSales University, Center Valley, Pennsylvania
| | - Alison McKenzie
- Department of Physical Therapy, Chapman University, Irvine, California
- Department of Neurology, University of California, Irvine, California
| | - Mary Tracy-Bee
- Biology Department, University of Detroit Mercy, Detroit, Michigan
- Department of Movement Science, Physical Therapy Program, Oakland University, Rochester, Michigan
| |
Collapse
|
6
|
Mikhaeil-Demo Y, Holmboe E, Gerard EE, Wayne DB, Cohen ER, Yamazaki K, Templer JW, Bega D, Culler GW, Bhatt AB, Shafi N, Barsuk JH. Simulation-Based Assessments and Graduating Neurology Residents' Milestones: Status Epilepticus Milestones. J Grad Med Educ 2021; 13:223-230. [PMID: 33897956 PMCID: PMC8054597 DOI: 10.4300/jgme-d-20-00832.1] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/28/2020] [Revised: 10/05/2020] [Accepted: 01/06/2021] [Indexed: 12/13/2022] Open
Abstract
BACKGROUND The American Board of Psychiatry and Neurology and the Accreditation Council for Graduate Medical Education (ACGME) developed Milestones that provide a framework for residents' assessment. However, Milestones do not provide a description for how programs should perform assessments. OBJECTIVES We evaluated graduating residents' status epilepticus (SE) identification and management skills and how they correlate with ACGME Milestones reported for epilepsy and management/treatment by their program's clinical competency committee (CCC). METHODS We performed a cohort study of graduating neurology residents from 3 academic medical centers in Chicago in 2018. We evaluated residents' skills identifying and managing SE using a simulation-based assessment (26-item checklist). Simulation-based assessment scores were compared to experience (number of SE cases each resident reported identifying and managing during residency), self-confidence in identifying and managing these cases, and their end of residency Milestones assigned by a CCC based on end-of-rotation evaluations. RESULTS Sixteen of 21 (76%) eligible residents participated in the study. Average SE checklist score was 15.6 of 26 checklist items correct (60%, SD 12.2%). There were no significant correlations between resident checklist performance and experience or self-confidence. The average participant's level of Milestone for epilepsy and management/treatment was high at 4.3 of 5 (SD 0.4) and 4.4 of 5 (SD 0.4), respectively. There were no significant associations between checklist skills performance and level of Milestone assigned. CONCLUSIONS Simulated SE skills performance of graduating neurology residents was poor. Our study suggests that end-of-rotation evaluations alone are inadequate for assigning Milestones for high-stakes clinical skills such as identification and management of SE.
Collapse
Affiliation(s)
- Yara Mikhaeil-Demo
- Yara Mikhaeil-Demo, MD, is Assistant Professor, Department of Neurology, Northwestern University, Feinberg School of Medicine
| | - Eric Holmboe
- Eric Holmboe, MD, MACP, FRCP, is Chief Research, Milestone Development, and Evaluation Officer, Accreditation Council for Graduate Medical Education (ACGME)
| | - Elizabeth E. Gerard
- Elizabeth E. Gerard, MD, is Director, Clinical Neurophysiology Fellowship, and Associate Professor, Department of Neurology, Northwestern University, Feinberg School of Medicine
| | - Diane B. Wayne
- Diane B. Wayne, MD, is Vice Dean for Education, Chair, Department of Medical Education, and Professor of Medicine and Medical Education, Northwestern University, Feinberg School of Medicine
| | - Elaine R. Cohen
- Elaine R. Cohen, MEd, is Research Associate, Department of Medicine, Northwestern University, Feinberg School of Medicine
| | - Kenji Yamazaki
- Kenji Yamazaki, PhD, is Senior Analyst, Milestones Research and Evaluation, ACGME
| | - Jessica W. Templer
- Jessica W. Templer, MD, is Director, Epilepsy Fellowship, and Assistant Professor, Department of Neurology, Northwestern University, Feinberg School of Medicine
| | - Danny Bega
- Danny Bega, MD, is Director, Neurology Residency Program, and Assistant Professor, Department of Neurology, Northwestern University, Feinberg School of Medicine
| | - George W. Culler
- George W. Culler, MD, is Epilepsy Fellow, Department of Neurology, Northwestern University, Feinberg School of Medicine
| | - Amar B. Bhatt
- Amar B. Bhatt, MD, is Assistant Professor, Department of Neurological Sciences, Rush University
| | - Neelofer Shafi
- Neelofer Shafi, MD, is Director, Students and Faculty Development, and Assistant Professor, Department of Neurology and Rehabilitation, University of Illinois Chicago
| | - Jeffrey H. Barsuk
- Jeffrey H. Barsuk, MD, MS, is Director, Simulation and Patient Safety, and Professor of Medicine and Medical Education, Northwestern University, Feinberg School of Medicine
| |
Collapse
|
7
|
H'ng MWC, Kassim NA, Wong DES. Workplace-based assessments in the Singapore radiology residency programme - aiming for the next milestone. Singapore Med J 2021; 62:149-152. [PMID: 33846755 DOI: 10.11622/smedj.2021028] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
|
8
|
Greenberg L. Can the Recruitment of Senior Transitioning Clinician Educators Enhance the Number and Quality of Resident Observations? Thinking Outside the Box. TEACHING AND LEARNING IN MEDICINE 2020; 32:569-574. [PMID: 32841577 DOI: 10.1080/10401334.2020.1801442] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
Issue: The Accreditation Council for Graduate Medical Education's Next Accreditation System has forever changed the way faculty evaluate residents, fellows, and medical students, mandating direct observation by faculty of trainee performance. Evidence: The literature suggests that institutional culture does not support trainee observation, and faculty perceive that they have limited time to observe trainees in an efficient and effective manner. These factors contribute to an inadequate number of trainee observations, limiting faculty ability to assess trainees' achievement of competency. Hiring more faculty to increase observations has not been feasible or a priority, nor have faculty development programs been universally effective in recruiting faculty to enhance observations. Implications: To alleviate this important problem, the author proposes recruiting senior clinician educators transitioning to retirement. These are faculty who in their full-time careers have established themselves as playing a major role in teaching and might be interested in continuing their relationship with the academic health center. The number of these physicians is increasing and therefore there will be a larger pool seeking an opportunity to continue their commitment to education. Recruitment of senior clinician educators transitioning to retirement could significantly increase the number and quality of resident observations, addressing a previously insoluble problem with a relatively significant return on investment to the academic health center.
Collapse
Affiliation(s)
- Larrie Greenberg
- Children's National Medical Center, The George Washington University School of Medicine and Health Sciences, Potomac, Maryland, USA
| |
Collapse
|
9
|
Ahle SL, Gielissen K, Keene DE, Blasberg JD. Understanding Entrustment Decision-Making by Surgical Program Directors. J Surg Res 2020; 249:74-81. [DOI: 10.1016/j.jss.2019.12.001] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/29/2019] [Revised: 11/12/2019] [Accepted: 12/02/2019] [Indexed: 11/16/2022]
|
10
|
Sturm EC, Mellinger JD, Koehler JL, Wall JCH. An Appreciative Inquiry Approach to the Core Competencies: Taking it From Theory to Practice. JOURNAL OF SURGICAL EDUCATION 2020; 77:380-389. [PMID: 31831306 DOI: 10.1016/j.jsurg.2019.11.002] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/07/2019] [Revised: 09/26/2019] [Accepted: 11/03/2019] [Indexed: 06/10/2023]
Abstract
OBJECTIVE To operationalize the surgical core competencies by using a qualitative inquiry strategy to explore how surgical competence is behaviorally demonstrated by faculty. DESIGN Categorical general and vascular surgery residents completed a survey soliciting opinions regarding which faculty were deemed most representative of each core competency. The surveys served as a theoretical sample, as surgeons selected were then interviewed, and interviews transcribed. A qualitative research approach using grounded theory coding methods was used for transcript analysis. Iterative coding was performed, and emergent themes were then extracted from transcript analysis. SETTING Southern Illinois University School of Medicine, Department of Surgery in Springfield, IL, a tertiary academic center. PARTICIPANTS Fourteen of 19 residents completed the survey (74% response rate). Two surgeons were selected for each competency. A total of 7 interviews were performed, with 4 surgeons being chosen for 2 competencies. RESULTS Emergent themes revealed that competent surgeons shared qualities that drove their development and execution of each competency. These qualities included self-awareness, a selfless character, responsibility and ownership, context awareness, reliance on relationships and community, and a pattern of habit formation and discipline. Additionally, the competencies were noted to be pursued in an interrelated and interdependent fashion. CONCLUSIONS Surgeons deemed competent in any core domain shared common qualities. Further study exploring how each of these is identified, developed and taught is warranted. The competencies are an inter-related matrix whose development and execution correlates with foundational personal disciplines.
Collapse
Affiliation(s)
- Emily C Sturm
- Southern Illinois University School of Medicine, Department of Surgery, Springfield, Illinois
| | - John D Mellinger
- Southern Illinois University School of Medicine, Department of Surgery, Springfield, Illinois
| | - Jeanne L Koehler
- Southern Illinois University School of Medicine, Department of Medical Education, Springfield, Illinois
| | - Jarrod C H Wall
- Southern Illinois University School of Medicine, Department of Surgery, Springfield, Illinois.
| |
Collapse
|
11
|
Ahle SL, Schuller M, Clark MJ, Williams RG, Wnuk G, Fryer JP, George BC. Do End-of-Rotation Evaluations Adequately Assess Readiness to Operate? ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2019; 94:1946-1952. [PMID: 31397708 DOI: 10.1097/acm.0000000000002936] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
PURPOSE Medical educators have developed no standard way to assess the operative performance of surgical residents. Most residency programs use end-of-rotation (EOR) evaluations for this purpose. Recently, some programs have implemented workplace-based "microassessment" tools that faculty use to immediately rate observed operative performance. The authors sought to determine (1) the degree to which EOR evaluations correspond to workplace-based microassessments and (2) which factors most influence EOR evaluations and directly observed workplace-based performance ratings and how the influence of those factors differs for each assessment method. METHOD In 2017, the authors retrospectively analyzed EOR evaluations and immediate postoperative assessment ratings of surgical trainees from a university-based training program from the 2015-2016 academic year. A Bayesian multivariate mixed model was constructed to predict operative performance ratings for each type of assessment. RESULTS Ratings of operative performance from EOR evaluations vs workplace-based microassessment ratings had a Pearson correlation of 0.55. Postgraduate year (PGY) of training was the most important predictor of operative performance ratings on EOR evaluations: Model estimates ranged from 0.62 to 1.75 and increased with PGY. For workplace-based assessment, operative autonomy rating was the most important predictor of operative performance (coefficient = 0.74). CONCLUSIONS EOR evaluations are perhaps most useful in assessing the ability of a resident to become a surgeon compared with other trainees in the same PGY of training. Workplace-based microassessments may be better for assessing a trainee's ability to perform specific procedures autonomously, thus perhaps providing more insight into a trainee's true readiness for operative independence.
Collapse
Affiliation(s)
- Samantha L Ahle
- S.L. Ahle is a general surgery resident, Yale School of Medicine, New Haven, Connecticut. M. Schuller is manager, Surgical Education, Department of Surgery, Northwestern University, Chicago, Illinois. M.J. Clark is lead statistician, Consulting for Statistics, Computing and Analytics Research, University of Michigan, Ann Arbor, Michigan. R.G. Williams was, at the time of the research reported here, adjunct research professor, Indiana University School of Medicine, Indianapolis, Indiana. G. Wnuk is program manager, Center for Surgical Training and Research, University of Michigan, Ann Arbor, Michigan. J.P. Fryer is vice chair of education, Department of Surgery, Northwestern University, Chicago, Illinois. B.C. George is assistant professor and director, Center for Surgical Training and Research, Department of Surgery, University of Michigan, Ann Arbor, Michigan; ORCID: https://orcid.org/0000-0002-9404-5255
| | | | | | | | | | | | | |
Collapse
|
12
|
Henry D, West DC. The Clinical Learning Environment and Workplace-Based Assessment: Frameworks, Strategies, and Implementation. Pediatr Clin North Am 2019; 66:839-854. [PMID: 31230626 DOI: 10.1016/j.pcl.2019.03.010] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
Abstract
This article provides an overview of the role played by the clinical learning environment in providing opportunities for assessment of trainee performance and how those assessments can guide learning. It reviews the importance of competency models as frameworks to facilitate the creation of a shared mental model of what is to be learned between learners and supervisors. In addition, it discusses how assessment can be used to drive mastery learning as well as the components necessary for a program of assessment.
Collapse
Affiliation(s)
- Duncan Henry
- Department of Pediatrics, University of California San Francisco, 550 16th Street, 5th floor, Box 0110, San Francisco, CA 94143-0110, USA.
| | - Daniel C West
- Department of Pediatrics, University of California San Francisco, 550 16th Street, 4th floor, Box 0110, San Francisco, CA 94143-0110, USA
| |
Collapse
|
13
|
Who moved my fellow: changes to Accreditation Council for Graduate Medical Education fellowships in pediatric surgery and what may be yet to come. Curr Opin Pediatr 2019; 31:409-413. [PMID: 31090584 DOI: 10.1097/mop.0000000000000762] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
Abstract
PURPOSE OF REVIEW Over the past 15 years, the Accreditation Council for Graduate Medical Education (ACGME) has significantly altered the regulatory framework governing fellowship training in pediatric surgery. The daily experiences of pediatric surgical trainees have been impacted by these changes, but training program directors and faculty have not developed a consistent approach to managing this shift. This review highlights the changes, which have occurred, analyzes the current state of fellowship training, and proposes potential strategies for management. RECENT FINDINGS The implementation of work hour restrictions, increased supervision requirements, the milestone evaluation program and most recently, enforcement of required critical care experience, have caused significant changes in the curriculum. Pediatric surgical trainees record more total cases, and more minimally invasive surgical (MIS) cases, in particular, than ever before. A subset of this increase may result from trainees performing cases previously assigned to general surgery residents. Teaching cases performed by fellows have decreased. Although the relationship between these shifts in training experience and the didactic curriculum is not clear, we also note that the Pediatric Surgery Certifying Examination failure rate has increased, approaching 20% in recent years. SUMMARY It is unclear whether the changes in Pediatric Surgery training programs have been effective, or (conversely) have led to unintended consequences. Paradigm shifts in our training model may be required to address the changes in surgical education and skill acquisition, so that well tolerated, competent and skillful pediatric surgeons continue to enter the workforce.
Collapse
|
14
|
Dauphinee WD, Boulet JR, Norcini JJ. Considerations that will determine if competency-based assessment is a sustainable innovation. ADVANCES IN HEALTH SCIENCES EDUCATION : THEORY AND PRACTICE 2019; 24:413-421. [PMID: 29777463 DOI: 10.1007/s10459-018-9833-2] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/22/2017] [Accepted: 05/11/2018] [Indexed: 06/08/2023]
Abstract
Educational assessment for the health professions has seen a major attempt to introduce competency based frameworks. As high level policy developments, the changes were intended to improve outcomes by supporting learning and skills development. However, we argue that previous experiences with major innovations in assessment offer an important road map for developing and refining assessment innovations, including careful piloting and analyses of their measurement qualities and impacts. Based on the literature, numerous assessment workshops, personal interactions with potential users, and our 40 years of experience in implementing assessment change, we lament the lack of a coordinated approach to clarify and improve measurement qualities and functionality of competency based assessment (CBA). To address this worrisome situation, we offer two roadmaps to guide CBA's further development. Initially, reframe and address CBA as a measurement development opportunity. Secondly, using a roadmap adapted from the management literature on sustainable innovation, the medical assessment community needs to initiate an integrated plan to implement CBA as a sustainable innovation within existing educational programs and self-regulatory enterprises. Further examples of down-stream opportunities to refocus CBA at the implementation level within faculties and within the regulatory framework of the profession are offered. In closing, we challenge the broader assessment community in medicine to step forward and own the challenge and opportunities to reframe CBA as an innovation to improve the quality of the clinical educational experience. The goal is to optimize assessment in health education and ultimately improve the public's health.
Collapse
Affiliation(s)
- W Dale Dauphinee
- Foundation for the Advancement of International Medical Education and Research, 3624 Market Street, Fourth Floor, Philadelphia, PA, 19104, USA.
- McGill University, 1140 Pine Avenue West, Montreal, QC, H3A 1A3, Canada.
- , Saint Andrews, NB, Canada.
| | - John R Boulet
- Foundation for the Advancement of International Medical Education and Research, 3624 Market Street, Fourth Floor, Philadelphia, PA, 19104, USA
| | - John J Norcini
- Foundation for the Advancement of International Medical Education and Research, 3624 Market Street, Fourth Floor, Philadelphia, PA, 19104, USA
| |
Collapse
|
15
|
LeBeau L, Morgan C, Heath D, Pazdernik VK. Assessing Competency in Family Medicine Residents Using the Osteopathic Manipulative Medicine Mini-Clinical Evaluation Exercise. J Osteopath Med 2019; 119:2721358. [PMID: 30644522 DOI: 10.7556/jaoa.2019.013] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
Abstract
CONTEXT The Mini-Clinical Evaluation Exercise (Mini-CEX) is one example of a direct observation tool used for workplace-based skills assessment. The Mini-CEX has been validated as a useful formative evaluation tool in graduate medical education. However, no Mini-CEX has been reported in the literature that specifically assesses the osteopathic manipulative medicine (OMM) skills of family medicine residents. Therefore, the authors created and studied an OMM Mini-CEX to fill this skills assessment gap. OBJECTIVE To determine whether the OMM Mini-CEX is perceived as an effective evaluation tool for assessing the OMM core competencies of family medicine residents. METHODS Faculty and residents of The Wright Center for Graduate Medical Education National Family Medicine Residency program participated in the study. Each resident was evaluated at least once using the OMM Mini-CEX. Surveys were used to assess faculty and resident perceptions of the usefulness and effectiveness of the OMM Mini-CEX for assessing OMM competencies. RESULTS Eighty-one responses were received during 2 survey cycles within a 7-month period. The internal consistency of the survey responses had a high reliability (Cronbach α=0.93). Considering respondents who agreed that they had a clear understanding of the general purpose of a Mini-CEX, the perceived effectiveness score for the OMM Mini-CEX was higher among those who agreed that a Mini-CEX was a useful part of training than among those who disagreed or were unsure of its usefulness (median score, 4.0 vs 3.4, respectively; P=.047). CONCLUSIONS The results suggest the OMM Mini-CEX can be a useful direct observation evaluation tool to assess OMM core competencies in family medicine residents. Additional research is needed to determine its perceived effectiveness in other clinical specialties and in undergraduate medical education.
Collapse
|
16
|
Binkley J, Bukoski AD, Doty J, Crane M, Barnes SL, Quick JA. Surgical Simulation: Markers of Proficiency. JOURNAL OF SURGICAL EDUCATION 2019; 76:234-241. [PMID: 29983346 DOI: 10.1016/j.jsurg.2018.05.018] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/29/2018] [Revised: 04/30/2018] [Accepted: 05/27/2018] [Indexed: 06/08/2023]
Abstract
OBJECTIVE Surgical simulation has become an integral component of surgical training. Simulation proficiency determination has been traditionally based upon time to completion of various simulated tasks. We aimed to determine objective markers of proficiency in surgical simulation by comparing novel assessments with conventional evaluations of technical skill. DESIGN Categorical general surgery residents completed 10 laparoscopic cholecystectomy modules using a high-fidelity simulator. We recorded and analyzed simulation task times, as well as number of hand movements, instrument path length, instrument acceleration, and participant affective engagement during each simulation. Comparisons were made to Objective Structured Assessment of Technical Skill (OSATS) and Accreditation Council for Graduate Medical Education Milestones, as well as previous laparoscopic experience, duration of laparoscopic cholecystectomies performed by participants, and postgraduate year. Comparisons were also made to Fundamentals of Laparoscopic Surgery task times. Spearman's rho was utilized for comparisons, significance set at >0.50. SETTING University of Missouri, Columbia, Missouri, an academic tertiary care facility. PARTICIPANTS Fourteen categorical general surgery residents (postgraduate year 1-5) were prospectively enrolled. RESULTS One hundred forty simulations were included. The number of hand movements and instrument path lengths strongly correlated with simulation task times (ρ 0.62-0.87, p < 0.0001), FLS task completion times (ρ 0.50-0.53, p < 0.0001), and prior real-world laparoscopic cholecystectomy experience (ρ -0.51 to -0.53, p < 0.0001). No significant correlations were identified between any of the studied markers with Accreditation Council for Graduate Medical Education Milestones, Objective Structured Assessment of Technical Skill evaluations, total previous laparoscopic experience, or postgraduate year level. Neither instrument acceleration nor participant engagement showed significant correlation with any of the conventional markers of real-world or simulation skill proficiency. CONCLUSIONS Simulation proficiency, measured by instrument and hand motion, is more representative of simulation skill than simulation task time, instrument acceleration, or participant engagement.
Collapse
Affiliation(s)
- Jana Binkley
- School of Medicine, University of Missouri, Columbia, Missouri
| | - Alex D Bukoski
- Department of Veterinary Medicine and Surgery, University of Missouri, Columbia, Missouri
| | - Jennifer Doty
- Department of Surgery, University of Missouri, Columbia, Missouri
| | - Megan Crane
- Department of Surgery, University of Missouri, Columbia, Missouri
| | - Stephen L Barnes
- Department of Surgery, University of Missouri, Columbia, Missouri
| | - Jacob A Quick
- Department of Surgery, University of Missouri, Columbia, Missouri.
| |
Collapse
|
17
|
Reddy ST. Preliminary Validity Evidence for a Milestones-Based Rating Scale for Chart-Stimulated Recall. J Grad Med Educ 2018; 10:269-275. [PMID: 29946382 PMCID: PMC6008016 DOI: 10.4300/jgme-d-17-00435.1] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/21/2017] [Revised: 11/03/2017] [Accepted: 03/05/2018] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND Minimally anchored Standard Rating Scales (SRSs), which are widely used in medical education, are hampered by suboptimal interrater reliability. Expert-derived frameworks, such as the Accreditation Council for Graduate Medical Education (ACGME) Milestones, may be helpful in defining level-specific anchors to use on rating scales. OBJECTIVE We examined validity evidence for a Milestones-Based Rating Scale (MBRS) for scoring chart-stimulated recall (CSR). METHODS Two 11-item scoring forms with either an MBRS or SRS were developed. Items and anchors for the MBRS were adapted from the ACGME Internal Medicine Milestones. Six CSR standardized videos were developed. Clinical faculty scored videos using either the MBRS or SRS and following a randomized crossover design. Reliability of the MBRS versus the SRS was compared using intraclass correlation. RESULTS Twenty-two faculty were recruited for instrument testing. Some participants did not complete scoring, leaving a response rate of 15 faculty (7 in the MBRS group and 8 in the SRS group). A total of 529 ratings (number of items × number of scores) using SRSs and 540 using MBRSs were available. Percent agreement was higher for MBRSs for only 2 of 11 items-use of consultants (92 versus 75, P = .019) and unique characteristics of patients (96 versus 79, P = .011)-and the overall score (89 versus 82, P < .001). Interrater agreement was 0.61 for MBRSs and 0.51 for SRSs. CONCLUSIONS Adding milestones to our rating form resulted in significant, but not substantial, improvement in intraclass correlation coefficient. Improvement was inconsistent across items.
Collapse
|
18
|
Benson NM, Puckett JA, Chaukos DC, Gerken AT, Baker JT, Smith FA, Beach SR. Curriculum Overhaul in Psychiatric Residency: An Innovative Approach to Revising the Didactic Lecture Series. ACADEMIC PSYCHIATRY : THE JOURNAL OF THE AMERICAN ASSOCIATION OF DIRECTORS OF PSYCHIATRIC RESIDENCY TRAINING AND THE ASSOCIATION FOR ACADEMIC PSYCHIATRY 2018; 42:258-261. [PMID: 28493218 DOI: 10.1007/s40596-017-0717-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/21/2016] [Accepted: 04/27/2017] [Indexed: 06/07/2023]
|
19
|
Linsenmeyer M, Wimsatt L, Speicher M, Powers J, Miller S, Katsaros E. Assessment Considerations for Core Entrustable Professional Activities for Entering Residency. J Osteopath Med 2018; 118:243-251. [DOI: 10.7556/jaoa.2018.049] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
Abstract
Abstract
Context
In the process of analyzing entrustable professional activities (EPAs) for use in medical education, ten Cate and others identified challenges, including the need for valid and reliable EPA assessment strategies.
Objective
To provide osteopathic medical schools with a database of assessment tools compiled from the literature to assist them with the development and implementation of robust, evidence-based assessment methods.
Methods
MEDLINE, ERIC, PubMed, and other relevant databases were searched using MeSH keywords for articles outlining robust, evidence-based assessment tools that could be used in designing assessments for EPAs 1 through 6.
Results
A total of 55 publications were included in content analysis and reporting. All but 2 of the assessment articles were conducted in an undergraduate or graduate medical education setting. The majority of the 55 articles related to assessment of competencies affiliated with EPA 2 (16 articles) and EPA 4 (15 articles). Four articles focused on EPA 3.
Conclusion
Osteopathic medical schools can use this database of assessment tools to support the development of EPA-specific assessment plans that match the unique context and needs of their institution.
Collapse
|
20
|
Ginsburg S, van der Vleuten CPM, Eva KW. The Hidden Value of Narrative Comments for Assessment: A Quantitative Reliability Analysis of Qualitative Data. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2017; 92:1617-1621. [PMID: 28403004 DOI: 10.1097/acm.0000000000001669] [Citation(s) in RCA: 72] [Impact Index Per Article: 10.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/12/2023]
Abstract
PURPOSE In-training evaluation reports (ITERs) are ubiquitous in internal medicine (IM) residency. Written comments can provide a rich data source, yet are often overlooked. This study determined the reliability of using variable amounts of commentary to discriminate between residents. METHOD ITER comments from two cohorts of PGY-1s in IM at the University of Toronto (graduating 2010 and 2011; n = 46-48) were put into sets containing 15 to 16 residents. Parallel sets were created: one with comments from the full year and one with comments from only the first three assessments. Each set was rank-ordered by four internists external to the program between April 2014 and May 2015 (n = 24). Generalizability analyses and a decision study were performed. RESULTS For the full year of comments, reliability coefficients averaged across four rankers were G = 0.85 and G = 0.91 for the two cohorts. For a single ranker, G = 0.60 and G = 0.73. Using only the first three assessments, reliabilities remained high at G = 0.66 and G = 0.60 for a single ranker. In a decision study, if two internists ranked the first three assessments, reliability would be G = 0.80 and G = 0.75 for the two cohorts. CONCLUSIONS Using written comments to discriminate between residents can be extremely reliable even after only several reports are collected. This suggests a way to identify residents early on who may require attention. These findings contribute evidence to support the validity argument for using qualitative data for assessment.
Collapse
Affiliation(s)
- Shiphra Ginsburg
- S. Ginsburg is professor, Department of Medicine, and scientist, Wilson Centre for Research in Education, Faculty of Medicine, University of Toronto, Toronto, Ontario, Canada. C.P.M. van der Vleuten is professor of education, Department of Educational Development and Research, Faculty of Health, Medicine and Life Sciences, Maastricht University, Maastricht, the Netherlands. K.W. Eva is associate director and senior scientist, Centre for Health Education Scholarship, and professor and director of educational research and scholarship, Faculty of Medicine, University of British Columbia, Vancouver, British Columbia, Canada
| | | | | |
Collapse
|
21
|
Wood TJ, Chan J, Humphrey-Murto S, Pugh D, Touchie C. The influence of first impressions on subsequent ratings within an OSCE station. ADVANCES IN HEALTH SCIENCES EDUCATION : THEORY AND PRACTICE 2017; 22:969-983. [PMID: 27848171 DOI: 10.1007/s10459-016-9736-z] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/12/2015] [Accepted: 11/03/2016] [Indexed: 05/25/2023]
Abstract
Competency-based assessment is placing increasing emphasis on the direct observation of learners. For this process to produce valid results, it is important that raters provide quality judgments that are accurate. Unfortunately, the quality of these judgments is variable and the roles of factors that influence the accuracy of those judgments are not clearly understood. One such factor is first impressions: that is, judgments about people we do not know, made quickly and based on very little information. This study explores the influence of first impressions in an OSCE. Specifically, the purpose is to begin to examine the accuracy of a first impression and its influence on subsequent ratings. We created six videotapes of history-taking performance. Each video was scripted from a real performance by six examinee residents within a single OSCE station. Each performance was re-enacted with six different actors playing the role of the examinees and one actor playing the role of the patient and videotaped. A total of 23 raters (i.e., physician examiners) reviewed each video and were asked to make a global judgment of the examinee's clinical abilities after 60 s (First Impression GR) by providing a rating on a six-point global rating scale and then to rate their confidence in the accuracy of that judgment by providing a rating on a five-point rating scale (Confidence GR). After making these ratings, raters then watched the remainder of the examinee's performance and made another global rating of performance (Final GR) before moving on to the next video. First impression ratings of ability varied across examinees and were moderately correlated to expert ratings (r = .59, 95% CI [-.13, .90]). There were significant differences in mean ratings for three examinees. Correlations ranged from .05 to .56 but were only significant for three examinees. Rater confidence in their first impression was not related to the likelihood of a rater changing their rating between the first impression and a subsequent rating. The findings suggest that first impressions could play a role in explaining variability in judgments, but their importance was determined by the videotaped performance of the examinees. More work is needed to clarify conditions that support or discourage the use of first impressions.
Collapse
Affiliation(s)
- Timothy J Wood
- Department of Innovation in Medical Education, Faculty of Medicine, University of Ottawa, RGN 2206, 451 Smyth Road, Ottawa, ON, K1H 8M5, Canada.
| | - James Chan
- Department of Medicine, University of Ottawa, 501 Smyth Road, Ottawa, ON, K1H 8L6, Canada
| | - Susan Humphrey-Murto
- Department of Medicine, University of Ottawa, 501 Smyth Road, Ottawa, ON, K1H 8L6, Canada
| | - Debra Pugh
- Department of Medicine, University of Ottawa, 501 Smyth Road, Ottawa, ON, K1H 8L6, Canada
| | - Claire Touchie
- Department of Medicine, University of Ottawa, 501 Smyth Road, Ottawa, ON, K1H 8L6, Canada
| |
Collapse
|
22
|
Bruce AN, Kumar A, Malekzadeh S. Procedural Skills of the Entrustable Professional Activities: Are Graduating US Medical Students Prepared to Perform Procedures in Residency? JOURNAL OF SURGICAL EDUCATION 2017; 74:589-595. [PMID: 28126380 DOI: 10.1016/j.jsurg.2017.01.002] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/18/2016] [Revised: 11/17/2016] [Accepted: 01/01/2017] [Indexed: 06/06/2023]
Abstract
PURPOSE Competency-based medical education has been successfully instituted in graduate medical education through the development of Milestones. Consequently, the Association of American Medical Colleges implemented the core entrustable professional activities initiative to complement this framework in undergraduate medical education. We sought to determine its efficacy by examining the experiences and confidence of recent medical school graduates with general procedural skills (entrustable professional activities 12). METHOD We administered an electronic survey to the MedStar Georgetown University Hospital intern class assessing their experiences with learning and evaluation as well as their confidence with procedural skills training during medical school. Simple linear regression was used to compare respondent confidence and the presence of formal evaluation in medical school. RESULTS We received 28 complete responses, resulting in a 33% response rate, whereas most respondents indicated that basic cardiopulmonary resuscitation, bag/mask ventilation, and universal precautions were important to and evaluated by their medical school, this emphasis was not present for venipuncture, intravenous catheter placement, and arterial puncture. Mean summed scores of confidence for each skill indicated a statistically significant effect between confidence and evaluation of universal precaution skills. CONCLUSIONS More advanced procedural skills are not considered as important for graduating medical students and are less likely to be taught and formally evaluated before graduation. Formal evaluation of some procedural skills is associated with increased confidence of the learner.
Collapse
Affiliation(s)
- Adrienne N Bruce
- Department of Student Research, Georgetown University School of Medicine, Washington, DC.
| | - Anagha Kumar
- Department of Biostatistics, MedStar Health Research Institute, Hyattsville, Maryland
| | - Sonya Malekzadeh
- Department of Otolaryngology-Head and Neck Surgery, Georgetown University School of Medicine, Washington, DC
| |
Collapse
|
23
|
McCloskey CB, Domen RE, Conran RM, Hoffman RD, Post MD, Brissette MD, Gratzinger DA, Raciti PM, Cohen DA, Roberts CA, Rojiani AM, Kong CS, Peterson JEG, Johnson K, Plath S, Powell SZE. Entrustable Professional Activities for Pathology: Recommendations From the College of American Pathologists Graduate Medical Education Committee. Acad Pathol 2017; 4:2374289517714283. [PMID: 28725792 PMCID: PMC5496684 DOI: 10.1177/2374289517714283] [Citation(s) in RCA: 35] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/21/2017] [Revised: 05/01/2017] [Accepted: 05/03/2017] [Indexed: 01/10/2023] Open
Abstract
Competency-based medical education has evolved over the past decades to include the Accreditation Council for Graduate Medical Education Accreditation System of resident evaluation based on the Milestones project. Entrustable professional activities represent another means to determine learner proficiency and evaluate educational outcomes in the workplace and training environment. The objective of this project was to develop entrustable professional activities for pathology graduate medical education encompassing primary anatomic and clinical pathology residency training. The Graduate Medical Education Committee of the College of American Pathologists met over the course of 2 years to identify and define entrustable professional activities for pathology graduate medical education. Nineteen entrustable professional activities were developed, including 7 for anatomic pathology, 4 for clinical pathology, and 8 that apply to both disciplines with 5 of these concerning laboratory management. The content defined for each entrustable professional activity includes the entrustable professional activity title, a description of the knowledge and skills required for competent performance, mapping to relevant Accreditation Council for Graduate Medical Education Milestone subcompetencies, and general assessment methods. Many critical activities that define the practice of pathology fit well within the entrustable professional activity model. The entrustable professional activities outlined by the Graduate Medical Education Committee are meant to provide an initial framework for the development of entrustable professional activity–related assessment and curricular tools for pathology residency training.
Collapse
Affiliation(s)
- Cindy B McCloskey
- Department of Pathology, University of Oklahoma Health Sciences Center, Oklahoma City, OK, USA
| | - Ronald E Domen
- Department of Pathology, Medical Center and College of Medicine, Penn State Hershey, Hershey, PA, USA
| | - Richard M Conran
- Department of Pathology and Anatomy, Virginia Medical School, Eastern Norfolk, VA, USA
| | - Robert D Hoffman
- Department of Pathology, Microbiology, and Immunology, Vanderbilt University School of Medicine, Nashville, TN, USA
| | - Miriam D Post
- Department of Pathology, University of Colorado-Anschutz Medical Campus, Aurora, CO, USA
| | | | - Dita A Gratzinger
- Department of Pathology, School of Medicine, Stanford University, Stanford, CA, USA
| | - Patricia M Raciti
- Department of Pathology and Cell Biology, Columbia University Medical Center, New York Presbyterian Hospital, New York, NY, USA
| | - David A Cohen
- Department of Pathology and Genomic Medicine, Houston Methodist Hospital, Houston, TX, USA
| | | | - Amyn M Rojiani
- Department of Pathology, Medical College of Georgia, Augusta University, Augusta, GA, USA
| | - Christina S Kong
- Department of Pathology, School of Medicine, Stanford University, Stanford, CA, USA
| | - Jo Elle G Peterson
- Department of Pathology, University of Oklahoma Health Sciences Center, Oklahoma City, OK, USA
| | | | - Sue Plath
- College of American Pathologists, Northfield, IL, USA
| | | |
Collapse
|
24
|
Peabody MR, O'Neill TR, Peterson LE. Examining the Functioning and Reliability of the Family Medicine Milestones. J Grad Med Educ 2017; 9:46-53. [PMID: 28261393 PMCID: PMC5319627 DOI: 10.4300/jgme-d-16-00172.1] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND The Family Medicine (FM) Milestones are a framework designed to assess development of residents in key dimensions of physician competency. Residency programs use the milestones in semiannual reviews of resident performance from entry toward graduation. OBJECTIVE To examine the functioning and reliability of the FM Milestones and to determine whether they measure the amount of a latent trait (eg, knowledge or ability) possessed by a resident or simply indicate where a resident falls along the training sequence. METHODS This study utilized the Rasch Partial Credit model to examine academic year 2014-2015 ratings for 10 563 residents from 476 residency programs (postgraduate year [PGY] 1 = 3639; PGY-2 = 3562; PGY-3 = 3351; PGY-4 = 11). RESULTS Reliability was exceptionally high at 0.99. Mean scores were 3.2 (SD = 1.3) for PGY-1; 5.0 (SD = 1.3) for PGY-2; 6.7 (SD = 1.2) for PGY-3; and 7.4 (SD = 1.0) for PGY-4. Keyform analysis showed a rating on 1 item was likely to be similar for all other items. CONCLUSIONS Our findings suggest that FM Milestones seem to largely function as intended. Lack of spread in item difficulty and lack of variation in category probabilities show that FM Milestones do not measure the amount of a latent trait possessed by a resident, but rather describe where a resident falls along the training sequence. High reliability indicates residents are being rated in a stable manner as they progress through residency, and individual residents deviating from this rating structure warrant consideration by program leaders.
Collapse
Affiliation(s)
- Michael R. Peabody
- Corresponding author: Michael R. Peabody, PhD, American Board of Family Medicine, 1648 McGrathiana Parkway, Suite 550, Lexington, KY 40511, 859.269.5626, ext 1226, fax 859.335.7501,
| | | | | |
Collapse
|
25
|
Goldman RH, Tuomala RE, Bengtson JM, Stagg AR. How Effective are New Milestones Assessments at Demonstrating Resident Growth? 1 Year of Data. JOURNAL OF SURGICAL EDUCATION 2017; 74:68-73. [PMID: 27395399 DOI: 10.1016/j.jsurg.2016.06.009] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/28/2016] [Revised: 05/09/2016] [Accepted: 06/07/2016] [Indexed: 06/06/2023]
Abstract
OBJECTIVE Assessment tools that accrue data for the Accreditation Council for Graduate Medical Education Milestones must evaluate residents across multiple dimensions, including medical knowledge, procedural skills, teaching, and professionalism. Our objectives were to: (1) develop an assessment tool to evaluate resident performance in accordance with the Milestones and (2) review trends in resident achievements during the inaugural year of Milestone implementation. DESIGN A novel venue and postgraduate year (PGY) specific assessment tool was built, tested, and implemented for both operating room and labor and delivery "venues." Resident development of competence and independence was captured over time. To account for variable rotation schedules, the year was divided into thirds and compared using two-tailed Fisher's exact test. SETTING Brigham and Women's and Massachusetts General Hospitals, Boston MA. PARTICIPANTS Faculty evaluators and obstetrics and gynecology residents. RESULTS A total of 822 assessments of 44 residents were completed between 9/2014 and 6/2015. The percentage of labor and delivery tasks completed "independently" increased monotonically across the start of all years: 8.4% for PGY-1, 60.3% for PGY-2, 73.7% for PGY-3, and 87.5% for PGY-4. Assessments of PGY-1 residents demonstrated a significant shift toward "with minimal supervision" and "independent" for the management of normal labor (p = 0.03). PGY-3 residents demonstrated an increase in "able to be primary surgeon" in the operating room, from 36% of the time in the first 2/3 of the year, to 62.3% in the last 1/3 (p < 0.01). CONCLUSION Assessment tools developed to assist with Milestone assignments capture the growth of residents over time and demonstrate quantifiable differences in achievements between PGY classes. These tools will allow for targeted teaching opportunities for both individual residents and residency programs.
Collapse
Affiliation(s)
- Randi H Goldman
- Department of Obstetrics, Gynecology and Reproductive Biology, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts.
| | - Ruth E Tuomala
- Department of Obstetrics, Gynecology and Reproductive Biology, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts; Vincent Department of Obstetrics and Gynecology, Massachusetts General Hospital, Harvard Medical School, Boston, Massachusetts
| | - Joan M Bengtson
- Department of Obstetrics, Gynecology and Reproductive Biology, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts
| | - Amy R Stagg
- Vincent Department of Obstetrics and Gynecology, Massachusetts General Hospital, Harvard Medical School, Boston, Massachusetts
| |
Collapse
|
26
|
Ponton-Carss A, Kortbeek JB, Ma IW. Assessment of technical and nontechnical skills in surgical residents. Am J Surg 2016; 212:1011-1019. [DOI: 10.1016/j.amjsurg.2016.03.005] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/05/2015] [Revised: 02/24/2016] [Accepted: 03/22/2016] [Indexed: 01/03/2023]
|
27
|
Meier AH, Gruessner A, Cooney RN. Using the ACGME Milestones for Resident Self-Evaluation and Faculty Engagement. JOURNAL OF SURGICAL EDUCATION 2016; 73:e150-e157. [PMID: 27886973 DOI: 10.1016/j.jsurg.2016.09.001] [Citation(s) in RCA: 47] [Impact Index Per Article: 5.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/19/2016] [Revised: 09/01/2016] [Accepted: 09/04/2016] [Indexed: 06/06/2023]
Abstract
BACKGROUND Since July 2014 General Surgery residency programs have been required to use the Accreditation Council for Graduate Medical Education milestones twice annually to assess the progress of their trainees. We felt this change was a great opportunity to use this new evaluation tool for resident self-assessment and to furthermore engage the faculty in the educational efforts of the program. METHODS We piloted the milestones with postgraduate year (PGY) II and IV residents during the 2013/2014 academic year to get faculty and residents acquainted with the instrument. In July 2014, we implemented the same protocol for all residents. Residents meet with their advisers quarterly. Two of these meetings are used for milestones assessment. The residents perform an independent self-evaluation and the adviser grades them independently. They discuss the evaluations focusing mainly on areas of greatest disagreement. The faculty member then presents the resident to the clinical competency committee (CCC) and the committee decides on the final scores and submits them to the Accreditation Council for Graduate Medical Education website. We stored all records anonymously in a MySQL database. We used Anova with Tukey post hoc analysis to evaluate differences between groups. We used intraclass correlation coefficients and Krippendorff's α to assess interrater reliability. RESULTS We analyzed evaluations for 44 residents. We created scale scores across all Likert items for each evaluation. We compared score differences by PGY level and raters (self, adviser, and CCC). We found highly significant increases of scores between most PGY levels (p < 0.05). There were no significant score differences per PGY level between the raters. The interrater reliability for the total score and 6 competency domains was very high (ICC: 0.87-0.98 and α: 0.84-0.97). Even though this milestone evaluation process added additional work for residents and faculty we had very good participation (93.9% by residents and 92.9% by faculty) and feedback was generally positive. CONCLUSION Even though implementation of the milestones has added additional work for general surgery residency programs, it has also opened opportunities to furthermore engage the residents in reflection and self-evaluation and to create additional venues for faculty to get involved with the educational process within the residency program. Using the adviser as the initial rater seems to correlate closely with the final CCC assessment. Self-evaluation by the resident is a requirement by the RRC and the milestones seem to be a good instrument to use for this purpose. Our early assessment suggests the milestones provide a useful instrument to track trainee progression through their residency.
Collapse
Affiliation(s)
- Andreas H Meier
- Department of Surgery, Education Office, Upstate Medical University, Syracuse, New York.
| | - Angelika Gruessner
- Department of Surgery, Education Office, Upstate Medical University, Syracuse, New York
| | - Robert N Cooney
- Department of Surgery, Education Office, Upstate Medical University, Syracuse, New York
| |
Collapse
|
28
|
Genovese B, Yin S, Sareh S, Devirgilio M, Mukdad L, Davis J, Santos VJ, Benharash P. Surgical Hand Tracking in Open Surgery Using a Versatile Motion Sensing System: Are We There Yet? Am Surg 2016; 82:872-875. [DOI: 10.1177/000313481608201002] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
With changes in work hour limitations, there is an increasing need for objective determination of technical proficiency. Electromagnetic hand-motion analysis has previously shown only time to completion and number of movements to correlation with expertise. The present study was undertaken to evaluate the efficacy of hand-motion-tracking analysis in determining surgical skill proficiency. A nine-degree-of-freedom sensor was used and mounted on the superior aspect of a needle driver. A one-way analysis of variance and Welch's t test were performed to evaluate significance between subjects. Four Novices, four Trainees, and three Experts performed a large vessel patch anastomosis on a phantom tissue. Path length, total number of movements, absolute velocity, and total time were analyzed between groups. Compared to the Novices, Expert subjects exhibited significantly decreased total number of movements, decreased instrument path length, and decreased total time to complete tasks. There were no significant differences found in absolute velocity between groups. In this pilot study, we have identified significant differences in patterns of motion between Novice and Expert subjects. These data warrant further analysis for its predictive value in larger cohorts at different levels of training and may be a useful tool in competence-based training paradigms in the future.
Collapse
Affiliation(s)
- Bradley Genovese
- Division of Cardiac Surgery, University of California at Los Angeles, Los Angeles, California
- Center for Advanced Surgical and Interventional Technology, University of California at Los Angeles, Los Angeles, California
| | - Steven Yin
- Electrical Engineering Department, University of California at Los Angeles, Los Angeles, California; and
| | - Sohail Sareh
- Division of Cardiac Surgery, University of California at Los Angeles, Los Angeles, California
| | - Michael Devirgilio
- Division of Cardiac Surgery, University of California at Los Angeles, Los Angeles, California
| | - Laith Mukdad
- Division of Cardiac Surgery, University of California at Los Angeles, Los Angeles, California
| | - Jessica Davis
- Division of Cardiac Surgery, University of California at Los Angeles, Los Angeles, California
| | - Veronica J. Santos
- Center for Advanced Surgical and Interventional Technology, University of California at Los Angeles, Los Angeles, California
- Mechanical and Aerospace Engineering Department, University of California at Los Angeles, Los Angeles, California
| | - Peyman Benharash
- Division of Cardiac Surgery, University of California at Los Angeles, Los Angeles, California
- Center for Advanced Surgical and Interventional Technology, University of California at Los Angeles, Los Angeles, California
| |
Collapse
|
29
|
Klamen DL, Williams RG, Roberts N, Cianciolo AT. Competencies, milestones, and EPAs - Are those who ignore the past condemned to repeat it? MEDICAL TEACHER 2016; 38:904-910. [PMID: 26805785 DOI: 10.3109/0142159x.2015.1132831] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/05/2023]
Abstract
BACKGROUND The idea of competency-based education sounds great on paper. Who wouldn't argue for a standardized set of performance-based assessments to assure competency in graduating students and residents? Even so, conceptual concerns have already been raised about this new system and there is yet no evidence to refute their veracity. AIMS We argue that practical concerns deserve equal consideration, and present evidence strongly suggesting these concerns should be taken seriously. METHOD Specifically, we share two historical examples that illustrate what happened in two disparate contexts (K-12 education and the Department of Defense [DOD]) when competency (or outcomes-based) assessment frameworks were implemented. We then examine how observation and assessment of clinical performance stands currently in medical schools and residencies, since these methodologies will be challenged to a greater degree by expansive lists of competencies and milestones. RESULTS/CONCLUSIONS We conclude with suggestions as to a way forward, because clearly the assessment of competency and the ability to guarantee that graduates are ready for medical careers is of utmost importance. Hopefully the headlong rush to competencies, milestones, and core entrustable professional activities can be tempered before even more time, effort, frustration and resources are invested in an endeavor which history suggests will collapse under its own weight.
Collapse
|
30
|
Abstract
The Accreditation Council of Graduate Medical Education requires that residency programs teach and assess trainees in six core competencies. Assessments are imperative to determine trainee competence and to ensure that excellent care is provided to all patients. A structured, direct observation program is feasible for assessing nontechnical core competencies and providing trainees with immediate constructive feedback. Direct observation of residents in the outpatient setting by trained faculty allows assessment of each core competency. Checklists are used to document residents' basic communication skills, clinical reasoning, physical examination methods, and medical record keeping. Faculty concerns regarding residents' professionalism, medical knowledge, fatigue, or ability to self-assess are tracked. Serial observations allow for the reinforcement and/or monitoring of skills and attitudes identified as needing improvement. Residents who require additional coaching are identified early in training. Progress in educational milestones is recorded, allowing an individualized educational program that ensures that future orthopaedic surgeons excel across all domains of medical and surgical competence.
Collapse
|
31
|
Abstract
The challenge of the current graduate medical education environment requires in plastic surgery acceptance of those contemporary pressures that cannot be substantially modified and address of those that can be successfully met. To do so implies an examination of conference didactics, intraoperative teaching, and a valid assessment of resident performance.
Collapse
|
32
|
Wali E, Pinto JM, Cappaert M, Lambrix M, Blood AD, Blair EA, Small SD. Teaching professionalism in graduate medical education: What is the role of simulation? Surgery 2016; 160:552-64. [PMID: 27206333 DOI: 10.1016/j.surg.2016.03.026] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2016] [Accepted: 03/11/2016] [Indexed: 11/28/2022]
Abstract
BACKGROUND We systematically reviewed the literature concerning simulation-based teaching and assessment of the Accreditation Council for Graduate Medical Education professionalism competencies to elucidate best practices and facilitate further research. METHODS A systematic review of English literature for "professionalism" and "simulation(s)" yielded 697 abstracts. Two independent raters chose abstracts that (1) focused on graduate medical education, (2) described the simulation method, and (3) used simulation to train or assess professionalism. Fifty abstracts met the criteria, and seven were excluded for lack of relevant information. The raters, 6 professionals with medical education, simulation, and clinical experience, discussed 5 of these articles as a group; they calibrated coding and applied further refinements, resulting in a final, iteratively developed evaluation form. The raters then divided into 2 teams to read and assess the remaining articles. Overall, 15 articles were eliminated, and 28 articles underwent final analysis. RESULTS Papers addressed a heterogeneous range of professionalism content via multiple methods. Common specialties represented were surgery (46.4%), pediatrics (17.9%), and emergency medicine (14.3%). Sixteen articles (57%) referenced a professionalism framework; 14 (50%) incorporated an assessment tool; and 17 (60.7%) reported debriefing participants, though in limited detail. Twenty-three (82.1%) articles evaluated programs, mostly using subjective trainee reports. CONCLUSION Despite early innovation, reporting of simulation-based professionalism training and assessment is nonstandardized in methods and terminology and lacks the details required for replication. We offer minimum standards for reporting of future professionalism-focused simulation training and assessment as well as a basic framework for better mapping proper simulation methods to the targeted domain of professionalism.
Collapse
Affiliation(s)
- Eisha Wali
- The University of Chicago, Chicago, IL; Case Western Reserve University, Cleveland, OH.
| | | | | | | | - Angela D Blood
- The University of Chicago, Chicago, IL; Rush University, Chicago, IL
| | | | | |
Collapse
|
33
|
Discussion: Competency, the NAS, and the Milestones: Sea Change or Deluge? Plast Reconstr Surg 2016; 137:1334-1336. [PMID: 27018689 DOI: 10.1097/prs.0000000000002033] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
|
34
|
Touchie C, ten Cate O. The promise, perils, problems and progress of competency-based medical education. MEDICAL EDUCATION 2016; 50:93-100. [PMID: 26695469 DOI: 10.1111/medu.12839] [Citation(s) in RCA: 112] [Impact Index Per Article: 14.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/17/2015] [Revised: 04/22/2015] [Accepted: 07/24/2015] [Indexed: 05/11/2023]
Abstract
CONTEXT Competency-based medical education (CBME) is being adopted wholeheartedly by organisations worldwide in the hope of meeting today's expectations for training a competent doctor. But are we, as medical educators, fulfilling this promise? METHODS The authors explore, through a personal viewpoint, the problems identified with CBME and the progress made through the development of milestones and entrustable professional activities (EPAs). RESULTS Proponents of CBME have strong reasons to keep developing and supporting this broad movement in medical education. Critics, however, have legitimate reservations. The authors observe that the recent increase in use of milestones and EPAs can strengthen the purpose of CBME and counter some of the concerns voiced, if properly implemented. CONCLUSIONS The authors conclude with suggestions for the future and how using EPAs could lead us one step closer to the goals of not only competency-based medical education but also competency-based medical practice.
Collapse
Affiliation(s)
- Claire Touchie
- Medical Council of Canada, Ottawa, Ontario, Canada
- University of Ottawa, Ottawa, Ontario, Canada
| | - Olle ten Cate
- Center for Research and Development of Education, University Medical Center, Utrecht, the Netherlands
| |
Collapse
|
35
|
Holmboe ES. Realizing the promise of competency-based medical education. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2015; 90:411-3. [PMID: 25295967 DOI: 10.1097/acm.0000000000000515] [Citation(s) in RCA: 123] [Impact Index Per Article: 13.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/12/2023]
Abstract
Competency-based medical education (CBME) places a premium on both educational and clinical outcomes. The Milestones component of the Next Accreditation System represents a fundamental change in medical education in the United States and is part of the drive to realize the full promise of CBME. The Milestones framework provides a descriptive blueprint in each specialty to guide curriculum development and assessment practices. From the beginning of the Outcomes project in 1999, the Accreditation Council for Graduate Medical Education and the larger medical education community recognized the importance of improving their approach to assessment. Work-based assessments, which rely heavily on the observations and judgments of clinical faculty, are central to a competency-based approach. The direct observation of learners and the provision of robust feedback have always been recognized as critical components of medical education, but CBME systems further elevate their importance. Without effective and frequent direct observation, coaching, and feedback, the full potential of CBME and the Milestones cannot be achieved. Furthermore, simply using the Milestones as end-of-rotation evaluations to "check the box" to meet requirements undermines the intent of an outcomes-based accreditation system. In this Commentary, the author explores these challenges, addressing the concerns raised by Williams and colleagues in their Commentary. Meeting the assessment challenges of the Milestones will require a renewed commitment from institutions to meet the profession's "special obligations" to patients and learners. All stakeholders in graduate medical education must commit to a professional system of self-regulation to prepare highly competent physicians to fulfill this social contract.
Collapse
Affiliation(s)
- Eric S Holmboe
- Dr. Holmboe is senior vice president, Milestone Development and Evaluation, Accreditation Council for Graduate Medical Education, Chicago, Illinois
| |
Collapse
|
36
|
Sklar DP. Competencies, milestones, and entrustable professional activities: what they are, what they could be. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2015; 90:395-7. [PMID: 25803411 DOI: 10.1097/acm.0000000000000659] [Citation(s) in RCA: 30] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/26/2023]
|
37
|
|