1
|
Alaish SM, Arca MJ, Bucher BT, Cooney C, Diesen DL, Ehrlich PF, Gaines BA, Griggs CL, Javid PJ, Krishnaswami S, Middlesworth W, Wong CL, Edgar L. Pediatric surgery milestones 2.0: A primer. J Pediatr Surg 2022; 57:845-851. [PMID: 35649748 DOI: 10.1016/j.jpedsurg.2022.04.021] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/09/2022] [Revised: 04/27/2022] [Accepted: 04/28/2022] [Indexed: 11/19/2022]
Abstract
More than twenty years ago, the Accreditation Council for Graduate Medical Education and the American Board of Medical Specialties began the conversion of graduate medical education from a structure- and process-based model to a competency-based framework. The educational outcomes assessment tool, known as the Milestones, was introduced in 2013 for seven specialties and by 2015 for the remaining specialties, including pediatric surgery. Designed to be an iterative process with improvements over time based on feedback and evidence-based literature, the Milestones started the evolution from 1.0 to 2.0 in 2016. The formation of Pediatric Surgery Milestones 2.0 began in 2019 and was finalized in 2021 for implementation in the 2022-2023 academic year. Milestones 2.0 are fewer in number and are stated in more straightforward language. It incorporated the harmonized milestones, subcompetencies for non-patient care and non-medical knowledge that are consistent across all medical and surgical specialties. There is a new Supplemental Guide that lists examples, references and links to other assessment tools and resources for each subcompetency. Milestones 2.0 represents a continuous process of feedback, literature review and revision with goals of improving patient care and maintaining public trust in graduate medical education's ability to self-regulate. LEVEL OF EVIDENCE: V.
Collapse
Affiliation(s)
- Samuel M Alaish
- Department of Surgery, School of Medicine, Johns Hopkins University School of Medicine, 1800 Orleans Street, Room 7337, Baltimore, MD 21287, United States.
| | - Marjorie J Arca
- Department of Surgery, University of Rochester School of Medicine and Dentistry, Rochester, NY, United States
| | - Brian T Bucher
- Department of Surgery, University of Utah School of Medicine, Salt Lake City, UT, United States
| | - Cathi Cooney
- Cooper University Hospital, Camden, NJ, United States
| | - Diana L Diesen
- Department of Surgery, University of Texas Southwestern, Dallas, TX, United States
| | - Peter F Ehrlich
- Department of Surgery, University of Michigan, Ann Arbor, MI, United States
| | - Barbara A Gaines
- Department of Surgery, University of Pittsburgh, Pittsburgh, PA, United States
| | | | - Patrick J Javid
- Department of Surgery, University of Washington, Seattle, WA, United States
| | - Sanjay Krishnaswami
- Department of Surgery, Oregon Health and Science University, Portland, OR, United States
| | | | - Cynthia L Wong
- Department of Dentistry, University of Rochester School of Medicine and Dentistry, Rochester, NY, United States
| | - Laura Edgar
- Milestones Development, Accreditation Council for Graduate Medical Education, United States
| |
Collapse
|
2
|
Boland RJ, Dingle AD, Travis MJ, Osborne LM, Shapiro MA, Madaan V, Ahmed I. Using the Psychiatry Resident-In-Training Examination (PRITE) to Assess the Psychiatry Medical Knowledge Milestones in Psychiatry. ACADEMIC PSYCHIATRY : THE JOURNAL OF THE AMERICAN ASSOCIATION OF DIRECTORS OF PSYCHIATRIC RESIDENCY TRAINING AND THE ASSOCIATION FOR ACADEMIC PSYCHIATRY 2022; 46:331-337. [PMID: 34623622 DOI: 10.1007/s40596-021-01537-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/27/2018] [Accepted: 09/09/2021] [Indexed: 05/13/2023]
Abstract
OBJECTIVE The introduction of the Milestone Project underscored the need for objective assessments of resident progress across the competencies. Therefore, the authors examined the Psychiatry Resident-In-Training Examination (PRITE) utility for measuring improvements in medical knowledge (MK). METHODS The authors compared the mean performance for each MK subcompetency by resident year for all residents taking the PRITE from 2015 to 2017 (18,175 examination administrations). In addition, they surveyed psychiatry residency program directors regarding how well they thought they teach these subcompetencies. RESULTS Increases in MK subcompetencies by resident year were significant for Psychopathology (p < 0.003), Psychotherapy (p < 0.002), and Somatic Therapies (p < 0.000). Development, Clinical Neuroscience, and Practice of Psychiatry did not show statistically significant differences between postgraduate years. Eighty psychiatry program directors responded to the survey and felt optimistic about their ability to teach the Psychopathology, Psychotherapy, Somatic Therapies, and Practice of Psychiatry subcompetencies. CONCLUSIONS The PRITE measured significant improvements in medical knowledge for several of the core subcompetencies. The program director's responses would suggest that the lack of statistically significant differences found for Development and Clinical Neuroscience reflects areas in need of curricular development. The disparity between PRITE performance and program director perception of the Practice of Psychiatry subcompetency may reflect difficulties in defining the scope of this subcompetency. Overall, this suggests that structured examinations help measure improvements in certain subcompetencies and may also help identify curricular needs. However, there may be potential problems with the definition of some subcompetencies.
Collapse
Affiliation(s)
- Robert J Boland
- Baylor College of Medicine and the Menninger Clinic, Houston, TX, USA.
| | - Arden D Dingle
- University of Nevada, Reno School of Medicine, Reno, NV, USA
| | - Michael J Travis
- University of Pittsburgh School of Medicine, Pittsburgh, PA, USA
| | | | | | - Vishal Madaan
- University of Virginia School of Medicine, Charlottesville, VA, USA
| | - Iqbal Ahmed
- Tripler Army Medical Center Psychiatry Residency Program, Honolulu, HI, USA
| |
Collapse
|
3
|
Hauer KE, Edgar L, Hogan SO, Kinnear B, Warm E. The Science of Effective Group Process: Lessons for Clinical Competency Committees. J Grad Med Educ 2021; 13:59-64. [PMID: 33936534 PMCID: PMC8078081 DOI: 10.4300/jgme-d-20-00827.1] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/06/2022] Open
Affiliation(s)
- Karen E. Hauer
- Karen E. Hauer, MD, PhD, is Associate Dean, Competency Assessment and Professional Standards, and Professor of Medicine, University of California, San Francisco
| | - Laura Edgar
- Laura Edgar, EdD, CAE, is Vice President, Milestones Development, Accreditation Council for Graduate Medical Education (ACGME)
| | - Sean O. Hogan
- Sean O. Hogan, PhD, is Director, Outcomes Research and Evaluation, ACGME
| | - Benjamin Kinnear
- Benjamin Kinnear, MD, MEd, is Associate Professor of Internal Medicine and Pediatrics, Department of Pediatrics, University of Cincinnati College of Medicine
| | - Eric Warm
- Eric Warm, MD, is Program Director, Internal Medicine, Department of Medicine, University of Cincinnati College of Medicine
| |
Collapse
|
4
|
St-Onge C, Vachon Lachiver É, Langevin S, Boileau E, Bernier F, Thomas A. Lessons from the implementation of developmental progress assessment: A scoping review. MEDICAL EDUCATION 2020; 54:878-887. [PMID: 32083743 DOI: 10.1111/medu.14136] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/28/2019] [Revised: 01/21/2020] [Accepted: 02/06/2020] [Indexed: 06/10/2023]
Abstract
OBJECTIVES Educators and researchers recently implemented developmental progress assessment (DPA) in the context of competency-based education. To reap its anticipated benefits, much still remains to be understood about its implementation. In this study, we aimed to determine the nature and extent of the current evidence on DPA, in an effort to broaden our understanding of the major goals and intended outcomes of DPA as well as the lessons learned from how it has been executed in, or applied across, educational contexts. METHODS We conducted a scoping study based on the methodology of Arksey and O'Malley. Our search strategy yielded 2494 articles. These articles were screened for inclusion and exclusion (90% agreement), and numerical and qualitative data were extracted from 56 articles based on a pre-defined set of charting categories. The thematic analysis of the qualitative data was completed with iterative consultations and discussions until consensus was achieved for the interpretation of the results. RESULTS Tools used to document DPA include scales, milestones and portfolios. Performances were observed in clinical or standardised contexts. We identified seven major themes in our qualitative thematic analysis: (a) underlying aims of DPA; (b) sources of information; (c) barriers; (d) contextual factors that can act as barriers or facilitators to the implementation of DPA; (e) facilitators; (f) observed outcomes, and (g) documented validity evidences. CONCLUSIONS Developmental progress assessment seems to fill a need in the training of future competent health professionals. However, moving forward with a widespread implementation of DPA, factors such as lack of access to user-friendly technology and time to observe performance may render its operationalisation burdensome in the context of competency-based medical education.
Collapse
Affiliation(s)
- Christina St-Onge
- Department of Medicine, Faculty of Medicine and Health Sciences, University of Sherbrooke, Sherbrooke, Québec, Canada
| | - Élise Vachon Lachiver
- Faculty of Medicine and Health Sciences, University of Sherbrooke, Sherbrooke, Québec, Canada
| | - Serge Langevin
- Department of Medicine, Faculty of Medicine and Health Sciences, University of Sherbrooke, Sherbrooke, Québec, Canada
| | - Elisabeth Boileau
- Department of Family and Emergency Medicine, Faculty of Medicine and Health Sciences, University of Sherbrooke, Sherbrooke, Québec, Canada
| | - Frédéric Bernier
- Department of Medicine, Faculty of Medicine and Health Sciences, University of Sherbrooke, Sherbrooke, Québec, Canada
- Faculty of Medicine and Health Sciences, University of Sherbrooke, Sherbrooke, Québec, Canada
- Research Center - Sherbrooke University Hospital Center (CHUS), Integrated Health and Social Service Centers (CISSS) and Integrated University Health and Social Service Centres (CIUSSS), Sherbrooke, Québec, Canada
| | - Aliki Thomas
- School of Physical and Occupational Therapy, McGill University, Montreal, Québec, Canada
| |
Collapse
|
5
|
Krueger CA, Rivera JC, Bhullar PS, Osborn PM. Developing a Novel Scoring System to Objectively Track Orthopaedic Resident Educational Performance and Progression. JOURNAL OF SURGICAL EDUCATION 2020; 77:454-460. [PMID: 31889688 DOI: 10.1016/j.jsurg.2019.09.009] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/14/2019] [Revised: 08/16/2019] [Accepted: 09/11/2019] [Indexed: 06/10/2023]
Abstract
OBJECTIVE Objectively determining orthopedic resident competence remains difficult and lacks standardization across residency programs. We sought to develop a scoring system to measure resident educational activity to stratify participation and performance in particular aspects of training and the effect of these measures on board certification. DESIGN A weighted scoring system (Average Resident Score, ARS) was developed using the number of logged cases, clinic notes dictated, OITE PGY percentile, case minimums met, and scholarly activity completed each academic year (AY), with clinical activity being more heavily weighted. The Resident Effectiveness Score (RES), a z-score showing the number of standard deviations from the mean, was determined using the ARS. The RES effect on the Accreditation Council for Graduate Medical Education (ACGME) Milestones and American Board of Orthopedic Surgery (ABOS) Part 1 percentile score was determined using a Spearman correlation. SETTING Large academic orthopedic residency. PARTICIPANTS Thirty one orthopedic residents graduating between 2011 and 2016 were included. RESULTS The RES did not differ between classes in the same AY, nor change significantly for individual residents during their training. Milestone z-scores increased as residents progressed in their education. The RES correlated with each Milestone competency subscore. The PGY5 OITE score and achieving ACGME minimums correlated with passing ABOS Part 1 (28/31 1st time pass), but the RES did not predict passing the board examination. CONCLUSIONS This study demonstrates a scoring system encompassing multiple facets of resident education to track resident activity and progress. The RES can be tailored to an individual program's goals and aims and help program directors identify residents not maximizing educational opportunities compared to their peers. Monitoring this score may allow tailoring of educational efforts to individual resident needs. This RES may also allow residents to measure their performance and educational accomplishments and adjust their focus to obtain competence and board certification.
Collapse
Affiliation(s)
- Chad A Krueger
- San Antonio Uniformed Services Healthcare Education Consortium, Fort Sam Houston, Texas
| | - Jessica C Rivera
- San Antonio Uniformed Services Healthcare Education Consortium, Fort Sam Houston, Texas
| | - Preetinder S Bhullar
- San Antonio Uniformed Services Healthcare Education Consortium, Fort Sam Houston, Texas
| | - Patrick M Osborn
- San Antonio Uniformed Services Healthcare Education Consortium, Fort Sam Houston, Texas.
| |
Collapse
|
6
|
Santen SA, Yamazaki K, Holmboe ES, Yarris LM, Hamstra SJ. Comparison of Male and Female Resident Milestone Assessments During Emergency Medicine Residency Training: A National Study. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2020; 95:263-268. [PMID: 31517688 PMCID: PMC7004441 DOI: 10.1097/acm.0000000000002988] [Citation(s) in RCA: 38] [Impact Index Per Article: 9.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/09/2023]
Abstract
PURPOSE A previous study found that milestone ratings at the end of training were higher for male than for female residents in emergency medicine (EM). However, that study was restricted to a sample of 8 EM residency programs and used individual faculty ratings from milestone reporting forms that were designed for use by the program's Clinical Competency Committee (CCC). The objective of this study was to investigate whether similar results would be found when examining the entire national cohort of EM milestone ratings reported by programs after CCC consensus review. METHOD This study examined longitudinal milestone ratings for all EM residents (n = 1,363; 125 programs) reported to the Accreditation Council for Graduate Medical Education every 6 months from 2014 to 2017. A multilevel linear regression model was used to estimate differences in slope for all subcompetencies, and predicted marginal means between genders were compared at time of graduation. RESULTS There were small but statistically significant differences between males' and females' increase in ratings from initial rating to graduation on 6 of the 22 subcompetencies. Marginal mean comparisons at time of graduation demonstrated gender effects for 4 patient care subcompetencies. For these subcompetencies, males were rated as performing better than females; differences ranged from 0.048 to 0.074 milestone ratings. CONCLUSIONS In this national dataset of EM resident milestone assessments by CCCs, males and females were rated similarly at the end of their training for the majority of subcompetencies. Statistically significant but small absolute differences were noted in 4 patient care subcompetencies.
Collapse
Affiliation(s)
- Sally A. Santen
- S.A. Santen is professor and senior associate dean, Virginia Commonwealth University School of Medicine, Richmond, Virginia; ORCID: https://orcid.org/0000-0002-8327-8002
| | - Kenji Yamazaki
- K. Yamazaki is senior analyst, Milestones Research and Evaluation, Accreditation Council for Graduate Medical Education, Chicago, Illinois
| | - Eric S. Holmboe
- E.S. Holmboe is chief research, milestones development and evaluation officer, Accreditation Council for Graduate Medical Education, Chicago, Illinois; ORCID: https://orcid.org/0000-0003-0108-6021
| | - Lalena M. Yarris
- L.M. Yarris is professor and residency director, Department of Emergency Medicine, Oregon Health & Science University, Portland, Oregon
| | - Stanley J. Hamstra
- S.J. Hamstra is vice president, Department of Research, Milestone Development and Evaluation, Accreditation Council for Graduate Medical Education, Chicago, Illinois, adjunct professor, Faculty of Education, University of Ottawa, Ottawa, Ontario, Canada, and adjunct professor, Department of Medical Education, Feinberg School of Medicine, Northwestern University, Chicago, Illinois; ORCID: https://orcid.org/0000-0002-0680-366X
| |
Collapse
|
7
|
Kinnear B, Warm EJ, Hauer KE. Twelve tips to maximize the value of a clinical competency committee in postgraduate medical education. MEDICAL TEACHER 2018; 40:1110-1115. [PMID: 29944025 DOI: 10.1080/0142159x.2018.1474191] [Citation(s) in RCA: 41] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/10/2023]
Abstract
Medical education has shifted to a competency-based paradigm, leading to calls for improved learner assessment methods and validity evidence for how assessment data are interpreted. Clinical competency committees (CCCs) use the collective input of multiple people to improve the validity and reliability of decisions made and actions taken based on assessment data. Significant heterogeneity in CCC structure and function exists across postgraduate medical education programs and specialties, and while there is no "one-size-fits-all" approach, there are ways to maximize value for learners and programs. This paper collates available evidence and the authors' experiences to provide practical tips on CCC purpose, membership, processes, and outputs. These tips can benefit programs looking to start a CCC and those that are improving their current CCC processes.
Collapse
Affiliation(s)
- Benjamin Kinnear
- a Internal Medicine and Pediatrics , University of Cincinnati College of Medicine , Cincinnati , OH , USA
| | - Eric J Warm
- b Richard W. Vilter Professor of Medicine , University of Cincinnati College of Medicine , Cincinnati , OH , USA
| | - Karen E Hauer
- c Medicine , University of California, San Francisco School of Medicine , San Francisco , CA , USA
| |
Collapse
|
8
|
Ekpenyong A, Baker E, Harris I, Tekian A, Abrams R, Reddy S, Park YS. How do clinical competency committees use different sources of data to assess residents' performance on the internal medicine milestones?A mixed methods pilot study. MEDICAL TEACHER 2017; 39:1074-1083. [PMID: 28738746 DOI: 10.1080/0142159x.2017.1353070] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/24/2023]
Abstract
PURPOSE This study examines how Clinical Competency Committees (CCCs) synthesize assessment data to make judgments about residents' clinical performances. METHODS Between 2014 and 2015, after four six-month reporting periods to the Accreditation Council for Graduate Medical Education (ACGME), 7 of 16 CCC faculty at Rush University Medical Center completed questionnaires focused on their perspectives about rating residents on their achievement of the milestones and participated in a focus group. Qualitative data were analyzed using grounded theory. Milestones ratings for two six-month ACGME reporting cycles (n = 100 categorical residents) were also analyzed. RESULTS CCC members weighted resident rotation ratings highest (weight = 37%), followed by faculty rotation comments (weight = 27%) and personal experience with residents (weight = 14%) for making judgments about learner's milestone levels. Three assessment issues were identified from qualitative analyses: (1) "design issues" (e.g. problems with available data or lack thereof); (2) "synthesis issues" (e.g. factors influencing ratings and decision-making processes) and (3) "impact issues" (e.g. how CCC generated milestones ratings are used). CONCLUSIONS Identifying factors that affect assessment at all stages of the CCC process can contribute to improving assessment systems, including support for faculty development for CCCs. Recognizing challenges in synthesizing first and second-hand assessment data is an important step in understanding the CCC decision-making process.
Collapse
Affiliation(s)
- Andem Ekpenyong
- a Internal Medicine , Rush University Medical Center , Chicago , IL , USA
| | - Elizabeth Baker
- a Internal Medicine , Rush University Medical Center , Chicago , IL , USA
| | - Ilene Harris
- b Medical Education , University of Illinois, Chicago , Chicago , IL , USA
| | - Ara Tekian
- b Medical Education , University of Illinois, Chicago , Chicago , IL , USA
| | - Richard Abrams
- a Internal Medicine , Rush University Medical Center , Chicago , IL , USA
| | - Shalini Reddy
- c Internal Medicine , University of Chicago Pritzker School of Medicine , Chicago , IL , USA
| | - Yoon Soo Park
- b Medical Education , University of Illinois, Chicago , Chicago , IL , USA
| |
Collapse
|