1
|
Cheung WJ, Wagner N, Frank JR, Oswald A, Van Melle E, Skutovich A, Dalseg TR, Cooke LJ, Hall AK. Implementation of competence committees during the transition to CBME in Canada: A national fidelity-focused evaluation. MEDICAL TEACHER 2022; 44:781-789. [PMID: 35199617 DOI: 10.1080/0142159x.2022.2041191] [Citation(s) in RCA: 10] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
PURPOSE This study evaluated the fidelity of competence committee (CC) implementation in Canadian postgraduate specialist training programs during the transition to competency-based medical education (CBME). METHODS A national survey of CC chairs was distributed to all CBME training programs in November 2019. Survey questions were derived from guiding documents published by the Royal College of Physicians and Surgeons of Canada reflecting intended processes and design. RESULTS Response rate was 39% (113/293) with representation from all eligible disciplines. Committee size ranged from 3 to 20 members, 42% of programs included external members, and 20% included a resident representative. Most programs (72%) reported that a primary review and synthesis of resident assessment data occurs prior to the meeting, with some data reviewed collectively during meetings. When determining entrustable professional activity (EPA) achievement, most programs followed the national specialty guidelines closely with some exceptions (53%). Documented concerns about professionalism, EPA narrative comments, and EPA entrustment scores were most highly weighted when determining resident progress decisions. CONCLUSIONS Heterogeneity in CC implementation likely reflects local adaptations, but may also explain some of the variable challenges faced by programs during the transition to CBME. Our results offer educational leaders important fidelity data that can help inform the larger evaluation and transformation of CBME.
Collapse
Affiliation(s)
- Warren J Cheung
- Department of Emergency Medicine, University of Ottawa, Ottawa, Canada
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
| | - Natalie Wagner
- Office of Professional Development & Educational Scholarship and Department of Biomedical & Molecular Sciences, Queen's University, Kingston, Canada
| | - Jason R Frank
- Department of Emergency Medicine, University of Ottawa, Ottawa, Canada
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
| | - Anna Oswald
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Department of Medicine, University of Alberta, Edmonton, Canada
| | - Elaine Van Melle
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Department of Family Medicine, Queen's University, Kingston, Canada
| | | | - Timothy R Dalseg
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Department of Medicine, Division of Emergency Medicine, University of Toronto, Toronto, Canada
| | - Lara J Cooke
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Department of Clinical Neurosciences, Cumming School of Medicine, University of Calgary, Calgary, Canada
| | - Andrew K Hall
- Department of Emergency Medicine, University of Ottawa, Ottawa, Canada
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
| |
Collapse
|
2
|
Reimagining the Clinical Competency Committee to Enhance Education and Prepare for Competency-Based Time-Variable Advancement. J Gen Intern Med 2022; 37:2280-2290. [PMID: 35445932 PMCID: PMC9021365 DOI: 10.1007/s11606-022-07515-3] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/15/2021] [Accepted: 03/25/2022] [Indexed: 12/01/2022]
Abstract
Assessing residents and clinical fellows is a high-stakes activity. Effective assessment is important throughout training so that identified areas of strength and weakness can guide educational planning to optimize outcomes. Assessment has historically been underemphasized although medical education oversight organizations have strengthened requirements in recent years. Growing acceptance of competency-based medical education and its logical extension to competency-based time-variable (CB-TV) graduate medical education (GME) further highlights the importance of implementing effective evidence-based approaches to assessment. The Clinical Competency Committee (CCC) has emerged as a key programmatic structure in graduate medical education. In the context of launching a multi-specialty pilot of CB-TV GME in our health system, we have examined several program's CCC processes and reviewed the relevant literature to propose enhancements to CCCs. We recommend that all CCCs fulfill three core goals, regularly applied to every GME trainee: (1) discern and describe the resident's developmental status to individualize education, (2) determine readiness for unsupervised practice, and (3) foster self-assessment ability. We integrate the literature and observations from GME program CCCs in our institutions to evaluate how current CCC processes support or undermine these goals. Obstacles and key enablers are identified. Finally, we recommend ways to achieve the stated goals, including the following: (1) assess and promote the development of competency in all trainees, not just outliers, through a shared model of assessment and competency-based advancement; (2) strengthen CCC assessment processes to determine trainee readiness for independent practice; and (3) promote trainee reflection and informed self-assessment. The importance of coaching for competency, robust workplace-based assessments, feedback, and co-production of individualized learning plans are emphasized. Individual programs and their CCCs must strengthen assessment tools and frameworks to realize the potential of competency-oriented education.
Collapse
|
3
|
Acai A, Cupido N, Weavers A, Saperson K, Ladhani M, Cameron S, Sonnadara RR. Competence committees: The steep climb from concept to implementation. MEDICAL EDUCATION 2021; 55:1067-1077. [PMID: 34152027 DOI: 10.1111/medu.14585] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/16/2020] [Revised: 05/07/2021] [Accepted: 06/03/2021] [Indexed: 06/13/2023]
Abstract
INTRODUCTION Competence committees (CCs) are groups of educators tasked with reviewing resident progress throughout their training, making decisions regarding the achievement of Entrustable Professional Activities and recommendations regarding promotion and remediation. CCs have been mandated as part of competency-based medical education programmes worldwide; however, there has yet to be a thorough examination of the implementation challenges they face and how this impacts their functioning and decision-making processes. This study examined CC implementation at a Canadian institution, documenting the shared and unique challenges that CCs faced and overcame over a 3-year period. METHODS This study consisted of three phases, which were conceptually and analytically linked using Moran-Ellis and colleagues' notion of 'following a thread.' Phase 1 examined the early perceptions and experiences of 30 key informants using a survey and semi-structured interviews. Phase 2 provided insight into CCs' operations through a survey sent to 35 CC chairs 1-year post-implementation. Phase 3 invited 20 CC members to participate in semi-structured interviews to follow up on initial themes 2 years post-implementation. Detailed observation notes from 16 CC meetings across nine disciplines were used to corroborate the findings from each phase. RESULTS Response rates in each phase were 83% (n = 25), 43% (n = 15) and 60% (n = 12), respectively. Despite the high degree of support for CCs among faculty and resident members, several ongoing challenges were highlighted: adapting to programme size, optimising membership, engaging residents, maintaining capacity among members, sharing and aggregating data and developing a clear mandate. DISCUSSION Findings of this study reinforce the importance of resident engagement and information sharing between disciplines. Challenges faced by CCs are discussed in relation to the existing literature to inform a better understanding of group decision-making processes in medical education. Future research could compare implementation practices across sites and explore which adaptations lead to better or worse decision-making outcomes.
Collapse
Affiliation(s)
- Anita Acai
- Department of Psychology, Neuroscience and Behaviour, McMaster University, Hamilton, ON, Canada
- Office of Education Science, Department of Surgery, McMaster University, Hamilton, ON, Canada
- Department of Psychiatry and Behavioural Neurosciences, McMaster University, Hamilton, ON, Canada
| | - Nathan Cupido
- Office of Education Science, Department of Surgery, McMaster University, Hamilton, ON, Canada
| | - Aliana Weavers
- Office of Education Science, Department of Surgery, McMaster University, Hamilton, ON, Canada
| | - Karen Saperson
- Department of Psychiatry and Behavioural Neurosciences, McMaster University, Hamilton, ON, Canada
| | - Moyez Ladhani
- McMaster Postgraduate Medical Education Office, McMaster University, Hamilton, ON, Canada
- Department of Pediatrics, McMaster University, Hamilton, ON, Canada
| | - Sharon Cameron
- McMaster Postgraduate Medical Education Office, McMaster University, Hamilton, ON, Canada
| | - Ranil R Sonnadara
- Office of Education Science, Department of Surgery, McMaster University, Hamilton, ON, Canada
- Department of Surgery, University of Toronto, Toronto, ON, Canada
| |
Collapse
|
4
|
Abstract
There are myriad types of problem learners in surgical residency and most have difficulty in more than 1 competency. Programs that use a standard curriculum of study and assessment are most successful in identifying struggling learners early. Many problem learners lack appropriate systems for study; a multidisciplinary educational team that is separate from the team that evaluates the success of remediation is critical. Struggling residents who require formal remediation benefit from performance improvement plans that clearly outline the issues of concern, describe the steps required for remediation, define success of remediation, and outline consequences for failure to remediate appropriately.
Collapse
Affiliation(s)
- Lilah F Morris-Wiseman
- University of Arizona, Department of Surgery, Division of Surgical Oncology, 1501 N. Campbell Avenue, PO Box 245058, Tucson, AZ 85724-5058, USA
| | - Valentine N Nfonsam
- University of Arizona, Department of Surgery, Division of Surgical Oncology, 1501 N. Campbell Avenue, PO Box 245058, Tucson, AZ 85724-5058, USA.
| |
Collapse
|
5
|
Abdel-Razig S, Ling JOE, MBBS TH, Smitasin N, Lum LHW, Ibrahim H. Challenges and Solutions in Running Effective Clinical Competency Committees in the International Context. J Grad Med Educ 2021; 13:70-74. [PMID: 33936536 PMCID: PMC8078082 DOI: 10.4300/jgme-d-20-00844.1] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/06/2022] Open
Affiliation(s)
- Sawsan Abdel-Razig
- Sawsan Abdel-Razig, MD, MEHP, is Chair of Medical Education, Office of Academics, Cleveland Clinic Abu Dhabi, Abu Dhabi, UAE, and Clinical Associate Professor of Medicine, Cleveland Clinic Lerner College of Medicine, Case Western Reserve University
| | - Jolene Oon Ee Ling
- Jolene Oon Ee Ling, MBBCh BAO, is Consultant, Division of Infectious Disease, Program Director, Infectious Diseases Senior Residency Program, National University Hospital, Singapore, and Assistant Professor, Yong Loo Lin School of Medicine, National University of Singapore
| | - Thana Harhara MBBS
- Thana Harhara, MBBS, MSc, is Internal Medicine Residency Program Director, Sheikh Khalifa Medical City, Abu Dhabi, UAE
| | - Nares Smitasin
- Nares Smitasin, MD, is Senior Consultant, Division of Infectious Disease, Core Faculty, Infectious Diseases Senior Residency Program, National University Hospital, Singapore, and Assistant Professor, Yong Loo Lin School of Medicine, National University of Singapore
| | - Lionel HW Lum
- Lionel HW Lum, MBBS, MRCP, is Consultant, Division of Infectious Diseases, Core Faculty, Infectious Diseases Senior Residency Program, National University Hospital, Singapore, and Assistant Professor, Yong Loo Lin School of Medicine, National University of Singapore
| | - Halah Ibrahim
- Halah Ibrahim, MD, MEHP, is Consultant, Department of Medicine, Sheikh Khalifa Medical City, Abu Dhabi, UAE, and Adjunct Assistant Professor, Department of Medicine, Johns Hopkins University School of Medicine
| |
Collapse
|
6
|
Hauer KE, Edgar L, Hogan SO, Kinnear B, Warm E. The Science of Effective Group Process: Lessons for Clinical Competency Committees. J Grad Med Educ 2021; 13:59-64. [PMID: 33936534 PMCID: PMC8078081 DOI: 10.4300/jgme-d-20-00827.1] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/06/2022] Open
Affiliation(s)
- Karen E. Hauer
- Karen E. Hauer, MD, PhD, is Associate Dean, Competency Assessment and Professional Standards, and Professor of Medicine, University of California, San Francisco
| | - Laura Edgar
- Laura Edgar, EdD, CAE, is Vice President, Milestones Development, Accreditation Council for Graduate Medical Education (ACGME)
| | - Sean O. Hogan
- Sean O. Hogan, PhD, is Director, Outcomes Research and Evaluation, ACGME
| | - Benjamin Kinnear
- Benjamin Kinnear, MD, MEd, is Associate Professor of Internal Medicine and Pediatrics, Department of Pediatrics, University of Cincinnati College of Medicine
| | - Eric Warm
- Eric Warm, MD, is Program Director, Internal Medicine, Department of Medicine, University of Cincinnati College of Medicine
| |
Collapse
|
7
|
Schumacher DJ, Martini A, Sobolewski B, Carraccio C, Holmboe E, Busari J, Poynter S, van der Vleuten C, Lingard L. Use of Resident-Sensitive Quality Measure Data in Entrustment Decision Making: A Qualitative Study of Clinical Competency Committee Members at One Pediatric Residency. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2020; 95:1726-1735. [PMID: 32324637 DOI: 10.1097/acm.0000000000003435] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
PURPOSE Resident-sensitive quality measures (RSQMs) are quality measures that are likely performed by an individual resident and are important to care quality for a given illness of interest. This study sought to explore how individual clinical competency committee (CCC) members interpret, use, and prioritize RSQMs alongside traditional assessment data when making a summative entrustment decision. METHOD In this constructivist grounded theory study, 19 members of the pediatric residency CCC at Cincinnati Children's Hospital Medical Center were purposively and theoretically sampled between February and July 2019. Participants were provided a deidentified resident assessment portfolio with traditional assessment data (milestone and/or entrustable professional activity ratings as well as narrative comments from 5 rotations) and RSQM performance data for 3 acute, common diagnoses in the pediatric emergency department (asthma, bronchiolitis, and closed head injury) from the emergency medicine rotation. Data collection consisted of 2 phases: (1) observation and think out loud while participants reviewed the portfolio and (2) semistructured interviews to probe participants' reviews. Analysis moved from close readings to coding and theme development, followed by the creation of a model illustrating theme interaction. Data collection and analysis were iterative. RESULTS Five dimensions for how participants interpret, use, and prioritize RSQMs were identified: (1) ability to orient to RSQMs: confusing to self-explanatory, (2) propensity to use RSQMs: reluctant to enthusiastic, (3) RSQM interpretation: requires contextualization to self-evident, (4) RSQMs for assessment decisions: not sticky to sticky, and (5) expectations for residents: potentially unfair to fair to use RSQMs. The interactions among these dimensions generated 3 RSQM data user profiles: eager incorporation, willing incorporation, and disinclined incorporation. CONCLUSIONS Participants used RSQMs to varying extents in their review of resident data and found such data helpful to varying degrees, supporting the inclusion of RSQMs as resident assessment data for CCC review.
Collapse
Affiliation(s)
- Daniel J Schumacher
- D.J. Schumacher is associate professor of pediatrics, Cincinnati Children's Hospital Medical Center and University of Cincinnati College of Medicine, Cincinnati, Ohio
| | - Abigail Martini
- A. Martini is a clinical research coordinator, Division of Emergency Medicine, Cincinnati Children's Hospital Medical Center, Cincinnati, Ohio
| | - Brad Sobolewski
- B. Sobolewski is associate professor of pediatrics, Cincinnati Children's Hospital Medical Center and University of Cincinnati College of Medicine, Cincinnati, Ohio
| | - Carol Carraccio
- C. Carraccio is vice president of competency-based assessment, American Board of Pediatrics, Chapel Hill, North Carolina
| | - Eric Holmboe
- E. Holmboe is senior vice president for milestones development and evaluation, Accreditation Council for Graduate Medical Education, Chicago, Illinois
| | - Jamiu Busari
- J. Busari is associate professor of medical education, Maastricht University, Maastricht, The Netherlands
| | - Sue Poynter
- S. Poynter is professor of pediatrics, Cincinnati Children's Hospital Medical Center and University of Cincinnati College of Medicine, Cincinnati, Ohio
| | - Cees van der Vleuten
- C. van der Vleuten is professor of education, Department of Educational Development and Research, Faculty of Health, Medicine, and Life Sciences, and scientific director, School of Health Professions Education, Maastricht University, Maastricht, The Netherlands
| | - Lorelei Lingard
- L. Lingard is professor and scientist, Department of Medicine, and director, Center for Education Research & Innovation, Schulich School of Medicine and Dentistry at Western University, London, Ontario, Canada
| |
Collapse
|
8
|
St-Onge C, Vachon Lachiver É, Langevin S, Boileau E, Bernier F, Thomas A. Lessons from the implementation of developmental progress assessment: A scoping review. MEDICAL EDUCATION 2020; 54:878-887. [PMID: 32083743 DOI: 10.1111/medu.14136] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/28/2019] [Revised: 01/21/2020] [Accepted: 02/06/2020] [Indexed: 06/10/2023]
Abstract
OBJECTIVES Educators and researchers recently implemented developmental progress assessment (DPA) in the context of competency-based education. To reap its anticipated benefits, much still remains to be understood about its implementation. In this study, we aimed to determine the nature and extent of the current evidence on DPA, in an effort to broaden our understanding of the major goals and intended outcomes of DPA as well as the lessons learned from how it has been executed in, or applied across, educational contexts. METHODS We conducted a scoping study based on the methodology of Arksey and O'Malley. Our search strategy yielded 2494 articles. These articles were screened for inclusion and exclusion (90% agreement), and numerical and qualitative data were extracted from 56 articles based on a pre-defined set of charting categories. The thematic analysis of the qualitative data was completed with iterative consultations and discussions until consensus was achieved for the interpretation of the results. RESULTS Tools used to document DPA include scales, milestones and portfolios. Performances were observed in clinical or standardised contexts. We identified seven major themes in our qualitative thematic analysis: (a) underlying aims of DPA; (b) sources of information; (c) barriers; (d) contextual factors that can act as barriers or facilitators to the implementation of DPA; (e) facilitators; (f) observed outcomes, and (g) documented validity evidences. CONCLUSIONS Developmental progress assessment seems to fill a need in the training of future competent health professionals. However, moving forward with a widespread implementation of DPA, factors such as lack of access to user-friendly technology and time to observe performance may render its operationalisation burdensome in the context of competency-based medical education.
Collapse
Affiliation(s)
- Christina St-Onge
- Department of Medicine, Faculty of Medicine and Health Sciences, University of Sherbrooke, Sherbrooke, Québec, Canada
| | - Élise Vachon Lachiver
- Faculty of Medicine and Health Sciences, University of Sherbrooke, Sherbrooke, Québec, Canada
| | - Serge Langevin
- Department of Medicine, Faculty of Medicine and Health Sciences, University of Sherbrooke, Sherbrooke, Québec, Canada
| | - Elisabeth Boileau
- Department of Family and Emergency Medicine, Faculty of Medicine and Health Sciences, University of Sherbrooke, Sherbrooke, Québec, Canada
| | - Frédéric Bernier
- Department of Medicine, Faculty of Medicine and Health Sciences, University of Sherbrooke, Sherbrooke, Québec, Canada
- Faculty of Medicine and Health Sciences, University of Sherbrooke, Sherbrooke, Québec, Canada
- Research Center - Sherbrooke University Hospital Center (CHUS), Integrated Health and Social Service Centers (CISSS) and Integrated University Health and Social Service Centres (CIUSSS), Sherbrooke, Québec, Canada
| | - Aliki Thomas
- School of Physical and Occupational Therapy, McGill University, Montreal, Québec, Canada
| |
Collapse
|
9
|
Pirie J, St. Amant L, Glover Takahashi S. Managing residents in difficulty within CBME residency educational systems: a scoping review. BMC MEDICAL EDUCATION 2020; 20:235. [PMID: 32703231 PMCID: PMC7376876 DOI: 10.1186/s12909-020-02150-0] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/20/2020] [Accepted: 07/15/2020] [Indexed: 05/28/2023]
Abstract
BACKGROUND Best practices in managing residents in difficulty (RID) in the era of competency-based medical education (CBME) are not well described. This scoping review aimed to inventory the current literature and identify major themes in the articles that address or employ CBME as part of the identification and remediation of residents in difficulty. METHODS Articles published between 2011 to 2017 were included if they were about postgraduate medical education, RID, and offered information to inform the structure and/or processes of CBME. All three reviewers performed a primary screening, followed by a secondary screening of abstracts of the chosen articles, and then a final comprehensive sub-analysis of the 11 articles identified as using a CBME framework. RESULTS Of 165 articles initially identified, 92 qualified for secondary screening; the 63 remaining articles underwent full-text abstracting. Ten themes were identified from the content analysis with "identification of RID" (41%) and "defining and classifying deficiencies" (30%) being the most frequent. In the CBME article sub-analysis, the most frequent themes were: need to identify RID (64%), improving assessment tools (45%), and roles and responsibilities of players involved in remediation (27%). Almost half of the CBME articles were published in 2016-2017. CONCLUSIONS Although CBME programs have been implemented for many years, articles have only recently begun specifically addressing RID within a competency framework. Much work is needed to describe the sequenced progression, tailored learning experiences, and competency-focused instruction. Finally, future research should focus on the outcomes of remediation in CBME programs.
Collapse
Affiliation(s)
- Jonathan Pirie
- Department of Pediatrics, Faculty of Medicine, University of Toronto, Toronto, Canada
- Paediatric Emergency Medicine, The Hospital for Sick Children, Toronto, Canada
| | - Lisa St. Amant
- Postgraduate Medical Education, Faculty of Medicine, University of Toronto, Toronto, Canada
| | - Susan Glover Takahashi
- Department of Family and Community Medicine, Faculty of Medicine, Integrated Senior Scholar – Centre for Faculty Development and Postgraduate Medical Education, University of Toronto, Toronto, Canada
| |
Collapse
|
10
|
Soleas E, Dagnone D, Stockley D, Garton K, van Wylick R. Developing Academic Advisors and Competence Committees members: A community approach to developing CBME faculty leaders. CANADIAN MEDICAL EDUCATION JOURNAL 2020; 11:e46-e56. [PMID: 32215142 PMCID: PMC7082482 DOI: 10.36834/cmej.68181] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/05/2023]
Abstract
INTRODUCTION Implementing competency-based medical education (CBME) at the institutional level poses many challenges including having to rapidly enable faculty to be facilitators and champions of a new curriculum which utilizes feedback, coaching, and models of programmatic assessment. This study presents the necessary competencies required for Academic Advisors (AA) and Competence Committee (CC) members, as identified in the literature and as perceived by faculty members at Queen's University. METHODS This study integrated a review of available literature (n=26) yielding competencies that were reviewed by the authors followed by an external review consisting of CBME experts (n=5). These approved competencies were used in a cross-sectional community consultation survey distributed one year before (n=83) and one year after transitioning to CBME (n=144). FINDINGS Our newly identified competencies are a useful template for other institutions. Academic Advisor competencies focused on mentoring and coaching, whereas Competence Committee member's competencies focused on integrating assessments and institutional policies. Competency discrepancies between stakeholder groups existing before the transition had disappeared in the post-implementation sample. CONCLUSIONS We found value in taking an active community-based approach to developing and validating faculty leader competencies sooner rather than later when transitioning to CBME. The evolution of Competence Committees members and Academic Advisors requires the investment of specialized professional development and the sustained engagement of a collaborative community with shared concerns.
Collapse
Affiliation(s)
- Eleftherios Soleas
- Queens University, Ontario, Canada
- Correspondence: Eleftherios Soleas, 511 Union Street, Kingston, Ontario; phone: 343-364-4007;
| | | | | | | | | |
Collapse
|
11
|
Gonzalo JD, Wolpaw DR, Krok KL, Pfeiffer MP, McCall-Hosenfeld JS. A Developmental Approach to Internal Medicine Residency Education: Lessons Learned from the Design and Implementation of a Novel Longitudinal Coaching Program. MEDICAL EDUCATION ONLINE 2019; 24:1591256. [PMID: 30924404 PMCID: PMC6442085 DOI: 10.1080/10872981.2019.1591256] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/21/2018] [Revised: 02/22/2019] [Accepted: 02/25/2019] [Indexed: 06/09/2023]
Abstract
BACKGROUND Resident physicians' achievement of professional competencies requires reflective practice skills and faculty coaching. Graduate medical education programs, however, struggle to operationalize these activities. OBJECTIVE To (1) describe the process and strategies for implementing an Internal Medicine (IM) resident coaching program that evolved in response to challenges, (2) characterize residents' professional learning plans (PLPs) and their alignment with EPAs, and, (3) examine key lessons learned. DESIGN The program began in 2013 and involved all postgraduate years (PGY) residents (n = 60, 100%), and 20 faculty coaches who were all IM trained and practicing in an IM-related specialty. One coach was linked with 3-4 residents for three years. Through 1:1 meetings, resident-coach pairs identified professional challenges ('disorienting dilemmas' or 'worst days'), reviewed successes ('best days'), and co-created professional learning plans. Typed summaries were requested following meetings. Coaches met monthly for professional development and to discuss program challenges/successes, which informed programmatic improvements; additionally, a survey was distributed after three program years. Data were analyzed using quantitative and qualitative methodologies. RESULTS Disorienting dilemmas and professional learning plans mapped to all 16 EPAs and four additional themes: work-life balance, career planning, teaching skills, and research/scholarship. The most-frequently mapped topics included: PGY1 - leading and working within interprofessional care teams (EPA 10), research and scholarship, and work-life balance; PGY2 - improving quality of care (EPA 13), demonstrating personal habits of lifelong learning (EPA15), and research and scholarship; PGY3 - lifelong learning (EPA15); career planning was common across all years. CONCLUSIONS Lessons learned included challenges in coordination of observations, identifying disorienting dilemmas, and creating a shared mental model between residents, faculty, and program leadership. The coaching program resulted in professional learning plans aligned with IM EPAs, in addition to other professional development topics. Operationalization of aspects of these results can inform the development of similar programs in residency education.
Collapse
Affiliation(s)
- Jed D. Gonzalo
- Medicine and Public Health Sciences and Health Systems Education, Penn State College of Medicine, Hershey, PA, USA
| | - Daniel R. Wolpaw
- Medicine and Humanities, Penn State College of Medicine, Hershey, PA, USA
| | - Karen L. Krok
- Medicine, Penn State College of Medicine, Hershey, PA, USA
| | - Michael P. Pfeiffer
- Medicine, Penn State Hershey Heart and Vascular Institute, Penn State College of Medicine, Hershey, PA, USA
| | | |
Collapse
|
12
|
Duitsman ME. Group Assessment of Resident Performance: Valuable for Program Director Judgment? J Grad Med Educ 2019; 11:118-124. [PMID: 31428268 PMCID: PMC6697275 DOI: 10.4300/jgme-d-18-01069] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/13/2018] [Revised: 04/17/2019] [Accepted: 04/23/2019] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND Group discussion of resident performance is an emerging assessment approach in postgraduate medical education. However, groups do not necessarily make better decisions than individuals. OBJECTIVE This study examined how group meetings concerning the assessment of residents take place, what information is shared during the meetings, and how this influences program directors' judgment of resident performance. METHODS In 2017, the researchers observed 10 faculty group meetings where resident performance was discussed and interviewed the program directors within a month after the meetings. We used a thematic framework analysis to identify themes from the transcribed meetings and interviews. RESULTS The information shared by group members during the meetings had 2 aims: (1) forming a judgment about the residents, and (2) faculty development. Most group members shared information without written notes, most discussions were not structured by the program director, the major focus of discussions was on residents with performance concerns, and there was a lack of a shared mental model of resident performance. The program directors who benefited most from the meetings were those who thought group members were engaged and summarized the information after every discussion. CONCLUSIONS Unstructured discussions and a lack of a shared mental model among group members impede effective information sharing about resident performance with a developmental approach. Structured discussions with an equal amount of discussion time for every resident and creating a shared mental model about the purpose of the discussions and the assessment approach could enhance use of a developmental approach to assessing resident performance.
Collapse
|
13
|
Schumacher DJ, Martini A, Bartlett KW, King B, Calaman S, Garfunkel LC, Elliott SP, Frohna JG, Schwartz A, Michelson CD. Key Factors in Clinical Competency Committee Members' Decisions Regarding Residents' Readiness to Serve as Supervisors: A National Study. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2019; 94:251-258. [PMID: 30256253 DOI: 10.1097/acm.0000000000002469] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
Abstract
PURPOSE Entrustment has become a popular assessment framework in recent years. Most research in this area has focused on how frontline assessors determine when a learner can be entrusted. However, less work has focused on how these entrustment decisions are made. The authors sought to understand the key factors that pediatric residency program clinical competency committee (CCC) members consider when recommending residents to a supervisory role. METHOD CCC members at 14 pediatric residency programs recommended residents to one of five progressive supervisory roles (from not serving as a supervisory resident to serving as a supervisory resident in all settings). They then responded to a free-text prompt, describing the key factors that led them to that decision. The authors analyzed these responses, by role recommendation, using a thematic analysis. RESULTS Of the 155 CCC members at the participating programs, 84 completed 769 supervisory role recommendations during the 2015-2016 academic year. Four themes emerged from the thematic analysis: (1) Determining supervisory ability follows from demonstrated trustworthiness; (2) demonstrated performance matters, but so does experience; (3) ability to lead a team is considered; and (4) contextual considerations external to the resident are at play. CONCLUSIONS CCC members considered resident and environmental factors in their summative entrustment decision making. The interplay between these factors should be considered as CCC processes are optimized and studied further.
Collapse
Affiliation(s)
- Daniel J Schumacher
- D.J. Schumacher is associate professor, Department of Pediatrics, Cincinnati Children's Hospital Medical Center and University of Cincinnati College of Medicine, Cincinnati, Ohio. A. Martini is clinical research coordinator, Division of Emergency Medicine, Cincinnati Children's Hospital Medical Center, Cincinnati, Ohio. K.W. Bartlett is associate professor and associate program director, Department of Pediatrics, Duke University School of Medicine, Durham, North Carolina. B. King is research project manager, Association of Pediatric Program Directors Longitudinal Educational Assessment Research Network, McLean, Virginia. S. Calaman is associate professor and pediatric program director, Department of Pediatrics, Drexel University College of Medicine and St. Christopher's Hospital for Children, Philadelphia, Pennsylvania. L.C. Garfunkel is professor and associate program director, Department of Pediatrics, University of Rochester School of Medicine and Dentistry, Rochester, New York. S.P. Elliott is professor, associate chair, and program director, Department of Pediatrics, and interim associate dean, University of Arizona College of Medicine, Tucson, Arizona. J.G. Frohna is professor, Departments of Pediatrics and Internal Medicine, University of Wisconsin School of Medicine and Public Health, Madison, Wisconsin. A. Schwartz is Michael Reese Endowed Professor of Medical Education and associate head, Department of Medical Education, and research professor, Department of Pediatrics, University of Illinois College of Medicine, Chicago, Illinois, and director, Association of Pediatric Program Directors Longitudinal Educational Assessment Research Network, McLean, Virginia. C.D. Michelson is assistant professor and pediatric program director, Department of Pediatrics, Boston University School of Medicine, Boston, Massachusetts
| | | | | | | | | | | | | | | | | | | |
Collapse
|
14
|
Duitsman ME, Fluit CRMG, van Alfen-van der Velden JAEM, de Visser M, Ten Kate-Booij M, Dolmans DHJM, Jaarsma DADC, de Graaf J. Design and evaluation of a clinical competency committee. PERSPECTIVES ON MEDICAL EDUCATION 2019; 8:1-8. [PMID: 30656533 PMCID: PMC6382624 DOI: 10.1007/s40037-018-0490-1] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/15/2023]
Abstract
INTRODUCTION In postgraduate medical education, group decision-making has emerged as an essential tool to evaluate the clinical progress of residents. Clinical competency committees (CCCs) have been set up to ensure informed decision-making and provide feedback regarding performance of residents. Despite this important task, it remains unclear how CCCs actually function in practice and how their performance should be evaluated. METHODS In the prototyping phase of a design-based approach, a CCC meeting was developed, using three theoretical design principles: (1) data from multiple assessment tools and multiple perspectives, (2) a shared mental model and (3) structured discussions. The meetings were held in a university children's hospital and evaluated using observations, interviews with CCC members and an open-ended questionnaire among residents. RESULTS The structured discussions during the meetings provided a broad outline of resident performance, including identification of problematic and excellent residents. A shared mental model about the assessment criteria had developed over time. Residents were not always satisfied with the feedback they received after the meeting. Feedback that had been provided to a resident after the first CCC meeting was not addressed in the second meeting. DISCUSSION The principles that were used to design the CCC meeting were feasible in practice. Structured discussions, based on data from multiple assessment tools and multiple perspectives, provided a broad outline of resident performance. Residency programs that wish to implement CCCs can build on our design principles and adjust the prototype to their particular context. When running a CCC, it is important to consider feedback that has been provided to a resident after the previous meeting and to evaluate whether it has improved the resident's performance.
Collapse
|
15
|
Kinnear B, Warm EJ, Hauer KE. Twelve tips to maximize the value of a clinical competency committee in postgraduate medical education. MEDICAL TEACHER 2018; 40:1110-1115. [PMID: 29944025 DOI: 10.1080/0142159x.2018.1474191] [Citation(s) in RCA: 41] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/10/2023]
Abstract
Medical education has shifted to a competency-based paradigm, leading to calls for improved learner assessment methods and validity evidence for how assessment data are interpreted. Clinical competency committees (CCCs) use the collective input of multiple people to improve the validity and reliability of decisions made and actions taken based on assessment data. Significant heterogeneity in CCC structure and function exists across postgraduate medical education programs and specialties, and while there is no "one-size-fits-all" approach, there are ways to maximize value for learners and programs. This paper collates available evidence and the authors' experiences to provide practical tips on CCC purpose, membership, processes, and outputs. These tips can benefit programs looking to start a CCC and those that are improving their current CCC processes.
Collapse
Affiliation(s)
- Benjamin Kinnear
- a Internal Medicine and Pediatrics , University of Cincinnati College of Medicine , Cincinnati , OH , USA
| | - Eric J Warm
- b Richard W. Vilter Professor of Medicine , University of Cincinnati College of Medicine , Cincinnati , OH , USA
| | - Karen E Hauer
- c Medicine , University of California, San Francisco School of Medicine , San Francisco , CA , USA
| |
Collapse
|
16
|
Schumacher DJ. Influence of Clinical Competency Committee Review Process on Summative Resident Assessment Decisions. J Grad Med Educ 2018; 10:429-437. [PMID: 30154975 PMCID: PMC6108376 DOI: 10.4300/jgme-d-17-00762.1] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/13/2017] [Revised: 01/16/2018] [Accepted: 04/03/2018] [Indexed: 01/01/2023] Open
Abstract
BACKGROUND Clinical Competency Committees (CCCs) are charged with making summative assessment decisions about residents. OBJECTIVE We explored how review processes CCC members utilize influence their decisions regarding residents' milestone levels and supervisory roles. METHODS We conducted a multisite longitudinal prospective observational cohort study at 14 pediatrics residency programs during academic year 2015-2016. Individual CCC members biannually reported characteristics of their review process and Accreditation Council for Graduate Medical Education milestone levels and recommended supervisory role categorizations assigned to residents. Relationships among characteristics of CCC member reviews, mean milestone levels, and supervisory role categorizations were analyzed using mixed-effects linear regression, reported as mean differences with 95% confidence intervals (CIs), and Bayesian mixed-effects ordinal regression, reported as odds ratios (ORs) and 95% credible intervals (CrIs). RESULTS A total of 155 CCC members participated. Members who provided milestones or other professional development feedback after CCC meetings assigned significantly lower mean milestone levels (mean 1.4 points; CI -2.2 to -0.6; P < .001) and were significantly less likely to recommend supervisory responsibility in any setting (OR = 0.23, CrI 0.05-0.83) compared with CCC members who did not. Members recommended less supervisory responsibility when they reviewed more residents (OR = 0.96, 95% CrI 0.94-0.99) and participated in more review cycles (OR = 0.22, 95% CrI 0.07-0.63). CONCLUSIONS This study explored the association between characteristics of individual CCC member reviews and their summative assessment decisions about residents. Further study is needed to gain deeper understanding of factors influencing CCC members' summative assessment decisions.
Collapse
|
17
|
Day KM, Zoog ES, Kluemper CT, Scott JK, Steffen CM, Kennedy JW, Jemison DM, Rehm JP, Brzezienski MA. Progressive Surgical Autonomy Observed in a Hand Surgery Resident Clinic Model. JOURNAL OF SURGICAL EDUCATION 2018; 75:450-457. [PMID: 28967577 DOI: 10.1016/j.jsurg.2017.07.022] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/05/2017] [Revised: 05/31/2017] [Accepted: 07/24/2017] [Indexed: 06/07/2023]
Abstract
OBJECTIVE Resident clinics (RCs) are intended to catalyze the achievement of educational milestones through progressively autonomous patient care. However, few studies quantify their effect on competency-based surgical education, and no previous publications focus on hand surgery RCs (HRCs). We demonstrate the achievement of progressive surgical autonomy in an HRC model. DESIGN A retrospective review of all patients seen in a weekly half-day HRC from October 2010 to October 2015 was conducted. Investigators compiled data on patient demographics, provider encounters, operational statistics, operative details, and dictated surgical autonomy on an ascending 5 point scoring system. SETTING A tertiary hand surgery referral center. RESULTS A total of 2295 HRC patients were evaluated during the study period in 5173 clinic visits. There was an average of 22.6 patients per clinic, including 9.0 new patients with 6.5 emergency room referrals. Totally, 825 operations were performed by 39 residents. Trainee autonomy averaged 2.1/5 (standard deviation [SD] = 1.2), 3.4/5 (SD = 1.3), 2.1/5 (SD = 1.3), 3.4/5 (SD = 1.2), 3.2/5 (SD = 1.5), 3.5/5 (SD = 1.5), 4.0/5 (SD = 1.2), 4.1/5 (SD = 1.2), in postgraduate years 1 to 8, respectively. Linear mixed model analysis demonstrated training level significantly effected operative autonomy (p = 0.0001). Continuity of care was maintained in 79.3% of cases, and patients were followed an average of 3.9 clinic encounters over 12.4 weeks. CONCLUSIONS Our HRC appears to enable surgical trainees to practice supervised autonomous surgical care and provide a forum in which to observe progressive operative competency achievement during hand surgery training. Future studies comparing HRC models to non-RC models will be required to further define quality-of-care delivery within RCs.
Collapse
Affiliation(s)
- Kristopher M Day
- University of Tennessee, Chattanooga College of Medicine, Chattanooga, Tennessee; Department of Plastic Surgery, University of Tennessee College of Medicine, Hayes Hand Center, Chattanooga, Tennessee.
| | - Evon S Zoog
- University of Tennessee, Chattanooga College of Medicine, Chattanooga, Tennessee; Department of General Surgery, Hayes Hand Center, Chattanooga, Tennessee
| | - Chase T Kluemper
- University of Tennessee, Chattanooga College of Medicine, Chattanooga, Tennessee; Department of Orthopedic Surgery, Hayes Hand Center, Chattanooga, Tennessee
| | - Jillian K Scott
- University of Tennessee, Chattanooga College of Medicine, Chattanooga, Tennessee
| | - Caleb M Steffen
- University of Tennessee, Chattanooga College of Medicine, Chattanooga, Tennessee; Department of Plastic Surgery, University of Tennessee College of Medicine, Hayes Hand Center, Chattanooga, Tennessee
| | - James Woodfin Kennedy
- University of Tennessee, Chattanooga College of Medicine, Chattanooga, Tennessee; Department of Plastic Surgery, University of Tennessee College of Medicine, Hayes Hand Center, Chattanooga, Tennessee; Department of Orthopedic Surgery, Hayes Hand Center, Chattanooga, Tennessee
| | - David Marshall Jemison
- University of Tennessee, Chattanooga College of Medicine, Chattanooga, Tennessee; Department of Plastic Surgery, University of Tennessee College of Medicine, Hayes Hand Center, Chattanooga, Tennessee; Department of Orthopedic Surgery, Hayes Hand Center, Chattanooga, Tennessee
| | - Jason P Rehm
- University of Tennessee, Chattanooga College of Medicine, Chattanooga, Tennessee; Department of Plastic Surgery, University of Tennessee College of Medicine, Hayes Hand Center, Chattanooga, Tennessee; Department of Orthopedic Surgery, Hayes Hand Center, Chattanooga, Tennessee
| | - Mark A Brzezienski
- University of Tennessee, Chattanooga College of Medicine, Chattanooga, Tennessee; Department of Plastic Surgery, University of Tennessee College of Medicine, Hayes Hand Center, Chattanooga, Tennessee; Department of Orthopedic Surgery, Hayes Hand Center, Chattanooga, Tennessee
| |
Collapse
|
18
|
Schumacher DJ, Michelson C, Poynter S, Barnes MM, Li STT, Burman N, Sklansky DJ, Thoreson L, Calaman S, King B, Schwartz A, Elliott S, Sharma T, Gonzalez Del Rey J, Bartlett K, Scott-Vernaglia SE, Gibbs K, McGreevy JF, Garfunkel LC, Gellin C, Frohna JG. Thresholds and interpretations: How clinical competency committees identify pediatric residents with performance concerns. MEDICAL TEACHER 2018; 40:70-79. [PMID: 29345207 DOI: 10.1080/0142159x.2017.1394576] [Citation(s) in RCA: 26] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/12/2023]
Abstract
BACKGROUND Clinical competency committee (CCC) identification of residents with performance concerns is critical for early intervention. METHODS Program directors and 94 CCC members at 14 pediatric residency programs responded to a written survey prompt asking them to describe how they identify residents with performance concerns. Data was analyzed using thematic analysis. RESULTS Six themes emerged from analysis and were grouped into two domains. The first domain included four themes, each describing a path through which residents could meet or exceed a concern threshold:1) written comments from rotation assessments are foundational in identifying residents with performance concerns, 2) concerning performance extremes stand out, 3) isolated data points may accumulate to raise concern, and 4) developmental trajectory matters. The second domain focused on how CCC members and program directors interpret data to make decisions about residents with concerns and contained 2 themes: 1) using norm- and/or criterion-referenced interpretation, and 2) assessing the quality of the data that is reviewed. CONCLUSIONS Identifying residents with performance concerns is important for their education and the care they provide. This study delineates strategies used by CCC members across several programs for identifying these residents, which may be helpful for other CCCs to consider in their efforts.
Collapse
Affiliation(s)
- Daniel J Schumacher
- a Department of Pediatrics , Cincinnati Children's Hospital Medical Center, University of Cincinnati , Cincinnati , OH , USA
| | - Catherine Michelson
- b Department of Pediatrics , Boston Medical Center, Boston University School of Medicine , Boston , MA , USA
| | - Sue Poynter
- a Department of Pediatrics , Cincinnati Children's Hospital Medical Center, University of Cincinnati , Cincinnati , OH , USA
| | - Michelle M Barnes
- c Department of Pediatrics , University of Illinois at Chicago , Chicago , IL , USA
| | - Su-Ting T Li
- d Department of Pediatrics , University of California Davis , Sacramento , CA , USA
| | - Natalie Burman
- e Department of Pediatrics , Naval Medical Center San Diego , San Diego , CA , USA
| | - Daniel J Sklansky
- f Department of Pediatrics , University of Wisconsin School of Medicine and Public Health , Madison , WI , USA
| | - Lynn Thoreson
- g Department of Pediatrics , The University of Texas at Austin , Austin , TX , USA
| | - Sharon Calaman
- h Department of Pediatrics , St. Christopher's Hospital for Children, Drexel University College of Medicine , Philadelphia , PA , USA
| | - Beth King
- i Association of Pediatric Program Directors (APPD) Longitudinal Educational Assessment Research Network (LEARN) , McLean , VA , USA
| | - Alan Schwartz
- c Department of Pediatrics , University of Illinois at Chicago , Chicago , IL , USA
- i Association of Pediatric Program Directors (APPD) Longitudinal Educational Assessment Research Network (LEARN) , McLean , VA , USA
| | | | | | | | | | | | | | | | | | | | | |
Collapse
|
19
|
Ekpenyong A, Baker E, Harris I, Tekian A, Abrams R, Reddy S, Park YS. How do clinical competency committees use different sources of data to assess residents' performance on the internal medicine milestones?A mixed methods pilot study. MEDICAL TEACHER 2017; 39:1074-1083. [PMID: 28738746 DOI: 10.1080/0142159x.2017.1353070] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/24/2023]
Abstract
PURPOSE This study examines how Clinical Competency Committees (CCCs) synthesize assessment data to make judgments about residents' clinical performances. METHODS Between 2014 and 2015, after four six-month reporting periods to the Accreditation Council for Graduate Medical Education (ACGME), 7 of 16 CCC faculty at Rush University Medical Center completed questionnaires focused on their perspectives about rating residents on their achievement of the milestones and participated in a focus group. Qualitative data were analyzed using grounded theory. Milestones ratings for two six-month ACGME reporting cycles (n = 100 categorical residents) were also analyzed. RESULTS CCC members weighted resident rotation ratings highest (weight = 37%), followed by faculty rotation comments (weight = 27%) and personal experience with residents (weight = 14%) for making judgments about learner's milestone levels. Three assessment issues were identified from qualitative analyses: (1) "design issues" (e.g. problems with available data or lack thereof); (2) "synthesis issues" (e.g. factors influencing ratings and decision-making processes) and (3) "impact issues" (e.g. how CCC generated milestones ratings are used). CONCLUSIONS Identifying factors that affect assessment at all stages of the CCC process can contribute to improving assessment systems, including support for faculty development for CCCs. Recognizing challenges in synthesizing first and second-hand assessment data is an important step in understanding the CCC decision-making process.
Collapse
Affiliation(s)
- Andem Ekpenyong
- a Internal Medicine , Rush University Medical Center , Chicago , IL , USA
| | - Elizabeth Baker
- a Internal Medicine , Rush University Medical Center , Chicago , IL , USA
| | - Ilene Harris
- b Medical Education , University of Illinois, Chicago , Chicago , IL , USA
| | - Ara Tekian
- b Medical Education , University of Illinois, Chicago , Chicago , IL , USA
| | - Richard Abrams
- a Internal Medicine , Rush University Medical Center , Chicago , IL , USA
| | - Shalini Reddy
- c Internal Medicine , University of Chicago Pritzker School of Medicine , Chicago , IL , USA
| | - Yoon Soo Park
- b Medical Education , University of Illinois, Chicago , Chicago , IL , USA
| |
Collapse
|
20
|
Yao A, Massenburg BB, Silver L, Taub PJ. Initial Comparison of Resident and Attending Milestones Evaluations in Plastic Surgery. JOURNAL OF SURGICAL EDUCATION 2017; 74:773-779. [PMID: 28259488 DOI: 10.1016/j.jsurg.2017.02.001] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/27/2016] [Revised: 01/05/2017] [Accepted: 02/02/2017] [Indexed: 06/06/2023]
Abstract
BACKGROUND Graduate medical education has recently undergone a major archetypal shift toward competency-based evaluations of residents' performance. The implementation of the Milestones program by the Accreditation Council for Graduate Medical Education (ACGME) is a core component of the shift, designed to ensure uniformity in measuring residency knowledge using a series of specialty-specific achievements. This study evaluates the correlation between residents' self-evaluations and program directors' assessments of their performance. METHODS The study population comprised 12 plastic surgery residents, ranging from postgraduate year 1 to postgraduate year 6, enrolled in an integrated residency program at a single institution. RESULTS Overall, average attending scores were lower than average resident scores at all levels except postgraduate year 6. Correlation between resident and attending evaluations ranged from 0.417 to 0.957, with the correlation of average scores of Patient Care (0.854) and Medical Knowledge (0.816) Milestones significantly higher than those of professional skillsets (0.581). "Patient care, facial esthetics" was the Milestone with the lowest average scores from both groups. Residents scored themselves notably higher than their attendings' evaluations in Practice-based Learning and Improvement categories (+0.958) and notably lower in Medical Knowledge categories such as "Cosmetic Surgery, Trunk and Lower Extremities" (-0.375) and "Non-trauma hand" (-0.208). The total possible number of participants in this study was 12. The actual number of participants was 12 (100%). CONCLUSIONS The remarkable range of correlations suggests that expectations for performance standards may vary widely between residents and program directors. Understanding gaps between expectations and performance is vital to inform current and future residents as the restructuring of the accreditation process continues.
Collapse
Affiliation(s)
- Amy Yao
- Division of Plastic and Reconstructive Surgery, Icahn School of Medicine, New York, New York
| | - Benjamin B Massenburg
- Division of Plastic and Reconstructive Surgery, Icahn School of Medicine, New York, New York
| | - Lester Silver
- Division of Plastic and Reconstructive Surgery, Icahn School of Medicine, New York, New York
| | - Peter J Taub
- Division of Plastic and Reconstructive Surgery, Icahn School of Medicine, New York, New York.
| |
Collapse
|
21
|
Chahine S, Cristancho S, Padgett J, Lingard L. How do small groups make decisions? : A theoretical framework to inform the implementation and study of clinical competency committees. PERSPECTIVES ON MEDICAL EDUCATION 2017; 6:192-198. [PMID: 28534277 PMCID: PMC5466572 DOI: 10.1007/s40037-017-0357-x] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/25/2023]
Abstract
In the competency-based medical education (CBME) approach, clinical competency committees are responsible for making decisions about trainees' competence. However, we currently lack a theoretical model for group decision-making to inform this emerging assessment phenomenon. This paper proposes an organizing framework to study and guide the decision-making processes of clinical competency committees.This is an explanatory, non-exhaustive review, tailored to identify relevant theoretical and evidence-based papers related to small group decision-making. The search was conducted using Google Scholar, Web of Science, MEDLINE, ERIC, and PsycINFO for relevant literature. Using a thematic analysis, two researchers (SC & JP) met four times between April-June 2016 to consolidate the literature included in this review.Three theoretical orientations towards group decision-making emerged from the review: schema, constructivist, and social influence. Schema orientations focus on how groups use algorithms for decision-making. Constructivist orientations focus on how groups construct their shared understanding. Social influence orientations focus on how individual members influence the group's perspective on a decision. Moderators of decision-making relevant to all orientations include: guidelines, stressors, authority, and leadership.Clinical competency committees are the mechanisms by which groups of clinicians will be in charge of interpreting multiple assessment data points and coming to a shared decision about trainee competence. The way in which these committees make decisions can have huge implications for trainee progression and, ultimately, patient care. Therefore, there is a pressing need to build the science of how such group decision-making works in practice. This synthesis suggests a preliminary organizing framework that can be used in the implementation and study of clinical competency committees.
Collapse
Affiliation(s)
- Saad Chahine
- Centre for Education Research & Innovation, Schulich School of Medicine and Dentistry, Western University, London, Ontario, Canada.
| | - Sayra Cristancho
- Centre for Education Research & Innovation, Schulich School of Medicine and Dentistry, Western University, London, Ontario, Canada
| | - Jessica Padgett
- Centre for Education Research & Innovation, Schulich School of Medicine and Dentistry, Western University, London, Ontario, Canada
| | - Lorelei Lingard
- Centre for Education Research & Innovation, Schulich School of Medicine and Dentistry, Western University, London, Ontario, Canada
| |
Collapse
|
22
|
Meier AH, Gruessner A, Cooney RN. Using the ACGME Milestones for Resident Self-Evaluation and Faculty Engagement. JOURNAL OF SURGICAL EDUCATION 2016; 73:e150-e157. [PMID: 27886973 DOI: 10.1016/j.jsurg.2016.09.001] [Citation(s) in RCA: 47] [Impact Index Per Article: 5.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/19/2016] [Revised: 09/01/2016] [Accepted: 09/04/2016] [Indexed: 06/06/2023]
Abstract
BACKGROUND Since July 2014 General Surgery residency programs have been required to use the Accreditation Council for Graduate Medical Education milestones twice annually to assess the progress of their trainees. We felt this change was a great opportunity to use this new evaluation tool for resident self-assessment and to furthermore engage the faculty in the educational efforts of the program. METHODS We piloted the milestones with postgraduate year (PGY) II and IV residents during the 2013/2014 academic year to get faculty and residents acquainted with the instrument. In July 2014, we implemented the same protocol for all residents. Residents meet with their advisers quarterly. Two of these meetings are used for milestones assessment. The residents perform an independent self-evaluation and the adviser grades them independently. They discuss the evaluations focusing mainly on areas of greatest disagreement. The faculty member then presents the resident to the clinical competency committee (CCC) and the committee decides on the final scores and submits them to the Accreditation Council for Graduate Medical Education website. We stored all records anonymously in a MySQL database. We used Anova with Tukey post hoc analysis to evaluate differences between groups. We used intraclass correlation coefficients and Krippendorff's α to assess interrater reliability. RESULTS We analyzed evaluations for 44 residents. We created scale scores across all Likert items for each evaluation. We compared score differences by PGY level and raters (self, adviser, and CCC). We found highly significant increases of scores between most PGY levels (p < 0.05). There were no significant score differences per PGY level between the raters. The interrater reliability for the total score and 6 competency domains was very high (ICC: 0.87-0.98 and α: 0.84-0.97). Even though this milestone evaluation process added additional work for residents and faculty we had very good participation (93.9% by residents and 92.9% by faculty) and feedback was generally positive. CONCLUSION Even though implementation of the milestones has added additional work for general surgery residency programs, it has also opened opportunities to furthermore engage the residents in reflection and self-evaluation and to create additional venues for faculty to get involved with the educational process within the residency program. Using the adviser as the initial rater seems to correlate closely with the final CCC assessment. Self-evaluation by the resident is a requirement by the RRC and the milestones seem to be a good instrument to use for this purpose. Our early assessment suggests the milestones provide a useful instrument to track trainee progression through their residency.
Collapse
Affiliation(s)
- Andreas H Meier
- Department of Surgery, Education Office, Upstate Medical University, Syracuse, New York.
| | - Angelika Gruessner
- Department of Surgery, Education Office, Upstate Medical University, Syracuse, New York
| | - Robert N Cooney
- Department of Surgery, Education Office, Upstate Medical University, Syracuse, New York
| |
Collapse
|
23
|
Abstract
This article is the sixth in a 7-part series that aims to comprehensively describe the current state and future directions of pediatric emergency medicine (PEM) fellowship training from the essential requirements to considerations for successfully administering and managing a program to the careers that may be anticipated upon program completion. This article provides a broad overview of administering and supervising a PEM fellowship program. It explores 3 topics: the principles of program administration, committee management, and recommendations for minimum time allocated for PEM fellowship program directors to administer their programs.
Collapse
|
24
|
Dietl CA, Russell JC. Effect of Process Changes in Surgical Training on Quantitative Outcomes From Surgery Residency Programs. JOURNAL OF SURGICAL EDUCATION 2016; 73:807-818. [PMID: 27156139 DOI: 10.1016/j.jsurg.2016.03.015] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/18/2016] [Revised: 03/22/2016] [Accepted: 03/22/2016] [Indexed: 06/05/2023]
Abstract
OBJECTIVES The purpose of this article is to review the literature on process changes in surgical training programs and to evaluate their effect on the Accreditation Council of Graduate Medical Education (ACGME) Core Competencies, American Board of Surgery In-Training Examination (ABSITE) scores, and American Board of Surgery (ABS) certification. DESIGN A literature search was obtained from MEDLINE via PubMed.gov, ScienceDirect.com, Google Scholar on all peer-reviewed studies published since 2003 using the following search queries: surgery residency training, surgical education, competency-based surgical education, ACGME core competencies, ABSITE scores, and ABS pass rate. RESULTS Our initial search list included 990 articles on surgery residency training models, 539 on competency-based surgical education, 78 on ABSITE scores, and 33 on ABS pass rate. Overall, 31 articles met inclusion criteria based on their effect on ACGME Core Competencies, ABSITE scores, and ABS certification. Systematic review showed that 5/31, 19/31, and 6/31 articles on process changes in surgical training programs had a positive effect on patient care, medical knowledge, and ABSITE scores, respectively. ABS certification was not analyzed. The other ACGME core competencies were addressed in only 6 studies. CONCLUSIONS Several publications on process changes in surgical training programs have shown a positive effect on patient care, medical knowledge, and ABSITE scores. However, the effect on ABS certification, and other quantitative outcomes from residency programs, have not been addressed. Studies on education strategies showing evidence that residency program objectives are being achieved are still needed. This article addresses the 6 ACGME Core Competencies.
Collapse
Affiliation(s)
- Charles A Dietl
- Division of Cardiothoracic Surgery, Department of Surgery, University of New Mexico Health Sciences Center, Albuquerque, New Mexico.
| | - John C Russell
- Department of Surgery, University of New Mexico Health Sciences Center, Albuquerque, New Mexico
| |
Collapse
|
25
|
Doty CI, Roppolo LP, Asher S, Seamon JP, Bhat R, Taft S, Graham A, Willis J. How Do Emergency Medicine Residency Programs Structure Their Clinical Competency Committees? A Survey. Acad Emerg Med 2015; 22:1351-4. [PMID: 26473693 DOI: 10.1111/acem.12804] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/15/2015] [Revised: 05/04/2015] [Accepted: 06/10/2015] [Indexed: 11/30/2022]
Abstract
BACKGROUND The Accreditation Council for Graduate Medical Education (ACGME) recently has mandated the formation of a clinical competency committee (CCC) to evaluate residents across the newly defined milestone continuum. The ACGME has been nonproscriptive of how these CCCs are to be structured in order to provide flexibility to the programs. OBJECTIVES No best practices for the formation of CCCs currently exist. We seek to determine common structures of CCCs recently formed in the Council of Emergency Medicine Residency Directors (CORD) member programs and identify unique structures that have been developed. METHODS In this descriptive study, an 18-question survey was distributed via the CORD listserv in the late fall of 2013. Each member program was asked questions about the structure of its CCC. These responses were analyzed with simple descriptive statistics. RESULTS A total of 116 of the 160 programs responded, giving a 73% response rate. Of responders, most (71.6%) CCCs are chaired by the associate or assistant program director, while a small number (14.7%) are chaired by a core faculty member. Program directors (PDs) chair 12.1% of CCCs. Most CCCs are attended by the PD (85.3%) and selected core faculty members (78.5%), leaving the remaining committees attended by any core faculty. Voting members of the CCC consist of the residency leadership either with the PD (53.9%) or without the PD (36.5%) as a voting member. CCCs have an average attendance of 7.4 members with a range of three to 15 members. Of respondents, 53.1% of CCCs meet quarterly while 37% meet monthly. The majority of programs (76.4%) report a system to match residents with a faculty mentor or advisor. Of respondents, 36% include the resident's faculty mentor or advisor to discuss a particular resident. Milestone summaries (determination of level for each milestone) are the primary focus of discussion (93.8%), utilizing multiple sources of information. CONCLUSIONS The substantial variability and diversity found in our CORD survey of CCC structure and function suggest that there are myriad strategies that residency programs can use to match individual program needs and resources to requirements of the ACGME. Identifying a single protocol for CCC structure and development may prove challenging.
Collapse
Affiliation(s)
| | - Lynn P. Roppolo
- Department of Emergency Medicine; University of Texas; Southwestern Medical Center; Dallas TX
| | - Shellie Asher
- Department of Emergency Medicine; Albany Medical Center; Albany NY
| | - Jason P. Seamon
- Department of Emergency Medicine; Grand Rapids Medical Education Partners/Michigan State University; Grand Rapids MI
| | - Rahul Bhat
- Department of Emergency Medicine; Georgetown University; Washington DC
| | - Stephanie Taft
- Department of Emergency Medicine; East Carolina University; Greenville NC
| | - Autumn Graham
- Department of Emergency Medicine; Georgetown University; Washington DC
| | - James Willis
- Department of Emergency Medicine; SUNY Downstate Medical Center; Brooklyn NY
| |
Collapse
|
26
|
The New Accreditation Council for Graduate Medical Education Next Accreditation System Milestones Evaluation System. Plast Reconstr Surg 2015; 136:181-187. [DOI: 10.1097/prs.0000000000001368] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
|
27
|
Wagner JP, Chen DC, Donahue TR, Quach C, Hines OJ, Hiatt JR, Tillou A. Assessment of resident operative performance using a real-time mobile Web system: preparing for the milestone age. JOURNAL OF SURGICAL EDUCATION 2014; 71:e41-e46. [PMID: 25037504 DOI: 10.1016/j.jsurg.2014.06.008] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/28/2014] [Revised: 06/08/2014] [Accepted: 06/12/2014] [Indexed: 06/03/2023]
Abstract
OBJECTIVE To satisfy trainees' operative competency requirements while improving feedback validity and timeliness using a mobile Web-based platform. DESIGN The Southern Illinois University Operative Performance Rating Scale (OPRS) was embedded into a website formatted for mobile devices. From March 2013 to February 2014, faculty members were instructed to complete the OPRS form while providing verbal feedback to the operating resident at the conclusion of each procedure. Submitted data were compiled automatically within a secure Web-based spreadsheet. Conventional end-of-rotation performance (CERP) evaluations filed 2006 to 2013 and OPRS performance scores were compared by year of training using serial and independent-samples t tests. The mean CERP scores and OPRS overall resident operative performance scores were directly compared using a linear regression model. OPRS mobile site analytics were reviewed using a Web-based reporting program. SETTING Large university-based general surgery residency program. PARTICIPANTS General Surgery faculty used the mobile Web OPRS system to rate resident performance. Residents and the program director reviewed evaluations semiannually. RESULTS Over the study period, 18 faculty members and 37 residents logged 176 operations using the mobile OPRS system. There were 334 total OPRS website visits. Median time to complete an evaluation was 45 minutes from the end of the operation, and faculty spent an average of 134 seconds on the site to enter 1 assessment. In the 38,506 CERP evaluations reviewed, mean performance scores showed a positive linear trend of 2% change per year of training (p = 0.001). OPRS overall resident operative performance scores showed a significant linear (p = 0.001), quadratic (p = 0.001), and cubic (p = 0.003) trend of change per year of clinical training, reflecting the resident operative experience in our training program. Differences between postgraduate year-1 and postgraduate year-5 overall performance scores were greater with the OPRS (mean = 0.96, CI: 0.55-1.38) than with CERP measures (mean = 0.37, CI: 0.34-0.41). Additionally, there were consistent increases in each of the OPRS subcategories. CONCLUSIONS In contrast to CERPs, the OPRS fully satisfies the Accreditation Council for Graduate Medical Education and American Board of Surgery operative assessment requirements. The mobile Web platform provides a convenient interface, broad accessibility, automatic data compilation, and compatibility with common database and statistical software. Our mobile OPRS system encourages candid feedback dialog and generates a comprehensive review of individual and group-wide operative proficiency in real time.
Collapse
Affiliation(s)
- Justin P Wagner
- David Geffen School of Medicine, University of California, Los Angeles, California.
| | - David C Chen
- David Geffen School of Medicine, University of California, Los Angeles, California
| | - Timothy R Donahue
- David Geffen School of Medicine, University of California, Los Angeles, California
| | - Chi Quach
- David Geffen School of Medicine, University of California, Los Angeles, California
| | - O Joe Hines
- David Geffen School of Medicine, University of California, Los Angeles, California
| | - Jonathan R Hiatt
- David Geffen School of Medicine, University of California, Los Angeles, California
| | - Areti Tillou
- David Geffen School of Medicine, University of California, Los Angeles, California
| |
Collapse
|
28
|
Cogbill TH, Malangoni MA, Potts JR, Valentine RJ. The General Surgery Milestones Project. J Am Coll Surg 2014; 218:1056-62. [DOI: 10.1016/j.jamcollsurg.2014.02.016] [Citation(s) in RCA: 46] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/06/2014] [Accepted: 02/10/2014] [Indexed: 11/28/2022]
|