1
|
Pearce J, Reid K. When I say … assessment burden. MEDICAL EDUCATION 2025. [PMID: 40257010 DOI: 10.1111/medu.15708] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/25/2024] [Revised: 03/10/2025] [Accepted: 04/07/2025] [Indexed: 04/22/2025]
Affiliation(s)
- Jacob Pearce
- Specialist & Professional Education & Assessment Research, Australian Council for Educational Research, Camberwell, Australia
| | - Katharine Reid
- Specialist & Professional Education & Assessment Research, Australian Council for Educational Research, Camberwell, Australia
| |
Collapse
|
2
|
Gin BC, O'Sullivan PS, Hauer KE, Abdulnour RE, Mackenzie M, Ten Cate O, Boscardin CK. Entrustment and EPAs for Artificial Intelligence (AI): A Framework to Safeguard the Use of AI in Health Professions Education. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2025; 100:264-272. [PMID: 39761533 DOI: 10.1097/acm.0000000000005930] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/28/2025]
Abstract
ABSTRACT In this article, the authors propose a repurposing of the concept of entrustment to help guide the use of artificial intelligence (AI) in health professions education (HPE). Entrustment can help identify and mitigate the risks of incorporating generative AI tools with limited transparency about their accuracy, source material, and disclosure of bias into HPE practice. With AI's growing role in education-related activities, like automated medical school application screening and feedback quality and content appraisal, there is a critical need for a trust-based approach to ensure these technologies are beneficial and safe. Drawing parallels with HPE's entrustment concept, which assesses a trainee's readiness to perform clinical tasks-or entrustable professional activities-the authors propose assessing the trustworthiness of AI tools to perform an HPE-related task across 3 characteristics: ability (competence to perform tasks accurately), integrity (transparency and honesty), and benevolence (alignment with ethical principles). The authors draw on existing theories of entrustment decision-making to envision a structured way to decide on AI's role and level of engagement in HPE-related tasks, including proposing an AI-specific entrustment scale. Identifying tasks that AI could be entrusted with provides a focus around which considerations of trustworthiness and entrustment decision-making may be synthesized, making explicit the risks associated with AI use and identifying strategies to mitigate these risks. Responsible, trustworthy, and ethical use of AI requires health professions educators to develop safeguards for using it in teaching, learning, and practice-guardrails that can be operationalized via applying the entrustment concept to AI. Without such safeguards, HPE practice stands to be shaped by the oncoming wave of AI innovations tied to commercial motivations, rather than modeled after HPE principles-principles rooted in the trust and transparency that are built together with trainees and patients.
Collapse
|
3
|
Li M, Kurahashi AM, Kawaguchi S, Siemens I, Sirianni G, Myers J. When words are your scalpel, what and how information is exchanged may be differently salient to assessors. MEDICAL EDUCATION 2024; 58:1324-1332. [PMID: 38850193 DOI: 10.1111/medu.15458] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/05/2023] [Revised: 05/12/2024] [Accepted: 05/24/2024] [Indexed: 06/10/2024]
Abstract
PURPOSE Variable assessments of learner performances can occur when different assessors determine different elements to be differently important or salient. How assessors determine the importance of performance elements has historically been thought to occur idiosyncratically and thus be amenable to assessor training interventions. More recently, a main source of variation found among assessors was two underlying factors that were differently emphasised: medical expertise and interpersonal skills. This gave legitimacy to the theory that different interpretations of the same performance may represent multiple truths. A faculty development activity introducing assessors to entrustable professional activities in which they estimated a learner's level of readiness for entrustment provided an opportunity to qualitatively explore assessor variation in the context of an interaction and in a setting in which interpersonal skills are highly valued. METHODS Using a constructivist grounded theory approach, we explored variation in assessment processes among a group of palliative medicine assessors who completed a simulated direct observation and assessment of the same learner interaction. RESULTS Despite identifying similar learner strengths and areas for improvement, the estimated level of readiness for entrustment varied substantially among assessors. Those who estimated the learner as not yet ready for entrustment seemed to prioritise what information was exchanged and viewed missed information as performance gaps. Those who estimated the learner as ready for entrustment seemed to prioritise how information was exchanged and viewed the same missed information as personal style differences or appropriate clinical judgement. When presented with a summary, assessors expressed surprise and concern about the variation. CONCLUSION A main source of variation among our assessors was the differential salience of performance elements that align with medical expertise and interpersonal skills. These data support the theory that when assessing an interaction, differential salience for these two factors may be an important and perhaps inevitable source of assessor variation.
Collapse
Affiliation(s)
- Melissa Li
- Division of Palliative Care, Department of Family and Community Medicine, Temerty Faculty of Medicine, University of Toronto, Toronto, Canada
| | | | - Sarah Kawaguchi
- Division of Palliative Care, Department of Family and Community Medicine, Temerty Faculty of Medicine, University of Toronto, Toronto, Canada
| | - Isaac Siemens
- Division of Palliative Care, Department of Family and Community Medicine, Temerty Faculty of Medicine, University of Toronto, Toronto, Canada
| | - Giovanna Sirianni
- Division of Palliative Care, Department of Family and Community Medicine, Temerty Faculty of Medicine, University of Toronto, Toronto, Canada
| | - Jeff Myers
- Division of Palliative Care, Department of Family and Community Medicine, Temerty Faculty of Medicine, University of Toronto, Toronto, Canada
| |
Collapse
|
4
|
Cifra N, Pitts S, Mink R, Schwartz A, Herman B, Turner DA, Yussman S. Analysis of fellowship program director opinions of entrustable professional activities in adolescent medicine fellowship. Int J Adolesc Med Health 2024; 36:237-242. [PMID: 38522004 DOI: 10.1515/ijamh-2023-0154] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2023] [Accepted: 03/08/2024] [Indexed: 03/25/2024]
Abstract
OBJECTIVES This study aimed to explore the minimum entrustable professional activity (EPA) supervision levels at which pediatric fellowship program directors (FPDs) would be willing to graduate fellows and the levels deemed necessary for safe and effective practice for each of the common pediatric subspecialty and the four adolescent medicine-specific EPAs. METHODS This cross-sectional study utilized survey data from pediatric FPDs in 2017. FPDs indicated the minimum level of supervision (LOS) for fellows at graduation and for safe and effective practice. RESULTS 82 percent (23/28) of adolescent medicine FPDs completed the survey. For each EPA, there were differences (p<0.05) between LOS expected for graduation and for safe and effective practice. There was also variability in the level at which FPDs would graduate fellows. CONCLUSIONS This study summarizes pediatric FPD opinions regarding the minimum levels of supervision required for fellows at the time of graduation as well as the levels deemed necessary for safe and effective practice. The difference between the minimum LOS at which FPDs would graduate a fellow and that deemed appropriate for safe and effective practice, along with variability in minimum LOS for graduation, highlight the need for clearer standards for fellowship graduation as well as more structured early career support for ongoing learning. These data highlight variability in FPD opinion regarding such expectations and both the need to better define desired training outcomes and potential need for post-graduation supervision in clinical practice.
Collapse
Affiliation(s)
- Nicole Cifra
- Department of Pediatrics, University of Pennsylvania Perelman School of Medicine, Philadelphia, USA
| | - Sarah Pitts
- Department of Pediatrics, Boston Children's Hospital, Boston, USA
| | | | - Alan Schwartz
- Department of Pediatrics, University of Utah School of Medicine, Salt Lake City, USA
| | | | | | - Susan Yussman
- Department of Pediatrics, University of Rochester School of Medicine and Dentistry, Rochester, USA
| |
Collapse
|
5
|
Ramaswamy V, Danciu T, Kennedy EN, Romito L, Stewart D, Gul G, Marucha P, Quinonez RB. American Dental Education Association Compendium Entrustable Professional Activities Workgroup report. J Dent Educ 2024; 88:639-653. [PMID: 38693898 DOI: 10.1002/jdd.13542] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/27/2024] [Accepted: 03/27/2024] [Indexed: 05/03/2024]
Abstract
PURPOSE Entrustable professional activities (EPAs) are discrete clinical tasks that can be evaluated to help define readiness for independent practice in the health professions and are intended to increase trust in the dental graduate. EPAs provide a framework that bridges competencies to clinical practice. This report describes the work of the American Dental Education Association (ADEA) Compendium EPA Workgroup to develop a list of EPAs for dental education and supportive resources, including specifications and a glossary. METHODS Preliminary work including literature and resource review, mapping of existing competencies, and review of other health professions' EPAs informed the development of our EPAs list. Workgroup members achieved consensus using a modified Delphi process. A Qualtrics survey using a validated rubric for the assessment of EPAs as described in peer-reviewed literature was used. Dental educators, including academic deans, were surveyed for feedback on the content and format of the EPAs. RESULTS Based on findings in the literature analysis of existing EPAs and competencies in health professions, a list of EPAs was developed along with a description of specifications. The EPA workgroup (nine members from multiple institutions) used the Delphi process in receiving feedback from various experts. A list of 11 core EPAs was vetted by dental educators including academic deans (n = ∼23), and the process of development was reviewed by EPAs experts outside dental education. A glossary was developed to align language. CONCLUSION These EPAs define the scope of dental practice. This report represents Phase 1 of the EPA framework development and vetting process. Future directions will include a broader vetting of the EPA list, faculty development, and national standardized technology that support this work to optimize implementation.
Collapse
Affiliation(s)
- Vidya Ramaswamy
- Director for Curriculum Evaluation and Promotion of Teaching and Learning at the University of Michigan, School of Dentistry, Ann Arbor, Michigan, USA
| | - Theodora Danciu
- Clinical Professor and Director of Engaged Learning and Assessment at the University of Michigan, School of Dentistry, Ann Arbor, Michigan, USA
| | - Erinne N Kennedy
- Assistant Professor and Assistant Dean for Curriculum and Integrated Learning at Kansas City University College of Dental Medicine, Joplin, Missouri, USA
| | - Laura Romito
- Professor and Associate Dean of Education and Academic Affairs at the Indiana University School of Dentistry, Indianapolis, Indiana, USA
| | - Denice Stewart
- Adjunct Professor at the University of North Carolina at Chapel Hill Adams School of Dentistry, Chapel Hill, North Carolina, USA
| | - Gulsun Gul
- Chief of Innovation, Clinical Education & Public Health at the American Dental Education Association, Washington, District of Columbia, USA
| | - Phillip Marucha
- Co-Chair, ADEA EPA group; Professor, Oregon Health & Science University School of Dentistry, Portland, Oregon, USA
| | - Rocio B Quinonez
- Co-Chair, ADEA EPA group; Professor and Associate Dean for Curriculum, University of North Carolina at Chapel Hill Adams School of Dentistry, Chapel Hill, North Carolina, USA
| |
Collapse
|
6
|
Caretta-Weyer HA, Smirnova A, Barone MA, Frank JR, Hernandez-Boussard T, Levinson D, Lombarts KMJMH, Lomis KD, Martini A, Schumacher DJ, Turner DA, Schuh A. The Next Era of Assessment: Building a Trustworthy Assessment System. PERSPECTIVES ON MEDICAL EDUCATION 2024; 13:12-23. [PMID: 38274558 PMCID: PMC10809864 DOI: 10.5334/pme.1110] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/16/2023] [Accepted: 12/18/2023] [Indexed: 01/27/2024]
Abstract
Assessment in medical education has evolved through a sequence of eras each centering on distinct views and values. These eras include measurement (e.g., knowledge exams, objective structured clinical examinations), then judgments (e.g., workplace-based assessments, entrustable professional activities), and most recently systems or programmatic assessment, where over time multiple types and sources of data are collected and combined by competency committees to ensure individual learners are ready to progress to the next stage in their training. Significantly less attention has been paid to the social context of assessment, which has led to an overall erosion of trust in assessment by a variety of stakeholders including learners and frontline assessors. To meaningfully move forward, the authors assert that the reestablishment of trust should be foundational to the next era of assessment. In our actions and interventions, it is imperative that medical education leaders address and build trust in assessment at a systems level. To that end, the authors first review tenets on the social contextualization of assessment and its linkage to trust and discuss consequences should the current state of low trust continue. The authors then posit that trusting and trustworthy relationships can exist at individual as well as organizational and systems levels. Finally, the authors propose a framework to build trust at multiple levels in a future assessment system; one that invites and supports professional and human growth and has the potential to position assessment as a fundamental component of renegotiating the social contract between medical education and the health of the public.
Collapse
Affiliation(s)
- Holly A. Caretta-Weyer
- Department of Emergency Medicine, Stanford University School of Medicine, Palo Alto, California, USA
| | - Alina Smirnova
- Department of Family Medicine, University of Calgary, Calgary, Alberta, Canada
- Kern Institute for the Transformation of Medical Education, Medical College of Wisconsin, Milwaukee, Wisconsin, USA
| | - Michael A. Barone
- NBME, Philadelphia, Pennsylvania, USA
- Johns Hopkins University School of Medicine, Baltimore, Maryland, USA
| | - Jason R. Frank
- Department of Emergency Medicine, University of Ottawa, Ottawa, Ontario, CA
| | | | - Dana Levinson
- Josiah Macy Jr Foundation, Philadelphia, Pennsylvania, USA
| | - Kiki M. J. M. H. Lombarts
- Department of Medical Psychology, Amsterdam University Medical Centers, University of Amsterdam, NL
- Amsterdam Public Health research institute, Amsterdam, NL
| | - Kimberly D. Lomis
- Undergraduate Medical Education Innovations, American Medical Association, Chicago, Illinois, USA
| | - Abigail Martini
- Cincinnati Children’s Hospital Medical Center, Cincinnati, Ohio, USA
| | - Daniel J. Schumacher
- Division of Emergency Medicine, Cincinnati Children’s Hospital Medical Center/University of Cincinnati College of Medicine, Cincinnati, Ohio, USA
| | - David A. Turner
- American Board of Pediatrics, Chapel Hill, North Carolina, USA
| | - Abigail Schuh
- Division of Emergency Medicine, Medical College of Wisconsin, Milwaukee, Wisconsin, USA
| |
Collapse
|
7
|
Osborne KC, Barbagallo C, Aldaoud A, Guille J, Campbell A, Anderson M, Pearce J. Reforming Medical Physics and Radiopharmaceutical Science Training Through a Programmatic Approach to Assessment. JOURNAL OF MEDICAL EDUCATION AND CURRICULAR DEVELOPMENT 2024; 11:23821205241271539. [PMID: 39246600 PMCID: PMC11378185 DOI: 10.1177/23821205241271539] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/02/2024] [Accepted: 06/20/2024] [Indexed: 09/10/2024]
Abstract
OBJECTIVES Programmatic assessment approaches can be extended to the design of allied health professions training, to enhance the learning of trainees. The Australasian College of Physical Scientists and Engineers in Medicine worked with assessment specialists at the Australian Council for Educational Research and Amplexa Consulting, to revise their medical physics and radiopharmaceutical science training programs. One of the central aims of the revisions was to produce a training program that provides standardized training support to their registrars throughout the 3 years, better supporting their registrars to successfully complete the program in the time frame through providing timely and constructive feedback on the registrar's progression. METHODS We used the principles of programmatic assessment to revise the assessment methods and progression decisions in the three training programs. RESULTS We revised the 3-year training programs for diagnostic imaging medical physics, radiation oncology medical physics and radiopharmaceutical science in Australia and New Zealand, incorporating clear stages of training and associated progression points. CONCLUSIONS We discuss the advantages and difficulties that have arisen with this implementation. We found 5 key elements necessary for implementing programmatic assessment in these specialized contexts: embracing blurred boundaries between assessment of and for learning, adapting the approach to each specialized context, change management, engaging subject matter experts, and clear communication to registrars/trainees.
Collapse
Affiliation(s)
- Kristy C Osborne
- Education Research, Policy and Development Division, Australian Council for Educational Research, Camberwell, VIC, Australia
| | - Cathryn Barbagallo
- National Training Education and Assessment Program, Australasian College of Physical Scientists and Engineers in Medicine, Mascot, NSW, Australia
| | - Ammar Aldaoud
- Assessment and Psychometric Research Division, Australian Council for Educational Research, Camberwell, VIC, Australia
| | - Jennifer Guille
- National Training Education and Assessment Program, Australasian College of Physical Scientists and Engineers in Medicine, Mascot, NSW, Australia
| | - Andrew Campbell
- National Training Education and Assessment Program, Australasian College of Physical Scientists and Engineers in Medicine, Mascot, NSW, Australia
| | | | - Jacob Pearce
- Education Research, Policy and Development Division, Australian Council for Educational Research, Camberwell, VIC, Australia
| |
Collapse
|
8
|
Szulewski A, Braund H, Dagnone DJ, McEwen L, Dalgarno N, Schultz KW, Hall AK. The Assessment Burden in Competency-Based Medical Education: How Programs Are Adapting. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2023; 98:1261-1267. [PMID: 37343164 DOI: 10.1097/acm.0000000000005305] [Citation(s) in RCA: 20] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/23/2023]
Abstract
Residents and faculty have described a burden of assessment related to the implementation of competency-based medical education (CBME), which may undermine its benefits. Although this concerning signal has been identified, little has been done to identify adaptations to address this problem. Grounded in an analysis of an early Canadian pan-institutional CBME adopter's experience, this article describes postgraduate programs' adaptations related to the challenges of assessment in CBME. From June 2019-September 2022, 8 residency programs underwent a standardized Rapid Evaluation guided by the Core Components Framework (CCF). Sixty interviews and 18 focus groups were held with invested partners. Transcripts were analyzed abductively using CCF, and ideal implementation was compared with enacted implementation. These findings were then shared back with program leaders, adaptations were subsequently developed, and technical reports were generated for each program. Researchers reviewed the technical reports to identify themes related to the burden of assessment with a subsequent focus on identifying adaptations across programs. Three themes were identified: (1) disparate mental models of assessment processes in CBME, (2) challenges in workplace-based assessment processes, and (3) challenges in performance review and decision making. Theme 1 included entrustment interpretation and lack of shared mindset for performance standards. Adaptations included revising entrustment scales, faculty development, and formalizing resident membership. Theme 2 involved direct observation, timeliness of assessment completion, and feedback quality. Adaptations included alternative assessment strategies beyond entrustable professional activity forms and proactive assessment planning. Theme 3 related to resident data monitoring and competence committee decision making. Adaptations included adding resident representatives to the competence committee and assessment platform enhancements. These adaptations represent responses to the concerning signal of significant burden of assessment within CBME being experienced broadly. The authors hope other programs may learn from their institution's experience and navigate the CBME-related assessment burden their invested partners may be facing.
Collapse
Affiliation(s)
- Adam Szulewski
- A. Szulewski is associate professor, Departments of Emergency Medicine and Psychology, and educational scholarship lead, Postgraduate Medical Education, Queen's University, Kingston, Ontario, Canada; ORCID: https://orcid.org/0000-0002-3076-6221
| | - Heather Braund
- H. Braund is associate director of scholarship and simulation education, Office of Professional Development and Educational Scholarship, and assistant (adjunct) professor, Department of Biomedical and Molecular Sciences and School of Medicine, Queen's University, Kingston, Ontario, Canada; ORCID: https://orcid.org/0000-0002-9749-7193
| | - Damon J Dagnone
- D.J. Dagnone is associate professor, Department of Emergency Medicine, Queen's University, Kingston, Ontario, Canada; ORCID: https://orcid.org/0000-0001-6963-7948
| | - Laura McEwen
- L. McEwen is director of assessment and evaluation of postgraduate medical education and assistant professor, Department of Pediatrics, Postgraduate Medical Education, Queen's University, Kingston, Ontario, Canada; ORCID: https://orcid.org/0000-0003-2457-5311
| | - Nancy Dalgarno
- N. Dalgarno is director of education scholarship, Office of Professional Development and Educational Scholarship, and assistant professor (adjunct), Department of Biomedical and Molecular Sciences and Master of Health Professions Education, Queen's University, Kingston, Ontario, Canada; ORCID: https://orcid.org/0000-0001-7932-9949
| | - Karen W Schultz
- K.W. Schultz is professor, Department of Family Medicine, and associate dean of postgraduate medical education, Queen's University, Kingston, Ontario, Canada; ORCID: https://orcid.org/0000-0003-0208-3981
| | - Andrew K Hall
- A.K. Hall is associate professor and vice chair of education, Department of Emergency Medicine, University of Ottawa, and clinician educator, Royal College of Physicians and Surgeons of Canada, Ottawa, Ontario, Canada; ORCID: https://orcid.org/0000-0003-1227-5397
| |
Collapse
|
9
|
Chin M, Pack R, Cristancho S. "A whole other competence story": exploring faculty perspectives on the process of workplace-based assessment of entrustable professional activities. ADVANCES IN HEALTH SCIENCES EDUCATION : THEORY AND PRACTICE 2023; 28:369-385. [PMID: 35997910 DOI: 10.1007/s10459-022-10156-0] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/24/2022] [Accepted: 08/07/2022] [Indexed: 05/11/2023]
Abstract
The centrality of entrustable professional activities (EPAs) in competency-based medical education (CBME) is predicated on the assumption that low-stakes, high-frequency workplace-based assessments used in a programmatic approach will result in accurate and defensible judgments of competence. While there have been conversations in the literature regarding the potential of this approach, only recently has the conversation begun to explore the actual experiences of clinical faculty in this process. The purpose of this qualitative study was to explore the process of EPA assessment for faculty in everyday practice. We conducted 18 semi-structured interviews with Anesthesia faculty at a Canadian academic center. Participants were asked to describe how they engage in EPA assessment in daily practice and the factors they considered. Interviews were audio-recorded, transcribed, and analysed using the constant comparative method of grounded theory. Participants in this study perceived two sources of tension in the EPA assessment process that influenced their scoring on official forms: the potential constraints of the assessment forms and the potential consequences of their assessment outcome. This was particularly salient in circumstances of uncertainty regarding the learner's level of competence. Ultimately, EPA assessment in CBME may be experienced as higher-stakes by faculty than officially recognized due to these tensions, suggesting a layer of discomfort and burden in the process that may potentially interfere with the goal of assessment for learning. Acknowledging and understanding the nature of this burden and identifying strategies to mitigate it are critical to achieving the assessment goals of CBME.
Collapse
Affiliation(s)
- Melissa Chin
- Department of Anesthesia and Perioperative Medicine, London Health Sciences Centre, Schulich School of Medicine and Dentistry, University of Western Ontario, London, ON, Canada.
| | - Rachael Pack
- Center for Education Research and Innovation, University of Western Ontario, London, ON, Canada
| | - Sayra Cristancho
- Center for Education Research and Innovation, University of Western Ontario, London, ON, Canada
| |
Collapse
|
10
|
Kim ME, Tretter J, Wilmot I, Hahn E, Redington A, McMahon CJ. Entrustable Professional Activities and Their Relevance to Pediatric Cardiology Training. Pediatr Cardiol 2022; 44:757-768. [PMID: 36576524 PMCID: PMC9795145 DOI: 10.1007/s00246-022-03067-9] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/16/2022] [Accepted: 11/29/2022] [Indexed: 12/29/2022]
Abstract
Entrustable professional activities (EPAs) have become a popular framework for medical trainee assessment and a supplemental component for milestone and competency assessment. EPAs were developed to facilitate assessment of competencies and furthermore to facilitate translation into clinical practice. In this review, we explore the rationale for the introduction of EPAs, examine whether they fulfill the promise expected of them, and contemplate further developments in their application with specific reference to training in pediatric cardiology.
Collapse
Affiliation(s)
- Michael E. Kim
- Department of Pediatrics, College of Medicine, Heart Institute, Cincinnati Children’s Hospital Medical Center, University of Cincinnati, Cincinnati, OH USA
| | - Justin Tretter
- Department of Pediatric Cardiology, Pediatric Institute, Cleveland Clinic Children’s, and The Heart, Vascular, and Thoracic Institute, Cleveland Clinic, 9500 Euclid Avenue, M-41, Cleveland, OH 44195 USA
| | - Ivan Wilmot
- Department of Pediatrics, College of Medicine, Heart Institute, Cincinnati Children’s Hospital Medical Center, University of Cincinnati, Cincinnati, OH USA
| | - Eunice Hahn
- Department of Pediatrics, College of Medicine, Heart Institute, Cincinnati Children’s Hospital Medical Center, University of Cincinnati, Cincinnati, OH USA
| | - Andrew Redington
- Department of Pediatrics, College of Medicine, Heart Institute, Cincinnati Children’s Hospital Medical Center, University of Cincinnati, Cincinnati, OH USA
| | - Colin J. McMahon
- Department of Paediatric Cardiology, Children’s Health Ireland at Crumlin, Crumlin, Dublin Ireland ,School of Medicine, University College Dublin, Dublin 4, Belfield, Ireland ,School of Health Professions Education, Maastricht University, Maastricht, Netherlands
| |
Collapse
|
11
|
Early Outcomes from a Pediatric Education Research Unit. J Pediatr 2022; 249:3-5.e1. [PMID: 35227756 DOI: 10.1016/j.jpeds.2022.02.044] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/22/2022] [Accepted: 02/22/2022] [Indexed: 11/21/2022]
|
12
|
Pearce J. What do student experiences of programmatic assessment tell us about scoring programmatic assessment data? MEDICAL EDUCATION 2022; 56:872-875. [PMID: 35698736 DOI: 10.1111/medu.14852] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/31/2022] [Accepted: 06/06/2022] [Indexed: 06/15/2023]
Affiliation(s)
- Jacob Pearce
- Australian Council for Educational Research - Tertiary Education (Assessment), Camberwell, Victoria, Australia
| |
Collapse
|
13
|
Reimagining the Clinical Competency Committee to Enhance Education and Prepare for Competency-Based Time-Variable Advancement. J Gen Intern Med 2022; 37:2280-2290. [PMID: 35445932 PMCID: PMC9021365 DOI: 10.1007/s11606-022-07515-3] [Citation(s) in RCA: 14] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/15/2021] [Accepted: 03/25/2022] [Indexed: 12/01/2022]
Abstract
Assessing residents and clinical fellows is a high-stakes activity. Effective assessment is important throughout training so that identified areas of strength and weakness can guide educational planning to optimize outcomes. Assessment has historically been underemphasized although medical education oversight organizations have strengthened requirements in recent years. Growing acceptance of competency-based medical education and its logical extension to competency-based time-variable (CB-TV) graduate medical education (GME) further highlights the importance of implementing effective evidence-based approaches to assessment. The Clinical Competency Committee (CCC) has emerged as a key programmatic structure in graduate medical education. In the context of launching a multi-specialty pilot of CB-TV GME in our health system, we have examined several program's CCC processes and reviewed the relevant literature to propose enhancements to CCCs. We recommend that all CCCs fulfill three core goals, regularly applied to every GME trainee: (1) discern and describe the resident's developmental status to individualize education, (2) determine readiness for unsupervised practice, and (3) foster self-assessment ability. We integrate the literature and observations from GME program CCCs in our institutions to evaluate how current CCC processes support or undermine these goals. Obstacles and key enablers are identified. Finally, we recommend ways to achieve the stated goals, including the following: (1) assess and promote the development of competency in all trainees, not just outliers, through a shared model of assessment and competency-based advancement; (2) strengthen CCC assessment processes to determine trainee readiness for independent practice; and (3) promote trainee reflection and informed self-assessment. The importance of coaching for competency, robust workplace-based assessments, feedback, and co-production of individualized learning plans are emphasized. Individual programs and their CCCs must strengthen assessment tools and frameworks to realize the potential of competency-oriented education.
Collapse
|
14
|
Gin BC, Ten Cate O, O'Sullivan PS, Hauer KE, Boscardin C. Exploring how feedback reflects entrustment decisions using artificial intelligence. MEDICAL EDUCATION 2022; 56:303-311. [PMID: 34773415 DOI: 10.1111/medu.14696] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/27/2021] [Revised: 11/02/2021] [Accepted: 11/05/2021] [Indexed: 06/13/2023]
Abstract
CONTEXT Clinical supervisors make judgements about how much to trust learners with critical activities in patient care. Such decisions mediate trainees' opportunities for learning and competency development and thus are a critical component of education. As educators apply entrustment frameworks to assessment, it is important to determine how narrative feedback reflecting entrustment may also address learners' educational needs. METHODS In this study, we used artificial intelligence (AI) and natural language processing (NLP) to identify characteristics of feedback tied to supervisors' entrustment decisions during direct observation encounters of clerkship medical students (3328 unique observations). Supervisors conducted observations of students and collaborated with them to complete an entrustment-based assessment in which they documented narrative feedback and assigned an entrustment rating. We trained a deep neural network (DNN) to predict entrustment levels from the narrative data and developed an explainable AI protocol to uncover the latent thematic features the DNN used to make its prediction. RESULTS We found that entrustment levels were associated with level of detail (specific steps for performing clinical tasks), feedback type (constructive versus reinforcing) and task type (procedural versus cognitive). In justifying both high and low levels of entrustment, supervisors detailed concrete steps that trainees performed (or did not yet perform) competently. CONCLUSIONS Framing our results in the factors previously identified as influencing entrustment, we find a focus on performance details related to trainees' clinical competency as opposed to nonspecific feedback on trainee qualities. The entrustment framework reflected in feedback appeared to guide specific goal-setting, combined with details necessary to reach those goals. Our NLP methodology can also serve as a starting point for future work on entrustment and feedback as similar assessment datasets accumulate.
Collapse
Affiliation(s)
- Brian C Gin
- Department of Pediatrics, University of California San Francisco, San Francisco, CA, USA
| | - Olle Ten Cate
- Utrecht Center for Research and Development of Health Professions Education, University Medical Center, Utrecht, The Netherlands
- Department of Medicine, University of California San Francisco, San Francisco, CA, USA
| | - Patricia S O'Sullivan
- Department of Medicine, University of California San Francisco, San Francisco, CA, USA
- Department of Surgery, University of California San Francisco, San Francisco, CA, USA
| | - Karen E Hauer
- Department of Medicine, University of California San Francisco, San Francisco, CA, USA
| | - Christy Boscardin
- Department of Medicine, University of California San Francisco, San Francisco, CA, USA
- Department of Anesthesia, University of California San Francisco, San Francisco, CA, USA
| |
Collapse
|
15
|
Schumacher DJ, Turner DA. Entrustable Professional Activities: Reflecting on Where We Are to Define a Path for the Next Decade. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2021; 96:S1-S5. [PMID: 34183594 DOI: 10.1097/acm.0000000000004097] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Affiliation(s)
- Daniel J Schumacher
- D.J. Schumacher is associate professor of pediatrics, Cincinnati Children's Hospital Medical Center and University of Cincinnati College of Medicine, Cincinnati, Ohio
| | - David A Turner
- D.A. Turner is vice president for competency-based medical education, American Board of Pediatrics, Chapel Hill, North Carolina
| |
Collapse
|