1
|
Lin YH, Yang YY, Chen CH, Huang CC, Li CP, Shen HC, Yeh HY, Liang JF, Chu SY, Lirng JF, Wang SJ. Can routine EPA-based assessments predict OSCE performances of undergraduate medical students? MEDICAL TEACHER 2024:1-10. [PMID: 39400116 DOI: 10.1080/0142159x.2024.2413024] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/06/2024] [Accepted: 10/02/2024] [Indexed: 10/15/2024]
Abstract
BACKGROUND Objective structured clinical examination (OSCE) is used worldwide. This study aims to explore potential alternatives to the OSCE by using entrustable professional activities (EPA)-based assessments in the workplace. METHODS This study enrolled 265 six-year undergraduate medical students (UGY) from 2021 to 2023. During their rotations, students were assessed using 13 EPAs, with the grading methods modified to facilitate application. Before graduation, they participated in two mock OSCEs and a National OSCE. We used generalized estimating equations to analyze the associations between the EPA assessments and the OSCE scores, adjusting for age and sex, and developed a prediction model. EPA8 and EPA9, which represent advanced abilities that were not significant in the regression models, were removed from the prediction model. RESULTS Most EPAs were significantly correlated with OSCE scores across the three cohorts. The prediction model for forecasting passing in the three OSCEs demonstrated fair predictive capacity (area under curve = 0.82, 0.66, and 0.71 for students graduated in 2021, 2022, and 2023, respectively all p < 0.05). CONCLUSIONS The workplace-based assessments (EPA) showed a high correlation with competency-based assessments in simulated settings (OSCE). EPAs may serve as alternative tools to formal OSCE for medical students.
Collapse
Affiliation(s)
- Yi-Hsuan Lin
- Division of Clinical Skills Training, Department of Medical Education, Taipei Veterans General Hospital, Taipei, Taiwan
- School of Medicine, National Yang Ming Chiao Tung University, Taipei, Taiwan
- Department of Family Medicine, Taipei Veterans General Hospital, Taipei, Taiwan
| | - Ying-Ying Yang
- School of Medicine, National Yang Ming Chiao Tung University, Taipei, Taiwan
- Department of Medical Education, Taipei Veterans General Hospital, Taipei, Taiwan
| | - Chen-Huan Chen
- School of Medicine, National Yang Ming Chiao Tung University, Taipei, Taiwan
- Department of Medical Education, Taipei Veterans General Hospital, Taipei, Taiwan
| | - Chia-Chang Huang
- Division of Clinical Skills Training, Department of Medical Education, Taipei Veterans General Hospital, Taipei, Taiwan
- School of Medicine, National Yang Ming Chiao Tung University, Taipei, Taiwan
| | - Chung-Pin Li
- Division of Clinical Skills Training, Department of Medical Education, Taipei Veterans General Hospital, Taipei, Taiwan
- School of Medicine, National Yang Ming Chiao Tung University, Taipei, Taiwan
| | - Hsiao-Chin Shen
- School of Medicine, National Yang Ming Chiao Tung University, Taipei, Taiwan
- Department of Medical Education, Taipei Veterans General Hospital, Taipei, Taiwan
| | - Hsiao-Yun Yeh
- School of Medicine, National Yang Ming Chiao Tung University, Taipei, Taiwan
- Department of Family Medicine, Taipei Veterans General Hospital, Taipei, Taiwan
- Department of Medical Education, Taipei Veterans General Hospital, Taipei, Taiwan
| | - Jen-Feng Liang
- School of Medicine, National Yang Ming Chiao Tung University, Taipei, Taiwan
- Center of Evidence Based Medicine, Taipei Veterans General Hospital, Taipei, Taiwan
| | - Shao-Yin Chu
- School of Medicine, Tzu Chi University, Hualien, Taiwan
- Department of Medical Education and Pediatrics, Hualien Tzu Chi Hospital, Buddhist Tzu Chi Medical Foundation, Hualien, Taiwan
| | - Jiing-Feng Lirng
- School of Medicine, National Yang Ming Chiao Tung University, Taipei, Taiwan
| | - Shuu-Jiun Wang
- School of Medicine, National Yang Ming Chiao Tung University, Taipei, Taiwan
- Department of Medical Education, Taipei Veterans General Hospital, Taipei, Taiwan
| |
Collapse
|
2
|
Moore J, Chan T, Doucette J, Lipps T, Slager D. Defining Nurse Practitioner Core Entrustable Professional Activities: Essential Step Toward Competency-Based Education. Nurse Educ 2024; 49:235-240. [PMID: 38857420 DOI: 10.1097/nne.0000000000001673] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 06/12/2024]
Abstract
BACKGROUND Gaps between educational preparation and clinical practice readiness have led to innovative approaches to competence assessment. Entrustable professional activities (EPAs) show promise as a competence assessment framework in graduate nursing education. PURPOSE This study sought to develop and validate a set of EPAs that reflect the core activities performed by all nurse practitioners (NPs). METHODS Eight EPAs were developed. A Delphi approach was used to validate the EPAs by NP practice experts located across the United States and representing most NP populations. RESULTS Consensus was reached after 2 Delphi rounds. CONCLUSIONS The EPAs developed and validated in this study map multiple advanced-level NP competencies to workplace expectations and provide a shared framework for competency-based workplace assessment among NP preceptors from varied health care professions.
Collapse
Affiliation(s)
- Jeanne Moore
- Conway School of Nursing, The Catholic University of America, Washington, District of Columbia (Dr Moore); College of Nursing and Health, Madonna University, Livonia, Michigan (Dr Chan); College of Nursing, Rush University, Chicago, Illinois (Dr Doucette); School of Nursing, Beal University, Bangor, Maine (Dr Lipps); and Kirkhof College of Nursing, Grand Valley State University, Allendale, Michigan (Dr Slager)
| | | | | | | | | |
Collapse
|
3
|
Ryan MS, Gielissen KA, Shin D, Perera RA, Gusic M, Ferenchick G, Ownby A, Cutrer WB, Obeso V, Santen SA. How well do workplace-based assessments support summative entrustment decisions? A multi-institutional generalisability study. MEDICAL EDUCATION 2024; 58:825-837. [PMID: 38167833 DOI: 10.1111/medu.15291] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/25/2023] [Revised: 10/27/2023] [Accepted: 11/16/2023] [Indexed: 01/05/2024]
Abstract
BACKGROUND Assessment of the Core Entrustable Professional Activities for Entering Residency requires direct observation through workplace-based assessments (WBAs). Single-institution studies have demonstrated mixed findings regarding the reliability of WBAs developed to measure student progression towards entrustment. Factors such as faculty development, rater engagement and scale selection have been suggested to improve reliability. The purpose of this investigation was to conduct a multi-institutional generalisability study to determine the influence of specific factors on reliability of WBAs. METHODS The authors analysed WBA data obtained for clerkship-level students across seven institutions from 2018 to 2020. Institutions implemented a variety of strategies including selection of designated assessors, altered scales and different EPAs. Data were aggregated by these factors. Generalisability theory was then used to examine the internal structure validity evidence of the data. An unbalanced cross-classified random-effects model was used to decompose variance components. A phi coefficient of >0.7 was used as threshold for acceptable reliability. RESULTS Data from 53 565 WBAs were analysed, and a total of 77 generalisability studies were performed. Most data came from EPAs 1 (n = 17 118, 32%) 2 (n = 10 237, 19.1%), and 6 (n = 6000, 18.5%). Low variance attributed to the learner (<10%) was found for most (59/77, 76%) analyses, resulting in a relatively large number of observations required for reasonable reliability (range = 3 to >560, median = 60). Factors such as DA, scale or EPA were not consistently associated with improved reliability. CONCLUSION The results from this study describe relatively low reliability in the WBAs obtained across seven sites. Generalisability for these instruments may be less dependent on factors such as faculty development, rater engagement or scale selection. When used for formative feedback, data from these instruments may be useful. However, such instruments do not consistently provide reasonable reliability to justify their use in high-stakes summative entrustment decisions.
Collapse
Affiliation(s)
- Michael S Ryan
- Department of Pediatrics, University of Virginia, Charlottesville, Virginia, USA
- School of Health Professions Education, Maastricht University, Maastricht, The Netherlands
| | - Katherine A Gielissen
- Departments of Medicine and Pediatrics, Emory University School of Medicine, Atlanta, Georgia, USA
| | - Dongho Shin
- Department of Biostatistics, Virginia Commonwealth University School of Medicine, Richmond, Virginia, USA
| | - Robert A Perera
- Department of Biostatistics, Virginia Commonwealth University School of Medicine, Richmond, Virginia, USA
| | - Maryellen Gusic
- Departments of Pediatrics, Biomedical Education and Data Science, Lewis Katz School of Medicine, Philadelphia, Pennsylvania, USA
| | - Gary Ferenchick
- Department of Medicine, College of Human Medicine, Michigan State University, East Lansing, Michigan, USA
| | - Allison Ownby
- McGovern Medical School at UTHealth Houston, Houston, Texas, USA
| | - William B Cutrer
- Department of Pediatrics, Vanderbilt University School of Medicine, Nashville, Tennessee, USA
| | - Vivian Obeso
- Department of Medical Education, University of Miami Miller School of Medicine, Miami, Florida, USA
| | - Sally A Santen
- Virginia Commonwealth University School of Medicine, Richmond, Virginia, USA
- Emergency Medicine and Medical Education at University of Cincinnati College of Medicine, Cincinnati, Ohio, USA
| |
Collapse
|
4
|
Mukurunge E, Nyoni CN, Hugo L. Assessment approaches in undergraduate health professions education: towards the development of feasible assessment approaches for low-resource settings. BMC MEDICAL EDUCATION 2024; 24:318. [PMID: 38509579 PMCID: PMC10956342 DOI: 10.1186/s12909-024-05264-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/21/2023] [Accepted: 03/05/2024] [Indexed: 03/22/2024]
Abstract
BACKGROUND Feasible and effective assessment approaches to measuring competency in health sciences are vital in competency-based education. Educational programmes for health professions in low- and middle-income countries are increasingly adopting competency-based education as a strategy for training health professionals. Importantly, the organisation of assessments and assessment approaches must align with the available resources and still result in the fidelity of implementation. A review of existing assessment approaches, frameworks, models, and methods is essential for the development of feasible and effective assessment approaches in low-resource settings. METHODS Published literature was sourced from 13 electronic databases. The inclusion criteria were literature published in English between 2000 and 2022 about assessment approaches to measuring competency in health science professions. Specific data relating to the aims of each study, its location, population, research design, assessment approaches (including the outcome of implementing such approaches), frameworks, models, and methods were extracted from the included literature. The data were analysed through a multi-step process that integrated quantitative and qualitative approaches. RESULTS Many articles were from the United States and Australia and reported on the development of assessment models. Most of the articles included undergraduate medical or nursing students. A variety of models, theories, and frameworks were reported and included the Ideal model, Predictive Learning Assessment model, Amalgamated Student Assessment in Practice (ASAP) model, Leadership Outcome Assessment (LOA) model, Reporter-Interpreter-Manager-Educator (RIME) framework, the Quarter model, and the model which incorporates four assessment methods which are Triple Jump Test, Essay incorporating critical thinking questions, Multistation Integrated Practical Examination, and Multiple Choice Questions (TEMM) model. Additional models and frameworks that were used include the Entrustable Professional Activities framework, the System of Assessment framework, the Reporter-Interpreter-Manager-Educator (RIME) framework, the Clinical Reasoning framework (which is embedded in the Amalgamated Student Assessment in Practice (ASAP) model), Earl's Model of Learning, an assessment framework based on the Bayer-Fetzer Kalamazoo Consensus Statement, Bloom's taxonomy, the Canadian Medical Education Directions for Specialists (CanMEDS) Framework, the Accreditation Council for Graduate Medical Education (ACGME) framework, the Dreyfus Developmental Framework, and Miller's Pyramid. CONCLUSION An analysis of the assessment approaches, frameworks, models, and methods applied in health professions education lays the foundation for the development of feasible and effective assessment approaches in low-resource settings that integrate competency-based education. TRIAL REGISTRATION This study did not involve any clinical intervention. Therefore, trial registration was not required.
Collapse
Affiliation(s)
- Eva Mukurunge
- School of Nursing, Faculty of Health Sciences, University of the Free State, P.O. Box 339, Bloemfontein, 9300, South Africa.
| | - Champion N Nyoni
- School of Nursing, Faculty of Health Sciences, University of the Free State, P.O. Box 339, Bloemfontein, 9300, South Africa
| | - Lizemari Hugo
- School of Nursing, Faculty of Health Sciences, University of the Free State, P.O. Box 339, Bloemfontein, 9300, South Africa
| |
Collapse
|
5
|
Holleran C, Konrad J, Norton B, Burlis T, Ambler S. Use of learner-driven, formative, ad-hoc, prospective assessment of competence in physical therapist clinical education in the United States: a prospective cohort study. JOURNAL OF EDUCATIONAL EVALUATION FOR HEALTH PROFESSIONS 2023; 20:36. [PMID: 38081728 PMCID: PMC10823263 DOI: 10.3352/jeehp.2023.20.36] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/07/2023] [Accepted: 11/17/2023] [Indexed: 12/18/2023]
Abstract
PURPOSE The purpose of this project was to implement a process for learner-driven, formative, prospective, ad-hoc, entrustment assessment in Doctor of Physical Therapy clinical education. Our goals were to develop an innovative entrustment assessment tool, and then explore whether the tool detected (1) differences between learners at different stages of development and (2) differences within learners across the course of a clinical education experience. We also investigated whether there was a relationship between the number of assessments and change in performance. METHODS A prospective, observational, cohort of clinical instructors (CIs) was recruited to perform learner-driven, formative, ad-hoc, prospective, entrustment assessments. Two entrustable professional activities (EPAs) were used: (1) gather a history and perform an examination and (2) implement and modify the plan of care, as needed. CIs provided a rating on the entrustment scale and provided narrative support for their rating. RESULTS Forty-nine learners participated across 4 clinical experiences (CEs), resulting in 453 EPA learner-driven assessments. For both EPAs, statistically significant changes were detected both between learners at different stages of development and within learners across the course of a CE. Improvement within each CE was significantly related to the number of feedback opportunities. CONCLUSION The results of this pilot study provide preliminary support for the use of learner-driven, formative, ad-hoc assessments of competence based on EPAs with a novel entrustment scale. The number of formative assessments requested correlated with change on the EPA scale, suggesting that formative feedback may augment performance improvement.
Collapse
Affiliation(s)
- Carey Holleran
- Program in Physical Therapy, Washington University School of Medicine, St. Louis, MO, USA
- Department of Neurology, Washington University School of Medicine, St. Louis, MO, USA
| | - Jeffrey Konrad
- Program in Physical Therapy, Washington University School of Medicine, St. Louis, MO, USA
| | - Barbara Norton
- Program in Physical Therapy, Washington University School of Medicine, St. Louis, MO, USA
- Department of Neurology, Washington University School of Medicine, St. Louis, MO, USA
| | - Tamara Burlis
- Program in Physical Therapy, Washington University School of Medicine, St. Louis, MO, USA
- Department of Internal Medicine, Washington University School of Medicine, St. Louis, MO, USA
| | - Steven Ambler
- Program in Physical Therapy, Washington University School of Medicine, St. Louis, MO, USA
- Department of Orthopedic Surgery, Washington University School of Medicine, St. Louis, MO, USA
| |
Collapse
|
6
|
Colbert-Getz JM, Lappe K, Gerstenberger J, Milne CK, Raaum S. Capturing growth curves of medical students' clinical skills performance. CLINICAL TEACHER 2023; 20:e13623. [PMID: 37605795 DOI: 10.1111/tct.13623] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2023] [Accepted: 07/31/2023] [Indexed: 08/23/2023]
Abstract
INTRODUCTION A benefit of a milestone or Entrustable Professional Activity (EPA) assessment framework is the ability to capture longitudinal performance with growth curves using multi-level modelling (MLM). Growth curves can inform curriculum design and individualised learning. Residency programmes have found growth curves to vary by resident and by milestone. Only one study has analysed medical students' growth curves for EPAs. Analysis of EPA growth curves is critical because no change in performance raises concerns for EPAs as an assessment framework. METHODS Spencer Fox Eccles School of Medicine-University of Utah students' workplace-based assessment ratings for 7 EPAs were captured at 3 time-points in years 3-4 of AY2017-2018 to AY2020-2021. MLM was used to capture EPA growth curves and determine if variation in growth curves was explained by internal medicine (IM) clerkship order. FINDINGS A curvilinear slope significantly captured 256 students' average ratings overtime for EPA1a-history-taking, EPA2-clinical reasoning, EPA3-diagnostics, EPA5-documentation and EPA6-presentation, and a linear slope significantly captured EPA9-teamwork ratings, p ≤ 0.001. Growth curves were steepest for EPA2-clinical reasoning and EPA3-diagnostics. Growth curves varied by students, p < 0.05 for all EPA ratings, but IM clerkship rotation order did not significantly explain the variance, p > 0.05. DISCUSSION The increase in ratings from Year 3 to Year 4 provides validity evidence for use of EPAs in an assessment framework. Students may benefit from more curriculum/skills practice for EPA2-clinical reasoning and EPA3-diagnostics prior to year 3. Variation in student's growth curves is important for coaching and skill development; a one size fits all approach may not suffice.
Collapse
Affiliation(s)
- Jorie M Colbert-Getz
- Department of Internal Medicine, University of Utah School of Medicine, Salt Lake City, Utah, USA
| | - Katie Lappe
- Department of Internal Medicine, University of Utah School of Medicine, Salt Lake City, Utah, USA
| | - John Gerstenberger
- Department of Internal Medicine, University of Utah School of Medicine, Salt Lake City, Utah, USA
| | - Caroline K Milne
- Department of Internal Medicine, University of Utah School of Medicine, Salt Lake City, Utah, USA
| | - Sonja Raaum
- Department of Internal Medicine, University of Utah School of Medicine, Salt Lake City, Utah, USA
| |
Collapse
|
7
|
Huynh A, Nguyen A, Beyer RS, Harris MH, Hatter MJ, Brown NJ, de Virgilio C, Nahmias J. Fixing a Broken Clerkship Assessment Process: Reflections on Objectivity and Equity Following the USMLE Step 1 Change to Pass/Fail. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2023; 98:769-774. [PMID: 36780667 DOI: 10.1097/acm.0000000000005168] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/18/2023]
Abstract
Clerkship grading is a core feature of evaluation for medical students' skills as physicians and is considered by most residency program directors to be an indicator of future performance and success. With the transition of the U.S. Medical Licensing Examination Step 1 score to pass/fail, there will likely be even greater reliance on clerkship grades, which raises several important issues that need to be urgently addressed. This article details the current landscape of clerkship grading and the systemic discrepancies in assessment and allocation of honors. The authors examine not only objectivity and fairness in clerkship grading but also the reliability of clerkship grading in predicting residency performance and the potential benefits and drawbacks to adoption of a pass/fail clinical clerkship grading system. In the promotion of a more fair and equitable residency selection process, there must be standardization of grading systems with consideration of explicit grading criteria, grading committees, and/or structured education of evaluators and assessors regarding implicit bias. In addition, greater adherence and enforcement of transparency in grade distributions in the Medical Student Performance Evaluation is needed. These changes have the potential to level the playing field, foster equitable comparisons, and ultimately add more fairness to the residency selection process.
Collapse
Affiliation(s)
- Ashley Huynh
- A. Huynh is a first-year medical student, University of California, Irvine, School of Medicine, Irvine, California; ORCID: https://orcid.org/0000-0002-4413-6829
| | - Andrew Nguyen
- A. Nguyen is a first-year medical student, University of Florida College of Medicine, Gainesville, Florida; ORCID: https://orcid.org/0000-0002-8131-150X
| | - Ryan S Beyer
- R.S. Beyer is a second-year medical student, University of California, Irvine, School of Medicine, Irvine, California; ORCID: https://orcid.org/0000-0002-0283-3749
| | - Mark H Harris
- M.H. Harris is a second-year medical student, University of California, Irvine, School of Medicine, Irvine, California; ORCID: https://orcid.org/0000-0002-1598-225X
| | - Matthew J Hatter
- M.J. Hatter is a second-year medical student, University of California, Irvine, School of Medicine, Irvine, California; ORCID: https://orcid.org/0000-0003-2922-6196
| | - Nolan J Brown
- N.J. Brown is a fourth-year medical student, University of California, Irvine, School of Medicine, Irvine, California; ORCID: https://orcid.org/0000-0002-6025-346X
| | - Christian de Virgilio
- C. de Virgilio is professor of surgery, Harbor-UCLA Medical Center, Torrance, California
| | - Jeffry Nahmias
- J. Nahmias is professor of trauma, burns, surgical critical care, and acute care surgery, University of California, Irvine, School of Medicine, Irvine, California; ORCID: https://orcid.org/0000-0003-0094-571X
| |
Collapse
|
8
|
Beckett RD, Gratz MA, Marwitz KK, Hanson KM, Isch J, Robison HD. Development, Validation, and Reliability of a P1 Objective Structured Clinical Examination Assessing the National EPAs. AMERICAN JOURNAL OF PHARMACEUTICAL EDUCATION 2023; 87:100054. [PMID: 37316140 DOI: 10.1016/j.ajpe.2023.100054] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/01/2022] [Revised: 01/16/2023] [Accepted: 02/15/2023] [Indexed: 06/16/2023]
Abstract
OBJECTIVE To document the performance of first-year pharmacy students on a revised objective structured clinical examination (OSCE) based on national entrustable professional activities, identify risk factors for poor performance, and assess its validity and reliability. METHODS A working group developed the OSCE to verify students' progress toward readiness for advanced pharmacy practice experiences at the L1 level of entrustment (ready for thoughtful observation) on the national entrustable professional activities, with stations cross-mapped to the Accreditation Council for Pharmacy Education educational outcomes. Baseline characteristics and academic performance were used to investigate risk factors for poor performance and validity, respectively, by comparing students who were successful on the first attempt with those who were not. Reliability was evaluated using re-grading by a blinded, independent grader, and analyzed using Cohen's kappa. RESULTS A total of 65 students completed the OSCE. Of these, 33 (50.8%) successfully completed all stations on first attempt, and 32 (49.2%) had to re-attempt at least 1 station. Successful students had higher Health Sciences Reasoning Test scores (mean difference 5, 95% CI 2-9). First professional year grade point average was higher for students who passed all stations on first attempt (mean difference 0.4 on a 4-point scale, 95% CI 0.1-0.7). When evaluated in a multiple logistic regression, no differences were statistically significant between groups. Most kappa values were above 0.4 (range 0.404-0.708), suggesting moderate to substantial reliability. CONCLUSION Though predictors of poor performance were not identified when accounting for covariates, the OSCE was found to have good validity and reliability.
Collapse
Affiliation(s)
| | - Melissa A Gratz
- Manchester University College of Health Sciences and Pharmacy, Fort Wayne, IN, USA; Lutheran Hospital, Fort Wayne, IN, USA
| | - Kathryn K Marwitz
- Manchester University College of Health Sciences and Pharmacy, Fort Wayne, IN, USA
| | - Kierstan M Hanson
- Manchester University College of Health Sciences and Pharmacy, Fort Wayne, IN, USA
| | - Jason Isch
- Manchester University College of Health Sciences and Pharmacy, Fort Wayne, IN, USA; Saint Joseph Health System, Mishawaka, IN, USA
| | | |
Collapse
|
9
|
Hanna K, Gupta S, Hurst R, McKeon BA, DeWaay D. Specialty-Specific Entrustable Professional Activities: A Bridge to Internship. Cureus 2023; 15:e35547. [PMID: 37007399 PMCID: PMC10057663 DOI: 10.7759/cureus.35547] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2023] [Accepted: 02/27/2023] [Indexed: 03/03/2023] Open
Abstract
Background Undergraduate medical education aims to prepare learners to become capable residents. New interns are expected to perform clinical tasks with distant supervision reliant on having acquired a medical degree. However, there is limited data to discuss what entrustment residency programs grant versus what the medical schools believe they have trained their graduates to perform. At our institution, we sought to foster an alliance between undergraduate medical education (UME) and graduate medical education (GME) toward specialty-specific entrustable professional activities (SSEPAs). These SSEPAs create a bridge to residency and help students structure the final year of medical school while striving for entrustability for day one of residency. This paper describes the SSEPA curriculum development process and student self-assessment of competence. Methodology We piloted an SSEPA program with the departments of Family Medicine, Internal Medicine, Neurology, and Obstetrics & Gynecology. Utilizing Kern's curriculum development framework, each specialty designed a longitudinal curriculum with a post-match capstone course. Students participated in pre-course and post-course self-assessments utilizing the Chen scale for each entrustable professional activity (EPA). Results A total of 42 students successfully completed the SSEPA curriculum in these four specialties. Students' self-assessed competence levels rose from 2.61 to 3.65 in Internal Medicine; 3.23 to 4.12 in Obstetrics and Gynecology; 3.62 to 4.13 in Neurology; and 3.65 to 3.79 in Family Medicine. Students across all specialties noted an increase in confidence from 3.45 to 4.38 in Internal Medicine; 3.3 to 4.6 in Obstetrics and Gynecology; 3.25 to 4.25 in Neurology; and 4.33 to 4.67 in Family Medicine. Conclusions A specialty-specific curriculum utilizing a competency-based framework for learners traversing the UME to GME journey in the final year of medical school improves learner confidence in their clinical abilities and may lead to an improved educational handoff between UME and GME.
Collapse
|
10
|
Edmiston N, Hu W, Tobin S, Bailey J, Joyce C, Reed K, Mogensen L. "You're actually part of the team": a qualitative study of a novel transitional role from medical student to doctor. BMC MEDICAL EDUCATION 2023; 23:112. [PMID: 36793053 PMCID: PMC9930018 DOI: 10.1186/s12909-023-04084-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/26/2022] [Accepted: 02/06/2023] [Indexed: 06/18/2023]
Abstract
BACKGROUND Optimizing transitions from final year of medical school and into first post graduate year has important implications for students, patients and the health care system. Student experiences during novel transitional roles can provide insights into potential opportunities for final year curricula. We explored the experiences of medical students in a novel transitional role and their ability to continue learning whilst working as part of a medical team. METHODS Novel transitional role for final year medical students were created in partnership by medical schools and state health departments in 2020 in response to the COVID-19 pandemic and the need for a medical surge workforce. Final year medical students from an undergraduate entry medical school were employed as Assistants in Medicine (AiMs) in urban and regional hospitals. A qualitative study with semi-structured interviews at two time points was used to obtain experiences of the role from 26 AiMs. Transcripts were analyzed using deductive thematic analysis with Activity theory as a conceptual lens. RESULTS This unique role was defined by the objective of supporting the hospital team. Experiential learning opportunities in patient management were optimized when AiMs had opportunities to contribute meaningfully. Team structure and access to the key instrument, the electronic medical record, enabled participants to contribute meaningfully, whilst contractual arrangements and payments formalized the obligations to contribute. CONCLUSIONS The experiential nature of the role was facilitated by organizational factors. Structuring teams to involve a dedicated medical assistant position with specific duties and access to the electronic medical record sufficient to complete duties are key to successful transitional roles. Both should be considered when designing transitional roles as placements for final year medical students.
Collapse
Affiliation(s)
- Natalie Edmiston
- School of Medicine, Western Sydney University, Sydney, Australia.
- University Centre for Rural Health, Lismore, Australia.
| | - Wendy Hu
- School of Medicine, Western Sydney University, Sydney, Australia
| | - Stephen Tobin
- School of Medicine, Western Sydney University, Sydney, Australia
| | - Jannine Bailey
- School of Medicine, Western Sydney University, Sydney, Australia
| | - Caroline Joyce
- School of Medicine, Western Sydney University, Sydney, Australia
| | - Krista Reed
- School of Medicine, Western Sydney University, Sydney, Australia
| | - Lise Mogensen
- School of Medicine, Western Sydney University, Sydney, Australia
| |
Collapse
|
11
|
Kuehl SE, Spicer JO. Using entrustable professional activities to better prepare students for their postgraduate medical training: A medical student's perspective. PERSPECTIVES ON MEDICAL EDUCATION 2022; 11:359-364. [PMID: 36441351 PMCID: PMC9743878 DOI: 10.1007/s40037-022-00731-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/13/2021] [Revised: 10/12/2022] [Accepted: 10/14/2022] [Indexed: 06/16/2023]
Abstract
THE PROBLEM Medical students graduate underprepared for postgraduate medical training despite years of classroom and clinical training. In this article, a medical student shares her personal perspectives on three factors contributing to this problem in undergraduate medical education: students' peripheral roles in the clinical environment impede learning, students receive inadequate feedback, and assessments do not measure desired learning outcomes. A SOLUTION The authors describe how using entrustable professional activities (EPAs) could address these issues and promote students' clinical engagement by clarifying their roles, providing them with frequent and actionable feedback, and aligning their assessments with authentic work. These factors combined with grading schemes rewarding improvement could contribute to a growth mindset that reprioritizes clinical skill acquisition. The authors explore how medical schools have begun implementing the EPA framework, highlight insights from these efforts, and describe barriers that must be addressed. THE FUTURE Incorporating EPAs into medical school curricula could better prepare students for postgraduate training while also alleviating issues that contribute to student burnout by defining students' roles, improving feedback, and aligning assessments with desired learning outcomes.
Collapse
Affiliation(s)
- Sarah E Kuehl
- Emory University School of Medicine and Goizueta Business School, Atlanta, GA, USA.
| | - Jennifer O Spicer
- J. Willis Hurst Internal Medicine Residency Program, Division of Infectious Diseases, Department of Medicine, Emory University School of Medicine, Atlanta, GA, USA
| |
Collapse
|
12
|
Read EK, Maxey C, Hecker KG. Longitudinal assessment of competency development at The Ohio State University using the competency-based veterinary education (CBVE) model. Front Vet Sci 2022; 9:1019305. [PMID: 36387400 PMCID: PMC9642912 DOI: 10.3389/fvets.2022.1019305] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/14/2022] [Accepted: 09/29/2022] [Indexed: 09/19/2023] Open
Abstract
With the development of the American Association of Veterinary Medical Colleges' Competency-Based Veterinary Education (CBVE) model, veterinary schools are reorganizing curricula and assessment guidelines, especially within the clinical rotation training elements. Specifically, programs are utilizing both competencies and entrustable professional activities (EPAs) as opportunities for gathering information about student development within and across clinical rotations. However, what evidence exists that use of the central tenets of the CBVE model (competency framework, milestones and EPAs) improves our assessment practices and captures reliable and valid data to track competency development of students as they progress through their clinical year? Here, we report on validity evidence to support the use of scores from in-training evaluation report forms (ITERs) and workplace-based assessments of EPAs to evaluate competency progression within and across domains described in the CBVE, during the final year clinical training period of The Ohio State University's College of Veterinary Medicine (OSU-CVM) program. The ITER, used at the conclusion of each rotation, was modified to include the CBVE competencies that were assessed by identifying the stage of student development on a series of descriptive milestones (from pre-novice to competent). Workplace based assessments containing entrustment scales were used to assess EPAs from the CBVE model within each clinical rotation. Competency progression and entrustment scores were evaluated on each of the 31 rotations offered and high-stakes decisions regarding student performance were determined by a collective review of all the ITERs and EPAs recorded for each learner across each semester and the entire year. Results from the class of 2021, collected on approximately 190 students from 31 rotations, are reported with more than 55 299 total competency assessments combined with milestone placement and 2799 complete EPAs. Approximately 10% of the class was identified for remediation and received additional coaching support. Data collected longitudinally through the ITER on milestones provides initial validity evidence to support using the scores in higher stakes contexts such as identifying students for remediation and for determining whether students have met the necessary requirements to successfully complete the program. Data collected on entrustment scores did not, however, support such decision making. Implications are discussed.
Collapse
Affiliation(s)
- Emma K. Read
- College of Veterinary Medicine, The Ohio State University, Columbus, OH, United States
| | - Connor Maxey
- Faculty of Veterinary Medicine, University of Calgary, Calgary, AB, Canada
| | - Kent G. Hecker
- Faculty of Veterinary Medicine, University of Calgary, Calgary, AB, Canada
- International Council for Veterinary Assessment, Bismarck, ND, United States
| |
Collapse
|
13
|
Dzioba C, LaManna J, Perry CK, Toerber-Clark J, Boehning A, O'Rourke J, Rutledge C. Telehealth Competencies: Leveled for Continuous Advanced Practice Nurse Development. Nurse Educ 2022; 47:293-297. [PMID: 35404870 DOI: 10.1097/nne.0000000000001196] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
BACKGROUND The COVID-19 pandemic spurred a rapid uptake of telehealth utilization, with advanced practice registered nurses (APRNs) at the forefront of telehealth care delivery. To advance training of nurse practitioners and support curricular development, essential APRN student competencies in telehealth were developed. PROBLEM Although telehealth competencies have been developed, little is understood about their evaluation across the curricula. Moving to competency-based nursing education involves leveling broad competencies into subcompetencies, including those for telehealth. Subcompetencies support frequent, multimodal evaluation of student progress across APRN curricula. APPROACH Adapting Benner's Novice to Expert Theory, faculty experts in telehealth and graduate nursing education used an iterative process to develop and level subcompetencies aligned with the Four Ps of Telehealth framework. OUTCOMES Telehealth subcompetencies were leveled for preclinical and clinical rotations and for readiness for practice. CONCLUSIONS The leveled subcompetencies, aligned with the Four Ps of Telehealth framework, will support APRN faculty in diverse programs as they implement competency-based education in telehealth.
Collapse
Affiliation(s)
- Christina Dzioba
- Assistant Professor (Dr Dzioba), School of Nursing, Florida Gulf Coast University, Fort Myers; Associate Professor (Dr LaManna) and Program Chair, School of Nursing, University of Central Florida, Orlando; Professor (Dr Perry), School of Nursing, Oregon Health & Science University, Portland; Assistant Professor (Dr Toerber-Clark), School of Nursing, Washburn University, Topeka, Kansas; Assistant Professor (Dr Boehning), Department of Nursing CSU Bakersfield, Bakersfield, California; Assistant Professor Tenure Track (Dr O'Rourke), School of Nursing, Loyola University, Chicago, Illinois; Professor (Dr Rutledge), Associate Chair, School of Nursing, and Co-Director of Center for Telehealth Innovation, Education, and Research (C-TIER), Old Dominion University, Virginia Beach
| | | | | | | | | | | | | |
Collapse
|
14
|
Porter S, Prendiville E, Allen BFS, Booth G, Boublik J, Burnett GW, Elkassabany N, Hausman J, Klesius L, Le-Wendling L, Machi AT, Maniker R, Parra M, Rosenquist R, Spofford CM, Suresh S, Tedore T, Wilson EH, Zhou JY, Woodworth G. Development of entrustable professional activities for regional anesthesia and pain medicine fellowship training. Reg Anesth Pain Med 2022; 47:rapm-2022-103854. [PMID: 35878963 DOI: 10.1136/rapm-2022-103854] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2022] [Accepted: 07/14/2022] [Indexed: 11/04/2022]
Abstract
INTRODUCTION The Accreditation Council for Graduate Medical Education (ACGME) offers descriptions of competencies and milestones but does not provide standardized assessments to track trainee competency. Entrustable professional activities (EPAs) and special assessments (SAs) are emerging methods to assess the level of competency obtained by regional anesthesiology and acute pain medicine (RAAPM) fellows. METHODS A panel of RAAPM physicians with experience in education and competency assessment and one medical student were recruited to participate in a modified Delphi method with iterative rounds to reach consensus on: a list of EPAs, SAs, and procedural skills; detailed definitions for each EPA and SA; a mapping of the EPAs and SAs to the ACGME milestones; and a target level of entrustment for graduating US RAAPM fellows for each EPA and procedural skill. A gap analysis was performed and a heat map was created to cross-check the EPAs and SAs to the ACGME milestones. RESULTS Participants in EPA and SA development included 19 physicians and 1 medical student from 18 different programs. The Delphi rounds yielded a final list of 23 EPAs, a defined entrustment scale, mapping of the EPAs to ACGME milestones, and graduation targets. A list of 73 procedural skills and 7 SAs were similarly developed. DISCUSSION A list of 23 RAAPM EPAs, 73 procedural skills, and 7 SAs were created using a rigorous methodology to reach consensus. This framework can be utilized to help assess RAAPM fellows in the USA for competency and allow for meaningful performance feedback.
Collapse
Affiliation(s)
- Steven Porter
- Department of Anesthesiology and Perioperative Medicine, Mayo Clinic Florida, Jacksonville, Florida, USA
| | - Elaine Prendiville
- Department of Anesthesiology and Perioperative Medicine, Oregon Health & Science University, Portland, Oregon, USA
| | | | - Gregory Booth
- Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth Department of Anesthesiology and Pain Medicine, Portsmouth, Virginia, USA
| | - Jan Boublik
- Anesthesiology, Stanford Hospital and Clinics, Stanford, California, USA
| | - Garrett W Burnett
- Anesthesiology, Perioperative & Pain Medicine, Icahn School of Medicine at Mount Sinai, New York, New York, USA
| | - Nabil Elkassabany
- Department of Anesthesiology and Critical Care, University of Pennsylvania Health System, Philadelphia, Pennsylvania, USA
| | - Jonathan Hausman
- Department of Anesthesiology, Cedars-Sinai Medical Center, Los Angeles, California, USA
| | - Lisa Klesius
- Department of Anesthesiology, University of Wisconsin System, Madison, Wisconsin, USA
| | | | - Anthony T Machi
- Anesthesiology and Pain Management, University of Texas Southwestern Medical Center at Dallas, Dallas, Texas, USA
| | - Robert Maniker
- Department of Anesthesiology, Perioperative and Pain Medicine, Stanford University, Stanford, California, USA
- Department of Anesthesiology, Columbia University Medical Center, New York, New York, USA
| | | | | | - Christina M Spofford
- Department of Anesthesiology, Medical College of Wisconsin, Milwaukee, Wisconsin, USA
| | - Santhanam Suresh
- Pediatric Anesthesiology, Northwestern Medicine, Chicago, Illinois, USA
| | - Tiffany Tedore
- Anesthesiology, NewYork-Presbyterian Hospital/Weill Cornell Medical Center, New York, New York, USA
| | - Elizabeth H Wilson
- Department of Anesthesiology, University of Wisconsin-Madison School of Medicine and Public Health, Madison, Wisconsin, USA
| | - Jon Yan Zhou
- Anesthesiology and Pain Medicine, University of California Davis Health System, Sacramento, California, USA
| | - Glenn Woodworth
- Department of Anesthesiology and Perioperative Medicine, Oregon Health & Science University, Portland, Oregon, USA
| |
Collapse
|
15
|
Hobday PM, Borman-Shoap E, Cullen MJ, Englander R, Murray KE. The Minnesota Method: A Learner-Driven, Entrustable Professional Activity-Based Comprehensive Program of Assessment for Medical Students. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2021; 96:S50-S55. [PMID: 34183602 DOI: 10.1097/acm.0000000000004101] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
PROBLEM Assessment has been the Achilles heel of competency-based medical education. It requires a program of assessment in which outcomes are clearly defined, students know where they are in the development of the competencies, and what the next steps are to attaining them. Achieving this goal in a feasible manner has been elusive with traditional assessment methods alone. The Education in Pediatrics Across the Continuum (EPAC) program at the University of Minnesota developed a robust program of assessment that has utility and recognizes when students are ready for the undergraduate to graduate medical education transition. APPROACH The authors developed a learner-driven program of assessment in the foundational clinical training of medical students in the EPAC program based on the Core Entrustable Professional Activities for Entering Residency (Core EPAs). Frequent workplace-based assessments, coupled with summative assessments, informed a quarterly clinical competency committee and individualized learning plans. The data were displayed on real time dashboards for the students to review. OUTCOMES Over 4 cohorts from 2015 to 2019, students (n = 13) averaged approximately 200 discrete Core EPA workplace-based assessments during their foundational clinical training year. Assessments were completed by an average of 9 different preceptors each month across 8 different specialties. The data were displayed in a way students and faculty could monitor development and inform a clinical competency committee's ability to determine readiness to transition to advanced clinical rotations and residency. NEXT STEPS The next steps include continuing to scale the program of assessment to a larger cohort of students.
Collapse
Affiliation(s)
- Patricia M Hobday
- P.M. Hobday is assistant professor, course director, Education in Pediatrics Across the Continuum (EPAC), and associate pediatric residency program director, Department of Pediatrics, University of Minnesota Medical School, Minneapolis, Minnesota
| | - Emily Borman-Shoap
- E. Borman-Shoap is vice chair of education, pediatric residency program director, and associate professor, Department of Pediatrics, University of Minnesota Medical School, Minneapolis, Minnesota
| | - Michael J Cullen
- M.J. Cullen is director of evaluation for graduate medical education, University of Minnesota Medical School, Minneapolis, Minnesota
| | - Robert Englander
- R. Englander is associate dean for undergraduate medical education and professor, Department of Pediatrics, University of Minnesota Medical School, Minneapolis, Minnesota
| | - Katherine E Murray
- K.E. Murray is assistant dean for curriculum and associate professor, Department of Pediatrics, University of Minnesota Medical School, Minneapolis, Minnesota
| |
Collapse
|
16
|
Schumacher DJ, Turner DA. Entrustable Professional Activities: Reflecting on Where We Are to Define a Path for the Next Decade. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2021; 96:S1-S5. [PMID: 34183594 DOI: 10.1097/acm.0000000000004097] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Affiliation(s)
- Daniel J Schumacher
- D.J. Schumacher is associate professor of pediatrics, Cincinnati Children's Hospital Medical Center and University of Cincinnati College of Medicine, Cincinnati, Ohio
| | - David A Turner
- D.A. Turner is vice president for competency-based medical education, American Board of Pediatrics, Chapel Hill, North Carolina
| |
Collapse
|