1
|
Stuempfig ND, Cueva J, MacConaghy L, Alexeeva M, Moffet P. Are First-Year Emergency Medicine Residents Still Behind on Level 1 Care-Based Milestones? Cureus 2023; 15:e49842. [PMID: 38164295 PMCID: PMC10758298 DOI: 10.7759/cureus.49842] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 12/02/2023] [Indexed: 01/03/2024] Open
Abstract
Background The Accreditation Council for Graduate Medical Education defines Level 1 as "the resident demonstrates milestones expected of an incoming resident," yet a previous study of emergency medicine (EM) interns showed most were not meeting Level 1 milestones. In addition, previous research indicates that residents often provide more favorable self-assessments when compared to faculty assessments. Our study, performed in July 2022, aims to determine whether incoming EM residents remain behind on Level 1 care-based milestones and if resident self-assessments are consistent with faculty assessments. Methodology This is an observational study involving five distinct EM residency programs. Incoming interns were directly assessed by faculty for behaviors associated with the care-based milestones for EM using a standardized survey. Interns were asked to complete this same survey regarding their own performance. Results Faculty completed a total of 101 assessments on 49 residents. Of the 49 residents, 39 completed self-evaluations (80%). Achievement of Level 1 ranged from 25% to 82%. Residents had significantly higher self-assessments than faculty assessments on PC-1, PC-5a, and PC-6a. Faculty assessments were significantly higher than resident self-assessments on PC-6b. Conclusions Greater than 75% of incoming interns were able to meet Level 1 milestones in three of seven care-based milestones. However, there is a generalized trend toward overall improvement when compared to previous studies. Residents continue to demonstrate higher self-assessments than faculty in three separate care-based milestones and faculty rated residents significantly higher in one care-based milestone. This is consistent with previous studies.
Collapse
Affiliation(s)
| | - Julie Cueva
- Emergency Medicine, Maimonides Medical Center, Brooklyn, USA
| | | | | | - Peter Moffet
- Emergency Medicine, Virginia Commonwealth University, Richmond, USA
| |
Collapse
|
2
|
Beeson MS, Barton MA, Reisdorff EJ, Carter WA, Gausche‐Hill M, Gorgas DL, Joldersma KB, Santen SA. Comparison of performance data between emergency medicine 1-3 and 1-4 program formats. J Am Coll Emerg Physicians Open 2023; 4:e12991. [PMID: 37304857 PMCID: PMC10257037 DOI: 10.1002/emp2.12991] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2022] [Revised: 04/24/2023] [Accepted: 05/03/2023] [Indexed: 06/13/2023] Open
Abstract
Objective This study compares performance data from physicians completing 3-year versus 4-year emergency medicine residency training programs. Currently, there are 2 training formats and little is known about objective performance differences. Methods This was a retrospective cross-sectional analysis of emergency residents and physicians. Multiple analyses were conducted comparing physicians' performances, including Accreditation Council of Graduate Medical Education Milestones and American Board of Emergency Medicine In-training Examination (ITE), Qualifying Examination (QE), Oral Certification Examination (OCE), and program extensions from 3-year and 4-year residency programs. Some confounding variables were not or could not be considered, such as rationale for medical students to choose one format over another, as well as application and final match rates. Results Milestone scores are higher for emergency medicine 3 residents in 1-3 programs (3.51) versus emergency medicine 3 residents in 1-4 programs (3.07; P < 0.001, d = 1.47) and highest for emergency medicine 4 residents (3.67). There was no significant difference in program extension rates (emergency medicine 1-3, 8.1%; emergency medicine 1-4, 9.6%; P = 0.05, ω = 0.02). ITE scores were higher for emergency medicine 1, 2, and 3 residents from 1-3 programs and emergency medicine 4 residents from 1-4 programs scored highest. Mean QE score was slightly higher for emergency 1-3 physicians (83.55 vs 83.00; P < 0.01, d = 0.10). QE pass rate was higher for emergency 1-3 physicians (93.1% vs 90.8%; P < 0.001, ω = 0.08). Mean OCE score was slightly higher for emergency 1-4 physicians (5.67 vs 5.65; P = 0.03, d = -0.07) but did not reach a priori statistical significance (α < 0.01). OCE pass rate was also slightly higher for emergency 1-4 physicians (96.9% vs 95.5%; P = 0.06, ω = -0.07) but also non-significant. Conclusions These results suggest that although performance measures demonstrate small differences between physicians from emergency medicine 1-3 and 1-4 programs, these differences are limited in their ability to make causal claims about performance on the basis of program format alone.
Collapse
Affiliation(s)
| | | | | | - Wallace A. Carter
- Department of Emergency MedicineWeill Cornell MedicineNew YorkNew YorkUSA
| | - Marianne Gausche‐Hill
- Department of Emergency MedicineHarbor‐University of California Los Angeles, Los Angeles County Emergency Medical Services AgencyLos AngelesCaliforniaUSA
| | - Diane L. Gorgas
- Department of Emergency MedicineThe Ohio State Wexner Medical CenterColumbusOhioUSA
| | | | - Sally A. Santen
- Department of Emergency MedicineUniversity of Cincinnati College of Medicine, Cincinnati, Ohio and Department of Emergency Medicine, Virginia Commonwealth UniversityRichmondVirginiaUSA
| |
Collapse
|
3
|
Meguerdichian DA, Huancahuari N, Pozner CN, Eyre A, Schuur J, Yule S. Evaluating Nontechnical Skills in US Emergency Departments Using Simulation: Validating and Contextualizing a UK Assessment Tool. Simul Healthc 2022; 17:104-111. [PMID: 34009906 DOI: 10.1097/sih.0000000000000567] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
INTRODUCTION Nontechnical skills (NTS) in medicine are the "cognitive, social, and personal resource skills that complement technical skills contributing to safe and efficient care." We aimed to (1) evaluate the validity and reliability of a 12-element United Kingdom emergency medicine (EM) NTS assessment tool in the context of United States (US) EM practice and (2) identify behaviors unique to US clinical practice. METHODS This was a mixed methods study conducted in 2 phases, following Kane's validity framework. The intended use of the NTS tool is to provide formative assessment of US EM physicians (EPs) from a video of simulated clinical encounters. In phase I, a focus group assessed the appropriateness of each aspect of the tool in the context of US EM practice by reviewing and identifying the NTS of an EP in a simulated clinical scenario. In phase II, EPs (N = 208) attending a national EM conference evaluated an EP's behaviors in 1 of 2 video simulations. Reliability in the form of internal consistency was calculated using Cronbach α. All participants suggested exemplar behaviors for the 12 elements in the context of their own clinical practice and generated new assessment elements. RESULTS Internal consistency was acceptable (α > 0.7) for all categories, except teamwork and cooperation. Participants proposed 4 novel behavioral elements and suggested US exemplar behaviors for all 12 original elements. CONCLUSIONS This tool can be used to assess US EP's NTS for the purpose of formative assessment. Refinement of exemplar behaviors and inclusion of novel US-specific elements may optimize usability.
Collapse
Affiliation(s)
- David A Meguerdichian
- From the Neil and Elise Wallace STRATUS Center for Medical Simulation (D.A.M., C.N.P., A.E., S.Y.), and Department of Emergency Medicine (D.A.M., N.H., C.N.P., A.E., J.S.), Brigham and Women's Hospital, Harvard Medical School, Boston, MA; Department of Emergency Medicine, Rhode Island Hospital, Alpert Medical School at Brown University, Providence, RI (J.S.); Department of Surgery (S.Y.), Brigham and Women's Hospital, Harvard Medical School, Boston, MA; and Department of Clinical Surgery (S.Y.), The University of Edinburgh, Edinburgh, United Kingdom
| | | | | | | | | | | |
Collapse
|
4
|
Santen SA, Ryan MS, Coates WC. What Can a Pandemic Teach Us About Competency-based Medical Education? AEM EDUCATION AND TRAINING 2020; 4:301-305. [PMID: 32704603 PMCID: PMC7369495 DOI: 10.1002/aet2.10473] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/04/2020] [Revised: 05/07/2020] [Accepted: 05/08/2020] [Indexed: 06/09/2023]
Affiliation(s)
- Sally A. Santen
- Virginia Commonwealth University School of MedicineRichmondVA
| | - Michael S. Ryan
- Virginia Commonwealth University School of MedicineRichmondVA
| | - Wendy C. Coates
- Harbor‐UCLA Medical CenterTorranceCA
- David Geffen School of Medicine at University of California, Los AngelesLos AngelesCA
| |
Collapse
|
5
|
Santen SA, Yamazaki K, Holmboe ES, Yarris LM, Hamstra SJ. Comparison of Male and Female Resident Milestone Assessments During Emergency Medicine Residency Training: A National Study. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2020; 95:263-268. [PMID: 31517688 PMCID: PMC7004441 DOI: 10.1097/acm.0000000000002988] [Citation(s) in RCA: 38] [Impact Index Per Article: 9.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/09/2023]
Abstract
PURPOSE A previous study found that milestone ratings at the end of training were higher for male than for female residents in emergency medicine (EM). However, that study was restricted to a sample of 8 EM residency programs and used individual faculty ratings from milestone reporting forms that were designed for use by the program's Clinical Competency Committee (CCC). The objective of this study was to investigate whether similar results would be found when examining the entire national cohort of EM milestone ratings reported by programs after CCC consensus review. METHOD This study examined longitudinal milestone ratings for all EM residents (n = 1,363; 125 programs) reported to the Accreditation Council for Graduate Medical Education every 6 months from 2014 to 2017. A multilevel linear regression model was used to estimate differences in slope for all subcompetencies, and predicted marginal means between genders were compared at time of graduation. RESULTS There were small but statistically significant differences between males' and females' increase in ratings from initial rating to graduation on 6 of the 22 subcompetencies. Marginal mean comparisons at time of graduation demonstrated gender effects for 4 patient care subcompetencies. For these subcompetencies, males were rated as performing better than females; differences ranged from 0.048 to 0.074 milestone ratings. CONCLUSIONS In this national dataset of EM resident milestone assessments by CCCs, males and females were rated similarly at the end of their training for the majority of subcompetencies. Statistically significant but small absolute differences were noted in 4 patient care subcompetencies.
Collapse
Affiliation(s)
- Sally A. Santen
- S.A. Santen is professor and senior associate dean, Virginia Commonwealth University School of Medicine, Richmond, Virginia; ORCID: https://orcid.org/0000-0002-8327-8002
| | - Kenji Yamazaki
- K. Yamazaki is senior analyst, Milestones Research and Evaluation, Accreditation Council for Graduate Medical Education, Chicago, Illinois
| | - Eric S. Holmboe
- E.S. Holmboe is chief research, milestones development and evaluation officer, Accreditation Council for Graduate Medical Education, Chicago, Illinois; ORCID: https://orcid.org/0000-0003-0108-6021
| | - Lalena M. Yarris
- L.M. Yarris is professor and residency director, Department of Emergency Medicine, Oregon Health & Science University, Portland, Oregon
| | - Stanley J. Hamstra
- S.J. Hamstra is vice president, Department of Research, Milestone Development and Evaluation, Accreditation Council for Graduate Medical Education, Chicago, Illinois, adjunct professor, Faculty of Education, University of Ottawa, Ottawa, Ontario, Canada, and adjunct professor, Department of Medical Education, Feinberg School of Medicine, Northwestern University, Chicago, Illinois; ORCID: https://orcid.org/0000-0002-0680-366X
| |
Collapse
|
6
|
Abstract
PURPOSE OF REVIEW One of the major functions of the Accreditation Council for Graduate Medical Education (ACGME) is to accredit all approved residency programs. This accreditation system is based on both common and program-specific requirements that form the foundation of all ACGME-accredited training programs. Embedded within the program requirements are the essential elements of the Competencies and Milestones. In this review article, we hope to provide the reader with an overview of the current Milestones and a preview of what lies ahead. RECENT FINDINGS Milestones for resident education were implemented approximately 7 years ago. The milestones were intended to create a logical trajectory of professional growth which could be measured and tracked for each sub-specialty. However, substantial variability in both content and developmental progression was seen in many specialties. The ACGME has been actively reviewing the Milestones to insure that there exists harmony across all specialties. Much has been learned about the milestones since their implementation. As educators, we need to provide a robust and reproducible system for all to use. The future of resident education, Milestones 2.0, will provide the necessary groundwork for a more user friendly system that will allow adequate evaluation of our trainees.
Collapse
Affiliation(s)
- Karim J Hamawy
- Lahey Hospital and Medical Center, 41 Mall Rd., Burlington, MA, 01805, USA.
| | - Laura Edgar
- ACGME, 401 North Michigan Avenue, Suite 2000, Chicago, IL, 60611, USA
| |
Collapse
|
7
|
Hamstra SJ, Yamazaki K, Barton MA, Santen SA, Beeson MS, Holmboe ES. A National Study of Longitudinal Consistency in ACGME Milestone Ratings by Clinical Competency Committees: Exploring an Aspect of Validity in the Assessment of Residents' Competence. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2019; 94:1522-1531. [PMID: 31169540 PMCID: PMC6760653 DOI: 10.1097/acm.0000000000002820] [Citation(s) in RCA: 30] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/01/2023]
Abstract
PURPOSE To investigate whether clinical competency committees (CCCs) were consistent in applying milestone ratings for first-year residents over time or whether ratings increased or decreased. METHOD Beginning in December 2013, the Accreditation Council for Graduate Medical Education (ACGME) initiated a phased-in requirement for reporting milestones; emergency medicine (EM), diagnostic radiology (DR), and urology (UR) were among the earliest reporting specialties. The authors analyzed CCC milestone ratings of first-year residents from 2013 to 2016 from all ACGME-accredited EM, DR, and UR programs for which they had data. The number of first-year residents in these programs ranged from 2,838 to 2,928 over this time period. The program-level average milestone rating for each subcompetency was regressed onto the time of observation using a random coefficient multilevel regression model. RESULTS National average program-level milestone ratings of first-year residents decreased significantly over the observed time period for 32 of the 56 subcompetencies examined. None of the other subcompetencies showed a significant change. National average in-training examination scores for each of the specialties remained essentially unchanged over the time period, suggesting that differences between the cohorts were not likely an explanatory factor. CONCLUSIONS The findings indicate that CCCs tend to become more stringent or maintain consistency in their ratings of beginning residents over time. One explanation for these results is that CCCs may become increasingly comfortable in assigning lower ratings when appropriate. This finding is consistent with an increase in confidence with the milestone rating process and the quality of feedback it provides.
Collapse
Affiliation(s)
- Stanley J. Hamstra
- S.J. Hamstra is vice president, Milestones Research and Evaluation, Accreditation Council for Graduate Medical Education, Chicago, Illinois, adjunct professor, Faculty of Education, University of Ottawa, Ottawa, Ontario, Canada, and adjunct professor, Department of Medical Education, Feinberg School of Medicine, Northwestern University, Chicago, Illinois; ORCID: https://orcid.org/0000-0002-0680-366X
| | - Kenji Yamazaki
- K. Yamazaki is senior analyst, Milestones Research and Evaluation, Accreditation Council for Graduate Medical Education, Chicago, Illinois
| | - Melissa A. Barton
- M.A. Barton is director of medical affairs, American Board of Emergency Medicine, East Lansing, Michigan
| | - Sally A. Santen
- S.A. Santen is professor and senior associate dean, Virginia Commonwealth University School of Medicine, Richmond, Virginia
| | - Michael S. Beeson
- M.S. Beeson is director, American Board of Emergency Medicine, East Lansing, Michigan, professor, Department of Emergency Medicine, Northeast Ohio Medical University, Rootstown, Ohio, and program director, Department of Emergency Medicine, Summa Health, Akron, Ohio
| | - Eric S. Holmboe
- E.S. Holmboe is senior vice president, Milestone Development and Evaluation, Accreditation Council for Graduate Medical Education, Chicago, Illinois
| |
Collapse
|
8
|
Chan TM, Van Dewark K, Sherbino J, Lineberry M. Coaching for Chaos: A Qualitative Study of Instructional Methods for Multipatient Management in the Emergency Department. AEM EDUCATION AND TRAINING 2019; 3:145-155. [PMID: 31008426 PMCID: PMC6457384 DOI: 10.1002/aet2.10312] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/17/2018] [Revised: 11/14/2018] [Accepted: 11/19/2018] [Indexed: 05/21/2023]
Abstract
BACKGROUND Busy environments, like the emergency department (ED), require teachers to develop instructional strategies for coaching trainees to function within these same environments. Few studies have documented the strategies used by emergency physician (EP)-teachers within these busy, chaotic environments, instead emphasizing teaching in more predictable environments such as the outpatient clinic, hospital wards, or operating room. The authors sought to discover what strategies EP-teachers were using and what trainees recalled experiencing when learning to handle these unpredictable, overcrowded, complex, multipatient environments. METHOD An interpretive description study was conducted at multiple teaching hospitals affiliated with McMaster University from July 2014 to May 2015. Participants (10 EP-teachers and 10 junior residents) were asked to recall teaching strategies related to handling ED patient flow. Participants were asked to describe techniques that they used, observed, or experienced as trainees. Two independent coders read through interview transcripts, analyzing these documents inductively and iteratively. RESULTS Two main types of strategies to teach ED management were discovered: 1) workplace-based methods, including both observation and in situ instruction; and 2) principle-based advice. The most often described techniques were workplace-based methods, which included a variety of in situ techniques ranging from conversations to managerial coaching (e.g., collaborative problem-solving of real-life administrative dilemmas). CONCLUSIONS A mix of strategies are used to teach and coach trainees to handle multipatient environments. Further research is required to determine how to optimize the use of these techniques and innovate new strategies to support the learning of these crucial skills.
Collapse
Affiliation(s)
- Teresa M. Chan
- Division of Emergency MedicineDepartment of MedicineMcMaster UniversityHamiltonON
- McMaster program for Education Research, Innovation, and Theory (MERIT)HamiltonON
| | - Kenneth Van Dewark
- Department of Emergency MedicineUniversity of British ColumbiaVancounverBC
| | - Jonathan Sherbino
- Division of Emergency MedicineDepartment of MedicineMcMaster UniversityHamiltonON
- McMaster program for Education Research, Innovation, and Theory (MERIT)HamiltonON
| | - Matthew Lineberry
- Simulation Research, Assessment, and Outcomes, Zamierowski Institute for Experiential Learning, and Department of Health Policy and ManagementUniversity of Kansas Medical CenterKansas CityKS
| |
Collapse
|
9
|
A Multicenter Collaboration for Simulation-Based Assessment of ACGME Milestones in Emergency Medicine. Simul Healthc 2019; 13:348-355. [PMID: 29620703 DOI: 10.1097/sih.0000000000000291] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/08/2023]
Abstract
STATEMENT In 2014, the six allopathic emergency medicine (EM) residency programs in Chicago established an annual, citywide, simulation-based assessment of all postgraduate year 2 EM residents. The cases and corresponding assessment tools were designed by the simulation directors from each of the participating sites. All assessment tools include critical actions that map directly to numerous EM milestones in 11 different subcompetencies. The 2-hour assessments provide opportunities for residents to lead resuscitations of critically ill patients and demonstrate procedural skills, using mannequins and task trainers respectively. More than 80 residents participate annually and their assessment experiences are essentially identical across testing sites. The assessments are completed electronically and comparative performance data are immediately available to program directors.
Collapse
|
10
|
Goyal N, Folt J, Jaskulka B, Baliga S, Slezak M, Schultz LR, Vallee P. Assessment methods and resource requirements for milestone reporting by an emergency medicine clinical competency committee. MEDICAL EDUCATION ONLINE 2018; 23:1538925. [PMID: 30376785 PMCID: PMC6211216 DOI: 10.1080/10872981.2018.1538925] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Received: 08/11/2018] [Revised: 10/13/2018] [Accepted: 10/15/2018] [Indexed: 06/08/2023]
Abstract
BACKGROUND The Accreditation Council for Graduate Medical Education (ACGME) introduced milestones for Emergency Medicine (EM) in 2012. Clinical Competency Committees (CCC) are tasked with assessing residents on milestones and reporting them to the ACGME. Appropriate workflows for CCCs are not well defined. OBJECTIVE Our objective was to compare different approaches to milestone assessment by a CCC, quantify resource requirements for each and to identify the most efficient workflow. DESIGN Three distinct processes for rendering milestone assessments were compared: Full milestone assessments (FMA) utilizing all available resident assessment data, Ad-hoc milestone assessments (AMA) created by multiple expert educators using their personal assessment of resident performance, Self-assessments (SMA) completed by residents. FMA were selected as the theoretical gold standard. Intraclass correlation coefficients were used to analyze for agreement between different assessment methods. Kendall's coefficient was used to assess the inter-rater agreement for the AMA. RESULTS All 13 second-year residents and 7 educational faculty of an urban EM Residency Program participated in the study in 2013. Substantial or better agreement between FMA and AMA was seen for 8 of the 23 total subcompetencies (PC4, PC8, PC9, PC11, MK, PROF2, ICS2, SBP2), and for 1 subcompetency (SBP1) between FMA and SMA. Multiple AMA for individual residents demonstrated substantial or better interobserver agreement in 3 subcompetencies (PC1, PC2, and PROF2). FMA took longer to complete compared to AMA (80.9 vs. 5.3 min, p < 0.001). CONCLUSIONS Using AMA to evaluate residents on the milestones takes significantly less time than FMA. However, AMA and SMA agree with FMA on only 8 and 1 subcompetencies, respectively. An estimated 23.5 h of faculty time are required each month to fulfill the requirement for semiannual reporting for a residency with 42 trainees.
Collapse
Affiliation(s)
- Nikhil Goyal
- Department of Emergency Medicine, Henry Ford Health System, Detroit, MI, USA
- Department of Internal Medicine, Henry Ford Health System, Detroit, MI, USA
| | - Jason Folt
- Department of Emergency Medicine, Henry Ford Health System, Detroit, MI, USA
| | - Bradley Jaskulka
- Department of Emergency Medicine, Henry Ford Health System, Detroit, MI, USA
| | - Sudhir Baliga
- Department of Emergency Medicine, Henry Ford Health System, Detroit, MI, USA
| | - Michelle Slezak
- Department of Emergency Medicine, Henry Ford Health System, Detroit, MI, USA
| | - Lonni R. Schultz
- Department of Public Health Sciences, Henry Ford Health System, Detroit, MI, USA
| | - Phyllis Vallee
- Department of Emergency Medicine, Henry Ford Health System, Detroit, MI, USA
| |
Collapse
|
11
|
Miranda FBG, Mazzo A, Alves Pereira-Junior G. Construction and validation of competency frameworks for the training of nurses in emergencies. Rev Lat Am Enfermagem 2018; 26:e3061. [PMID: 30379246 PMCID: PMC6206820 DOI: 10.1590/1518-8345.2631-3061] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/20/2018] [Accepted: 08/02/2018] [Indexed: 11/22/2022] Open
Abstract
Objective: to build and validate competency frameworks to be developed in the training
of nurses for the care of adult patients in situations of emergency with a
focus on airway, breathing and circulation approach. Method: this is a descriptive and methodological study that took place in three
phases: the first phase consisted in a literature review and a workshop
involving seven experts for the creation of the competency frameworks; in
the second phase, 15 experts selected through the Snowball Technique and
Delphi Technique participated in the face and content validation, with
analysis of the content of the suggestions and calculation of the Content
Validation Index to assess the agreement on the representativeness of each
item; in the third phase, 13 experts participated in the final agreement of
the presented material. Results: the majority of the experts were nurses, with graduation and professional
experience in the theme of the study. Competency frameworks were developed
and validated for the training of nurses in the airway, breathing and
circulation approach. Conclusion: the study made it possible to build and validate competency frameworks. We
highlight its originality and potentialities to guide teachers and
researchers in an efficient and objective way in the practical development
of skills involved in the subject approached.
Collapse
Affiliation(s)
- Fernanda Berchelli Girão Miranda
- Universidade de São Paulo, Escola de Enfermagem de Ribeirão Preto, PAHO/WHO Collaborating Centre for Nursing Research Development, Ribeirão Preto, SP, Brazil
| | - Alessandra Mazzo
- Universidade de São Paulo, Escola de Enfermagem de Ribeirão Preto, PAHO/WHO Collaborating Centre for Nursing Research Development, Ribeirão Preto, SP, Brazil
| | | |
Collapse
|
12
|
Edgar L, Roberts S, Yaghmour NA, Leep Hunderfund A, Hamstra SJ, Conforti L, Holmboe ES. Competency Crosswalk: A Multispecialty Review of the Accreditation Council for Graduate Medical Education Milestones Across Four Competency Domains. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2018; 93:1035-1041. [PMID: 29166350 DOI: 10.1097/acm.0000000000002059] [Citation(s) in RCA: 62] [Impact Index Per Article: 10.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/10/2023]
Abstract
PURPOSE To identify common and overlapping themes among the interpersonal and communication skills (ICS), practice-based learning and improvement (PBLI), professionalism (PROF), and systems-based practice (SBP) milestones of the transitional year and 26 specialties. METHOD In May 2017, milestones were accessed from the Accreditation Council for Graduate Medical Education specialties website. A thematic analysis of the ICS, PBLI, PROF, and SBP milestones was performed to determine unique and common themes across these competencies and across specialties. Keywords from the common program requirements were initially applied as codes to the milestones. Codes were then grouped into common themes. RESULTS Twenty-two themes were identified: 15 (68%) were unique to a given competency (3 related to ICS, 4 related to PBLI, 5 related to PROF, and 3 related to SBP), and 7 (32%) appeared in the milestones of more than one core competency. Eleven themes (50%) were used by 20 or more specialties, and 6 themes (27%) by 10 or fewer specialties. No theme was present across all specialties. CONCLUSIONS The ICS, PBLI, PROF, and SBP milestones contain multiple themes with areas of overlap among these four competencies and substantial variability across specialties. This variability may create differential expectations of residents across specialties, complicate faculty development, and make sharing assessment tools difficult. The thematic analysis provides important insights into how individual specialties interpret and operationalize the ICS, PBLI, PROF, and SBP competency domains and can inform future revisions of milestones to enable harmonization and shared understanding of these competencies across specialties where appropriate.
Collapse
Affiliation(s)
- Laura Edgar
- L. Edgar is executive director for milestone development, Accreditation Council for Graduate Medical Education, Chicago, Illinois. S. Roberts is milestones project manager, Accreditation Council for Graduate Medical Education, Chicago, Illinois. N.A. Yaghmour is research associate for milestones evaluation, Accreditation Council for Graduate Medical Education, Chicago, Illinois. A. Leep Hunderfund is assistant professor of neurology, Mayo Clinic, Rochester, Minnesota. S.J. Hamstra is vice president for milestone research and evaluation, Accreditation Council for Graduate Medical Education, Chicago, Illinois. L. Conforti is research associate for milestones evaluation, Accreditation Council for Graduate Medical Education, Chicago, Illinois. E.S. Holmboe is senior vice president for milestone development and evaluation, Accreditation Council for Graduate Medical Education, Chicago, Illinois
| | | | | | | | | | | | | |
Collapse
|
13
|
Chan T, Sebok‐Syer S, Thoma B, Wise A, Sherbino J, Pusic M. Learning Analytics in Medical Education Assessment: The Past, the Present, and the Future. AEM EDUCATION AND TRAINING 2018; 2:178-187. [PMID: 30051086 PMCID: PMC6001721 DOI: 10.1002/aet2.10087] [Citation(s) in RCA: 54] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/29/2018] [Accepted: 01/30/2018] [Indexed: 05/09/2023]
Abstract
With the implementation of competency-based medical education (CBME) in emergency medicine, residency programs will amass substantial amounts of qualitative and quantitative data about trainees' performances. This increased volume of data will challenge traditional processes for assessing trainees and remediating training deficiencies. At the intersection of trainee performance data and statistical modeling lies the field of medical learning analytics. At a local training program level, learning analytics has the potential to assist program directors and competency committees with interpreting assessment data to inform decision making. On a broader level, learning analytics can be used to explore system questions and identify problems that may impact our educational programs. Scholars outside of health professions education have been exploring the use of learning analytics for years and their theories and applications have the potential to inform our implementation of CBME. The purpose of this review is to characterize the methodologies of learning analytics and explore their potential to guide new forms of assessment within medical education.
Collapse
Affiliation(s)
- Teresa Chan
- McMaster program for Education Research, Innovation, and Theory (MERIT)HamiltonOntarioCanada
| | - Stefanie Sebok‐Syer
- Centre for Education Research & InnovationSchulich School of Medicine and DentistrySaskatoonSaskatchewanCanada
| | - Brent Thoma
- Department of Emergency MedicineUniversity of SaskatchewanSaskatoonSaskatchewanCanada
| | - Alyssa Wise
- Steinhardt School of Culture, Education, and Human DevelopmentNew York UniversityNew YorkNY
| | - Jonathan Sherbino
- Faculty of Health ScienceDivision of Emergency MedicineDepartment of MedicineMcMaster UniversityHamiltonOntarioCanada
- McMaster program for Education Research, Innovation, and Theory (MERIT)HamiltonOntarioCanada
| | - Martin Pusic
- Department of Emergency MedicineNYU School of MedicineNew YorkNY
| |
Collapse
|
14
|
Shoenberger J, Lamba S, Goett R, DeSandre P, Aberger K, Bigelow S, Brandtman T, Chan GK, Zalenski R, Wang D, Rosenberg M, Jubanyik K. Development of Hospice and Palliative Medicine Knowledge and Skills for Emergency Medicine Residents: Using the Accreditation Council for Graduate Medical Education Milestone Framework. AEM EDUCATION AND TRAINING 2018; 2:130-145. [PMID: 30051080 PMCID: PMC6001832 DOI: 10.1002/aet2.10088] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/16/2017] [Revised: 01/25/2018] [Accepted: 01/28/2018] [Indexed: 06/08/2023]
Abstract
OBJECTIVES Emergency medicine (EM) physicians commonly care for patients with serious life-limiting illness. Hospice and palliative medicine (HPM) is a subspecialty pathway of EM. Although a subspecialty level of practice requires additional training, primary-level skills of HPM such as effective communication and symptom management are part of routine clinical care and expected of EM residents. However, unlike EM residency curricula in disciplines like trauma and ultrasound, there is no nationally defined HPM curriculum for EM resident training. An expert consensus group was convened with the aim of defining content areas and competencies for HPM primary-level practice in the ED setting. Our overall objective was to develop HPM milestones within a competency framework that is relevant to the practice of EM. METHODS The American College of Emergency Physicians Palliative Medicine Section assembled a committee that included academic EM faculty, community EM physicians, EM residents, and nurses, all with interest and expertise in curricular design and palliative medicine. RESULTS The committee peer reviewed and assessed HPM content for validity and importance to EM residency training. A topic list was developed with three domains: provider skill set, clinical recognition of HPM needs, and logistic understanding related to HPM in the ED. The group also developed milestones in HPM-EM to identify relevant knowledge, skills, and behaviors using the framework modeled after the Accreditation Council for Graduate Medical Education (ACGME) EM milestones. This framework was chosen to make the product as user-friendly and familiar as possible to facilitate use by EM educators. CONCLUSIONS Educators in EM residency programs now have access to HPM content areas and milestones relevant to EM practice that can be used for curriculum development in EM residency programs. The HPM-EM skills/competencies presented herein are structured in a familiar milestone framework that is modeled after the widely accepted ACGME EM milestones.
Collapse
Affiliation(s)
- Jan Shoenberger
- Keck School of Medicine of the University of Southern CaliforniaLos AngelesCA
| | | | | | | | | | | | | | | | | | | | | | | |
Collapse
|
15
|
Festekjian A, Mody AP, Chang TP, Ziv N, Nager AL. Novel Transfer of Care Sign-out Assessment Tool in a Pediatric Emergency Department. Acad Pediatr 2018; 18:86-93. [PMID: 28843485 DOI: 10.1016/j.acap.2017.08.009] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/16/2016] [Revised: 08/10/2017] [Accepted: 08/15/2017] [Indexed: 10/19/2022]
Abstract
OBJECTIVE Transfer of care sign-outs (TOCS) for admissions from a pediatric emergency department have unique challenges. Standardized and reliable assessment tools for TOCS remain elusive. We describe the development, reliability, and validity of a TOCS assessment tool. METHODS Video recordings of resident TOCS were assessed to capture 4 domains: completeness, synopsis, foresight, and professionalism. In phase 1, 56 TOCS were used to modify the tool and improve reliability. In phase 2, 91 TOCS were used to examine validity. Analyses included Cronbach's alpha for internal structure, intraclass correlation and Cohen's kappa for interrater reliability, Pearson's correlation for relationships between variables, and 95% confidence interval of the mean for resident group comparisons. RESULTS Cronbach's alpha was 0.52 for internal structure of the tool's subjective rating scale. Intraclass correlation for the subjective rating scale items ranged from 0.70 to 0.80. Cohen's kappa for most objective checklist items ranged from 0.43 to 1. Content completeness was significantly correlated with synopsis, foresight, and professionalism (Pearson's r ranged from 0.36 to 0.62, P values were <0.001). House staff senior residents scored higher (on average) than interns and rotating senior residents in synopsis and foresight. Also, house staff interns scored higher (on average) than rotating senior residents in professionalism. House staff senior residents scored higher (on average) than rotating senior residents in content completeness. CONCLUSIONS We provide validity evidence to support using scores from the TOCS tool to assess higher-level transfer of care comprehension and communication by pediatric emergency department residents and to test interventions to improve TOCS.
Collapse
Affiliation(s)
- Ara Festekjian
- Department of Pediatrics, Division of Emergency and Transport Medicine, Children's Hospital Los Angeles, Los Angeles, Calif; Keck School of Medicine, University of Southern California, Los Angeles, Calif.
| | - Ameer P Mody
- Department of Pediatrics, Division of Emergency and Transport Medicine, Children's Hospital Los Angeles, Los Angeles, Calif; Keck School of Medicine, University of Southern California, Los Angeles, Calif
| | - Todd P Chang
- Department of Pediatrics, Division of Emergency and Transport Medicine, Children's Hospital Los Angeles, Los Angeles, Calif; Keck School of Medicine, University of Southern California, Los Angeles, Calif
| | - Nurit Ziv
- Department of Pediatrics, Division of Emergency and Transport Medicine, Children's Hospital Los Angeles, Los Angeles, Calif
| | - Alan L Nager
- Department of Pediatrics, Division of Emergency and Transport Medicine, Children's Hospital Los Angeles, Los Angeles, Calif; Keck School of Medicine, University of Southern California, Los Angeles, Calif
| |
Collapse
|
16
|
Beeson MS, Hamstra SJ, Barton MA, Yamazaki K, Counselman FL, Shayne PH, Holmboe ES, Muelleman RL, Reisdorff EJ. Straight Line Scoring by Clinical Competency Committees Using Emergency Medicine Milestones. J Grad Med Educ 2017; 9:716-720. [PMID: 29270260 PMCID: PMC5734325 DOI: 10.4300/jgme-d-17-00304.1] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/25/2017] [Revised: 07/25/2017] [Accepted: 08/07/2017] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND In 2013, milestone ratings became a reporting requirement for emergency medicine (EM) residency programs. Programs rate each resident in the fall and spring on 23 milestone subcompetencies. OBJECTIVE This study examined the incidence of straight line scoring (SLS) for EM Milestone ratings, defined as a resident being assessed the same score across the milestone subcompetencies. METHODS This descriptive analysis measured the frequencies of SLS for all Accreditation Council for Graduate Medical Education (ACGME)-accredited EM programs during the 2015-2016 academic year. Outcomes were the frequency of SLS in the fall and spring milestone assessments, changes in the number of SLS reports, and reporting trends. Chi-square analysis compared nominal variables. RESULTS There were 6257 residents in the fall and 6588 in the spring. Milestone scores were reported for 6173 EM residents in the fall (99% of 6257) and spring (94% of 6588). In the fall, 93% (5753 residents) did not receive SLS ratings and 420 (7%) did, with no significant difference compared with the spring (5776 [94%] versus 397 [6%]). Subgroup analysis showed higher SLS results for residents' first ratings (183 of 2136 versus 237 of 4220, P < .0001) and for their final ratings (200 of 2019 versus 197 of 4354, P < .0001). Twenty percent of programs submitted 10% or more SLS ratings, and a small percentage submitted more than 50% of ratings as SLS. CONCLUSIONS Most programs did not submit SLS ratings. Because of the statistical improbability of SLS, any SLS ratings reduce the validity assertions of the milestone assessments.
Collapse
|
17
|
Chan TM. Nuance and Noise: Lessons Learned From Longitudinal Aggregated Assessment Data. J Grad Med Educ 2017; 9:724-729. [PMID: 29270262 PMCID: PMC5734327 DOI: 10.4300/jgme-d-17-00086.1] [Citation(s) in RCA: 22] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/02/2017] [Revised: 07/04/2017] [Accepted: 08/22/2017] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND Competency-based medical education requires frequent assessment to tailor learning experiences to the needs of trainees. In 2012, we implemented the McMaster Modular Assessment Program, which captures shift-based assessments of resident global performance. OBJECTIVE We described patterns (ie, trends and sources of variance) in aggregated workplace-based assessment data. METHODS Emergency medicine residents and faculty members from 3 Canadian university-affiliated, urban, tertiary care teaching hospitals participated in this study. During each shift, supervising physicians rated residents' performance using a behaviorally anchored scale that hinged on endorsements for progression. We used a multilevel regression model to examine the relationship between global rating scores and time, adjusting for data clustering by resident and rater. RESULTS We analyzed data from 23 second-year residents between July 2012 and June 2015, which yielded 1498 unique ratings (65 ± 18.5 per resident) from 82 raters. The model estimated an average score of 5.7 ± 0.6 at baseline, with an increase of 0.005 ± 0.01 for each additional assessment. There was significant variation among residents' starting score (y-intercept) and trajectory (slope). CONCLUSIONS Our model suggests that residents begin at different points and progress at different rates. Meta-raters such as program directors and Clinical Competency Committee members should bear in mind that progression may take time and learning trajectories will be nuanced. Individuals involved in ratings should be aware of sources of noise in the system, including the raters themselves.
Collapse
|
18
|
Gaeta T, Mahalingam G, Pyle M, Dam A, Visconti A. Using an alumni survey to target improvements in an emergency medicine training programme. Emerg Med J 2017; 35:189-191. [PMID: 29055891 DOI: 10.1136/emermed-2017-206692] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/22/2017] [Revised: 08/13/2017] [Accepted: 10/08/2017] [Indexed: 11/04/2022]
Abstract
INTRODUCTION The Accreditation Council for Graduate Medical Education (ACGME) is the governing body responsible for accrediting graduate medical training programme in the USA. The Emergency Medicine Milestones (EM-Milestones) were developed by the ACGME and American Board of Emergency Medicine as a guide and monitoring tool for the knowledge, skills, abilities and experiences to be acquired during training. Alumni surveys have been reported as a valuable resource for training programme to identify areas for improvement; however, there are few studies regarding programme improvement in emergency medicine. We aimed to use the EM-Milestones, adapted as an alumni self-assessment survey, to identify areas for training programme improvement. METHODS This study was conducted at an urban, academic affiliated, community hospital in New York city with an emergency medicine training programme consisting of 30 residents over 3 years. Alumni of our emergency medicine training programme were sent an EM-Milestones-based self-assessment survey. Participants evaluated their ability in each EM-Milestones subcompetency on a Likert scale. Data were analysed using descriptive statistics. RESULTS Response rate was 74% (69/93). Alumni reported achieving the target performance in 5/6 general competencies, with Systems-Based Practice falling below the target performance. The survey further identified 6/23 subcompetencies (Pharmacotherapy, Ultrasound, Wound Management, Patient Safety, Systems-Based Management and Technology) falling below the target performance level. DISCUSSION Alumni self-evaluation of competence using the EM-Milestones provides valuable information concerning confidence to practice independently; these data, coupled with regular milestone evaluation of existing trainees, can identify problem areas and provide a blueprint for targeted programme improvement.
Collapse
Affiliation(s)
- Theodore Gaeta
- Department of Emergency Medicine, New York-Presbyterian Brooklyn Methodist Hospital, Brooklyn, New York, USA
| | - Gowtham Mahalingam
- Department of Emergency Medicine, New York-Presbyterian Brooklyn Methodist Hospital, Brooklyn, New York, USA
| | - Matthew Pyle
- Department of Emergency Medicine, New York-Presbyterian Brooklyn Methodist Hospital, Brooklyn, New York, USA.,Department of Emergency Medicine, George Washington University Hospital, Washington, DC, USA
| | - Aaron Dam
- Department of Emergency Medicine, New York-Presbyterian Brooklyn Methodist Hospital, Brooklyn, New York, USA
| | - Annette Visconti
- Department of Emergency Medicine, New York-Presbyterian Brooklyn Methodist Hospital, Brooklyn, New York, USA
| |
Collapse
|
19
|
Chang TP, Schrager SM, Rake AJ, Chan MW, Pham PK, Christman G. The effect of multimedia replacing text in resident clinical decision-making assessment. ADVANCES IN HEALTH SCIENCES EDUCATION : THEORY AND PRACTICE 2017; 22:901-914. [PMID: 27752842 DOI: 10.1007/s10459-016-9719-0] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/04/2016] [Accepted: 10/06/2016] [Indexed: 06/06/2023]
Abstract
Multimedia in assessing clinical decision-making skills (CDMS) has been poorly studied, particularly in comparison to traditional text-based assessments. The literature suggests multimedia is more difficult for trainees. We hypothesize that pediatric residents score lower in diagnostic skill when clinical vignettes use multimedia rather than text for patient findings. A standardized method was developed to write text-based questions from 60 high-resolution, quality multimedia; a series of expert panels selected 40 questions with both a multimedia and text-based counterpart, and two online tests were developed. Each test featured 40 identical questions with reciprocal and alternating modality (multimedia vs. text). Pediatric residents and rising 4th year medical students (MS-IV) at a single residency were randomized to complete either test stratified by postgraduate training year (PGY). A mixed between-within subjects ANOVA analyzed differences in score due to modality and PGY. Secondary analyses ascertained modality effect in dermatology and respiratory questions using Mann-Whitney U tests, and correlations on test performance to In-service Training Exam (ITE) scores using Spearman rank. Eighty-eight residents and rising interns completed the study. Overall multimedia scores were lower than text-based scores (p = 0.047, η p2 = 0.04), with highest disparity in rising interns (MS-IV); however, PGY had a greater effect on scores (p = 0.001, η p2 = 0.16). Respiratory questions were not significantly lower with multimedia (n = 9, median 0.71 vs. 0.86, p = 0.09) nor dermatology questions (n = 13, p = 0.41). ITEs correlated significantly with text-based scores (ρ = 0.23-0.25, p = 0.04-0.06) but not with multimedia scores. In physician trainees with less clinical experience, multimedia-based case vignettes are associated with significantly lower scores. These results help shed light on the role of multimedia versus text-based information in CDMS, particularly in less experienced clinicians.
Collapse
Affiliation(s)
- Todd P Chang
- Division of Emergency Medicine and Transport, Children's Hospital Los Angeles and University of Southern California Keck School of Medicine, Los Angeles, CA, USA.
| | - Sheree M Schrager
- Division of Hospital Medicine, Children's Hospital Los Angeles and University of Southern California Keck School of Medicine, Los Angeles, CA, USA
| | - Alyssa J Rake
- Department of Critical Care and Anesthesiology, Children's Hospital Los Angeles and University of Southern California Keck School of Medicine, Los Angeles, CA, USA
| | - Michael W Chan
- Division of Emergency Medicine, Ann and Robert H. Lurie Children's Hospital of Chicago and Northwestern University Feinberg School of Medicine, Chicago, IL, USA
| | - Phung K Pham
- Division of Emergency Medicine and Transport, Children's Hospital Los Angeles and University of Southern California Keck School of Medicine, Los Angeles, CA, USA
- Division of Behavioral and Organizational Sciences, Claremont Graduate University, Claremont, CA, USA
| | - Grant Christman
- Division of Hospital Medicine, Children's Hospital Los Angeles and University of Southern California Keck School of Medicine, Los Angeles, CA, USA
| |
Collapse
|
20
|
Practices and attitudes towards radiation risk disclosure for computed tomography: survey of emergency medicine residency program directors. Emerg Radiol 2017; 24:479-486. [DOI: 10.1007/s10140-017-1493-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/24/2017] [Accepted: 03/02/2017] [Indexed: 10/19/2022]
|
21
|
Yao A, Massenburg BB, Silver L, Taub PJ. Initial Comparison of Resident and Attending Milestones Evaluations in Plastic Surgery. JOURNAL OF SURGICAL EDUCATION 2017; 74:773-779. [PMID: 28259488 DOI: 10.1016/j.jsurg.2017.02.001] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/27/2016] [Revised: 01/05/2017] [Accepted: 02/02/2017] [Indexed: 06/06/2023]
Abstract
BACKGROUND Graduate medical education has recently undergone a major archetypal shift toward competency-based evaluations of residents' performance. The implementation of the Milestones program by the Accreditation Council for Graduate Medical Education (ACGME) is a core component of the shift, designed to ensure uniformity in measuring residency knowledge using a series of specialty-specific achievements. This study evaluates the correlation between residents' self-evaluations and program directors' assessments of their performance. METHODS The study population comprised 12 plastic surgery residents, ranging from postgraduate year 1 to postgraduate year 6, enrolled in an integrated residency program at a single institution. RESULTS Overall, average attending scores were lower than average resident scores at all levels except postgraduate year 6. Correlation between resident and attending evaluations ranged from 0.417 to 0.957, with the correlation of average scores of Patient Care (0.854) and Medical Knowledge (0.816) Milestones significantly higher than those of professional skillsets (0.581). "Patient care, facial esthetics" was the Milestone with the lowest average scores from both groups. Residents scored themselves notably higher than their attendings' evaluations in Practice-based Learning and Improvement categories (+0.958) and notably lower in Medical Knowledge categories such as "Cosmetic Surgery, Trunk and Lower Extremities" (-0.375) and "Non-trauma hand" (-0.208). The total possible number of participants in this study was 12. The actual number of participants was 12 (100%). CONCLUSIONS The remarkable range of correlations suggests that expectations for performance standards may vary widely between residents and program directors. Understanding gaps between expectations and performance is vital to inform current and future residents as the restructuring of the accreditation process continues.
Collapse
Affiliation(s)
- Amy Yao
- Division of Plastic and Reconstructive Surgery, Icahn School of Medicine, New York, New York
| | - Benjamin B Massenburg
- Division of Plastic and Reconstructive Surgery, Icahn School of Medicine, New York, New York
| | - Lester Silver
- Division of Plastic and Reconstructive Surgery, Icahn School of Medicine, New York, New York
| | - Peter J Taub
- Division of Plastic and Reconstructive Surgery, Icahn School of Medicine, New York, New York.
| |
Collapse
|
22
|
Warrington S, Beeson M, Bradford A. Inter-rater Agreement of End-of-shift Evaluations Based on a Single Encounter. West J Emerg Med 2017; 18:518-524. [PMID: 28435505 PMCID: PMC5391904 DOI: 10.5811/westjem.2016.12.32014] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/08/2016] [Revised: 12/14/2016] [Accepted: 12/30/2016] [Indexed: 11/11/2022] Open
Abstract
INTRODUCTION End-of-shift evaluation (ESE) forms, also known as daily encounter cards, represent a subset of encounter-based assessment forms. Encounter cards have become prevalent for formative evaluation, with some suggesting a potential for summative evaluation. Our objective was to evaluate the inter-rater agreement of ESE forms using a single scripted encounter at a conference of emergency medicine (EM) educators. METHODS Following institutional review board exemption, we created a scripted video simulating an encounter between an intern and a patient with an ankle injury. That video was shown during a lecture at the Council of EM Residency Director's Academic Assembly with attendees asked to evaluate the "resident" using one of eight possible ESE forms randomly distributed. Descriptive statistics were used to analyze the results with Fleiss' kappa to evaluate inter-rater agreement. RESULTS Most of the 324 respondents were leadership in residency programs (66%), with a range of 29-47 responses per evaluation form. Few individuals (5%) felt they were experts in assessing residents based on EM milestones. Fleiss' kappa ranged from 0.157 - 0.308 and did not perform much better in two post-hoc subgroup analyses. CONCLUSION The kappa ranges found show only slight to fair inter-rater agreement and raise concerns about the use of ESE forms in assessment of EM residents. Despite limitations present in this study, these results and a lack of other studies on inter-rater agreement of encounter cards should prompt further studies of such methods of assessment. Additionally, EM educators should focus research on methods to improve inter-rater agreement of ESE forms or other evaluating other methods of assessment of EM residents.
Collapse
Affiliation(s)
- Steven Warrington
- Kaweah Delta Medical Center, Department of Emergency Medicine, Visalia, California
| | - Michael Beeson
- Akron General Medical Center, Department of Emergency Medicine, Akron, Ohio
| | - Amber Bradford
- Akron General Medical Center, Department of Emergency Medicine, Akron, Ohio
| |
Collapse
|
23
|
Beeson MS. The Emergency Medicine Milestones: With Experience Comes Suggestions to Improve. Acad Emerg Med 2016; 23:1434-1436. [PMID: 27428572 DOI: 10.1111/acem.13055] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
Affiliation(s)
- Michael S. Beeson
- Department of Emergency Medicine; Cleveland Clinic Foundation Akron General Medical Center; Akron OH
| |
Collapse
|
24
|
Ketterer AR, Salzman DH, Branzetti JB, Gisondi MA. Supplemental Milestones for Emergency Medicine Residency Programs: A Validation Study. West J Emerg Med 2016; 18:69-75. [PMID: 28116011 PMCID: PMC5226766 DOI: 10.5811/westjem.2016.10.31499] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/08/2016] [Revised: 09/22/2016] [Accepted: 10/10/2016] [Indexed: 12/03/2022] Open
Abstract
Introduction Emergency medicine (EM) residency programs may be 36 or 48 months in length. The Residency Review Committee for EM requires that 48-month programs provide educational justification for the additional 12 months. We developed additional milestones that EM training programs might use to assess outcomes in domains that meet this accreditation requirement. This study aims to assess for content validity of these supplemental milestones using a similar methodology to that of the original EM Milestones validation study. Methods A panel of EM program directors (PD) and content experts at two institutions identified domains of additional training not covered by the existing EM Milestones. This led to the development of six novel subcompetencies: “Operations and Administration,” “Critical Care,” “Leadership and Management,” “Research,” “Teaching and Learning,” and “Career Development.” Subject-matter experts at other 48-month EM residency programs refined the milestones for these subcompetencies. PDs of all 48-month EM programs were then asked to order the proposed milestones using the Dreyfus model of skill acquisition for each subcompetency. Data analysis mirrored that used in the original EM Milestones validation study, leading to the final version of our supplemental milestones. Results Twenty of 33 subjects (58.8%) completed the study. No subcompetency or individual milestone met deletion criteria. Of the 97 proposed milestones, 67 (69.1%) required no further editing and remained at the same level as proposed by the study authors. Thirty milestones underwent level changes: 15 (15.5%) were moved one level up and 13 (13.4%) were moved one level down. One milestone (1.0%) in “Leadership and Management” was moved two levels up, and one milestone in “Operations and Administration” was moved two levels down. One milestone in “Research” was ranked by the survey respondents at one level higher than that proposed by the authors; however, this milestone was kept at its original level assignment. Conclusion Six additional subcompetencies were generated and assessed for content validity using the same methodology as was used to validate the current EM Milestones. These optional milestones may serve as an additional set of assessment tools that will allow EM residency programs to report these additional educational outcomes using a familiar milestone rubric.
Collapse
Affiliation(s)
- Andrew R Ketterer
- Northwestern University Feinberg School of Medicine, Department of Emergency Medicine, Chicago, Illinois; Feinberg Academy of Medical Educators, Department of Medical Education, Chicago, Illinois
| | - David H Salzman
- Northwestern University Feinberg School of Medicine, Department of Emergency Medicine, Chicago, Illinois; Feinberg Academy of Medical Educators, Department of Medical Education, Chicago, Illinois
| | - Jeremy B Branzetti
- University of Washington School of Medicine, Division of Emergency Medicine, Seattle, Washington
| | - Michael A Gisondi
- Northwestern University Feinberg School of Medicine, Department of Emergency Medicine, Chicago, Illinois; Feinberg Academy of Medical Educators, Department of Medical Education, Chicago, Illinois
| |
Collapse
|
25
|
Nelson M, Abdi A, Adhikari S, Boniface M, Bramante RM, Egan DJ, Matthew Fields J, Leo MM, Liteplo AS, Liu R, Nomura JT, Pigott DC, Raio CC, Ruskis J, Strony R, Thom C, Lewiss RE. Goal-directed Focused Ultrasound Milestones Revised: A Multiorganizational Consensus. Acad Emerg Med 2016; 23:1274-1279. [PMID: 27520068 DOI: 10.1111/acem.13069] [Citation(s) in RCA: 29] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2016] [Revised: 07/21/2016] [Accepted: 07/27/2016] [Indexed: 11/29/2022]
Abstract
In 2012 the Accreditation Council for Graduate Medical Education and the American Board of Emergency Medicine released the emergency medicine milestones. The Patient Care 12 (PC12) subcompetency delineates staged and progressive accomplishment in emergency ultrasound. While valuable as an initial framework for ultrasound resident education, there are limitations to PC12. This consensus paper provides a revised description of criteria to define the subcompetency. A multiorganizational task force was formed between the American College of Emergency Physicians Ultrasound Section, the Council of Emergency Medicine Residency Directors, and the Academy of Emergency Ultrasound of the Society for Academic Emergency Medicine. Representatives from each organization created this consensus document and revision.
Collapse
Affiliation(s)
- Mathew Nelson
- Department of Emergency Medicine; North Shore University Hospital; Plainview NY
| | - Amin Abdi
- Department of Emergency Medicine; Los Angeles County University of Southern California Medical Center; Los Angeles CA
| | - Srikar Adhikari
- Department of Emergency Medicine; University of Arizona; Tucson AZ
| | - Michael Boniface
- Department of Emergency Medicine; University of Florida; Gainesville FL
| | - Robert M. Bramante
- Department of Emergency Medicine; Good Samaritan Hospital Medical Center; Islip NY
| | - Daniel J. Egan
- Department of Emergency Medicine; Mt. Sinai St. Luke's Roosevelt; New York NY
| | - J. Matthew Fields
- Department of Emergency Medicine; Thomas Jefferson Hospital; Philadelphia PA
| | - Megan M. Leo
- Department of Emergency Medicine; Boston University School of Medicine; Boston MA
| | - Andrew S. Liteplo
- Department of Emergency Medicine; Massachusetts General Hospital; Boston MA
| | - Rachel Liu
- Department of Emergency Medicine; Yale University School of Medicine; New Haven CT
| | - Jason T. Nomura
- Department of Emergency Medicine; Christiana Care Health System; Newark DE
| | - David C. Pigott
- Department of Emergency Medicine; University of Alabama at Birmingham; Birmingham AL
| | - Christopher C. Raio
- Department of Emergency Medicine; Good Samaritan Hospital Medical Center; Islip NY
| | - Jennifer Ruskis
- Department of Emergency Medicine; Cook County Hospital; Chicago IL
| | - Robert Strony
- Department of Emergency Medicine; Geisinger Medical Center; Danville PA
| | - Chris Thom
- Department of Emergency Medicine; University of Virginia Health System; Charlottesville VA
| | - Resa E. Lewiss
- Department of Emergency Medicine; University of Colorado School of Medicine; Aurora CO
| |
Collapse
|
26
|
Abstract
This article is the sixth in a 7-part series that aims to comprehensively describe the current state and future directions of pediatric emergency medicine (PEM) fellowship training from the essential requirements to considerations for successfully administering and managing a program to the careers that may be anticipated upon program completion. This article provides a broad overview of administering and supervising a PEM fellowship program. It explores 3 topics: the principles of program administration, committee management, and recommendations for minimum time allocated for PEM fellowship program directors to administer their programs.
Collapse
|
27
|
ASDS Cosmetic Dermatologic Surgery Fellowship Milestones. Dermatol Surg 2016; 42:1164-73. [PMID: 27661429 DOI: 10.1097/dss.0000000000000860] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
BACKGROUND The American Council of Graduate Medical Education, which oversees much of postgraduate medical education in the United States, has championed the concept of "milestones," standard levels of achievement keyed to particular time points, to assess trainee performance during residency. OBJECTIVE To develop a milestones document for the American Society for Dermatologic Surgery (ASDS) Cosmetic Dermatologic Surgery (CDS) fellowship program. METHODS An ad hoc milestone drafting committee was convened that included members of the ASDS Accreditation Work Group and program directors of ASDS-approved Cosmetic Dermatologic Surgery (CDC) fellowship training programs. Draft milestones were circulated through email in multiple rounds until consensus was achieved. RESULTS Thirteen milestones were developed in the 6 Accreditation Council for Graduate Medical Education (ACGME) competency areas, with 8 of these being patient-care milestones. Additional instructions for milestone administration more specific to the CDS fellowship than general ACGME instructions were also approved. Implementation of semiannual milestones was scheduled for the fellowship class entering in July 2018. CONCLUSION Milestones are now available for CDS fellowship directors to implement in combination with other tools for fellow evaluation.
Collapse
|
28
|
Casting early light on anesthesiology milestone validation. J Clin Anesth 2016; 33:117-8. [DOI: 10.1016/j.jclinane.2016.02.033] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/27/2016] [Accepted: 02/12/2016] [Indexed: 11/23/2022]
|
29
|
Quinn SM, Worrilow CC, Jayant DA, Bailey B, Eustice E, Kohlhepp J, Rogers R, Kane BG. Using Milestones as Evaluation Metrics During an Emergency Medicine Clerkship. J Emerg Med 2016; 51:426-431. [PMID: 27473442 DOI: 10.1016/j.jemermed.2016.06.014] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/26/2016] [Revised: 05/18/2016] [Accepted: 06/04/2016] [Indexed: 11/19/2022]
Abstract
BACKGROUND The Accreditation Council for Graduate Medical Education's (ACGME) Milestones presumes graduating medical students will enter residency proficient at Milestone level 1 for 23 skills. The Next Accreditation System now includes Milestones for each postgraduate specialty, and it is unlikely that schools will document every emergency medicine (EM) applicant's EM-specific skills in their performance evaluation. OBJECTIVES The goals of this research were to determine if assessment of the Milestones was feasible during a medical student clerkship and examine the proportion of medical students performing at Milestone level 1. METHODS This study was conducted at a center with Liaison Committee on Medical Education-approved medical training and a 4-year EM residency. Using traditional clerkship, we studied the feasibility of an ACGME EM Milestones-based clerkship assessment. Data led to redesign of the clerkship and its evaluation process, including all level 1 anchor(s) to add "occasionally" (>60%), "usually" (>80%), and "always" (100%) on a Likert scale to on-shift assessment forms. RESULTS During the feasibility phase (2013-14), 75 students rotated though the clerkship; 55 evaluations were issued and 50 contained the Milestone summary. Eight deficiencies were noted in Milestone 12 and three in Milestone 14. After changes, 49 students rotated under the new evaluation rubric. Of 575 completed on-shift evaluations, 16 Milestone deficiencies were noted. Of 41 institutional evaluations issued, only one student had deficiencies noted, all of which pertained to patient care. All evaluations in this second cohort contained each student's Milestone proficiency. CONCLUSIONS Assessment of the Milestones is feasible. Communication of ACGME EM Milestone proficiency may identify students who require early observation or remediation. The majority of students meet the anchors for the Milestones, suggesting that clerkship assessment with the ACGME EM Milestones does not adequately differentiate students.
Collapse
Affiliation(s)
- Shawn M Quinn
- Department of Emergency Medicine, Lehigh Valley Health Network/University of South Florida Morsani College of Medicine, Allentown, Pennsylvania
| | - Charles C Worrilow
- Department of Emergency Medicine, Lehigh Valley Health Network/University of South Florida Morsani College of Medicine, Allentown, Pennsylvania
| | - Deepak A Jayant
- Department of Emergency Medicine, Lehigh Valley Health Network/University of South Florida Morsani College of Medicine, Allentown, Pennsylvania
| | - Blake Bailey
- Department of Emergency Medicine, Lehigh Valley Health Network/University of South Florida Morsani College of Medicine, Allentown, Pennsylvania
| | - Eric Eustice
- Department of Emergency Medicine, Lehigh Valley Health Network/University of South Florida Morsani College of Medicine, Allentown, Pennsylvania
| | - Jared Kohlhepp
- Department of Emergency Medicine, Lehigh Valley Health Network/University of South Florida Morsani College of Medicine, Allentown, Pennsylvania
| | - Ryan Rogers
- Department of Emergency Medicine, Lehigh Valley Health Network/University of South Florida Morsani College of Medicine, Allentown, Pennsylvania
| | - Bryan G Kane
- Department of Emergency Medicine, Lehigh Valley Health Network/University of South Florida Morsani College of Medicine, Allentown, Pennsylvania
| |
Collapse
|
30
|
Smalley CM, Dorey A, Thiessen M, Kendall JL. A Survey of Ultrasound Milestone Incorporation Into Emergency Medicine Training Programs. JOURNAL OF ULTRASOUND IN MEDICINE : OFFICIAL JOURNAL OF THE AMERICAN INSTITUTE OF ULTRASOUND IN MEDICINE 2016; 35:1517-1521. [PMID: 27268999 DOI: 10.7863/ultra.15.09012] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/03/2015] [Accepted: 10/23/2015] [Indexed: 06/06/2023]
Abstract
OBJECTIVES With the introduction of the Emergency Medicine Milestone Project in 2013, residencies now assess emergency ultrasound (US) skills at regular intervals. However, it is unclear how programs are implementing the emergency US milestones and assessing competency. With the use of the milestone tool, a survey was distributed to emergency US educators to determine when programs are providing emergency US education, when residents are expected to attain competency, and whether the milestones reflect their expectations of trainees. METHODS We conducted a prospective cross-sectional survey study distributed electronically to designated emergency US experts at 169 programs. Participants were queried on education and competency evaluation within the context of the milestones by designating a postgraduate year when the 5 milestone levels were taught and competency was expected. Survey findings were reported as percentages of total respondents from descriptive statistics. RESULTS Responses were received from 53% of programs, and 99% were familiar with the milestones. Most programs provide level 1 (88%) and 2 (85%) instruction during postgraduate year 1. Most programs expect level 1 competency before residency (61%) and expect mastery of level 2 by the end of postgraduate year 1 (60%). Sixty-two percent believe the milestones do not accurately reflect their expectations, citing insufficient minimum scan numbers, lack of specificity, and unattainable level 5 requirements. CONCLUSIONS There is substantial variability in the frequency and methods of competency evaluation using the emergency US milestones. However, most responders agree that residents should obtain level 2 competency by postgraduate year 1. Variation exists regarding what year and what skills define level 3 or greater competency.
Collapse
Affiliation(s)
| | - Alyrene Dorey
- Department of Emergency Medicine, University of California Davis, Sacramento, California USA
| | - Molly Thiessen
- Department of Emergency Medicine, University of Colorado School of Medicine, Aurora, Colorado USA. Department of Emergency Medicine, Denver Health Medical Center, Denver, Colorado USA
| | - John L Kendall
- Department of Emergency Medicine, University of Colorado School of Medicine, Aurora, Colorado USA. Department of Emergency Medicine, Denver Health Medical Center, Denver, Colorado USA
| |
Collapse
|
31
|
Tichter AM, Mulcare MR, Carter WA. Interrater agreement of emergency medicine milestone levels: resident self-evaluation vs clinical competency committee consensus. Am J Emerg Med 2016; 34:1677-9. [PMID: 27236856 DOI: 10.1016/j.ajem.2016.04.055] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/27/2016] [Revised: 04/29/2016] [Accepted: 04/30/2016] [Indexed: 10/21/2022] Open
Affiliation(s)
- Aleksandr M Tichter
- Emergency Medicine Residency, New York-Presbyterian Hospital, New York, New York; Columbia University Medical Center, New York, New York 10032.
| | - Mary R Mulcare
- Emergency Medicine Residency, New York-Presbyterian Hospital, New York, New York; Weill Cornell Medical Center, New York, New York
| | - Wallace A Carter
- Emergency Medicine Residency, New York-Presbyterian Hospital, New York, New York; Columbia University Medical Center, New York, New York 10032; Weill Cornell Medical Center, New York, New York
| |
Collapse
|
32
|
Weizberg M, Bond MC, Cassara M, Doty C, Seamon J. Have First-Year Emergency Medicine Residents Achieved Level 1 on Care-Based Milestones? J Grad Med Educ 2015; 7:589-94. [PMID: 26692971 PMCID: PMC4675416 DOI: 10.4300/jgme-d-14-00590.1] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/22/2014] [Revised: 01/22/2015] [Accepted: 05/18/2015] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND Residents in Accreditation Council for Graduate Medical Education accredited emergency medicine (EM) residencies were assessed on 23 educational milestones to capture their progression from medical student level (Level 1) to that of an EM attending physician (Level 5). Level 1 was conceptualized to be at the level of an incoming postgraduate year (PGY)-1 resident; however, this has not been confirmed. OBJECTIVES Our primary objective in this study was to assess incoming PGY-1 residents to determine what percentage achieved Level 1 for the 8 emergency department (ED) patient care-based milestones (PC 1-8), as assessed by faculty. Secondary objectives involved assessing what percentage of residents had achieved Level 1 as assessed by themselves, and finally, we calculated the absolute differences between self- and faculty assessments. METHODS Incoming PGY-1 residents at 4 EM residencies were assessed by faculty and themselves during their first month of residency. Performance anchors were adapted from ACGME milestones. RESULTS Forty-one residents from 4 programs were included. The percentage of residents who achieved Level 1 for each subcompetency on faculty assessment ranged from 20% to 73%, and on self-assessment from 34% to 92%. The majority did not achieve Level 1 on faculty assessment of milestones PC-2, PC-3, PC-5a, and PC-6, and on self-assessment of PC-3 and PC-5a. Self-assessment was higher than faculty assessment for PC-2, PC-5b, and PC-6. CONCLUSIONS Less than 75% of PGY-1 residents achieved Level 1 for ED care-based milestones. The majority did not achieve Level 1 on 4 milestones. Self-assessments were higher than faculty assessments for several milestones.
Collapse
Affiliation(s)
- Moshe Weizberg
- Corresponding author: Moshe Weizberg, MD, Staten Island University Hospital, Department of Emergency Medicine, 475 Seaview Avenue, Staten Island, NY 10305, 718.226.1548, fax 718.226.8447,
| | | | | | | | | |
Collapse
|
33
|
Goldflam K, Bod J, Della-Giustina D, Tsyrulnik A. Emergency Medicine Residents Consistently Rate Themselves Higher than Attending Assessments on ACGME Milestones. West J Emerg Med 2015; 16:931-5. [PMID: 26594293 PMCID: PMC4651597 DOI: 10.5811/westjem.2015.8.27247] [Citation(s) in RCA: 22] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2015] [Revised: 08/11/2015] [Accepted: 08/25/2015] [Indexed: 11/11/2022] Open
Abstract
Introduction In 2012 the Accreditation Council for Graduate Medical Education (ACGME) introduced the Next Accreditation System (NAS), which implemented milestones to assess the competency of residents and fellows. While attending evaluation and feedback is crucial for resident development, perhaps equally important is a resident’s self-assessment. If a resident does not accurately self-assess, clinical and professional progress may be compromised. The objective of our study was to compare emergency medicine (EM) resident milestone evaluation by EM faculty with the same resident’s self-assessment. Methods This is an observational, cross-sectional study that was performed at an academic, four-year EM residency program. Twenty-five randomly chosen residents completed milestone self-assessment using eight ACGME sub-competencies deemed by residency leadership as representative of core EM principles. These residents were also evaluated by 20 faculty members. The milestone levels were evaluated on a nine-point scale. We calculated the average difference between resident self-ratings and faculty ratings, and used sample t-tests to determine statistical significance of the difference in scores. Results Eighteen residents evaluated themselves. Each resident was assessed by an average of 16 attendings (min=10, max=20). Residents gave themselves statistically significant higher milestone ratings than attendings did for each sub-competency examined (p<0.0001). Conclusion Residents over-estimated their abilities in every sub-competency assessed. This underscores the importance of feedback and assessment transparency. More attention needs to be paid to methods by which residency leadership can make residents’ self-perception of their clinical ability more congruent with that of their teachers and evaluators. The major limitation of our study is small sample size of both residents and attendings.
Collapse
Affiliation(s)
- Katja Goldflam
- Yale University School of Medicine, Department of Emergency Medicine, New Haven, Connecticut
| | - Jessica Bod
- Yale University School of Medicine, Department of Emergency Medicine, New Haven, Connecticut
| | - David Della-Giustina
- Yale University School of Medicine, Department of Emergency Medicine, New Haven, Connecticut
| | - Alina Tsyrulnik
- Yale University School of Medicine, Department of Emergency Medicine, New Haven, Connecticut
| |
Collapse
|
34
|
Bentley S, Mudan G, Strother C, Wong N. Are Live Ultrasound Models Replaceable? Traditional versus Simulated Education Module for FAST Exam. West J Emerg Med 2015; 16:818-22. [PMID: 26594272 PMCID: PMC4651576 DOI: 10.5811/westjem.2015.9.27276] [Citation(s) in RCA: 34] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2015] [Revised: 08/31/2015] [Accepted: 09/26/2015] [Indexed: 11/28/2022] Open
Abstract
Introduction The focused assessment with sonography for trauma (FAST) is a commonly used and life-saving tool in the initial assessment of trauma patients. The recommended emergency medicine (EM) curriculum includes ultrasound and studies show the additional utility of ultrasound training for medical students. EM clerkships vary and often do not contain formal ultrasound instruction. Time constraints for facilitating lectures and hands-on learning of ultrasound are challenging. Limitations on didactics call for development and inclusion of novel educational strategies, such as simulation. The objective of this study was to compare the test, survey, and performance of ultrasound between medical students trained on an ultrasound simulator versus those trained via traditional, hands-on patient format. Methods This was a prospective, blinded, controlled educational study focused on EM clerkship medical students. After all received a standardized lecture with pictorial demonstration of image acquisition, students were randomized into two groups: control group receiving traditional training method via practice on a human model and intervention group training via practice on an ultrasound simulator. Participants were tested and surveyed on indications and interpretation of FAST and training and confidence with image interpretation and acquisition before and after this educational activity. Evaluation of FAST skills was performed on a human model to emulate patient care and practical skills were scored via objective structured clinical examination (OSCE) with critical action checklist. Results There was no significant difference between control group (N=54) and intervention group (N=39) on pretest scores, prior ultrasound training/education, or ultrasound comfort level in general or on FAST. All students (N=93) showed significant improvement from pre- to post-test scores and significant improvement in comfort level using ultrasound in general and on FAST (p<0.001). There was no significant difference between groups on OSCE scores of FAST on a live model. Overall, no differences were demonstrated between groups trained on human models versus simulator. Discussion There was no difference between groups in knowledge based ultrasound test scores, survey of comfort levels with ultrasound, and students’ abilities to perform and interpret FAST on human models. Conclusion These findings suggest that an ultrasound simulator is a suitable alternative method for ultrasound education. Additional uses of ultrasound simulation should be explored in the future.
Collapse
Affiliation(s)
- Suzanne Bentley
- Icahn School of Medicine at Mount Sinai, Department of Emergency Medicine, New York City, New York ; Elmhurst Hospital Center, Department of Emergency Medicine, Elmhurst, New York
| | - Gurpreet Mudan
- Icahn School of Medicine at Mount Sinai, Department of Emergency Medicine, New York City, New York
| | - Christopher Strother
- Icahn School of Medicine at Mount Sinai, Department of Emergency Medicine, New York City, New York
| | - Nelson Wong
- Massachusetts General Hospital, Department of Emergency Medicine, Boston, Massachusetts
| |
Collapse
|
35
|
Beeson MS, Holmboe ES, Korte RC, Nasca TJ, Brigham T, Russ CM, Whitley CT, Reisdorff EJ. Initial Validity Analysis of the Emergency Medicine Milestones. Acad Emerg Med 2015; 22:838-44. [PMID: 26112031 DOI: 10.1111/acem.12697] [Citation(s) in RCA: 60] [Impact Index Per Article: 6.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/17/2014] [Revised: 01/06/2015] [Accepted: 01/10/2015] [Indexed: 11/29/2022]
Abstract
OBJECTIVES The Accreditation Council for Graduate Medical Education (ACGME) Milestones describe behavioral markers for the progressive acquisition of competencies during residency. As a key component of the Next Accreditation System, all residents are evaluated for the acquisition of specialty-specific Milestones. The objective was to determine the validity and reliability of the emergency medicine (EM) Milestones. METHODS The ACGME and the American Board of Emergency Medicine performed this single-event observational study. The data included the initial EM Milestones performance ratings of all categorical EM residents submitted to the ACGME from October 31, 2013, to January 6, 2014. Mean performance ratings were determined for all 23 subcompetencies for every year of residency training. The internal consistency (reliability) of the Milestones was determined using a standardized Cronbach's alpha coefficient. Exploratory factor analysis was conducted to determine how the subcompetencies were interrelated. RESULTS EM Milestone performance ratings were obtained on 100% of EM residents (n = 5,805) from 162 residency programs. The mean performance ratings of the aggregate and individual subcompetency scores showed discrimination between residency years, and the factor structure further supported the validity of the EM Milestones. The reliability was α = 0.96 within each year of training. CONCLUSIONS The EM Milestones demonstrated validity and reliability as an assessment instrument for competency acquisition. EM residents can be assured that this evaluation process has demonstrated validity and reliability; faculty can be confident that the Milestones are psychometrically sound; and stakeholders can know that the Milestones are a nationally standardized, objective measure of specialty-specific competency acquisition.
Collapse
Affiliation(s)
- Michael S. Beeson
- The Department of Emergency Medicine; Akron General Medical Center; Akron OH
| | - Eric S. Holmboe
- Milestones Development and Evaluation; Chicago IL
- Accreditation Council for Graduate Medical Education; Chicago IL
| | | | - Thomas J. Nasca
- Accreditation Council for Graduate Medical Education; Chicago IL
- Jefferson Medical Center; Philadelphia PA
| | - Timothy Brigham
- Accreditation Council for Graduate Medical Education; Chicago IL
- Jefferson Medical Center; Philadelphia PA
| | - Chad M. Russ
- American Board of Emergency Medicine; East Lansing MI
| | | | | |
Collapse
|
36
|
Love JN, Yarris LM, Ankel FK. Emergency Medicine Milestones: The Next Step. Acad Emerg Med 2015; 22:847-8. [PMID: 26112362 DOI: 10.1111/acem.12700] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Affiliation(s)
- Jeffrey N. Love
- Georgetown University Hospital/Washington Hospital Center; Washington DC
| | | | - Felix K. Ankel
- HealthPartners Institute for Education and Research; Bloomington MN
| | | |
Collapse
|
37
|
Dehon E, Jones J, Puskarich M, Sandifer JP, Sikes K. Use of Emergency Medicine Milestones as Items on End-of-Shift Evaluations Results in Overestimates of Residents' Proficiency Level. J Grad Med Educ 2015. [PMID: 26221433 PMCID: PMC4512788 DOI: 10.4300/jgme-d-14-00438.1] [Citation(s) in RCA: 21] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND The emergency medicine milestones were developed to provide more objective resident assessment than current methods. However, little is known about the best practices for applying the milestones in resident assessment. OBJECTIVE We examined the utility of end-of-shift evaluations (ESEs) constructed using the milestones in resident assessment. METHODS We developed 14 daily ESEs, each of which included 9 or 10 emergency medicine milestones. Postgraduate year (PGY)-1 and PGY-2 residents were assessed on milestone levels 1 through 3; PGY-3 and PGY-4 residents were assessed on levels 3 through 5. Each milestone was rated on a nominal scale (yes, no, or not applicable). The Clinical Competency Committee combined the ESE data with data from other assessments to determine each resident's proficiency level for the emergency medicine subcompetencies. We used descriptive statistics to summarize resident ESEs and milestone levels. We analyzed differences in ESE score across PGY levels using t tests and analyses of variance. RESULTS Faculty completed 763 ESEs on 33 residents with a range of 2 to 54 (median=22) ESEs per resident. Faculty rarely (8%, 372 of 4633) rated a resident as not achieving a milestone on the ESEs. Analyses of variance revealed that ESE scores on level 3 milestones did not differ significantly by PGY level. There was poor agreement between ESE scores and Clinical Competency Committee ratings. CONCLUSIONS The ESEs constructed using the milestones resulted in grade or milestone inflation. Our results do not support using milestones as a stand-alone assessment tool.
Collapse
Affiliation(s)
- Erin Dehon
- Corresponding author: Erin Dehon, PhD, Department of Emergency Medicine, University of Mississippi Medical Center, 2500 N State Street, Jackson, MS 39216, 504.710.5368,
| | | | | | | | | |
Collapse
|
38
|
Gardner AK, Scott DJ, Hebert JC, Mellinger JD, Frey-Vogel A, Ten Eyck RP, Davis BR, Sillin LF, Sachdeva AK. Gearing up for milestones in surgery: Will simulation play a role? Surgery 2015; 158:1421-7. [PMID: 26013987 DOI: 10.1016/j.surg.2015.03.039] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2015] [Accepted: 03/13/2015] [Indexed: 10/23/2022]
Abstract
BACKGROUND The Consortium of American College of Surgeons-Accredited Education Institutes was created to promote patient safety through the use of simulation, develop new education and technologies, identify best practices, and encourage research and collaboration. METHODS During the 7th Annual Meeting of the Consortium, leaders from a variety of specialties discussed how simulation is playing a role in the assessment of resident performance within the context of the Milestones of the Accreditation Council for Graduate Medical Education as part of the Next Accreditation System. CONCLUSION This report presents experiences from several viewpoints and supports the utility of simulation for this purpose.
Collapse
Affiliation(s)
| | | | - James C Hebert
- University of Vermont College of Medicine, Residency Review Committee for Surgery, Burlington, VT
| | - John D Mellinger
- Southern Illinois University School of Medicine, Springfield, IL
| | | | | | | | - Lelan F Sillin
- Lahey Center for Professional Development and Simulation, Burlington, MA
| | | |
Collapse
|
39
|
Yarris LM, Jones D, Kornegay JG, Hansen M. The Milestones Passport: A Learner-Centered Application of the Milestone Framework to Prompt Real-Time Feedback in the Emergency Department. J Grad Med Educ 2014; 6:555-60. [PMID: 26279784 PMCID: PMC4535223 DOI: 10.4300/jgme-d-13-00409.1] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/07/2013] [Revised: 01/23/2014] [Accepted: 04/14/2014] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND In July 2013, emergency medicine residency programs implemented the Milestone assessment as part of the Next Accreditation System. OBJECTIVE We hypothesized that applying the Milestone framework to real-time feedback in the emergency department (ED) could affect current feedback processes and culture. We describe the development and implementation of a Milestone-based, learner-centered intervention designed to prompt real-time feedback in the ED. METHODS We developed and implemented the Milestones Passport, a feedback intervention incorporating subcompetencies, in our residency program in July 2013. Our primary outcomes were feasibility, including faculty and staff time and costs, number of documented feedback encounters in the first 2 months of implementation, and user-reported time required to complete the intervention. We also assessed learner and faculty acceptability. RESULTS Development and implementation of the Milestones Passport required 10 hours of program coordinator time, 120 hours of software developer time, and 20 hours of faculty time. Twenty-eight residents and 34 faculty members generated 257 Milestones Passport feedback encounters. Most residents and faculty reported that the encounters required fewer than 5 minutes to complete, and 48% (12 of 25) of the residents and 68% (19 of 28) of faculty reported satisfaction with the Milestones Passport intervention. Faculty satisfaction with overall feedback in the ED improved after the intervention (93% versus 54%, P = .003), whereas resident satisfaction with feedback did not change significantly. CONCLUSIONS The Milestones Passport feedback intervention was feasible and acceptable to users; however, learner satisfaction with the Milestone assessment in the ED was modest.
Collapse
|
40
|
Santen SA, Peterson WJ, Khandelwal S, House JB, Manthey DE, Sozener CB. Medical student milestones in emergency medicine. Acad Emerg Med 2014; 21:905-11. [PMID: 25155021 DOI: 10.1111/acem.12443] [Citation(s) in RCA: 32] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2014] [Revised: 02/25/2014] [Accepted: 03/07/2014] [Indexed: 11/27/2022]
Abstract
OBJECTIVES Medical education is a continuum from medical school through residency to unsupervised clinical practice. There has been a movement toward competency-based medical education prompted by the Accreditation Council for Graduate Medical Education (ACGME) using milestones to assess competence. While implementation of milestones for residents sets specific standards for transition to internship, there exists a need for the development of competency-based instruments to assess medical students as they progress toward internship. The objective of this study was to develop competency-based milestones for fourth-year medical students completing their emergency medicine (EM) clerkships (regardless of whether the students were planning on entering EM) using a rigorous method to attain validity evidence. METHODS A literature review was performed to develop a list of potential milestones. An expert panel, which included a medical student and 23 faculty members (four program directors, 16 clerkship directors, and five assistant deans) from 19 different institutions, came to consensus on these milestones through two rounds of a modified Delphi protocol. The Delphi technique builds content validity and is an accepted method to develop consensus by eliciting expert opinions through multiple rounds of questionnaires. RESULTS Of the initial 39 milestones, 12 were removed at the end of round 1 due to low agreement on importance of the milestone or because of redundancy with other milestones. An additional 12 milestones were revised to improve clarity or eliminate redundancy, and one was added based on expert panelists' suggestions. Of the 28 milestones moving to round 2, consensus with a high level of agreement was achieved for 24. These were mapped to the ACGME EM residency milestone competency domains, as well as the Association of American Medical Colleges (AAMC) core entrustable professional activities for entering residency to improve content validity. CONCLUSIONS This study found consensus support by experts for a list of 24 milestones relevant to the assessment of fourth-year medical student performance by the completion of their EM clerkships. The findings are useful for development of a valid method for assessing medical student performance as students approach residency.
Collapse
Affiliation(s)
- Sally A. Santen
- Department of Emergency Medicine; University of Michigan Medical School; Ann Arbor MI
| | | | - Sorabh Khandelwal
- Department of Emergency Medicine; The Ohio State University Medical Center; Columbus OH
| | - Joseph B. House
- Department of Emergency Medicine; University of Michigan Medical School; Ann Arbor MI
| | - David E. Manthey
- Department of Emergency Medicine; Wake Forest Baptist Health Center; Winston-Salem NC
| | - Cemal B. Sozener
- Department of Emergency Medicine; University of Michigan Medical School; Ann Arbor MI
| |
Collapse
|
41
|
Affiliation(s)
- Felix Ankel
- Department of Emergency Medicine; Regions Hospital; St. Paul MN
| | - Douglas Franzen
- Department of Emergency Medicine; Virginia Commonwealth University; Richmond VA
| | - Jason Frank
- Department of Emergency Medicine University of Ottawa; Ottowa Ontario Canada
| |
Collapse
|