1
|
Kakara Anderson HL, Abdulla L, Cabral P, Govaerts M, Balmer DF, Busari JO. Abrasion: a phenomenological study of inequity in workplace-based assessment in pediatrics. Soc Sci Med 2025; 375:118092. [PMID: 40253979 DOI: 10.1016/j.socscimed.2025.118092] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/08/2024] [Revised: 03/10/2025] [Accepted: 04/15/2025] [Indexed: 04/22/2025]
Abstract
Despite the centrality of workplace-based assessments in medical education and practice, there is troubling evidence that workplace-based assessments are inequitable. This study investigated the experience of inequity in workplace-based assessment via a phenomenological study with learners (resident physicians) and assessors (physician supervisors) in one specialty in the United States, general pediatrics, from August 2023-June 2024. The authors used critical phenomenology to prompt and iteratively analyze participants' experiences with inequity in workplace-based assessment and their lifeworlds. To understand participants' lifeworlds, the authors applied Collins' domains of power framework to examine participants' varied and unique locations within intersecting power relations. Participants described the phenomenon of inequity in workplace-based assessment as a type of abrasion, that is, an injury caused by friction that occurred when a workplace-based assessment excoriated their sense of self. Abrasion had three dimensions: physical, affective, and temporal. These findings suggest that inequity in workplace-based assessment cannot be tracked solely by measuring disparities in numbers, grades, differences in narrative language used in comments, and other common measures of inequity, rather, it can be characterized as an experienced, felt, phenomenon that has essential dimensions. These findings have major implications for how inequity is conceptualized and intervened upon in medical education.
Collapse
Affiliation(s)
- Hannah L Kakara Anderson
- University of Pennsylvania Perelman School of Medicine and PhD Candidate, Maastricht University School of Health Professions Education, United States.
| | - Layla Abdulla
- Department of Pediatrics, The Children's Hospital of Philadelphia, United States
| | - Pricilla Cabral
- Department of Pediatrics, The Children's Hospital of Philadelphia, United States
| | - Marjan Govaerts
- Maastricht University, School of Health Professions Education, the Netherlands
| | - Dorene F Balmer
- Perelman School of Medicine University of Pennsylvania and Co-Director of Research on Pediatric Education, The Children's Hospital of Philadelphia, United States
| | - Jamiu O Busari
- Horacio Oduber Hospital, Aruba; Disability and Rehabilitation Research, Ontario Tech University, Medical Education, Maastricht University School of Health Professions Education, the Netherlands
| |
Collapse
|
2
|
Kakara Anderson HL, Govaerts M, Abdulla L, Balmer DF, Busari JO, West DC. Clarifying and expanding equity in assessment by considering three orientations: Fairness, inclusion and justice. MEDICAL EDUCATION 2025; 59:494-502. [PMID: 39279355 PMCID: PMC11976206 DOI: 10.1111/medu.15534] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/25/2024] [Revised: 08/07/2024] [Accepted: 08/14/2024] [Indexed: 09/18/2024]
Abstract
CONTEXT Despite increasing discussion and scholarship, equity in assessment is rarely defined and distinguished in a way that allows for shared understanding in medical education. This paper seeks to clarify and expand the conversation about equity in assessment by critically reviewing three distinct and evolving orientations toward equity in assessment. Orientations refers to the positions, attitudes, interests or priorities individuals can hold toward equity in assessment. The three orientations include fairness-oriented assessment, assessment for inclusion and justice-oriented assessment. While fairness-oriented assessment is a prevailing orientation in medical education, assessment for inclusion and justice-oriented assessment, originally developed in other fields of education, deserve careful consideration. METHODS In this paper, the authors explore unique underpinning assumptions of each orientation by critically examining the foundational literature of each orientation. They reflect on the unique perspectives each orientation provides, including the actions one might take and what advantages and disadvantages might result from looking at equity in assessment from any one orientation. CONCLUSIONS Informed by this reflection, the authors propose that to more effectively advance equity in assessment in medical education, those working in the field should clearly identify their respective orientations, intentionally choose methods, tools and measures aligned with their orientations and expand their work by exploring alternative orientations.
Collapse
Affiliation(s)
- Hannah L. Kakara Anderson
- Perelman School of Medicine at the University of PennsylvaniaPhiladelphiaPennsylvaniaUSA
- School of Health Professions Education (SHE)Maastricht UniversityMaastrichtThe Netherlands
| | - Marjan Govaerts
- Department of Educational Development, School of Health Professions Education (SHE)Maastricht UniversityMaastrichtThe Netherlands
| | - Layla Abdulla
- Perelman School of Medicine at the University of PennsylvaniaPhiladelphiaPennsylvaniaUSA
| | - Dorene F. Balmer
- Perelman School of Medicine at the University of PennsylvaniaPhiladelphiaPennsylvaniaUSA
- Department of PediatricsThe Children's Hospital of PhiladelphiaPhiladelphiaPennsylvaniaUSA
| | - Jamiu O. Busari
- Health Professions EducationHoracio Oduber HospitalOranjestedAruba
- Faculty of Health, Medicine and Life SciencesMaastricht UniversityMaastrichtThe Netherlands
| | - Daniel C. West
- Perelman School of Medicine at the University of PennsylvaniaPhiladelphiaPennsylvaniaUSA
| |
Collapse
|
3
|
Moreci R, Spencer B, Fallon B, Harbaugh C, Zendejas B, Gray BW, Alaish S, Diaz-Miron J, Ehrlich P, Gadepalli S, Newman E, Sandhu G, Hirschl R. Implementation of a Competency-Based Training Program in Pediatric Surgery: A Pilot Study. J Pediatr Surg 2025; 60:162257. [PMID: 40120395 DOI: 10.1016/j.jpedsurg.2025.162257] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/18/2024] [Revised: 02/01/2025] [Accepted: 02/20/2025] [Indexed: 03/25/2025]
Abstract
INTRODUCTION Current surgical trainees experience less autonomy in their training than prior generations. Intentionally integrating graduated autonomy into the educational development of fellows would enhance trainee competence while providing the safety net of fellowship programming. The objective of this study was to pilot a competency-based training (CBT) approach for pediatric surgery fellowship programs. METHODS A CBT approach was developed at a single pediatric surgery fellowship program. Three goals were identified for the development of the CBT structure: 1) demonstrate flexibility for trainees to gain knowledge and skills at individualized paces, 2) readily identify and address knowledge gaps, and 3) facilitate increased autonomy for fellows while in training, in areas where competence has been successfully assessed. The CBT program consisted of five components: operative evaluations, a communication assessment tool (CAT), entrustable professional activities (EPAs), oral exams, and video reviews. Once a fellow completed the minimum requirements of each component, they proceeded to the final review at the clinical competency committee (CCC) meeting. For this pilot, we tracked the changes that were made to the program over the first few years as well as progression of the first six fellows and their participation in the various components. RESULTS Between 2017 and 2023, six pediatric surgery fellows were evaluated by fourteen pediatric surgery faculty. Three fellows progressed through the pilot program with three procedures (Laparoscopic Appendectomy, Laparoscopic Inguinal Hernia Repair, and Laparoscopic Pyloromyotomy), while the other three fellows progressed through the program with these three initial and two additional procedures (Laparoscopic Gastrostomy Tube Placement and Open Malrotation Repair). The median number of each component completed was: 5 CATs, 101 operative evaluations, 4 oral exams, and 3 video reviews. Fellows achieved competency for 1-4 procedures. From semi-structured interviews, fellows appeared overall satisfied with their experience of the CBT program, stating that these additional sets of tasks and requirements were a helpful thought exercise, provided an extra layer of structure to the curriculum, created a culture of independence and autonomy, and gave added purpose to the first year of training. The principal strength of the program was the structured format leading to graduated autonomy. All fellows reported feeling very prepared to move into a faculty role following graduation from fellowship training. CONCLUSION Competency-based training may allow for identification of specific gaps in knowledge, skills or abilities and support progressive autonomy among trainees prior to fellowship graduation. Generating further validity evidence and engaging in implementation research are necessary for quality improvement and dissemination of the program.
Collapse
Affiliation(s)
- Rebecca Moreci
- Section of Pediatric Surgery, Department of Surgery, University of Michigan Medical School, C.S. Mott Children's Hospital, Ann Arbor, MI, USA.
| | - Brianna Spencer
- Section of Pediatric Surgery, Department of Surgery, University of Michigan Medical School, C.S. Mott Children's Hospital, Ann Arbor, MI, USA; Department of Surgery, Michigan Medicine, Ann Arbor, MI, USA
| | - Brian Fallon
- Section of Pediatric Surgery, Department of Surgery, University of Michigan Medical School, C.S. Mott Children's Hospital, Ann Arbor, MI, USA; Department of Pediatric Surgery, Boston Children's Hospital, Harvard Medical School, Boston, MA, USA
| | | | - Benjamin Zendejas
- Department of Pediatric Surgery, Boston Children's Hospital, Harvard Medical School, Boston, MA, USA
| | - Brian W Gray
- Division of Pediatric General Surgery, Department of Surgery, Indiana University School of Medicine, Riley Hospital for Children, Indianapolis, IN, USA
| | - Samuel Alaish
- Division of Pediatric Surgery, Department of Surgery, Johns Hopkins University School of Medicine, Baltimore, MD, USA
| | - Jose Diaz-Miron
- Department of Pediatric Surgery, University of Colorado School of Medicine, Children's Hospital Colorado, Aurora, CO, USA
| | - Peter Ehrlich
- Section of Pediatric Surgery, Department of Surgery, University of Michigan Medical School, C.S. Mott Children's Hospital, Ann Arbor, MI, USA
| | - Samir Gadepalli
- Section of Pediatric Surgery, Department of Surgery, University of Michigan Medical School, C.S. Mott Children's Hospital, Ann Arbor, MI, USA
| | - Erika Newman
- Section of Pediatric Surgery, Department of Surgery, University of Michigan Medical School, C.S. Mott Children's Hospital, Ann Arbor, MI, USA
| | - Gurjit Sandhu
- Department of Surgery, Michigan Medicine, Ann Arbor, MI, USA
| | - Ronald Hirschl
- Section of Pediatric Surgery, Department of Surgery, University of Michigan Medical School, C.S. Mott Children's Hospital, Ann Arbor, MI, USA
| |
Collapse
|
4
|
Anderson HK, Cabral P, Gerstenzang E, Liverpool C, Gonzalez M, Weiss A, Cullen D, Balmer D, Govaerts M, West DC, Busari J. Co-Designing a Justice-Oriented Assessment System in a Pediatric Residency Program: Report from the Designing for Equity in Medical Education Project. PERSPECTIVES ON MEDICAL EDUCATION 2025; 14:141-148. [PMID: 40226578 PMCID: PMC11987879 DOI: 10.5334/pme.1541] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/07/2024] [Accepted: 03/02/2025] [Indexed: 04/15/2025]
Abstract
Background and Need for Innovation There is a large body of evidence that assessment systems in medical education are inequitable for many groups of learners. A common approach to improve equity has been the use of organizational strategies, where training program leaders work to develop and implement improvements in existing assessment systems from their perspective to improve equity. However, emerging assessment approaches, such as justice-oriented assessment, argue that assessment systems must be made more equitable by critique and re-building through co-design with learners, assessors, and other key users. Little is known about how to apply these methods to workplace-based assessment in medical education. Goal of Innovation To fill the knowledge gap about how to co-design a more equitable, justice-oriented, workplace-based assessment system in pediatric post-graduate medical education. Steps taken for Development and Implementation of innovation Using the Design Justice framework, the authors completed 4 of the 5 phases of Design Thinking to co-design with learners and other users a workplace-based assessment system in their institution's pediatric residency program. Evaluation of Innovation To understand whether and how Design Justice principles were present and operationalized in the process of co-designing the assessment system, the authors evaluated the design activities in each phase of the Design Thinking process, the outputs of the design process, and the experiences of participating users. Critical Reflection Evidence of Design Justice principles included participants' feelings of being heard, affirmed, and empowered, as well as the design teams' iterative, critical reflection on making the project accessible, accountable, sustainable, and collaborative. This project offers a practical example of co-designing a justice-oriented assessment system, the process and principles of which can inform the efforts of advancing equity in assessment.
Collapse
Affiliation(s)
| | | | | | | | | | - Anna Weiss
- The Children’s Hospital of Philadelphia, US
| | | | | | - Marjan Govaerts
- Maastricht University School of Health Professions Education, NL
| | | | - Jamiu Busari
- Maastricht University School of Health Professions Education, NL
| |
Collapse
|
5
|
Anderson HLK, West DC, Schwartz AJ, Weiss AK, Marcus C, Hanson E, Turner DA, Schumacher DJ. Inequity in Assessment Among Pediatric Residents. JAMA Netw Open 2025; 8:e255594. [PMID: 40244589 PMCID: PMC12006863 DOI: 10.1001/jamanetworkopen.2025.5594] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/03/2024] [Accepted: 02/13/2025] [Indexed: 04/18/2025] Open
Abstract
This survey study evaluates self-reported experiences of biased verbal and written assessments in medical education.
Collapse
Affiliation(s)
- Hannah L. Kakara Anderson
- The Children’s Hospital of Philadelphia (CHOP), Philadelphia, Pennsylvania
- Perelman School of Medicine at the University of Pennsylvania, Philadelphia
- Maastricht University School of Health Professions Education, Maastricht, the Netherlands
- CHOP Education Collaboratory, Philadelphia, Pennsylvania
| | - Daniel C. West
- The Children’s Hospital of Philadelphia (CHOP), Philadelphia, Pennsylvania
- Perelman School of Medicine at the University of Pennsylvania, Philadelphia
- CHOP Education Collaboratory, Philadelphia, Pennsylvania
| | - Alan J. Schwartz
- Association of Pediatric Program Directors Longitudinal Educational Assessment Research Network, McLean, Virginia
| | - Anna K. Weiss
- The Children’s Hospital of Philadelphia (CHOP), Philadelphia, Pennsylvania
- Perelman School of Medicine at the University of Pennsylvania, Philadelphia
- CHOP Education Collaboratory, Philadelphia, Pennsylvania
| | | | - Elizabeth Hanson
- Joe R. and Teresa Lozano Long School of Medicine, University of Texas Health San Antonio
| | | | - Daniel J. Schumacher
- Cincinnati Children’s Hospital Medical Center, Cincinnati, Ohio
- University of Cincinnati College of Medicine, Cincinnati, Ohio
| |
Collapse
|
6
|
Pillay TS, Jafri L, Punchoo R. Formulation of workplace-based assessments (WBAs) and entrustable professional activities (EPAs) for postgraduate medical trainees in clinical biochemistry. J Clin Pathol 2025; 78:240-250. [PMID: 39667849 DOI: 10.1136/jcp-2024-209796] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/07/2024] [Accepted: 10/21/2024] [Indexed: 12/14/2024]
Affiliation(s)
- Tahir S Pillay
- Department of Chemical Pathology, University of Pretoria & National Health Laboratory Service, Pretoria, South Africa
- Division of Chemical Pathology, University of Cape Town, Rondebosch, South Africa
| | - Lena Jafri
- The Aga Khan University, Karachi, Pakistan
| | - Rivak Punchoo
- Division of Chemical Pathology, University of Cape Town Faculty of Health Sciences, Observatory, Western Cape, South Africa
| |
Collapse
|
7
|
Khan SB, Maart R. Clinical assessment strategies for competency-based education in prosthetic dentistry. J Dent Educ 2025; 89:375-382. [PMID: 39436275 PMCID: PMC11903901 DOI: 10.1002/jdd.13746] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/25/2024] [Revised: 09/25/2024] [Accepted: 10/02/2024] [Indexed: 10/23/2024]
Abstract
Reflective practice is viewed as a theoretical and pedagogical concept in higher education having several diverse approaches and interpretations. The most important aspect of reflective practice is that it is a necessary quality assurance aspect of higher education which should occur recurrently and at different stages of the program. It usually entails an evaluation of advanced instructions which has become the norm in an educational setting, in order to improve the learning outcomes. Reflective practice must therefore be seen as a tool which allows continuous improvement, modifications, and changes to educational approaches, which include theoretical and clinical assessment strategies. Academics in prosthetic dentistry at a research-led university reflected on their current assessment strategies used in the senior undergraduate dental program as part of a quality assurance process and its global comparability. This paper aims to share and explain the importance of reviewing assessment strategies in higher education, especially in such a clinical program using reflective practice as a framework. Different assessment strategies used over a 5-year period are explored and their different structures, expectations, and appropriateness for a clinical program are reported from the literature. The concerns were addressed in a cyclical manner within this framework, and Blooms and blueprinting implemented where appropriate. We conclude that without a validated definition and framework for regular reflective practices, and guidelines to modify the included assessment strategies, the quality assurance within a competency-based dental program may be compromised.
Collapse
Affiliation(s)
- Saadika B. Khan
- Department of ProsthodonticsFaculty of DentistryUniversity of the Western CapeCape TownSouth Africa
| | - Ronel Maart
- Department of ProsthodonticsFaculty of DentistryUniversity of the Western CapeCape TownSouth Africa
| |
Collapse
|
8
|
Sandars J, Cecilio-Fernandes D, Patel R. "Showing the Best Version of Yourself": The Importance of Dynamic Assessment for Trainees Undergoing Workplace-Based Assessments in Postgraduate Training. Br J Hosp Med (Lond) 2024; 85:1-12. [PMID: 39831500 DOI: 10.12968/hmed.2024.0489] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/22/2025]
Abstract
Workplace-based assessments (WPBAs) in postgraduate training may not always provide an accurate representation of a trainee's capability to perform a given task, or a true measure of a trainee's overall competence in clinical practice settings. This article describes how trainers can use a theory-driven and evidence-based intervention called dynamic assessment for providing an individual with the best opportunity to demonstrate a more accurate representation of their performance, and ultimately present the best version of themselves when undergoing an observed WPBA, such as a Direct Observation of Procedural Skills (DOPS) or Mini Consultation Evaluation Exercise (MiniCEX). Dynamic assessment simultaneously combines educational support with assessment as the trainee undergoes the WPBA by using focussed questions as prompts to facilitate an individual trainee's essential coordination of their motivational and thinking processes since this is often challenged during assessments. In addition, the response to the prompts can also provide trainers with information to inform specific feedback for future professional development.
Collapse
Affiliation(s)
- John Sandars
- Edge Hill University Medical School, Edge Hill University, Ormskirk, UK
| | - Dario Cecilio-Fernandes
- Institute of Medical Education Research Rotterdam, Erasmus MC University Medical Centre Rotterdam, Rotterdam, Netherlands
| | - Rakesh Patel
- Barts and the London Faculty of Medicine and Dentistry, Queen Mary University London, London, UK
| |
Collapse
|
9
|
Kassam A, de Vries I, Zabar S, Durning SJ, Holmboe E, Hodges B, Boscardin C, Kalet A. The Next Era of Assessment Within Medical Education: Exploring Intersections of Context and Implementation. PERSPECTIVES ON MEDICAL EDUCATION 2024; 13:496-506. [PMID: 39399409 PMCID: PMC11469546 DOI: 10.5334/pme.1128] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/16/2023] [Accepted: 09/11/2024] [Indexed: 10/15/2024]
Abstract
In competency-based medical education (CBME), which is being embraced globally, the patient-learner-educator encounter occurs in a highly complex context which contributes to a wide range of assessment outcomes. Current and historical barriers to considering context in assessment include the existing post-positivist epistemological stance that values objectivity and validity evidence over the variability introduced by context. This is most evident in standardized testing. While always critical to medical education the impact of context on assessment is becoming more pronounced as many aspects of training diversify. This diversity includes an expanding interest beyond individual trainee competence to include the interdependency and collective nature of clinical competence and the growing awareness that medical education needs to be co-produced among a wider group of stakeholders. In this Eye Opener, we wish to consider: 1) How might we best account for the influence of context in the clinical competence assessment of individuals in medical education? and by doing so, 2) How could we usher in the next era of assessment that improves our ability to meet the dynamic needs of society and all its stakeholders? The purpose of this Eye Opener is thus two-fold. First, we conceptualize - from a variety of viewpoints, how we might address context in assessment of competence at the level of the individual learner. Second, we present recommendations that address how to approach implementation of a more contextualized competence assessment.
Collapse
Affiliation(s)
- Aliya Kassam
- Department of Community Health Sciences and Director of Scholarship in the Office of Postgraduate Medical Education at the Cumming School of Medicine, University of Calgary, Alberta, Canada
| | - Ingrid de Vries
- Faculty of Education at Queen’s University, Kingston, Canada
| | - Sondra Zabar
- Division of General Internal Medicine and Clinical Innovation at the NYU Grossman School of Medicine, New York, New York, USA
| | - Steven J. Durning
- Center for Health Professions Education at the Uniformed Services University of the Health Sciences in Bethesda, Maryland, USA
| | | | - Brian Hodges
- Temerty Faculty of Medicine at University of Toronto, Canada
- Royal College of Physicians and Surgeons of Canada, Canada
| | - Christy Boscardin
- Department of Medicine and Department of Anesthesia and Perioperative Care, and the Faculty Director of Assessment in the School of Medicine at the University of California, San Francisco, California, USA
| | - Adina Kalet
- Department of Medicine, Center for the Advancement of Population Health at the Medical College of Wisconsin, Wisconsin, USA
| |
Collapse
|
10
|
Hsu PJ, Wnuk G, Leininger L, Peterson S, Hughes DT, Sandhu G, Zwischenberger JB, George BC, Aubry S. When the first try fails: re-implementation of SIMPL in a general surgery residency. BMC Surg 2024; 24:257. [PMID: 39261888 PMCID: PMC11389305 DOI: 10.1186/s12893-024-02557-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/06/2024] [Accepted: 09/03/2024] [Indexed: 09/13/2024] Open
Abstract
BACKGROUND Workplace-based assessment (WBA) can facilitate evaluation of operative performance; however, implementation of WBA is sometimes unsuccessful. The American Board of Surgery Entrustable Professional Activities WBA project was launched in July 2023. Some programs will face the challenge of re-implementation of a WBA following previous failures. It is unknown what interventions are most effective for WBA re-implementation. Our goal is to identify barriers and facilitators to re-implementing SIMPL, an operative performance WBA. METHODS The System for Improving and Measuring Procedural Learning (SIMPL) was implemented at our residency in 2018, but usage rates were low. We interviewed residents and faculty to identify barriers to usage and opportunities for improvement. Residents reported that SIMPL usage declined because of several factors, including a low faculty response rate, while some faculty reported not responding because they were unable to login to the app and because usage was not mandated. We then re-implemented SIMPL using a plan based on Kotter's Model of Change. To evaluate impact, we analyzed rates of SIMPL usage when it was first implemented, as well as before and after the date of re-implementation. RESULTS In September 2022, we re-implemented SIMPL at our program with measures addressing the identified barriers. We found that, in the six months after re-implementation, an average of 145.8 evaluations were submitted by residents per month, compared with 47 evaluations per month at the start of the original implementation and 5.8 evaluations per month just prior to re-implementation. Faculty completed 60.6% of evaluations and dictated feedback for 59.1% of these evaluations, compared with 69.1% at implementation (44% dictated) and 43% prior to re-implementation (53% dictated). CONCLUSIONS After identifying barriers to implementation of a WBA, we re-implemented it with significantly higher usage by faculty and residents. Future opportunities exist to implement or re-implement assessment tools within general surgery programs. These opportunities may have a significant impact in the setting of national standardization of workplace-based assessment among general surgery residencies.
Collapse
Affiliation(s)
- Phillip J Hsu
- Department of Surgery, University of Michigan, Ann Arbor, MI, USA.
| | - Gregory Wnuk
- Department of Surgery, University of Michigan, Ann Arbor, MI, USA
| | - Lisa Leininger
- Department of Surgery, University of Michigan, Ann Arbor, MI, USA
| | | | - David T Hughes
- Department of Surgery, University of Michigan, Ann Arbor, MI, USA
| | - Gurjit Sandhu
- Department of Surgery, University of Michigan, Ann Arbor, MI, USA
| | | | - Brian C George
- Department of Surgery, University of Michigan, Ann Arbor, MI, USA
| | - Staci Aubry
- Department of Surgery, University of Michigan, Ann Arbor, MI, USA
| |
Collapse
|
11
|
Costich M, Friedman S, Robinson V, Catallozzi M. Implementation and faculty perception of outpatient medical student workplace-based assessments. CLINICAL TEACHER 2024; 21:e13751. [PMID: 38433555 DOI: 10.1111/tct.13751] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/10/2023] [Accepted: 02/06/2024] [Indexed: 03/05/2024]
Abstract
BACKGROUND There is growing interest in use of entrustable professional activity (EPA)-grounded workplace-based assessments (WBAs) to assess medical students through direct observation in the clinical setting. However, there has been very little reflection on how these tools are received by the faculty using them to deliver feedback. Faculty acceptance of WBAs is fundamentally important to sustained utilisation in the clinical setting, and understanding faculty perceptions of the WBA as an adjunct for giving targeted feedback is necessary to guide future faculty development in this area. APPROACH Use of a formative EPA-grounded WBA was implemented in the ambulatory setting during the paediatrics clerkship following performance-driven training and frame-of-reference training with faculty. Surveys and semi-structured interviews with faculty members explored how faculty perceived the tool and its impact on feedback delivery. EVALUATION Faculty reported providing more specific, task-oriented feedback following implementation of the WBA, as well as greater timeliness of feedback and greater satisfaction with opportunities to provide feedback, although these later two findings did not reach significance. Themes from the interviews reflected the benefits of WBAs, persistent barriers to the provision of feedback and suggestions for improvement of the WBA. IMPLICATIONS EPA-grounded WBAs are feasible to implement in the outpatient primary care setting and improve feedback delivery around core EPAs. The WBAs positively impacted the way faculty conceptualise feedback and provide learners with more actionable, behaviour-based feedback. Findings will inform modifications to the WBA and future faculty development and training to allow for sustainable WBA utilisation in the core clerkship.
Collapse
Affiliation(s)
- Marguerite Costich
- Division of Child and Adolescent Health, Department of Pediatrics, Columbia University Vagelos College of Physicians and Surgeons and NewYork-Presbyterian, New York, New York, USA
- Department of Pediatrics, Columbia University Vagelos College of Physicians and Surgeons and NewYork-Presbyterian, New York, New York, USA
| | - Suzanne Friedman
- Division of Child and Adolescent Health, Department of Pediatrics, Columbia University Vagelos College of Physicians and Surgeons and NewYork-Presbyterian, New York, New York, USA
- Department of Pediatrics, Columbia University Vagelos College of Physicians and Surgeons and NewYork-Presbyterian, New York, New York, USA
| | - Victoria Robinson
- Division of Child and Adolescent Health, Department of Pediatrics, Columbia University Vagelos College of Physicians and Surgeons and NewYork-Presbyterian, New York, New York, USA
- Department of Pediatrics, Columbia University Vagelos College of Physicians and Surgeons and NewYork-Presbyterian, New York, New York, USA
| | - Marina Catallozzi
- Division of Child and Adolescent Health, Department of Pediatrics, Columbia University Vagelos College of Physicians and Surgeons and NewYork-Presbyterian, New York, New York, USA
- Department of Pediatrics, Columbia University Vagelos College of Physicians and Surgeons and NewYork-Presbyterian, New York, New York, USA
- Department of Population and Family Health, Mailman School of Public Health, Columbia University Irving Medical Center, New York, New York, USA
| |
Collapse
|
12
|
Heath JK, Kogan JR, Holmboe ES, Conforti L, Park YS, Dine CJ. Gender Differences in Work-Based Assessment Scores and Narrative Comments After Direct Observation. J Gen Intern Med 2024; 39:1795-1802. [PMID: 38289461 PMCID: PMC11282012 DOI: 10.1007/s11606-024-08645-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/20/2023] [Accepted: 01/19/2024] [Indexed: 07/28/2024]
Abstract
BACKGROUND While some prior studies of work-based assessment (WBA) numeric ratings have not shown gender differences, they have been unable to account for the true performance of the resident or explore narrative differences by gender. OBJECTIVE To explore gender differences in WBA ratings as well as narrative comments (when scripted performance was known). DESIGN Secondary analysis of WBAs obtained from a randomized controlled trial of a longitudinal rater training intervention in 2018-2019. Participating faculty (n = 77) observed standardized resident-patient encounters and subsequently completed rater assessment forms (RAFs). SUBJECTS Participating faculty in longitudinal rater training. MAIN MEASURES Gender differences in mean entrustment ratings (4-point scale) were assessed with multivariable regression (adjusted for scripted performance, rater and resident demographics, and the interaction between study arm and time period [pre- versus post-intervention]). Using pre-specified natural language processing categories (masculine, feminine, agentic, and communal words), multivariable linear regression was used to determine associations of word use in the narrative comments with resident gender, race, and skill level, faculty demographics, and interaction between the study arm and the time period (pre- versus post-intervention). KEY RESULTS Across 1527 RAFs, there were significant differences in entrustment ratings between women and men standardized residents (2.29 versus 2.54, respectively, p < 0.001) after correction for resident skill level. As compared to men, feminine terms were more common for comments of what the resident did poorly among women residents (β 0.45, CI 0.12-0.78, p 0.01). This persisted despite adjusting for the faculty's entrustment ratings. There were no other significant linguistic differences by gender. CONCLUSIONS Contrasting prior studies, we found entrustment rating differences in a simulated WBA which persisted after adjusting for the resident's scripted performance. There were also linguistic differences by gender after adjusting for entrustment ratings, with feminine terms being used more frequently in comments about women in some, but not all narrative comments.
Collapse
Affiliation(s)
- Janae K Heath
- Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, USA.
| | - Jennifer R Kogan
- Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, USA
| | - Eric S Holmboe
- Accreditation Council for Graduate Medical Education, Chicago, IL, USA
| | - Lisa Conforti
- Accreditation Council for Graduate Medical Education, Chicago, IL, USA
| | - Yoon Soo Park
- University of Illinois College of Medicine, Chicago, IL, USA
| | - C Jessica Dine
- Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, USA
| |
Collapse
|
13
|
Lim A, Krishnan S, Singh H, Furletti S, Sarkar M, Stewart D, Malone D. Linking assessment to real life practice - comparing work based assessments and objective structured clinical examinations using mystery shopping. ADVANCES IN HEALTH SCIENCES EDUCATION : THEORY AND PRACTICE 2024; 29:859-878. [PMID: 37728720 PMCID: PMC11208193 DOI: 10.1007/s10459-023-10284-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/22/2023] [Accepted: 09/03/2023] [Indexed: 09/21/2023]
Abstract
Objective Structured Clinical Examinations (OSCEs) and Work Based Assessments (WBAs) are the mainstays of assessing clinical competency in health professions' education. Underpinned by the extrapolation inference in Kane's Validity Framework, the purpose of this study is to determine whether OSCEs translate to real life performance by comparing students' OSCE performance to their performance in real-life (as a WBA) using the same clinical scenario, and to understand factors that affect students' performance. A sequential explanatory mixed methods approach where a grade comparison between students' performance in their OSCE and WBA was performed. Students were third year pharmacy undergraduates on placement at a community pharmacy in 2022. The WBA was conducted by a simulated patient, unbeknownst to students and indistinguishable from a genuine patient, visiting the pharmacy asking for health advice. The simulated patient was referred to as a 'mystery shopper' and the process to 'mystery shopping' in this manuscript. Community pharmacy is an ideal setting for real-time observation and mystery shopping as staff can be accessed without appointment. The students' provision of care and clinical knowledge was assessed by the mystery shopper using the same clinical checklist the student was assessed from in the OSCE. Students who had the WBA conducted were then invited to participate in semi-structured interviews to discuss their experiences in both settings. Overall, 92 mystery shopper (WBA) visits with students were conducted and 36 follow-up interviews were completed. The median WBA score was 41.7% [IQR 28.3] and significantly lower compared to the OSCE score 80.9% [IQR 19.0] in all participants (p < 0.001). Interviews revealed students knew they did not perform as well in the WBA compared to their OSCE, but reflected that they still need OSCEs to prepare them to manage real-life patients. Many students related their performance to how they perceived their role in OSCEs versus WBAs, and that OSCEs allowed them more autonomy to manage the patient as opposed to an unfamiliar workplace. As suggested by the activity theory, the performance of the student can be driven by their motivation which differed in the two contexts.
Collapse
Affiliation(s)
- Angelina Lim
- Faculty of Pharmacy and Pharmaceutical Sciences, Monash University, 3052, Parkville, VIC, Australia.
| | - Sunanthiny Krishnan
- Department of Cardiovascular Sciences, University of Leicester, Glenfield Hospital, LE3 9QP, Leicester, UK
| | - Harjit Singh
- Faculty of Pharmacy and Pharmaceutical Sciences, Monash University, 3052, Parkville, VIC, Australia
| | - Simon Furletti
- Faculty of Pharmacy and Pharmaceutical Sciences, Monash University, 3052, Parkville, VIC, Australia
| | - Mahbub Sarkar
- Monash Centre for Scholarship in Health Education, Faculty of Medicine and Nursing, Monash University, 3806, Clayton, VIC, Australia
| | | | - Daniel Malone
- Faculty of Pharmacy and Pharmaceutical Sciences, Monash University, 3052, Parkville, VIC, Australia
| |
Collapse
|
14
|
Baboolal SO, Singaram VS. Implementation and Impact of an Adapted Digital Perioperative Competency-Building Tool to Enhance Teaching, Learning And Feedback in Postgraduate Competency-Based Medical Education. JOURNAL OF SURGICAL EDUCATION 2024; 81:722-740. [PMID: 38492984 DOI: 10.1016/j.jsurg.2024.01.015] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/17/2023] [Revised: 01/25/2024] [Accepted: 01/25/2024] [Indexed: 03/18/2024]
Abstract
OBJECTIVES The purpose of this educational intervention was to introduce, iteratively adapt, and implement a digital formative assessment tool in a surgical speciality. The study also evaluated the intervention's impact on perioperative teaching, learning, feedback, and surgical competency. DESIGN A participatory action research model with a mixed methods approach. SETTING This study was performed over 10 months in an institutional hospital in South Africa with a general surgery department. PARTICIPANTS Twelve supervising surgical trainers/faculty and 12 surgical trainees/residents consented to participate in the intervention. RESULTS The first 4 months of the intervention focused on relationship building, a multi-stakeholder contextual needs assessment and training sessions to support a shared mindset and shift in the teaching and learning culture. The final adapted perioperative competency-building tool comprised a 23-item assessment with four open-text answers (Table 1). Over the following 6-month period, 48 workplace-based competency-building perioperative evaluations were completed. Most trainees took less than 5 minutes to self-assess (67%) before most trainers (67%) took less than 5 minutes to give oral feedback to the trainee after the perioperative supervised learning encounter. On average, the digital tool took 6 minutes to complete during the bidirectional perioperative teaching and learning encounter with no negative impact on the operational flow. All trainers and trainees reported the training and implementation of the digital tool to be beneficial to teaching, learning, feedback, and the development of surgical competency. Analysis of the completed tools revealed several trainees showing evidence of progression in surgical competency for index procedures within the speciality. The focus groups and interviews also showed a change in the teaching and learning culture: more positively framed, frequent, structured, and specific feedback, improved accountability, and trainee-trainer perioperative readiness for teaching. Highlighted changes included the usefulness of trainee self-assessment before perioperative trainer feedback and the tool's value in improving competency to Kirkpatrick Level 4. CONCLUSION Implementing an adapted digital Workplace-Based Assessment (WBA) tool using a participatory action research model has proven successful in enhancing the effectiveness of supervised perioperative teaching and learning encounters. This approach has improved teaching and feedback practices, facilitated the development of surgical competency, and ultimately impacted the overall culture to Kirkpatrick level 4. Importantly, it has positively influenced the trainee-trainer relationship dynamic. Based on these positive outcomes, we recommend using this effective method and our relationship-centred framework for implementing formative competency-building tools in future studies. By doing so, larger-scale and successful implementation of Competency-Based Medical Education (CBME) could be achieved in various contexts. This approach can potentially enhance teaching and learning encounters, promote competency development, and improve the overall educational experience for surgical trainees and trainers.
Collapse
Affiliation(s)
- Sandika O Baboolal
- School of Clinical Medicine College of Health Sciences University of KwaZulu Natal 719 Umbilo Road, Umbilo Durban 4001 South Africa; Ophthalmology Department, Division of Surgery, James Paget University Hospital NHS Foundation Trust, United Kingdom.
| | - Veena S Singaram
- School of Clinical Medicine College of Health Sciences University of KwaZulu Natal 719 Umbilo Road, Umbilo Durban 4001 South Africa
| |
Collapse
|
15
|
Moreci R, Gates RS, Marcotte KM, George BC, Krumm AE. Right Case, Right Time: Which Procedures Best Differentiate General Surgery Trainees' Operative Performance? JOURNAL OF SURGICAL EDUCATION 2023; 80:1493-1502. [PMID: 37349156 DOI: 10.1016/j.jsurg.2023.05.009] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/24/2023] [Revised: 04/18/2023] [Accepted: 05/08/2023] [Indexed: 06/24/2023]
Abstract
OBJECTIVE Assessing surgical trainee operative performance is time- and resource-intensive. To maximize the utility of each assessment, it is important to understand which assessment activities provide the most information about a trainee's performance. The objective of this study is to identify the procedures that best differentiate performance for each general surgery postgraduate year (PGY)-level, leading to recommendations for targeted assessment. DESIGN The Society for Improving Medical Professional Learning (SIMPL) operative performance ratings were modeled using a multilevel Rasch model which identified the highest and lowest performing trainees for each PGY-level. For each procedure within each PGY-level, a procedural performance discrimination index was calculated by subtracting the proportion of "practice-ready" ratings of the lowest performing trainees from the proportion of "practice-ready" ratings of the highest performing trainees. Four-quadrant plots were created using the median procedure volume and median discrimination index for each PGY-level. All procedures within the upper right quadrant were considered "highly differentiating, high volume" procedures. SETTING This study was conducted across 70 general surgical residency programs who are members of the SIMPL collaborative. PARTICIPANTS A total of 54,790 operative performance evaluations of categorical general surgery trainees were collected between 2015 and 2021. Trainees who had at least 1 procedure in common were included. Procedures with less than 25 evaluations per training year were excluded. RESULTS The total number of evaluations per procedure ranged from 25 to 2,131. Discrimination values were generated for 51 (PGY1), 54 (PGY2), 92 (PGY3), 105 (PGY4), and 103 (PGY5) procedures. Using the above criteria, a total of 12 (PGY1), 15 (PGY2), 22 (PGY3), 21 (PGY4), and 28 (PGY5) procedures were identified as highly differentiating, high volume procedures. CONCLUSIONS Our study draws on national data to identify procedures which are most useful in differentiating trainee operative performance at each PGY-level. This list of procedures can be used to guide targeted assessment and improve assessment efficiency.
Collapse
Affiliation(s)
- Rebecca Moreci
- Department of Surgery, Center for Surgical Training and Research, University of Michigan, Ann Arbor, Michigan.
| | - Rebecca S Gates
- Department of Surgery, Center for Surgical Training and Research, University of Michigan, Ann Arbor, Michigan
| | - Kayla M Marcotte
- Department of Surgery, Center for Surgical Training and Research, University of Michigan, Ann Arbor, Michigan; Department of Learning Health Sciences, University of Michigan, Ann Arbor, Michigan
| | - Brian C George
- Department of Surgery, Center for Surgical Training and Research, University of Michigan, Ann Arbor, Michigan
| | - Andrew E Krumm
- Department of Surgery, Center for Surgical Training and Research, University of Michigan, Ann Arbor, Michigan; Department of Learning Health Sciences, University of Michigan, Ann Arbor, Michigan; School of Information, University of Michigan, Ann Arbor, Michigan
| |
Collapse
|
16
|
Baboolal SO, Singaram VS. The Use, Effectiveness, and Impact of Workplace-Based Assessments on Teaching, Supervision and Feedback Across Surgical Specialties. JOURNAL OF SURGICAL EDUCATION 2023; 80:1158-1171. [PMID: 37407351 DOI: 10.1016/j.jsurg.2023.05.012] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/13/2022] [Revised: 04/17/2023] [Accepted: 05/15/2023] [Indexed: 07/07/2023]
Abstract
OBJECTIVE To investigate the use and effectiveness of Workplace-based assessments (WBAs) and their impact on training, feedback, and perioperative teaching in surgical training programs. DESIGN A mixed methods cross-sectional, national electronic survey was conducted with surgical trainees and consultant trainers. SETTINGS The trainees and supervising faculty were from all 8 major surgical training universities across 11 surgical disciplines in South Africa. PARTICIPANTS A total of 108 surgical trainees and 41 supervising consultant trainers from 11 surgical disciplines across 8 surgical training universities responded to the survey. RESULTS The most significant educational gap identified by both the surgical trainees and trainers across all surgical disciplines was inadequate perioperative feedback. A third of the respondents were currently using workplace-based assessments. The WBA users (both trainees and trainers) had a higher rating for the general quality of surgical feedback than WBA nonusers (p = 0.02). WBA users also had a higher rating for the general quality of feedback given to trainees on their skills and competence (p = 0.04) and a higher rating for trainee supervision (p = 0.01) and the specialist training program overall (p = 0.01). The WBA users also had a higher rating for the assessment of competencies such as the trainee as an effective communicator (p < 0.01) and collaborator (p = 0.04). CONCLUSION This study found that the use of WBAs enhances the quality and effectiveness of feedback in surgical training programs. We also found that the use of WBAs enhance perioperative teaching and learning and improves the assessment of relational competencies. This was also associated with high ratings for the quality of trainee supervision. Faculty and trainee development, strengthening the trainee-trainer relationship, and integrating iterative stakeholder feedback could help realize the full potential of WBAs to augment surgical training across disciplines.
Collapse
Affiliation(s)
- Sandika O Baboolal
- School of Clinical Medicine, College of Health Sciences, University of KwaZulu Natal, Durban, South Africa.
| | - Veena S Singaram
- School of Clinical Medicine, College of Health Sciences, University of KwaZulu Natal, Durban, South Africa
| |
Collapse
|
17
|
Blanchette P, Poitras ME, St-Onge C. Assessing trainee's performance using reported observations: Perceptions of nurse meta-assessors. NURSE EDUCATION TODAY 2023; 126:105836. [PMID: 37167832 DOI: 10.1016/j.nedt.2023.105836] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/31/2022] [Revised: 04/12/2023] [Accepted: 04/30/2023] [Indexed: 05/13/2023]
Abstract
BACKGROUND Educational and health care organizations who prepare meta-assessors to fulfill their role in the assessment of trainees' performance based on reported observations have little literature to rely on. While the assessment of trainees' performance based on reported observations has been operationalized, we have yet to understand the elements that can affect its quality fully. Closing this gap in the literature will provide valuable insight that could inform the implementation and quality monitoring of the assessment of trainees' performance based on reported observations. OBJECTIVES The purpose of this study was to explore the elements to consider in the assessment of trainees' performance based on reported observations from the perspectives of meta-assessors. METHODS Design, Settings, Participants, data collection and analysis. The authors adopted Sandelowski's qualitative descriptive approach to interview nurse meta-assessors from two nursing programs. A semi-structured interview guide was used to document the elements to consider in the assessment of nursing trainees' performance based on reported observations, and a survey was used to collect sociodemographic data. The authors conducted a thematic analysis of the interview transcripts. RESULTS Thirteen meta-assessors participated in the study. Three core themes were identified: (1) meta-assessors' appropriation of their perceived assessment roles and activities, (2) team climate of information sharing, and (3) challenges associated with the assessment of trainees' performance based on reported observations. Each theme is comprised of several sub themes. CONCLUSIONS To optimize the quality of the assessment of the trainee's performance based on reported observations and ratings, HPE programs might consider how to clarify better the meta-assessor's roles and activities, as well as how interventions could be created to promote a climate of information sharing and to address the challenges identified. This work will guide educational and health care organizations for better preparation and support for meta-assessors and preceptors.
Collapse
Affiliation(s)
| | - Marie-Eve Poitras
- Department of Family Medicine and Emergency Medicine, University of Sherbrooke, Sherbrooke, Quebec, Canada.
| | - Christina St-Onge
- Department of Medicine, University of Sherbrooke, Sherbrooke, Quebec, Canada
| |
Collapse
|
18
|
Andreou V, Peters S, Eggermont J, Embo M, Michels NR, Schoenmakers B. Fitness-for-purpose of the CanMEDS competencies for workplace-based assessment in General Practitioner's Training: a Delphi study. BMC MEDICAL EDUCATION 2023; 23:204. [PMID: 37005633 PMCID: PMC10067520 DOI: 10.1186/s12909-023-04207-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 05/16/2022] [Accepted: 03/28/2023] [Indexed: 06/19/2023]
Abstract
BACKGROUND In view of the exponential use of the CanMEDS framework along with the lack of rigorous evidence about its applicability in workplace-based medical trainings, further exploring is necessary before accepting the framework as accurate and reliable competency outcomes for postgraduate medical trainings. Therefore, this study investigated whether the CanMEDS key competencies could be used, first, as outcome measures for assessing trainees' competence in the workplace, and second, as consistent outcome measures across different training settings and phases in a postgraduate General Practitioner's (GP) Training. METHODS In a three-round web-based Delphi study, a panel of experts (n = 25-43) was asked to rate on a 5-point Likert scale whether the CanMEDS key competencies were feasible for workplace-based assessment, and whether they could be consistently assessed across different training settings and phases. Comments on each CanMEDS key competency were encouraged. Descriptive statistics of the ratings were calculated, while content analysis was used to analyse panellists' comments. RESULTS Out of twenty-seven CanMEDS key competencies, consensus was not reached on six competencies for feasibility of assessment in the workplace, and on eleven for consistency of assessment across training settings and phases. Regarding feasibility, three out of four key competencies under the role "Leader", one out of two competencies under the role "Health Advocate", one out of four competencies under the role "Scholar", and one out of four competencies under the role "Professional" were deemed as not feasible for assessment in a workplace setting. Regarding consistency, consensus was not achieved for one out of five competencies under "Medical Expert", two out of five competencies under "Communicator",one out of three competencies under "Collaborator", one out of two under "Health Advocate", one out of four competencies under "Scholar", one out of four competencies under "Professional". No competency under the role "Leader" was deemed to be consistently assessed across training settings and phases. CONCLUSIONS The findings indicate a mismatch between the initial intent of the CanMEDS framework and its applicability in the context of workplace-based assessment. Although the CanMEDS framework could offer starting points, further contextualization of the framework is required before implementing in workplace-based postgraduate medical trainings.
Collapse
Affiliation(s)
- Vasiliki Andreou
- Academic Centre for General Practice, Department of Public Health and Primary Care, KU Leuven, Box 7001, Kapucijnenvoer 7, Leuven, 3000, Belgium.
| | - Sanne Peters
- Academic Centre for General Practice, Department of Public Health and Primary Care, KU Leuven, Box 7001, Kapucijnenvoer 7, Leuven, 3000, Belgium
- School of Health Sciences, Faculty of Medicine, Dentistry and Health Sciences, University of Melbourne, Melbourne, Australia
| | - Jan Eggermont
- Department of Cellular and Molecular Medicine, KU Leuven, Leuven, Belgium
| | - Mieke Embo
- Department of Educational Studies, Faculty of Psychology and Educational Sciences, University of Ghent, Ghent, Belgium
- Health and Care Research, Artevelde University of Applied Sciences, Ghent, Belgium
| | - Nele R Michels
- Center for General Practice, Department of Family Medicine and Population Health, University of Antwerp, Antwerp, Belgium
| | - Birgitte Schoenmakers
- Academic Centre for General Practice, Department of Public Health and Primary Care, KU Leuven, Box 7001, Kapucijnenvoer 7, Leuven, 3000, Belgium
| |
Collapse
|
19
|
Moeller J, Salas RME. Neurology Education in 2035: The Neurology Future Forecasting Series. Neurology 2023; 100:579-586. [PMID: 36564205 PMCID: PMC10033166 DOI: 10.1212/wnl.0000000000201669] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2022] [Accepted: 10/24/2022] [Indexed: 12/24/2022] Open
Abstract
In the past decade, there have been dramatic changes in all aspects of neurologic care, and along with this, neurology education has transformed. These changes have affected all aspects of education across the educational continuum, including learners, teachers, educators, content, delivery methods, assessments, and outcomes. Health systems science, health humanities, diversity, equity, and inclusion and health disparities are becoming core components of neurology curricula, and, in the future, will be integrated into every aspect of our educational mission. The ways in which material is taught and learned have been influenced by technologic innovations and a growing understanding of the science of learning. We forecast that this trend will continue, with learners choosing from an array of electronic resources to engage with fundamental topics, allowing front-line clinical teachers to spend more time supporting critical reasoning and teaching students how to learn. There has been a growing differentiation of educational roles (i.e., teachers, educators, and scholars). We forecast that these roles will become more distinct, each with an individualized pattern of support and expectations. Assessment has become more aligned with the work of the learners, and there are growing calls to focus more on the impact of educational programs on patient care. We forecast that there will be an increased emphasis on educational outcomes and public accountability for training programs. In this article, we reflect on the history of medical education in neurology and explore the current state to forecast the future of neurology education and discuss ways in which we can prepare.
Collapse
Affiliation(s)
- Jeremy Moeller
- From the Department of Neurology (J.M.), Yale University, New Haven, CT; Department of Neurology and Neurosurgery (R.M.E.S.), Johns Hopkins School of Medicine, Baltimore, MD.
| | - Rachel Marie E Salas
- From the Department of Neurology (J.M.), Yale University, New Haven, CT; Department of Neurology and Neurosurgery (R.M.E.S.), Johns Hopkins School of Medicine, Baltimore, MD
| |
Collapse
|
20
|
Eltayar AN, Aref SR, Khalifa HM, Hammad AS. Do entrustment scales make a difference in the inter-rater reliability of the workplace-based assessment? MEDICAL EDUCATION ONLINE 2022; 27:2053401. [PMID: 35311494 PMCID: PMC8942514 DOI: 10.1080/10872981.2022.2053401] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/15/2021] [Revised: 03/08/2022] [Accepted: 03/11/2022] [Indexed: 06/14/2023]
Abstract
BACKGROUND A workplace-based assessment (WBA) is used to assess learners' competencies in their workplaces. Many workplace assessment tools are available and validated to assess various constructs. The implementation of workplace-based assessment requires proper training of the staff. OBJECTIVE This study aimed to explore the impact of staff training on WBA practices and evaluate the inter-rater reliability of these practices while using entrustment scales, performance descriptors, and personal judgment. DESIGN A quasi-experimental study, in which the staff members of the orthopedic department were invited to participate in a training program on the use of entrustment scales and assessment descriptors within the WBA tools. As a response to the training, subjective judgment was replaced by entrustment scales and performance descriptors in a trauma course offered by the orthopedic department. The inter-rater reliability of the WBA was evaluated using various rating scales. RESULTS The entrustment scales had higher inter-rater reliability of the assessment tools than performance descriptors and the personal judgment. CONCLUSION The inter-rater reliability was highest when using entrustment scales for WBAs, which could indicate that the entrustment scales achieve good psychometric properties as regards consistency among different raters. Thus, they decrease the confounding effect of differences in assessors. They may also give a clearer image of the actual academic level of the learners.
Collapse
Affiliation(s)
- Ayat Nabil Eltayar
- Medical Education Department, Faculty of Medicine, Alexandria University, Egypt
| | - Soha Rashed Aref
- Community Medicine and Public Health Department, Faculty of Medicine, Alexandria University, Egypt
| | - Hoda Mahmoud Khalifa
- Histology and Cell biology Department, Faculty of Medicine, Alexandria University, Egypt
| | - Abdullah Said Hammad
- Orthopaedic and Traumatology Department, Faculty of Medicine, Alexandria University, Egypt
| |
Collapse
|
21
|
Phinney LB, Fluet A, O'Brien BC, Seligman L, Hauer KE. Beyond Checking Boxes: Exploring Tensions With Use of a Workplace-Based Assessment Tool for Formative Assessment in Clerkships. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2022; 97:1511-1520. [PMID: 35703235 DOI: 10.1097/acm.0000000000004774] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
PURPOSE To understand the role of a workplace-based assessment (WBA) tool in facilitating feedback for medical students, this study explored changes and tensions in a clerkship feedback activity system through the lens of cultural historical activity theory (CHAT) over 2 years of tool implementation. METHOD This qualitative study uses CHAT to explore WBA use in core clerkships by identifying feedback activity system elements (e.g., community, tools, rules, objects) and tensions among these elements. University of California, San Francisco core clerkship students were invited to participate in semistructured interviews eliciting experience with a WBA tool intended to enhance direct observation and feedback in year 1 (2019) and year 2 (2020) of implementation. In year 1, the WBA tool required supervisor completion in the school's evaluation system on a computer. In year 2, both students and supervisors had WBA completion abilities and could access the form via a smartphone separate from the school's evaluation system. RESULTS Thirty-five students participated in interviews. The authors identified tensions that shifted with time and tool iterations. Year 1 students described tensions related to cumbersome tool design, fear of burdening supervisors, confusion over WBA purpose, WBA as checking boxes, and WBA usefulness depending on clerkship context and culture. Students perceived dissatisfaction with the year 1 tool version among peers and supervisors. The year 2 mobile-based tool and student completion capabilities helped to reduce many of the tensions noted in year 1. Students expressed wider WBA acceptance among peers and supervisors in year 2 and reported understanding WBA to be for low-stakes feedback, thereby supporting formative assessment for learning. CONCLUSIONS Using CHAT to explore changes in a feedback activity system with WBA tool iterations revealed elements important to WBA implementation, including designing technology for tool efficiency and affording students autonomy to document feedback with WBAs.
Collapse
Affiliation(s)
- Lauren B Phinney
- L.B. Phinney is a first-year internal medicine resident, Department of Medicine, University of California, San Francisco School of Medicine, San Francisco, California
| | - Angelina Fluet
- A. Fluet is a fourth-year medical student, University of California, San Francisco School of Medicine, San Francisco, California
| | - Bridget C O'Brien
- B.C. O'Brien is professor of medicine and education scientist, Department of Medicine and Center for Faculty Educators, University of California, San Francisco School of Medicine, San Francisco, California
| | - Lee Seligman
- L. Seligman is a second-year internal medicine resident, Department of Medicine, New York-Presbyterian Hospital, Columbia University Irving Medical Center, New York, New York
| | - Karen E Hauer
- K.E. Hauer is associate dean for competency assessment and professional standards and professor, Department of Medicine, University of California, San Francisco School of Medicine, San Francisco, California
| |
Collapse
|
22
|
Torre D, Schuwirth L, Van der Vleuten C, Heeneman S. An international study on the implementation of programmatic assessment: Understanding challenges and exploring solutions. MEDICAL TEACHER 2022; 44:928-937. [PMID: 35701165 DOI: 10.1080/0142159x.2022.2083487] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
INTRODUCTION Programmatic assessment is an approach to assessment aimed at optimizing the learning and decision function of assessment. It involves a set of key principles and ground rules that are important for its design and implementation. However, despite its intuitive appeal, its implementation remains a challenge. The purpose of this paper is to gain a better understanding of the factors that affect the implementation process of programmatic assessment and how specific implementation challenges are managed across different programs. METHODS An explanatory multiple case (collective) approach was used for this study. We identified 6 medical programs that had implemented programmatic assessment with variation regarding health profession disciplines, level of education and geographic location. We conducted interviews with a key faculty member from each of the programs and analyzed the data using inductive thematic analysis. RESULTS We identified two major factors in managing the challenges and complexity of the implementation process: knowledge brokers and a strategic opportunistic approach. Knowledge brokers were the people who drove and designed the implementation process acting by translating evidence into practice allowing for real-time management of the complex processes of implementation. These knowledge brokers used a 'strategic opportunistic' or agile approach to recognize new opportunities, secure leadership support, adapt to the context and take advantage of the unexpected. Engaging in an overall curriculum reform process was a critical factor for a successful implementation of programmatic assessment. DISCUSSION The study contributes to the understanding of the intricacies of implementation processes of programmatic assessment across different institutions. Managing opportunities, adaptive planning, awareness of context, were all critical aspects of thinking strategically and opportunistically in the implementation of programmatic assessment. Future research is needed to provide a more in-depth understanding of values and beliefs that underpin the assessment culture of an organization, and how such values may affect implementation.
Collapse
Affiliation(s)
- Dario Torre
- Director of Assessment, and Professor of Medicine, University of Central Florida College of Medicine, Orlando, FL, USA
| | - Lambert Schuwirth
- College of Medicine and Public Health, Flinders University, Adelaide, Australia
| | - Cees Van der Vleuten
- Department of Educational Development and Research, School of Health Profession Education, Maastricht University, Maastricht, The Netherlands
| | - Sylvia Heeneman
- Department of Pathology, School Health Profession Education, Cardiovascular Research Institute Maastricht, Maastricht University, Maastricht, The Netherlands
| |
Collapse
|
23
|
Sukhera J, Fung CC, Kulasegaram K. Disruption and Dissonance: Exploring Constructive Tensions Within Research in Medical Education. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2021; 96:S1-S5. [PMID: 34348377 DOI: 10.1097/acm.0000000000004326] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
The academic medicine community has experienced an unprecedented level of disruption in recent years. In this context, the authors consider how the disruptions have impacted the state of research in medical education (RIME). The articles in this year's RIME supplement reflect several constructive tensions that provide insight on future for the field. In this commentary, the authors discuss themes and propose a framework for the future. Recommendations include: normalizing help seeking during times of disruption and uncertainty, contextualizing the application of complex approaches to assessment, advancing and problematizing innovation, and recognizing the deeply embedded and systemic nature of inequities.
Collapse
Affiliation(s)
- Javeed Sukhera
- J. Sukhera is associate professor, Departments of Psychiatry and Paediatrics, and a scientist, Centre for Education Research and Innovation, Schulich School of Medicine and Dentistry, Western University, London, Ontario, Canada; ORCID: https://orcid.org/0000-0001-8146-4947
| | - Cha-Chi Fung
- C.-C. Fung is associate professor, Department of Medical Education, and assistant dean for research and scholarship, Keck School of Medicine of USC, University of Southern California, Los Angeles, California
| | - Kulamakan Kulasegaram
- K. Kulasegaram is associate professor, Department of Family & Community Medicine, a scientist, Wilson Centre, and the Temerty Chair in Learner Assessment and Program Evaluation, Temerty Faculty of Medicine, University of Toronto, Toronto, Canada
| |
Collapse
|