1
|
Zagaar M, Boedeker PJ, Love SJ. PharmaCORE: Optimizing medical pharmacology education with an innovative instructional dashboard. MEDICAL TEACHER 2024:1-3. [PMID: 38648549 DOI: 10.1080/0142159x.2024.2342540] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/06/2023] [Accepted: 04/09/2024] [Indexed: 04/25/2024]
Abstract
WHAT WAS THE EDUCATIONAL CHALLENGE? Diminishing emphasis on pharmacology education in medical schools has resulted in a concerning lack of prescribing knowledge among physician graduates. These concerns mirror our graduates' expressed dissatisfaction with the structure and quality of pharmacology educational experiences over the past 5 years. WHAT WAS THE SOLUTION? PharmaCORE, a web-based instructional dashboard, was developed as an interactive faculty development tool to enhance integration and instruction of pharmacology content in pre-clinical curriculum at a US medical school. HOW WAS THE SOLUTION IMPLEMENTED? PharmaCORE was introduced in Spring 2022 for instructors teaching pharmacology in the pre-clinical curriculum. Instructors used the dashboard to assess coverage of specific drug topics throughout the curriculum and to apply tailored, learner-centered teaching strategies to optimize learner engagement and comprehension. WHAT LESSONS WERE LEARNED THAT ARE RELEVANT TO A WIDER GLOBAL AUDIENCE? The initial assessment indicated that the dashboard was user-friendly and positively influenced instructor awareness of pharmacology content and learner-centered teaching. This faculty development approach underscores the importance of skill-based mapping and maintaining learner-centered teaching standards to address other integrated subjects and broader curricular challenges. WHAT ARE THE NEXT STEPS? This study lays the foundation for the broader applicability of instructional dashboards in tracking and addressing curricular challenges across pharmacology and other science subjects. Future steps include more personalized feedback for instructors, creating a student-accessible version, and ongoing monitoring of maintenance measures like milestone exams.
Collapse
Affiliation(s)
- Munder Zagaar
- Department of Education Innovation and Technology, Baylor College of Medicine, Houston, TX, USA
| | - Peter J Boedeker
- Department of Education Innovation and Technology, Baylor College of Medicine, Houston, TX, USA
| | - Sherita J Love
- Department of Education Innovation and Technology, Baylor College of Medicine, Houston, TX, USA
| |
Collapse
|
2
|
Dalseg TR, Thoma B, Wycliffe-Jones K, Frank JR, Taber S. Enabling Implementation of Competency Based Medical Education through an Outcomes-Focused Accreditation System. PERSPECTIVES ON MEDICAL EDUCATION 2024; 13:75-84. [PMID: 38343559 PMCID: PMC10854411 DOI: 10.5334/pme.963] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/08/2023] [Accepted: 09/08/2023] [Indexed: 02/15/2024]
Abstract
Competency based medical education is being adopted around the world. Accreditation plays a vital role as an enabler in the adoption and implementation of competency based medical education, but little has been published about how the design of an accreditation system facilitates this transformation. The Canadian postgraduate medical education environment has recently transitioned to an outcomes-based accreditation system in parallel with the adoption of competency based medical education. Using the Canadian example, we characterize four features of an accreditation system that can facilitate the implementation of competency based medical education: theoretical underpinning, quality focus, accreditation standards, and accreditation processes. Alignment of the underlying educational theories within the accreditation system and educational paradigm drives change in a consistent and desired direction. An accreditation system that prioritizes quality improvement over quality assurance promotes educational system development and progressive change. Accreditation standards that achieve the difficult balance of being sufficiently detailed yet flexible foster a high fidelity of implementation without stifling innovation. Finally, accreditation processes that recognize the change process, encourage program development, and are not overly punitive all enable the implementation of competency based medical education. We also discuss the ways in which accreditation can simultaneously hinder the implementation of this approach. As education bodies adopt competency based medical education, particular attention should be paid to the role that accreditation plays in successful implementation.
Collapse
Affiliation(s)
- Timothy R. Dalseg
- Department of Medicine, Division of Emergency Medicine, University of Toronto, Toronto, ON, Canada
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Toronto General Hospital, 200 Elizabeth Street, R. Fraser Elliott Building, Ground Floor, Room 480, Toronto, ON M5G 2C4, (416) 833-0121; Canada
| | - Brent Thoma
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Department of Emergency Medicine, University of Saskatchewan, Saskatoon, SK, Canada
| | - Keith Wycliffe-Jones
- Department of Family Medicine, Cumming School of Medicine, University of Calgary, Calgary, AB, Canada
| | - Jason R. Frank
- Centre for Innovation in Medical Education, University of Ottawa, Ottawa, ON, Canada
| | - Sarah Taber
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
| |
Collapse
|
3
|
Choo EK, Woods R, Walker ME, O’Brien JM, Chan TM. The Quality of Assessment for Learning score for evaluating written feedback in anesthesiology postgraduate medical education: a generalizability and decision study. CANADIAN MEDICAL EDUCATION JOURNAL 2023; 14:78-85. [PMID: 38226296 PMCID: PMC10787859 DOI: 10.36834/cmej.75876] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 01/17/2024]
Abstract
Background Competency based residency programs depend on high quality feedback from the assessment of entrustable professional activities (EPA). The Quality of Assessment for Learning (QuAL) score is a tool developed to rate the quality of narrative comments in workplace-based assessments; it has validity evidence for scoring the quality of narrative feedback provided to emergency medicine residents, but it is unknown whether the QuAL score is reliable in the assessment of narrative feedback in other postgraduate programs. Methods Fifty sets of EPA narratives from a single academic year at our competency based medical education post-graduate anesthesia program were selected by stratified sampling within defined parameters [e.g. resident gender and stage of training, assessor gender, Competency By Design training level, and word count (≥17 or <17 words)]. Two competency committee members and two medical students rated the quality of narrative feedback using a utility score and QuAL score. We used Kendall's tau-b co-efficient to compare the perceived utility of the written feedback to the quality assessed with the QuAL score. The authors used generalizability and decision studies to estimate the reliability and generalizability coefficients. Results Both the faculty's utility scores and QuAL scores (r = 0.646, p < 0.001) and the trainees' utility scores and QuAL scores (r = 0.667, p < 0.001) were moderately correlated. Results from the generalizability studies showed that utility scores were reliable with two raters for both faculty (Epsilon=0.87, Phi=0.86) and trainees (Epsilon=0.88, Phi=0.88). Conclusions The QuAL score is correlated with faculty- and trainee-rated utility of anesthesia EPA feedback. Both faculty and trainees can reliability apply the QuAL score to anesthesia EPA narrative feedback. This tool has the potential to be used for faculty development and program evaluation in Competency Based Medical Education. Other programs could consider replicating our study in their specialty.
Collapse
Affiliation(s)
- Eugene K Choo
- Department of Anesthesiology, College of Medicine, University of Saskatchewan, Saskatchewan, Canada;
| | - Rob Woods
- Department of Emergency Medicine, College of Medicine, University of Saskatchewan, Saskatchewan, Canada
| | - Mary Ellen Walker
- Department of Anesthesiology, College of Medicine, University of Saskatchewan, Saskatchewan, Canada;
| | - Jennifer M O’Brien
- Department of Anesthesiology, College of Medicine, University of Saskatchewan, Saskatchewan, Canada;
| | - Teresa M Chan
- Department of Medicine (Division of Emergency Medicine; Division of Education & Innovation), Michael G. DeGroote School of Medicine, Faculty of Health Sciences, McMaster University and Office of Continuing Professional Development & McMaster Education Research, Innovation, and Theory (MERIT) Program, Faculty of Health Sciences, McMaster University, Ontario, Canada
| |
Collapse
|
4
|
Schultz KW, Kolomitro K, Koppula S, Bethune CH. Competency-based faculty development: applying transformations from lessons learned in competency-based medical education. CANADIAN MEDICAL EDUCATION JOURNAL 2023; 14:95-102. [PMID: 38045069 PMCID: PMC10689999 DOI: 10.36834/cmej.75768] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 12/05/2023]
Abstract
Faculty development in medical education is often delivered in an ad hoc manner instead of being a deliberately sequenced program matched to data-informed individual needs. In this article, the authors, all with extensive experience in Faculty Development (FD), present a competency-based faculty development (CBFD) framework envisioned to enhance the impact of FD. Steps and principles in the CBFD framework reflect the lessons learned from competency-based medical education (CBME) with its foundational goal to better train physicians to meet societal needs. The authors see CBFD as a similar framework, this one to better train faculty to meet educational needs. CBFD core elements include articulated competencies for the varied educational roles faculty fulfill, deliberately designed curricula structured to build those competencies, and an assessment program and process to support individualized faculty learning and professional growth. The framework incorporates ideas about where and how CBFD should be delivered, the use of coaching to promote reflection and identity formation and the creation of communities of learning. As with CBME, the CBFD framework has included the important considerations of change management, including broad stakeholder engagement, continuous quality improvement and scholarship. The authors have provided examples from the literature as well as challenges and considerations for each step.
Collapse
Affiliation(s)
| | | | - Sudha Koppula
- Department of Family Medicine, University of Alberta, Edmonton, Alberta
| | - Cheri H Bethune
- Northern Ontario School of Medicine, Ontario, Canada
- College of Family Physicians of Canada, Ontario, Canada
| |
Collapse
|
5
|
Sebok‐Syer SS, Dukelow AM, Sedran R, Shepherd L, McConnell A, Shaw JM, Lingard L. Developing and authenticating an electronic health record-based report card for assessing residents' clinical performance. AEM EDUCATION AND TRAINING 2023; 7:e10851. [PMID: 37008653 PMCID: PMC10061574 DOI: 10.1002/aet2.10851] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/02/2022] [Revised: 01/30/2023] [Accepted: 01/31/2023] [Indexed: 06/19/2023]
Abstract
Purpose The electronic health record (EHR) is frequently identified as a source of assessment data regarding residents' clinical performance. To better understand how to harness EHR data for education purposes, the authors developed and authenticated a prototype resident report card. This report card used EHR data exclusively and was authenticated with various stakeholders to understand individuals' reactions to and interpretations of EHR data when presented in this way. Methods Using principles derived from participatory action research and participatory evaluation, this study brought together residents, faculty, a program director, and medical education researchers (n = 19) to develop and authenticate a prototype report card for residents. From February to September 2019, participants were invited to take part in a semistructured interview that explored their reactions to the prototype and provided insights about how they interpreted the EHR data. Results Our results highlighted three themes: data representation, data value, and data literacy. Participants varied in terms of the best way to present the various EHR metrics and felt pertinent contextual information should be included. All participants agreed that the EHR data presented were valuable, but most had concerns about using it for assessment. Finally, participants had difficulties interpreting the data, suggesting that these data could be presented more intuitively and that residents and faculty may require additional training to fully appreciate these EHR data. Conclusions This work demonstrated how EHR data could be used to assess residents' clinical performance, but it also identified areas that warrant further consideration, especially pertaining to data representation and subsequent interpretation. Providing residents and faculty with EHR data in a resident report card was viewed as most valuable when used to guide feedback and coaching conversations.
Collapse
Affiliation(s)
- Stefanie S. Sebok‐Syer
- Department of Emergency MedicineStanford University School of MedicinePalo AltoCaliforniaUSA
| | - Adam M. Dukelow
- Division of Emergency Medicine at Schulich School of Medicine and DentistryWestern UniversityLondonOntarioCanada
| | - Robert Sedran
- Division of Emergency Medicine at Schulich School of Medicine and DentistryWestern UniversityLondonOntarioCanada
| | - Lisa Shepherd
- Division of Emergency Medicine at Schulich School of Medicine and DentistryWestern UniversityLondonOntarioCanada
| | - Allison McConnell
- Division of Emergency Medicine at Schulich School of Medicine and DentistryWestern UniversityLondonOntarioCanada
| | - Jennifer M. Shaw
- Centre for Education, Research, and Innovation at Schulich School of Medicine and DentistryWestern UniversityLondonOntarioCanada
| | - Lorelei Lingard
- Department of Medicine and Faculty of Education, and Senior Scientist at the Centre for Education, Research, and Innovation at Schulich School of Medicine and DentistryWestern UniversityLondonOntarioCanada
| |
Collapse
|
6
|
Yilmaz Y, Carey R, Chan TM, Bandi V, Wang S, Woods RA, Mondal D, Thoma B. Developing a dashboard for program evaluation in competency-based training programs: a design-based research project. CANADIAN MEDICAL EDUCATION JOURNAL 2022; 13:14-27. [PMID: 36310899 PMCID: PMC9588183 DOI: 10.36834/cmej.73554] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 06/16/2023]
Abstract
BACKGROUND Canadian specialist residency training programs are implementing a form of competency-based medical education (CBME) that requires the assessment of entrustable professional activities (EPAs). Dashboards could be used to track the completion of EPAs to support program evaluation. METHODS Using a design-based research process, we identified program evaluation needs related to CBME assessments and designed a dashboard containing elements (data, analytics, and visualizations) meeting these needs. We interviewed leaders from the emergency medicine program and postgraduate medical education office at the University of Saskatchewan. Two investigators thematically analyzed interview transcripts to identify program evaluation needs that were audited by two additional investigators. Identified needs were described using quotes, analytics, and visualizations. RESULTS Between July 1, 2019 and April 6, 2021 we conducted 17 interviews with six participants (two program leaders and four institutional leaders). Four needs emerged as themes: tracking changes in overall assessment metrics, comparing metrics to the assessment plan, evaluating rotation performance, and engagement with the assessment metrics. We addressed these needs by presenting analytics and visualizations within a dashboard. CONCLUSIONS We identified program evaluation needs related to EPA assessments and designed dashboard elements to meet them. This work will inform the development of other CBME assessment dashboards designed to support program evaluation.
Collapse
Affiliation(s)
- Yusuf Yilmaz
- Continuing Professional Development Office, and McMaster program for Education Research, Innovation, and Theory (MERIT), McMaster University, Ontario, Canada
- Department of Medical Education, Ege University, Turkey
| | - Robert Carey
- Department of Emergency Medicine, University of Saskatchewan, Saskatchewan, Canada
| | - Teresa M Chan
- Continuing Professional Development Office, and McMaster program for Education Research, Innovation, and Theory (MERIT), McMaster University, Ontario, Canada
- Division of Emergency Medicine, Department of Medicine at McMaster University
| | - Venkat Bandi
- Department of Computer Science, University of Saskatchewan, Saskatchewan, Canada
| | - Shisong Wang
- Department of Computer Science, University of Saskatchewan, Saskatchewan, Canada
| | - Robert A Woods
- Department of Emergency Medicine, University of Saskatchewan, Saskatchewan, Canada
| | - Debajyoti Mondal
- Department of Computer Science, University of Saskatchewan, Saskatchewan, Canada
| | - Brent Thoma
- Department of Emergency Medicine, University of Saskatchewan, Saskatchewan, Canada
- Royal College of Physicians and Surgeons of Canada, Ontario, Canada
| |
Collapse
|
7
|
Chan TM, Dowling S, Tastad K, Chin A, Thoma B. Integrating training, practice, and reflection within a new model for Canadian medical licensure: a concept paper prepared for the Medical Council of Canada. CANADIAN MEDICAL EDUCATION JOURNAL 2022; 13:68-81. [PMID: 36091730 PMCID: PMC9441128 DOI: 10.36834/cmej.73717] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
In 2020 the Medical Council of Canada created a task force to make recommendations on the modernization of its practices for granting licensure to medical trainees. This task force solicited papers on this topic from subject matter experts. As outlined within this Concept Paper, our proposal would shift licensure away from the traditional focus on high-stakes summative exams in a way that integrates training, clinical practice, and reflection. Specifically, we propose a model of graduated licensure that would have three stages including: a trainee license for trainees that have demonstrated adequate medical knowledge to begin training as a closely supervised resident, a transition to practice license for trainees that have compiled a reflective educational portfolio demonstrating the clinical competence required to begin independent practice with limitations and support, and a fully independent license for unsupervised practice for attendings that have demonstrated competence through a reflective portfolio of clinical analytics. This proposal was reviewed by a diverse group of 30 trainees, practitioners, and administrators in medical education. Their feedback was analyzed and summarized to provide an overview of the likely reception that this proposal would receive from the medical education community.
Collapse
Affiliation(s)
- Teresa M Chan
- Division of Emergency Medicine, Department of Medicine, Faculty of Health Sciences, McMaster University
- Division of Education & Innovation, Department of Medicine
- Faculty of Health Sciences, McMaster University; McMaster Education Research, Innovation, and Theory (MERIT) program
- Office of Continuing Professional Development; Faculty of Health Sciences, McMaster University
| | - Shawn Dowling
- Department of Emergency Medicine, Cumming School of Medicine, University of Calgary
| | - Kara Tastad
- Royal College Emergency Medicine Training Program, University of Toronto
| | - Alvin Chin
- Division of Emergency Medicine, Department of Medicine, Faculty of Health Sciences, McMaster University
| | - Brent Thoma
- Department of Emergency Medicine, University of Saskatchewan
| |
Collapse
|
8
|
Chan TM, Sebok-Syer SS, Yilmaz Y, Monteiro S. The Impact of Electronic Data to Capture Qualitative Comments in a Competency-Based Assessment System. Cureus 2022; 14:e23480. [PMID: 35494923 PMCID: PMC9038604 DOI: 10.7759/cureus.23480] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 03/24/2022] [Indexed: 11/23/2022] Open
Abstract
Introduction Digitalizing workplace-based assessments (WBA) holds the potential for facilitating feedback and performance review, wherein we can easily record, store, and analyze data in real time. When digitizing assessment systems, however, it is unclear what is gained and lost in the message as a result of the change in medium. This study evaluates the quality of comments generated in paper vs. electronic media and the influence of an assessor’s seniority. Methods Using a realist evaluation framework, a retrospective database review was conducted with paper-based and electronic medium comments. A sample of assessments was examined to determine any influence of the medium on the word count and the Quality of Assessment for Learning (QuAL) score. A correlation analysis evaluated the relationship between word count and QuAL score. Separate univariate analyses of variance (ANOVAs) were used to examine the influence of the assessor's seniority and medium on word count, QuAL score, and WBA scores. Results The analysis included a total of 1,825 records. The average word count for the electronic comments (M=16) was significantly higher than the paper version (M=12; p=0.01). Longer comments positively correlated with QuAL score (r=0.2). Paper-based comments received lower QuAL scores (0.41) compared to electronic (0.51; p<0.01). Years in practice was negatively correlated with QuAL score (r=-0.08; p<0.001) as was word count (r=-0.2; p<0.001). Conclusion Digitization of WBAs increased the length of comments and did not appear to jeopardize the quality of WBAs; these results indicate higher-quality assessment data. True digital transformation may be possible by harnessing trainee data repositories and repurposing them to analyze for faculty-relevant metrics.
Collapse
|
9
|
Yilmaz Y, Papanagnou D, Fornari A, Chan TM. The Learning Loop: Conceptualizing Just-in-Time Faculty Development. AEM EDUCATION AND TRAINING 2022; 6:e10722. [PMID: 35224408 PMCID: PMC8848258 DOI: 10.1002/aet2.10722] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/14/2021] [Revised: 12/28/2021] [Accepted: 01/04/2022] [Indexed: 06/14/2023]
Abstract
BACKGROUND As technology advances, the gap between learning and doing continues to close-especially for frontline academic faculty and clinician educators. For busy clinician faculty members, it can be difficult to find time to engage in skills and professional development. Competing interests between clinical care and various forms of academic work (e.g., research, administration, education) all create challenges for traditional group-based and/or didactic faculty development. METHODS The authors engaged in a synthetic narrative review of literature from several unrelated fields: learning technologies, medical education/health professions education, general/higher education. The aim for this review was to synthesize this pre-existing literature to propose a new conceptual model. RESULTS The authors propose a new conceptual model, the Just-In-Time Learning Loop, to guide the development of online faculty development for just-in-time delivery. CONCLUSIONS The Just-In-Time Learning Loop is a new conceptual framework that may be of use to those engaging in online, digital learning design. Faculty developers, especially in emergency medicine, can integrate leading concepts from the technology-enhanced learning field (e.g., microlearning, micro-credentialing, badging) to create new types of learning experiences for their end-users.
Collapse
Affiliation(s)
- Yusuf Yilmaz
- McMaster Education Research, Innovation, and Theory (MERIT) ProgramHamiltonOntarioCanada
- McMaster University Office of Continuing Professional Development HamiltonHamiltonOntarioCanada
- Department of Medical EducationFaculty of MedicineEge UniversityIzmirTurkey
- Program for Faculty DevelopmentFaculty of Health SciencesMcMaster UniversityHamiltonOntarioCanada
| | - Dimitrios Papanagnou
- Department of Emergency MedicineSidney Kimmel Medical College at Thomas Jefferson UniversityPhiladelphiaPennsylvaniaUSA
| | - Alice Fornari
- Donald and Barbara Zucker School of Medicine at Hofstra/NorthwellHempsteadNew YorkUSA
| | - Teresa M. Chan
- McMaster Education Research, Innovation, and Theory (MERIT) ProgramHamiltonOntarioCanada
- McMaster University Office of Continuing Professional Development HamiltonHamiltonOntarioCanada
- Program for Faculty DevelopmentFaculty of Health SciencesMcMaster UniversityHamiltonOntarioCanada
- Division of Emergency MedicineDepartment of MedicineMcMaster UniversityHamiltonOntarioCanada
- Division of Education & InnovationDepartment of MedicineMcMaster UniversityHamiltonOntarioCanada
| |
Collapse
|
10
|
Slavin S, D’Eon MF. Overcrowded curriculum is an impediment to change (Part A). CANADIAN MEDICAL EDUCATION JOURNAL 2021; 12:1-6. [PMID: 34567300 PMCID: PMC8463236 DOI: 10.36834/cmej.73532] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/16/2023]
Affiliation(s)
- Stuart Slavin
- Accreditation Council for Graduate Medical Education, Michigan, USA
| | - Marcel F D’Eon
- Educational Innovation Institute of the Medical College of Georgia, Augusta University, Georgia, USA
| |
Collapse
|