1
|
Fielding A, Mundy B, Tapley A, Gani S, Ali R, Bentley M, Boland R, Zbaidi L, Holliday E, Ball J, van Driel M, Klein L, Magin P. Educational utility of observational workplace-based assessment modalities in Australian vocational general practice training: a cross-sectional study. BMC MEDICAL EDUCATION 2025; 25:762. [PMID: 40410727 PMCID: PMC12102905 DOI: 10.1186/s12909-025-07328-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 11/11/2024] [Accepted: 05/12/2025] [Indexed: 05/25/2025]
Abstract
BACKGROUND Direct observation, workplace-based assessments (WBAs) are a fundamental component of competency-based postgraduate medical education. In Australian general practice vocational training, external clinical teaching visits (ECTVs) are key observation-based WBAs. Traditionally, ECTVs are conducted face-to-face, but the COVID-19 pandemic saw the development and implementation of remote ECTV modalities. It remains unknown if perceived educational utility of remote ECTVs differs from traditional face-to-face ECTVs. This study explored the educational utility of ECTVs, including face-to-face and remote formats. METHODS General practice trainees ('registrars') and external clinical teaching visitors ('ECT visitors', who are independent experienced GP observers) each completed a cross-sectional questionnaire following individual ECTVs undertaken in 2020. Outcomes included overall educational utility of the ECTV as perceived by registrars, registrar ratings of likelihood to change their clinical practice as a result of the ECTV, registrar ratings of likelihood to change their approach to learning/training as a result of the ECTV, and overall educational utility of the ECTV as perceived by the ECT visitor. Educational utility ratings (5-point scales) were analysed descriptively. Univariable and multivariable logistic regression were employed to examine factors associated with dichotomised educational utility ratings. RESULTS Response rates were 41% (n = 801) for registrars and 39% (n = 742) for ECT visitors. Most registrars (64.1%) rated ECTV overall educational utility as 'very useful'; 58.5% and 47.9% of registrars rated their likelihood to change practice and approach to learning/training, respectively, as 'very likely'. No statistically significant differences in perceived educational utility ratings were identified between face-to-face and remote video/phone ECTVs (multivariable p-value range: .07-.96). Receiving feedback that was focused/specific/easy to translate into action was consistently associated with registrars' rating overall educational utility as 'very useful' (odds ratio (OR): 12.8, 95% confidence interval (CI): 8.26 to19.9), rating likelihood to change practice as 'very likely' (OR: 2.5, 95%CI: 1.59 to 3.94), and rating likelihood to change learning/training approach as 'very likely' (OR: 3.19, 95%CI: 1.97 to 5.17). CONCLUSIONS ECTVs are perceived by registrars and ECT visitors to be educationally useful across different delivery modalities and formats. The quality and features of the feedback provided appear most important in ECTVs as an assessment for learning.
Collapse
Affiliation(s)
- Alison Fielding
- School of Medicine & Public Health, University of Newcastle, University Drive, Callaghan, NSW, 2308, Australia.
- General Practice Training Research, Royal Australian College of General Practitioners, Level 1, 20 Mclntosh Drive, Mayfield West, NSW, 2304, Australia.
| | - Benjamin Mundy
- General Practice Training Research, Royal Australian College of General Practitioners, Level 1, 20 Mclntosh Drive, Mayfield West, NSW, 2304, Australia
| | - Amanda Tapley
- School of Medicine & Public Health, University of Newcastle, University Drive, Callaghan, NSW, 2308, Australia
- General Practice Training Research, Royal Australian College of General Practitioners, Level 1, 20 Mclntosh Drive, Mayfield West, NSW, 2304, Australia
| | - Sarah Gani
- General Practice Training Medical Education, Royal Australian College of General Practitioners, Suite 2, 16 Napier Close, Deakin, ACT, 2600, Australia
| | - Rula Ali
- Discipline of General Practice, Faculty of Medicine and Health, University of New South Wales, Sydney, NSW, 2052, Australia
| | - Michael Bentley
- General Practice Training Research, Royal Australian College of General Practitioners, TAS, 62 Patrick Street, Hobart, 7000, Australia
| | - Rachael Boland
- General Practice Training Medical Education, GP Training, Royal Australian College of General Practitioners, TAS, 62 Patrick Street, Hobart, 7000, Australia
| | - Lina Zbaidi
- Northern Territory General Practice Education, Level 3, Building 1, Yellow Precinct, Charles Darwin University, Ellengowan Drive, Casuarina, NT, 0810, Australia
| | - Elizabeth Holliday
- School of Medicine & Public Health, University of Newcastle, University Drive, Callaghan, NSW, 2308, Australia
| | - Jean Ball
- Clinical Research Design IT and Statistical Support Unit (CReDITSS), Hunter Medical Research Institute (HMRI), 1 Kookaburra Circuit, New Lambton Heights, NSW, 2305, Australia
| | - Mieke van Driel
- General Practice Clinical Unit, Faculty of Medicine, Mayne Medical Building, The University of Queensland, Level 2288 Herston Road, Herston, QLD, 4006, Australia
| | - Linda Klein
- School of Medicine & Public Health, University of Newcastle, University Drive, Callaghan, NSW, 2308, Australia
- General Practice Training Research, Royal Australian College of General Practitioners, Level 1, 20 Mclntosh Drive, Mayfield West, NSW, 2304, Australia
| | - Parker Magin
- School of Medicine & Public Health, University of Newcastle, University Drive, Callaghan, NSW, 2308, Australia
- General Practice Training Research, Royal Australian College of General Practitioners, Level 1, 20 Mclntosh Drive, Mayfield West, NSW, 2304, Australia
| |
Collapse
|
2
|
Can M, Cengiz MI. The Impact of Instructional Approaches on Scaling and Root Planning Skill Learning in Preclinical Periodontal Education. J Dent Educ 2025:e13934. [PMID: 40346747 DOI: 10.1002/jdd.13934] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/04/2025] [Revised: 04/24/2025] [Accepted: 05/01/2025] [Indexed: 05/12/2025]
Abstract
OBJECTIVE This study aimed to evaluate the effectiveness of video-based learning, live demonstrations, and brochures in teaching scaling and root planning (SRP) skills to third-year dental students in a preclinical periodontal education setting. METHOD Ninety third-year dental students who had not previously received theoretical or practical training in SRP were included in this randomized study. The students were equally assigned to three instructional groups: brochure, video, and live demonstration. To evaluate the initial effectiveness of these methods, all participants first performed SRP on standardized typodont models coated with simulated calculus, created using a consistent application of black nail polish. Each student performed 10 strokes per surface using identical instruments to ensure standardization. Pre- and post-procedure images of the models were taken for objective assessment. The extent of calculus removal was quantified through pixel area measurements using image analysis software. Following the practical sessions, a validated 10-item Likert-scale questionnaire was administered to assess students' perceptions of the instructional methods, focusing on clarity, engagement, satisfaction, and preferences for future learning. RESULTS Significant improvements in the quality of scaling and root planing were observed in all three groups after education. Video and live demonstration methods were significantly more effective than the brochure method. However, there were no significant differences between video and live demonstration groups. Student feedback showed that both the video and live demonstration methods received significantly higher scores than the brochure method; however, there was no significant difference between the video and live demonstration methods. CONCLUSION Both video-based learning and live demonstrations are equally effective and superior to brochures for teaching critical periodontal skills. These results suggest that interactive teaching methods should be prioritized in dental education to enhance student outcomes and better prepare them for clinical practice.
Collapse
Affiliation(s)
- Merve Can
- Department of Periodontology, Faculty of Dentistry, Istanbul Medipol University, Istanbul, Turkey
| | - Murat Inanc Cengiz
- Department of Periodontology, Faculty of Dentistry, Zonguldak Bulent Ecevit University, Zonguldak, Turkey
| |
Collapse
|
3
|
Ekman N, Fors A, Moons P, Taft C. Gothenburg direct observation tool for assessing person-centred care (GDOT-PCC): evaluation of inter-rater reliability. BMJ Open 2025; 15:e096576. [PMID: 40246569 PMCID: PMC12007047 DOI: 10.1136/bmjopen-2024-096576] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/14/2024] [Accepted: 04/02/2025] [Indexed: 04/19/2025] Open
Abstract
OBJECTIVE To assess the inter-rater reliability of the Gothenburg direct observation tool-person-centred care in assessing healthcare professionals' competency in delivering person-centred care (PCC). DESIGN Observational, fully-crossed inter-rater reliability study. SETTING The study was conducted between October and December 2022 at the participants' homes or offices. PARTICIPANTS AND METHODS Six health professionals individually rated 10 video-recorded, simulated consultations against the 53-item, 15-domain tool covering four major areas: PCC activities, clinician manner, clinician skills and PCC goals. Cronbach's α was used to assess internal consistency. Intraclass correlations (ICC) and 95% CI were computed for the domains. RESULTS Two domains (planning and documentation and documentation) were excluded from analyses due to insufficient evaluable data. Cronbach's α was acceptable (>0.70) for all evaluated domains. ICC values were high (ICC ≥0.75) for 11 of the 13 domains; however, CIs were generally wide and the lower bounds fell within the good range (ICC=0.60-0.74) for six domains and fair agreement (ICC=0.40-0.59) for the remaining six. The ICC for the domain patient perspective was non-informative due to its wide CIs (ICC=0.74 (0.39-0.92)). CONCLUSION ICC estimates for most domains were comparable to or exceeded those reported for similar direct observation tools for assessing PCC, suggesting that they may reliably be used in, for example, education and quality improvement applications. Reliability for the domains planning and documentation and documentation needs to be assessed in studies sampling more documentation behaviours. Reliability for the patient perspective domain may owe to methodological issues and should be reassessed in larger, better-designed studies.
Collapse
Affiliation(s)
- Nina Ekman
- Institute of Health and Care Sciences, Sahlgrenska Academy, University of Gothenburg, Goteborg, Sweden
- University of Gothenburg Centre for Person-Centred Care (GPCC), Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden
- Department of Public Health and Primary Care, Faculty of Medicine, KU Leuven, Leuven, Belgium
| | - Andreas Fors
- Institute of Health and Care Sciences, Sahlgrenska Academy, University of Gothenburg, Goteborg, Sweden
- University of Gothenburg Centre for Person-Centred Care (GPCC), Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden
- Region Västra Götaland, Research, Education, Development and Innovation, Primary Health Care, Gothenborg, Sweden
| | - Philip Moons
- Institute of Health and Care Sciences, Sahlgrenska Academy, University of Gothenburg, Goteborg, Sweden
- University of Gothenburg Centre for Person-Centred Care (GPCC), Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden
- Department of Public Health and Primary Care, Faculty of Medicine, KU Leuven, Leuven, Belgium
- Department of Pediatrics and Child Health, University of Cape Town, Cape Town, South Africa
| | - Charles Taft
- Institute of Health and Care Sciences, Sahlgrenska Academy, University of Gothenburg, Goteborg, Sweden
- University of Gothenburg Centre for Person-Centred Care (GPCC), Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden
| |
Collapse
|
4
|
Sarica K, Güzel R, Bayraktar Z, Yildirim S, Yasar H, Sarica G, Sahın C. Local clinical practice patterns in urolithiasis guidelines: a critical evaluation from Turkey. World J Urol 2025; 43:97. [PMID: 39899122 PMCID: PMC11790800 DOI: 10.1007/s00345-025-05490-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2024] [Accepted: 01/27/2025] [Indexed: 02/04/2025] Open
Abstract
PURPOSE This study aimed to evaluate the current clinical practice patterns regarding the utilization of "Urolithiasis Guidelines" in Turkey and to identify critical factors influencing their application by urologists. METHODS The study targeted practicing urologists in Turkey, primarily those involved in the management of urolithiasis, to assess their perspectives and experiences regarding the clinical application of established guidelines. A total of 415 urology specialists were invited to participate in a survey-based study conducted via Google Forms. Participation was voluntary, and 65.08% of the invited urologists completed the survey. RESULTS Among the respondents, 84.7% reported utilizing the available guidelines in their routine clinical practice, with varying frequencies of reference. The primary motivations for guideline use were the prevention of potential complications and the avoidance of legal risks, as indicated by 90.5% of respondents. While 56.9% of participants adhered to the guidelines as a clinically standardized practice, 41.6% reported applying the recommendations on a case-by-case basis. Notably, 41.0% of respondents emphasized the need for locally adapted versions of guideline texts. Additionally, nearly half of the participants reported receiving no formal education or training on the significance, content, and practical application of these guidelines. Furthermore, 12.7% expressed skepticism about the evidence-based foundation of the guidelines, questioning whether the recommendations were derived from rigorously conducted studies. CONCLUSION The available urolithiasis guidelines are recognized as valuable resources offering key recommendations for the effective and safe management of urolithiasis. However, findings from this survey highlight significant variability in clinical practice patterns due to local conditions and the individual experience and attitudes of practicing urologists. The application of guideline recommendations is further influenced by perceptions regarding their development, content, and practicality. Insights gathered from this study may contribute to improving the preparation, dissemination, and implementation of urolithiasis guidelines, particularly in adapting them to local clinical settings.
Collapse
Affiliation(s)
- Kemal Sarica
- Department of Urology, Sancaktepe Research and Training Hospital, Health Sciences University, Istanbul, Turkey
- Department of Urology, Biruni University Medical School, Istanbul, Turkey
| | - Rasim Güzel
- Urology Clinic, Kavacık Medistate Hospital, Istanbul, Turkey
| | - Zeki Bayraktar
- Department of Urology, Sancaktepe Research and Training Hospital, Health Sciences University, Istanbul, Turkey
| | - Salih Yildirim
- Department of Urology, Sancaktepe Research and Training Hospital, Health Sciences University, Istanbul, Turkey.
| | - Hikmet Yasar
- Department of Urology, Sancaktepe Research and Training Hospital, Health Sciences University, Istanbul, Turkey
| | - Göksu Sarica
- Medical intern, University Medical School, Istanbul, Turkey
| | - Cahit Sahın
- Department of Urology, Sancaktepe Research and Training Hospital, Health Sciences University, Istanbul, Turkey
| |
Collapse
|
5
|
Yuan Y, Wang C, Wen S, Li Y, Xu C, Yu F, Li X, He Y, Chen L, Ren Y, Zhou L. Pilot Study of a Modified DOPS Scale for Insulin Pump and CGM Installation Training in Chinese Medical Students During Endocrinology Rotations. Diabetes Metab Syndr Obes 2025; 18:37-50. [PMID: 39802617 PMCID: PMC11720810 DOI: 10.2147/dmso.s489435] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/31/2024] [Accepted: 12/31/2024] [Indexed: 01/16/2025] Open
Abstract
Background Direct Observation of Procedural Skills (DOPS) is a clinical assessment tool that enables trainers to observe medical students' procedural abilities in real-time clinical settings. It assesses students' knowledge application, decision-making, and skill proficiency during clinical tasks. Methods This study modifies the DOPS to evaluate the operation of insulin pumps (PUMP) and continuous glucose monitoring systems (CGMS) in diabetes management. Key elements of the modified DOPS include 1) Knowledge Assessment: Evaluating understanding of PUMP and CGMS, including interpreting CGMS data for insulin adjustments; 2) Operational Skills: Assessing correct PUMP needle insertion, programming, and adjustments; 3) Patient Safety: Ensuring safe and aseptic procedures; 4) Feedback: Providing constructive feedback to help students improve their skills. Results Training through DOPS led to significant improvements in all domains, overall performance scores, and reduced execution time for each domain. Correlations between domains showed that PUMP indication scores were linked to all other domains and execution times, including re-evaluation. Communication skills and seeking assistance were crucial factors influencing other domains. Multilinear regression analysis revealed that while DOPS-CGMS (R square 1.0) fully explained performance scores, DOPS-PUMP (R square 0.984) indicated that additional personal qualities significantly impacted students' PUMP operation performance. Conclusion This customized DOPS form offers insights into students' abilities in managing diabetes with PUMP and CGMS, while emphasizing the need for training on both technical skills and interpersonal skills in future educational models.
Collapse
Affiliation(s)
- Yue Yuan
- Department of Endocrinology, Shanghai Pudong Hospital, Fudan University, Shanghai, 201399, People’s Republic of China
| | - Congcong Wang
- Department of Endocrinology, Shanghai Pudong Hospital, Fudan University, Shanghai, 201399, People’s Republic of China
| | - Song Wen
- Department of Endocrinology, Shanghai Pudong Hospital, Fudan University, Shanghai, 201399, People’s Republic of China
| | - Yanyan Li
- Department of Endocrinology, Shanghai Pudong Hospital, Fudan University, Shanghai, 201399, People’s Republic of China
| | - Chenglin Xu
- Department of Endocrinology, Shanghai Pudong Hospital, Fudan University, Shanghai, 201399, People’s Republic of China
| | - Fang Yu
- Department of Endocrinology, Shanghai Pudong Hospital, Fudan University, Shanghai, 201399, People’s Republic of China
| | - Xiucai Li
- Department of Endocrinology, Shanghai Pudong Hospital, Fudan University, Shanghai, 201399, People’s Republic of China
| | - Yanju He
- Department of Endocrinology, Shanghai Pudong Hospital, Fudan University, Shanghai, 201399, People’s Republic of China
| | - Lijiao Chen
- Department of Endocrinology, Shanghai Pudong Hospital, Fudan University, Shanghai, 201399, People’s Republic of China
| | - Yishu Ren
- Department of Endocrinology, Shanghai Pudong Hospital, Fudan University, Shanghai, 201399, People’s Republic of China
| | - Ligang Zhou
- Department of Endocrinology, Shanghai Pudong Hospital, Fudan University, Shanghai, 201399, People’s Republic of China
- Shanghai Key Laboratory of Vascular Lesions Regulation and Remodeling, Shanghai Pudong Hospital, Fudan University, Shanghai, 201399, People’s Republic of China
| |
Collapse
|
6
|
Gingerich A, Lingard L, Sebok-Syer SS, Watling CJ, Ginsburg S. "Praise in Public; Criticize in Private": Unwritable Assessment Comments and the Performance Information That Resists Being Written. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2024; 99:1240-1246. [PMID: 39137257 DOI: 10.1097/acm.0000000000005839] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 08/15/2024]
Abstract
PURPOSE Written assessment comments are needed to archive feedback and inform decisions. Regrettably, comments are often impoverished, leaving performance-relevant information undocumented. Research has focused on content and supervisor's ability and motivation to write it but has not sufficiently examined how well the undocumented information lends itself to being written as comments. Because missing information threatens the validity of assessment processes, this study examined the performance information that resists being written. METHOD Two sequential data collection methods and multiple elicitation techniques were used to triangulate unwritten assessment comments. Between November 2022 and January 2023, physicians in Canada were recruited by email and social media to describe experiences with wanting to convey assessment information but feeling unable to express it in writing. Fifty supervisors shared examples via survey. From January to May 2023, a subset of 13 participants were then interviewed to further explain what information resisted being written and why it seemed impossible to express in writing and to write comments in response to a video prompt or for their own "unwritable" example. Constructivist grounded theory guided data collection and analysis. RESULTS Not all performance-relevant information was equally writable. Information resisted being written as assessment comments when it would require an essay to be expressed in writing, belonged in a conversation and not in writing, or was potentially irrelevant and unverifiable. In particular, disclosing sensitive information discussed in a feedback conversation required extensive recoding to protect the learner and supervisor-learner relationship. CONCLUSIONS When documenting performance information as written comments is viewed as an act of disclosure, it becomes clear why supervisors may feel compelled to leave some comments unwritten. Although supervisors can be supported in writing better assessment comments, their failure to write invites a reexamination of expectations for documenting feedback and performance information as written comments on assessment forms.
Collapse
|
7
|
Kassam A, de Vries I, Zabar S, Durning SJ, Holmboe E, Hodges B, Boscardin C, Kalet A. The Next Era of Assessment Within Medical Education: Exploring Intersections of Context and Implementation. PERSPECTIVES ON MEDICAL EDUCATION 2024; 13:496-506. [PMID: 39399409 PMCID: PMC11469546 DOI: 10.5334/pme.1128] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/16/2023] [Accepted: 09/11/2024] [Indexed: 10/15/2024]
Abstract
In competency-based medical education (CBME), which is being embraced globally, the patient-learner-educator encounter occurs in a highly complex context which contributes to a wide range of assessment outcomes. Current and historical barriers to considering context in assessment include the existing post-positivist epistemological stance that values objectivity and validity evidence over the variability introduced by context. This is most evident in standardized testing. While always critical to medical education the impact of context on assessment is becoming more pronounced as many aspects of training diversify. This diversity includes an expanding interest beyond individual trainee competence to include the interdependency and collective nature of clinical competence and the growing awareness that medical education needs to be co-produced among a wider group of stakeholders. In this Eye Opener, we wish to consider: 1) How might we best account for the influence of context in the clinical competence assessment of individuals in medical education? and by doing so, 2) How could we usher in the next era of assessment that improves our ability to meet the dynamic needs of society and all its stakeholders? The purpose of this Eye Opener is thus two-fold. First, we conceptualize - from a variety of viewpoints, how we might address context in assessment of competence at the level of the individual learner. Second, we present recommendations that address how to approach implementation of a more contextualized competence assessment.
Collapse
Affiliation(s)
- Aliya Kassam
- Department of Community Health Sciences and Director of Scholarship in the Office of Postgraduate Medical Education at the Cumming School of Medicine, University of Calgary, Alberta, Canada
| | - Ingrid de Vries
- Faculty of Education at Queen’s University, Kingston, Canada
| | - Sondra Zabar
- Division of General Internal Medicine and Clinical Innovation at the NYU Grossman School of Medicine, New York, New York, USA
| | - Steven J. Durning
- Center for Health Professions Education at the Uniformed Services University of the Health Sciences in Bethesda, Maryland, USA
| | | | - Brian Hodges
- Temerty Faculty of Medicine at University of Toronto, Canada
- Royal College of Physicians and Surgeons of Canada, Canada
| | - Christy Boscardin
- Department of Medicine and Department of Anesthesia and Perioperative Care, and the Faculty Director of Assessment in the School of Medicine at the University of California, San Francisco, California, USA
| | - Adina Kalet
- Department of Medicine, Center for the Advancement of Population Health at the Medical College of Wisconsin, Wisconsin, USA
| |
Collapse
|
8
|
Brecha FS, Friedman S, Costich M. Integrating Learners Into the Pediatric Primary Care Workflow: Strategies for Optimizing Teaching and Learning. Pediatr Ann 2024; 53:e386-e391. [PMID: 39377818 DOI: 10.3928/19382359-20240811-04] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/09/2024]
Abstract
Primary care pediatric providers play an important role in the education of future medical professionals. However, it may feel challenging to integrate a learner into out-patient practice given both time constraints and varying levels of experience among learners. Here we discuss how learners at various stages of training from different medical professions can be integrated into the outpatient pediatric clinical environment. We review eight teaching strategies and provide examples of their use in practice. The goal is to introduce tools to support teachers working with learners in the clinical environment to optimize educational experiences for both teachers and learners. [Pediatr Ann. 2024;53(10):e386-e391.].
Collapse
|
9
|
Gravel JW. Missing Tools, Missing Out. Fam Med 2024; 56:612-614. [PMID: 39401142 PMCID: PMC11493126 DOI: 10.22454/fammed.2024.757299] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2024]
Affiliation(s)
- Joseph W Gravel
- Medical College of Wisconsin Department of Family and Community Medicine, Milwaukee, WI
| |
Collapse
|
10
|
Osgood M, Silver B, Reidy J, Nagpal V. Curriculum Innovations: Enhancing Skills in Serious Illness Communication in Neurology Residents Using Simulation: A Pilot Study. NEUROLOGY. EDUCATION 2024; 3:e200140. [PMID: 39359652 PMCID: PMC11419305 DOI: 10.1212/ne9.0000000000200140] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/04/2023] [Accepted: 05/20/2024] [Indexed: 10/04/2024]
Abstract
Background and Problem Statement Patients with acute ischemic stroke are faced with prognostic uncertainty, progressive decline, and early mortality. Many neurologists report a lack of education and experience in providing palliative care. We developed a simulation-based curriculum to improve residents' confidence and comfort with conducting late-stage goals of care (GOC) conversations. Objectives To assess and improve neurology residents' self-reported confidence and comfort around GOC discussions, prognostication, and hospice; encourage neurology residents to conduct GOC conversations early in the illness; introduce neurology residents to a structured framework for conducting GOC conversations; facilitate the residents to build rapport and convey a mindful presence during GOC conversations; provide direct, real-time feedback and an opportunity for redo and practice; and identify gaps for education. Methods and Curriculum Description The 3-hour experience included a didactic session followed by an interactive simulation and debriefing. The residents' objectives were to deliver difficult news, discuss prognosis, explore goals, navigate treatment options, and discuss end-of-life care including hospice. The faculty observed each interaction and called time-outs to allow the residents to self-assess and obtain feedback. Residents and faculty debriefed to identify take-home points and to reflect on their emotions, self-care, and sense of purpose in medicine. Results and Assessment Twenty-six neurology residents filled out an anonymous presurvey to self-assess their confidence and comfort surrounding GOC conversations. More than 50% of residents reported being confident in conducting GOC discussions, whereas only 42% reported adequate prior training. Postsession, more than 90% of residents reported that training was relevant, helpful, organized, and clear. Faculty identified that residents had difficulty addressing prognosis, assessing goals, planning treatment, and using silence, responding to emotion, and displaying empathy. Fifteen residents filled out a postsurvey that revealed improved comfort with delivering prognosis, discussing hospice, and initiating early GOC discussions. Discussion and Lessons Learned Our project uniquely focuses on late-stage GOC conversations and builds on existing literature that supports a structured program with both didactic and simulation components to improve residents' abilities to effectively navigate GOC conversations with patients and families. Future work will focus on reinforcement and reassessment of communication skills.
Collapse
Affiliation(s)
- Marcey Osgood
- From the Department of Neurology (M.O.), Lahey Health and Medical Center, Burlington, MA; and Department of Neurology (B.S.), and Department of Palliative Care (J.R.,V.N.), University of Massachusetts Chan Medical School, Worcester
| | - Brian Silver
- From the Department of Neurology (M.O.), Lahey Health and Medical Center, Burlington, MA; and Department of Neurology (B.S.), and Department of Palliative Care (J.R.,V.N.), University of Massachusetts Chan Medical School, Worcester
| | - Jennifer Reidy
- From the Department of Neurology (M.O.), Lahey Health and Medical Center, Burlington, MA; and Department of Neurology (B.S.), and Department of Palliative Care (J.R.,V.N.), University of Massachusetts Chan Medical School, Worcester
| | - Vandana Nagpal
- From the Department of Neurology (M.O.), Lahey Health and Medical Center, Burlington, MA; and Department of Neurology (B.S.), and Department of Palliative Care (J.R.,V.N.), University of Massachusetts Chan Medical School, Worcester
| |
Collapse
|
11
|
Alex CP, Fromme HB, Greenberg L, Ryan MS, Gustafson S, Neeley MK, Nunez S, Rideout ME, VanNostrand J, Orlov NM. Exploring Medical Student Experiences With Direct Observation During the Pediatric Clerkship. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2024; 99:997-1006. [PMID: 38696720 DOI: 10.1097/acm.0000000000005747] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/04/2024]
Abstract
PURPOSE Direct observation (DO) enables assessment of vital competencies, such as clinical skills. Despite national requirement that medical students experience DOs during each clerkship, the frequency, length, quality, and context of these DOs are not well established. This study examines the quality, quantity, and characteristics of DOs obtained during pediatrics clerkships across multiple institutions. METHOD This multimethod study was performed at 6 U.S.-based institutions from March to October 2022. In the qualitative phase, focus groups and/or semistructured interviews were conducted with third-year medical students at the conclusion of pediatrics clerkships. In the quantitative phase, the authors administered an internally developed instrument after focus group discussions or interviews. Qualitative data were analyzed using thematic analysis, and quantitative data were analyzed using anonymous survey responses. RESULTS Seventy-three medical students participated in 20 focus groups, and 71 (97.3%) completed the survey. The authors identified 7 themes that were organized into key principles: before, during, and after DO. Most students reported their DOs were conducted primarily by residents (62 [87.3%]) rather than attendings (6 [8.4%]) in inpatient settings. Participants reported daily attending observation of clinical reasoning (38 [53.5%]), communication (39 [54.9%]), and presentation skills (58 [81.7%]). One-third reported they were never observed taking a history by an inpatient attending (23 [32.4%]), and one-quarter reported they were never observed performing a physical exam (18 [25.4%]). CONCLUSIONS This study revealed that students are not being assessed for performing vital clinical skills in the inpatient setting by attendings as frequently as previously believed. When observers set expectations, create a safe learning environment, and follow up with actionable feedback, medical students perceive the experience as valuable; however, the DO experience is currently suboptimal. Therefore, a high-quality, competency-based clinical education for medical students is necessary to directly drive future patient care by way of a competent physician workforce.
Collapse
|
12
|
Kawaguchi S, Myers J, Li M, Kurahashi AM, Sirianni G, Siemens I. Entrustable Professional Activities in Palliative Medicine: A Faculty and Learner Development Activity. J Palliat Med 2024; 27:1055-1059. [PMID: 39007218 DOI: 10.1089/jpm.2023.0682] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 07/16/2024] Open
Abstract
Background: Faculty development (FD) is critical to the implementation of competency-based medical education (CBME) and yet evidence to guide the design of FD activities is limited. Our aim with this study was to describe and evaluate an FD activity as part of CBME implementation. Methods: Palliative medicine faculty were introduced to entrustable professional activities (EPAs) and gained experience estimating a learner's level of readiness for entrustment by directly observing a simulated encounter. The variation that was found among assessments was discussed in facilitated debrief sessions. Attitudes and confidence levels were measured 1 week and 6 months following debriefs. Results: Participants were able to use the EPA framework when estimating the learner's readiness level for entrustment. Significant improvements in attitudes and level of confidence for several knowledge, skill, and behavior domains were maintained over time. Conclusions: Simulated direct observation and facilitated debriefs contributed to preparing both faculty and learners for CBME and EPA implementation.
Collapse
Affiliation(s)
- Sarah Kawaguchi
- Division of Palliative Care, Division of Palliative Care, Department of Family and Community Medicine, Sinai Health System, University of Toronto, Toronto, Canada
| | - Jeff Myers
- Division of Palliative Care, Division of Palliative Care, Department of Family and Community Medicine, Sinai Health System, University of Toronto, Toronto, Canada
| | - Melissa Li
- Division of Palliative Care, Division of Palliative Care, Department of Family and Community Medicine, Toronto Western Hospital, University Health Network, University of Toronto, Toronto, Canada
| | - Allison M Kurahashi
- The Temmy Latner Centre for Palliative Care, Sinai Health System, Toronto, Canada
| | - Giovanna Sirianni
- Division of Palliative Care, Division of Palliative Care, Department of Family and Community Medicine, Sunnybrook Health Sciences Centre, University of Toronto, Toronto, Canada
| | - Isaac Siemens
- Division of Palliative Care, Division of Palliative Care, Department of Family and Community Medicine, Sinai Health System, University of Toronto, Toronto, Canada
| |
Collapse
|
13
|
Shuford A, Carney PA, Ketterer B, Jones RL, Phillipi CA, Kraakevik J, Hasan R, Moulton B, Smeraglio A. An Analysis of Workplace-Based Assessments for Core Entrustable Professional Activities for Entering Residency: Does Type of Clinical Assessor Influence Level of Supervision Ratings? ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2024; 99:904-911. [PMID: 38498305 DOI: 10.1097/acm.0000000000005691] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/20/2024]
Abstract
PURPOSE The authors describe use of the workplace-based assessment (WBA) coactivity scale according to entrustable professional activities (EPAs) and assessor type to examine how diverse assessors rate medical students using WBAs. METHOD A WBA data collection system was launched at Oregon Health and Science University to visualize learner competency in various clinical settings to foster EPA assessment. WBA data from January 14 to June 18, 2021, for medical students (all years) were analyzed. The outcome variable was level of supervisor involvement in each EPA, and the independent variable was assessor type. RESULTS A total of 7,809 WBAs were included. Most fourth-, third-, and second-year students were assessed by residents or fellows (755 [49.5%], 1,686 [48.5%], and 918 [49.9%], respectively) and first-year students by attending physicians (803 [83.0%]; P < .001). Attendings were least likely to use the highest rating of 4 (1 was available just in case; 2,148 [56.7%] vs 2,368 [67.7%] for residents; P < .001). Learners more commonly sought WBAs from attendings for EPA 2 (prioritize differential diagnosis), EPA 5 (document clinical encounter), EPA 6 (provide oral presentation), EPA 7 (form clinical questions and retrieve evidence-based medicine), and EPA 12 (perform general procedures of a physician). Residents and fellows were more likely to assess students on EPA 3 (recommend and interpret diagnostic and screening tests), EPA 4 (enter and discuss orders and prescriptions), EPA 8 (give and receive patient handover for transitions in care), EPA 9 (collaborate as member of interprofessional team), EPA 10 (recognize and manage patient in need of urgent care), and EPA 11 (obtain informed consent). CONCLUSIONS Learners preferentially sought resident versus attending supervisors for different EPA assessments. Future research should investigate why learners seek different assessors more frequently for various EPAs and if assessor type variability in WBA levels holds true across institutions.
Collapse
|
14
|
Cardella L, Lang V, Cross W, Mooney C. Applying a Competency-Based Medical Education Framework to Development of Residents' Feedback Skills. ACADEMIC PSYCHIATRY : THE JOURNAL OF THE AMERICAN ASSOCIATION OF DIRECTORS OF PSYCHIATRIC RESIDENCY TRAINING AND THE ASSOCIATION FOR ACADEMIC PSYCHIATRY 2024; 48:329-333. [PMID: 38740718 DOI: 10.1007/s40596-024-01973-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/17/2023] [Accepted: 04/17/2024] [Indexed: 05/16/2024]
Abstract
OBJECTIVE Feedback is a critically important tool in medical education. This pilot program applies and evaluates a competency-based approach to develop residents' skills in providing feedback to medical students. METHODS In 2018-2019, a competency-based resident feedback skills program incorporating videorecording of skills, multi-source feedback using assessment tools with validity evidence, and sequential deliberate practice was piloted in a single-center, prospective study at the University of Rochester. Study participants included eight second-year psychiatry residents and 23 third-year clerkship students. After an introduction to foundational feedback concepts in didactic sessions, residents were videorecorded providing feedback to medical students. Recordings were reviewed with a faculty member for feedback. Skills were assessed by students who had received resident feedback, residents, and faculty utilizing a tool with validity evidence. Observations were repeated a total of three times. RESULTS Mean feedback scores increased from 2.70 at the first feedback observation, to 2.77 at the second feedback observation, to 2.89 at the third feedback observation (maximum 3.00 points). The differences between the first and third sessions (0.19) and second and third sessions (0.12) were statistically significant (p values were < .001 and .007, with SE of 0.4 and 0.4, respectively). CONCLUSIONS The observed competency-based feedback skills training program for residents using sequential, multi-source review and feedback was feasible and effective. Direct observation is a key component of high-quality feedback, and videorecording is an efficient methodology for observations, enabling both direct observation by the assessor and opportunity for enhanced self-assessment by residents viewing themselves in the feedback encounter.
Collapse
Affiliation(s)
- Laura Cardella
- University of Rochester School of Medicine and Dentistry, Rochester, NY, USA.
| | - Valerie Lang
- University of Rochester School of Medicine and Dentistry, Rochester, NY, USA
| | - Wendi Cross
- University of Rochester School of Medicine and Dentistry, Rochester, NY, USA
| | - Christopher Mooney
- University of Rochester School of Medicine and Dentistry, Rochester, NY, USA
| |
Collapse
|
15
|
Favier R, Proot J, Matiasovic M, Roos A, Knaake F, van der Lee A, den Toom M, Paes G, van Oostrom H, Verstappen F, Beukers M, van den Herik T, Bergknut N. Towards a flexible and personalised development of veterinarians and veterinary nurses working in a companion animal referral care setting. Vet Med Sci 2024; 10:e1518. [PMID: 38952266 PMCID: PMC11217593 DOI: 10.1002/vms3.1518] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/12/2024] [Revised: 05/15/2024] [Accepted: 06/10/2024] [Indexed: 07/03/2024] Open
Abstract
In the Netherlands, the demand for veterinarians and veterinary nurses (VNs) working within referral care is rapidly growing and currently exceeds the amount of available board-certified specialists. Simultaneously, a transparent structure to guide training and development and to assess quality of non-specialist veterinarians and VNs working in a referral setting is lacking. In response, we developed learning pathways guided by an entrustable professional activity (EPA) framework and programmatic assessment to support personalised development and competence of veterinarians and VNs working in referral settings. Between 4 and 35 EPAs varying per discipline (n = 11) were developed. To date, 20 trainees across five disciplines have been entrusted. Trainees from these learning pathways have proceeded to acquire new EPAs in addition to their already entrusted set of EPAs or progressed to specialist training during (n = 3) or after successfully completing (n = 1) the learning pathway. Due to their outcome-based approach, the learning pathways support flexible ways of development.
Collapse
Affiliation(s)
| | - Joachim Proot
- Evidensia Dierenziekenhuis BarendrechtBarendrechtThe Netherlands
| | | | - Arno Roos
- Evidensia Dierenziekenhuis NieuwegeinNieuwegeinThe Netherlands
| | - Frans Knaake
- Evidensia Dierenziekenhuis Den HaagDen HaagThe Netherlands
| | | | | | - Geert Paes
- IVC Evidensia the NetherlandsVleutenThe Netherlands
| | - Hugo van Oostrom
- Evidensia Dierenziekenhuis BarendrechtBarendrechtThe Netherlands
- Evidensia Dierenziekenhuis ArnhemArnhemThe Netherlands
| | | | - Martijn Beukers
- Evidensia Dierenziekenhuis BarendrechtBarendrechtThe Netherlands
- Evidensia Dierenziekenhuis Hart van BrabantWaalwijkThe Netherlands
| | | | - Niklas Bergknut
- Evidensia Dierenziekenhuis Hart van BrabantWaalwijkThe Netherlands
| |
Collapse
|
16
|
Booth GJ, Hauert T, Mynes M, Hodgson J, Slama E, Goldman A, Moore J. Fine-Tuning Large Language Models to Enhance Programmatic Assessment in Graduate Medical Education. THE JOURNAL OF EDUCATION IN PERIOPERATIVE MEDICINE : JEPM 2024; 26:E729. [PMID: 39354917 PMCID: PMC11441632 DOI: 10.46374/volxxvi_issue3_moore] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/03/2024]
Abstract
Background Natural language processing is a collection of techniques designed to empower computer systems to comprehend and/or produce human language. The purpose of this investigation was to train several large language models (LLMs) to explore the tradeoff between model complexity and performance while classifying narrative feedback on trainees into the Accreditation Council for Graduate Medical Education subcompetencies. We hypothesized that classification accuracy would increase with model complexity. Methods The authors fine-tuned several transformer-based LLMs (Bidirectional Encoder Representations from Transformers [BERT]-base, BERT-medium, BERT-small, BERT-mini, BERT-tiny, and SciBERT) to predict Accreditation Council for Graduate Medical Education subcompetencies on a curated dataset of 10 218 feedback comments. Performance was compared with the authors' previous work, which trained a FastText model on the same dataset. Performance metrics included F1 score for global model performance and area under the receiver operating characteristic curve for each competency. Results No models were superior to FastText. Only BERT-tiny performed worse than FastText. The smallest model with comparable performance to FastText, BERT-mini, was 94% smaller. Area under the receiver operating characteristic curve for each competency was similar on BERT-mini and FastText with the exceptions of Patient Care 7 (Situational Awareness and Crisis Management) and Systems-Based Practice. Discussion Transformer-based LLMs were fine-tuned to understand anesthesiology graduate medical education language. Complex LLMs did not outperform FastText. However, equivalent performance was achieved with a model that was 94% smaller, which may allow model deployment on personal devices to enhance speed and data privacy. This work advances our understanding of best practices when integrating LLMs into graduate medical education.
Collapse
Affiliation(s)
- Gregory J Booth
- The following authors are in both the Department of Anesthesiology, Uniformed Services University, Bethesda, MD, and Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth, Portsmouth, VA: Gregory J. Booth is an Associate Professor at Uniformed Services University and Program Director, Anesthesiology Residency at Naval Medical Center Portsmouth; Mike Mynes and Elizabeth Slama are Assistant Professors at Uniformed Services University and Staff Anesthesiologists at Naval Medical Center Portsmouth; Jeffrey Moore is an Assistant Professor at Uniformed Services University and Program Director, Pain Medicine Fellowship, and Associate Designated Institutional Official at Naval Medical Center Portsmouth. Thomas Hauert is an Anesthesiology Resident Physician at Naval Medical Center Portsmouth, Portsmouth, VA. Ashton Goldman is an Associate Professor at Uniformed Services University, Bethesda, MD, and a Staff Orthopedic Surgeon at the Department of Orthopedic Surgery and Sports Medicine at Naval Medical Center Portsmouth, Portsmouth, VA. John Hodgson is an Associate Professor and Program Director, Anesthesiology Residency at University of South Florida, Tampa, FL
| | - Thomas Hauert
- The following authors are in both the Department of Anesthesiology, Uniformed Services University, Bethesda, MD, and Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth, Portsmouth, VA: Gregory J. Booth is an Associate Professor at Uniformed Services University and Program Director, Anesthesiology Residency at Naval Medical Center Portsmouth; Mike Mynes and Elizabeth Slama are Assistant Professors at Uniformed Services University and Staff Anesthesiologists at Naval Medical Center Portsmouth; Jeffrey Moore is an Assistant Professor at Uniformed Services University and Program Director, Pain Medicine Fellowship, and Associate Designated Institutional Official at Naval Medical Center Portsmouth. Thomas Hauert is an Anesthesiology Resident Physician at Naval Medical Center Portsmouth, Portsmouth, VA. Ashton Goldman is an Associate Professor at Uniformed Services University, Bethesda, MD, and a Staff Orthopedic Surgeon at the Department of Orthopedic Surgery and Sports Medicine at Naval Medical Center Portsmouth, Portsmouth, VA. John Hodgson is an Associate Professor and Program Director, Anesthesiology Residency at University of South Florida, Tampa, FL
| | - Mike Mynes
- The following authors are in both the Department of Anesthesiology, Uniformed Services University, Bethesda, MD, and Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth, Portsmouth, VA: Gregory J. Booth is an Associate Professor at Uniformed Services University and Program Director, Anesthesiology Residency at Naval Medical Center Portsmouth; Mike Mynes and Elizabeth Slama are Assistant Professors at Uniformed Services University and Staff Anesthesiologists at Naval Medical Center Portsmouth; Jeffrey Moore is an Assistant Professor at Uniformed Services University and Program Director, Pain Medicine Fellowship, and Associate Designated Institutional Official at Naval Medical Center Portsmouth. Thomas Hauert is an Anesthesiology Resident Physician at Naval Medical Center Portsmouth, Portsmouth, VA. Ashton Goldman is an Associate Professor at Uniformed Services University, Bethesda, MD, and a Staff Orthopedic Surgeon at the Department of Orthopedic Surgery and Sports Medicine at Naval Medical Center Portsmouth, Portsmouth, VA. John Hodgson is an Associate Professor and Program Director, Anesthesiology Residency at University of South Florida, Tampa, FL
| | - John Hodgson
- The following authors are in both the Department of Anesthesiology, Uniformed Services University, Bethesda, MD, and Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth, Portsmouth, VA: Gregory J. Booth is an Associate Professor at Uniformed Services University and Program Director, Anesthesiology Residency at Naval Medical Center Portsmouth; Mike Mynes and Elizabeth Slama are Assistant Professors at Uniformed Services University and Staff Anesthesiologists at Naval Medical Center Portsmouth; Jeffrey Moore is an Assistant Professor at Uniformed Services University and Program Director, Pain Medicine Fellowship, and Associate Designated Institutional Official at Naval Medical Center Portsmouth. Thomas Hauert is an Anesthesiology Resident Physician at Naval Medical Center Portsmouth, Portsmouth, VA. Ashton Goldman is an Associate Professor at Uniformed Services University, Bethesda, MD, and a Staff Orthopedic Surgeon at the Department of Orthopedic Surgery and Sports Medicine at Naval Medical Center Portsmouth, Portsmouth, VA. John Hodgson is an Associate Professor and Program Director, Anesthesiology Residency at University of South Florida, Tampa, FL
| | - Elizabeth Slama
- The following authors are in both the Department of Anesthesiology, Uniformed Services University, Bethesda, MD, and Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth, Portsmouth, VA: Gregory J. Booth is an Associate Professor at Uniformed Services University and Program Director, Anesthesiology Residency at Naval Medical Center Portsmouth; Mike Mynes and Elizabeth Slama are Assistant Professors at Uniformed Services University and Staff Anesthesiologists at Naval Medical Center Portsmouth; Jeffrey Moore is an Assistant Professor at Uniformed Services University and Program Director, Pain Medicine Fellowship, and Associate Designated Institutional Official at Naval Medical Center Portsmouth. Thomas Hauert is an Anesthesiology Resident Physician at Naval Medical Center Portsmouth, Portsmouth, VA. Ashton Goldman is an Associate Professor at Uniformed Services University, Bethesda, MD, and a Staff Orthopedic Surgeon at the Department of Orthopedic Surgery and Sports Medicine at Naval Medical Center Portsmouth, Portsmouth, VA. John Hodgson is an Associate Professor and Program Director, Anesthesiology Residency at University of South Florida, Tampa, FL
| | - Ashton Goldman
- The following authors are in both the Department of Anesthesiology, Uniformed Services University, Bethesda, MD, and Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth, Portsmouth, VA: Gregory J. Booth is an Associate Professor at Uniformed Services University and Program Director, Anesthesiology Residency at Naval Medical Center Portsmouth; Mike Mynes and Elizabeth Slama are Assistant Professors at Uniformed Services University and Staff Anesthesiologists at Naval Medical Center Portsmouth; Jeffrey Moore is an Assistant Professor at Uniformed Services University and Program Director, Pain Medicine Fellowship, and Associate Designated Institutional Official at Naval Medical Center Portsmouth. Thomas Hauert is an Anesthesiology Resident Physician at Naval Medical Center Portsmouth, Portsmouth, VA. Ashton Goldman is an Associate Professor at Uniformed Services University, Bethesda, MD, and a Staff Orthopedic Surgeon at the Department of Orthopedic Surgery and Sports Medicine at Naval Medical Center Portsmouth, Portsmouth, VA. John Hodgson is an Associate Professor and Program Director, Anesthesiology Residency at University of South Florida, Tampa, FL
| | - Jeffrey Moore
- The following authors are in both the Department of Anesthesiology, Uniformed Services University, Bethesda, MD, and Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth, Portsmouth, VA: Gregory J. Booth is an Associate Professor at Uniformed Services University and Program Director, Anesthesiology Residency at Naval Medical Center Portsmouth; Mike Mynes and Elizabeth Slama are Assistant Professors at Uniformed Services University and Staff Anesthesiologists at Naval Medical Center Portsmouth; Jeffrey Moore is an Assistant Professor at Uniformed Services University and Program Director, Pain Medicine Fellowship, and Associate Designated Institutional Official at Naval Medical Center Portsmouth. Thomas Hauert is an Anesthesiology Resident Physician at Naval Medical Center Portsmouth, Portsmouth, VA. Ashton Goldman is an Associate Professor at Uniformed Services University, Bethesda, MD, and a Staff Orthopedic Surgeon at the Department of Orthopedic Surgery and Sports Medicine at Naval Medical Center Portsmouth, Portsmouth, VA. John Hodgson is an Associate Professor and Program Director, Anesthesiology Residency at University of South Florida, Tampa, FL
| |
Collapse
|
17
|
Ekman N, Fors A, Moons P, Boström E, Taft C. Are the content and usability of a new direct observation tool adequate for assessing competency in delivering person-centred care: a think-aloud study with patients and healthcare professionals in Sweden. BMJ Open 2024; 14:e085198. [PMID: 38950999 PMCID: PMC11328633 DOI: 10.1136/bmjopen-2024-085198] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/09/2024] [Accepted: 05/30/2024] [Indexed: 07/03/2024] Open
Abstract
OBJECTIVE To evaluate the content and usability of a new direct observation tool for assessing competency in delivering person-centred care based on the Gothenburg Centre for Person-Centred Care (gPCC) framework. DESIGN This is a qualitative study using think-aloud techniques and retrospective probing interviews and analyzed using deductive content analysis. SETTING Sessions were conducted remotely via Zoom with participants in their homes or offices. PARTICIPANTS 11 participants with lengthy experience of receiving, delivering and/or implementing gPCC were recruited using purposeful sampling and selected to represent a broad variety of stakeholders and potential end-users. RESULTS Participants generally considered the content of the four main domains of the tool, that is, person-centred care activities, clinician manner, clinician skills and person-centred care goals, to be comprehensive and relevant for assessing person-centred care in general and gPCC in particular. Some participants pointed to the need to expand person-centred care activities to better reflect the emphasis on eliciting patient resources/capabilities and psychosocial needs in the gPCC framework. Think-aloud analyses revealed some usability issues primarily regarding difficulties or uncertainties in understanding several words and in using the rating scale. Probing interviews indicated that these problems could be mitigated by improving written instructions regarding response options and by replacing some words. Participants generally were satisfied with the layout and structure of the tool, but some suggested enlarging font size and text spacing to improve readability. CONCLUSION The tool appears to satisfactorily cover major person-centred care activities outlined in the gPCC framework. The inclusion of content concerning clinician manner and skills was seen as a relevant embellishment of the framework and as contributing to a more comprehensive assessment of clinician performance in the delivery of person-centred care. A revised version addressing observed content and usability issues will be tested for inter-rater and intra-rater reliability and for feasibility of use in healthcare education and quality improvement efforts.
Collapse
Affiliation(s)
- Nina Ekman
- Institute of Health and Care Sciences, Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden
- Gothenburg Centre for Person-Centred Care (GPCC), University of Gothenburg, Gothenburg, Sweden
- Department of Public Health and Primary Care, Faculty of Medicine, KU Leuven, Leuven, Belgium
| | - Andreas Fors
- Institute of Health and Care Sciences, Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden
- Gothenburg Centre for Person-Centred Care (GPCC), University of Gothenburg, Gothenburg, Sweden
- Region Västra Götaland, Research, Education, Development and Innovation, Primary Health Care, Gothenburg, Sweden
| | - Philip Moons
- Institute of Health and Care Sciences, Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden
- Gothenburg Centre for Person-Centred Care (GPCC), University of Gothenburg, Gothenburg, Sweden
- Department of Public Health and Primary Care, Faculty of Medicine, KU Leuven, Leuven, Belgium
- Department of Pediatrics and Child Health, University of Cape Town, Cape Town, South Africa
| | - Eva Boström
- Department of Nursing, University of Umeå, Umeå, Sweden
| | - Charles Taft
- Institute of Health and Care Sciences, Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden
- Gothenburg Centre for Person-Centred Care (GPCC), University of Gothenburg, Gothenburg, Sweden
| |
Collapse
|
18
|
Griffith M, Zvonar I, Garrett A, Bayaa N. Making goals count: A theory-informed approach to on-shift learning goals. AEM EDUCATION AND TRAINING 2024; 8:e10993. [PMID: 38882241 PMCID: PMC11178521 DOI: 10.1002/aet2.10993] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/21/2024] [Revised: 04/23/2024] [Accepted: 04/25/2024] [Indexed: 06/18/2024]
Abstract
Supervisors often ask emergency medicine trainees for their learning goals at the start of a clinical shift, though they may do so without considering the reasons for this practice. Recognizing the underlying rationale for voicing on-shift learning goals and proactively considering solutions for some of the associated challenges can help learners and supervisors employ this practice to its full potential. Goal articulation is rooted in educational principles such as self-regulated learning, targeted performance feedback, and collaborative relationships between learner and supervisor. Despite the potential for on-shift learning goals to augment learning, there are numerous barriers that make it challenging for learners and supervisors alike to create or follow up on meaningful goals. Learner-related challenges include uncertainty about how to develop goals within an unpredictable clinical environment and creating goals too narrow or broad in scope. Supervisor-related challenges include difficulties integrating direct observation into the clinical workflow and a desire to avoid negative feedback. The learning environment also presents inherent challenges, such as lack of longitudinal supervisor-learner relationships, time constraints, space limitations, and incentives for learners to conceal their knowledge gaps. The authors discuss these challenges to effective on-shift learning goals and propose solutions that target the learner's approach, the supervisor's approach, and the learning environment itself.
Collapse
Affiliation(s)
- Max Griffith
- Department of Emergency Medicine University of Washington Seattle Washington USA
| | - Ivan Zvonar
- Department of Emergency Medicine University of Washington Seattle Washington USA
| | - Alexander Garrett
- Department of Emergency Medicine University of Washington Seattle Washington USA
| | - Naeem Bayaa
- Department of Emergency Medicine University of Washington Seattle Washington USA
| |
Collapse
|
19
|
Richardson D, Landreville JM, Trier J, Cheung WJ, Bhanji F, Hall AK, Frank JR, Oswald A. Coaching in Competence by Design: A New Model of Coaching in the Moment and Coaching Over Time to Support Large Scale Implementation. PERSPECTIVES ON MEDICAL EDUCATION 2024; 13:33-43. [PMID: 38343553 PMCID: PMC10854464 DOI: 10.5334/pme.959] [Citation(s) in RCA: 7] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/08/2023] [Accepted: 11/23/2023] [Indexed: 02/15/2024]
Abstract
Coaching is an increasingly popular means to provide individualized, learner-centered, developmental guidance to trainees in competency based medical education (CBME) curricula. Aligned with CBME's core components, coaching can assist in leveraging the full potential of this educational approach. With its focus on growth and improvement, coaching helps trainees develop clinical acumen and self-regulated learning skills. Developing a shared mental model for coaching in the medical education context is crucial to facilitate integration and subsequent evaluation of success. This paper describes the Royal College of Physicians and Surgeons of Canada's coaching model, one that is theory based, evidence informed, principle driven and iteratively and developed by a multidisciplinary team. The coaching model was specifically designed, fit for purpose to the postgraduate medical education (PGME) context and implemented as part of Competence by Design (CBD), a new competency based PGME program. This coaching model differentiates two coaching roles, which reflect different contexts in which postgraduate trainees learn and develop skills. Both roles are supported by the RX-OCR process: developing Relationship/Rapport, setting eXpectations, Observing, a Coaching conversation, and Recording/Reflecting. The CBD Coaching Model and its associated RX-OCR faculty development tool support the implementation of coaching in CBME. Coaching in the moment and coaching over time offer important mechanisms by which CBD brings value to trainees. For sustained change to occur and for learners and coaches to experience the model's intended benefits, ongoing professional development efforts are needed. Early post implementation reflections and lessons learned are provided.
Collapse
Affiliation(s)
- Denyse Richardson
- Department of Physical Medicine and Rehabilitation, Queen’s University, Kingston, ON, Canada
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
| | | | - Jessica Trier
- Department of Physical Medicine and Rehabilitation, Queen’s University, Kingston, ON, Canada
| | - Warren J. Cheung
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Department of Emergency Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Farhan Bhanji
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Education, Faculty of Medicine and Health Sciences, McGill University, Montreal, QC, Canada
| | - Andrew K. Hall
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Department of Emergency Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Jason R. Frank
- University of Ottawa Faculty of Medicine, Ottawa, ON, Canada
| | - Anna Oswald
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Division of Rheumatology, Department of Medicine, Faculty of Medicine and Dentistry, University of Alberta, Edmonton, ON, Canada
- Competency Based Medical Education, University of Alberta, Edmonton, AB, Canada
| |
Collapse
|
20
|
Alpine L, Barrett E, Broderick J, Mockler D, O'Connor A. Education programmes on performance-based assessment for allied health and nursing clinical educators: A scoping review protocol. HRB Open Res 2024; 6:11. [PMID: 39906759 PMCID: PMC11791400 DOI: 10.12688/hrbopenres.13669.2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 01/09/2024] [Indexed: 02/06/2025] Open
Abstract
Background Performance-based assessment (PBA) is a complex process undertaken in the workplace by healthcare practitioners known as clinical educators, who assist universities in determining health professional students' readiness for independent practice. Preparing healthcare professionals for PBA is considered essential to ensuring the quality of the assessment process in the clinical learning environment. A preliminary search of the literature indicated a paucity of research guiding the development of education programmes that support practice educators to understand and implement PBA. Objective The aim of this scoping review is to investigate and describe education programmes delivered to allied health and nursing clinical educators, to develop PBA knowledge and skills. Methods This review will follow the Joanna Briggs Institute (JBI) methodology for conducting scoping reviews. Electronic databases relevant to this research topic will be searched including, EMBASE, ERIC, MEDLINE (Ovid), Web of Science and CINAHL and other targeted databases for grey literature. Studies that include PBA as the main focus or a component of the education programmes, of any format, delivered to clinical educators in allied health and nursing will be included. Studies may report the design and/or implementation and/or evaluation of PBA education programmes. Relevant English language publications will be sought from January 2000 to October 2022. Two reviewers will screen all titles and abstracts against the inclusion/exclusion criteria, and publications deemed relevant will be eligible for full text screening, confirming appropriateness for inclusion in the scoping review. Data will be charted to create a table of the results, supported a by narrative summary of the findings in line with the review objectives.
Collapse
Affiliation(s)
- Lucy Alpine
- Discipline of Physiotherapy, Trinity College Dublin, Trinity Center for Health Sciences, Dublin, D08W9RT, Ireland
| | - Emer Barrett
- Discipline of Physiotherapy, Trinity College Dublin, Trinity Center for Health Sciences, Dublin, D08W9RT, Ireland
| | - Julie Broderick
- Discipline of Physiotherapy, Trinity College Dublin, Trinity Center for Health Sciences, Dublin, D08W9RT, Ireland
| | - David Mockler
- John Sterne Library, Trinity College Dublin, Trinity Center for Health Sciences, Dublin, D08W9RT, Ireland
| | - Anne O'Connor
- Munster Physiotherapy Clinic, Limerick, V94R9VW, Ireland
| |
Collapse
|
21
|
Mitchell EC, Ott M, Ross D, Grant A. Development of a Tool to Assess Surgical Resident Competence On-Call: The Western University Call Assessment Tool (WUCAT). JOURNAL OF SURGICAL EDUCATION 2024; 81:106-114. [PMID: 38008642 DOI: 10.1016/j.jsurg.2023.10.001] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/10/2023] [Revised: 09/13/2023] [Accepted: 10/02/2023] [Indexed: 11/28/2023]
Abstract
BACKGROUND A central tenet of competency-based medical education is the formative assessment of trainees. There are currently no assessments designed to examine resident competence on-call, despite the on-call period being a significant component of residency, characterized by less direct supervision compared to daytime. The purpose of this study was to design a formative on-call assessment tool and collect valid evidence on its application. METHODS Nominal group technique was used to identify critical elements of surgical resident competence on-call to inform tool development. The tool was piloted over six months in the Division of Plastic & Reconstructive Surgery at our institution. Quantitative and qualitative evidence was collected to examine tool validity. RESULTS A ten-item tool was developed based on the consensus group results. Sixty-three assessments were completed by seven staff members on ten residents during the pilot. The tool had a reliability coefficient of 0.67 based on a generalizability study and internal item consistency was 0.92. Scores were significantly associated with years of training. We found the tool improved the quantity and structure of feedback given and that the tool was considered feasible and acceptable by both residents and staff members. CONCLUSIONS The Western University Call Assessment Tool (WUCAT) has multiple sources of evidence supporting its use in assessing resident competence on-call.
Collapse
Affiliation(s)
- Eric C Mitchell
- Department of Surgery, Western University, London, Ontario, Canada
| | - Michael Ott
- Department of Surgery, Western University, London, Ontario, Canada
| | - Douglas Ross
- Department of Surgery, Western University, London, Ontario, Canada
| | - Aaron Grant
- Department of Surgery, Western University, London, Ontario, Canada.
| |
Collapse
|
22
|
Andreou V, Peters S, Eggermont J, Schoenmakers B. Evaluating Feedback Comments in Entrustable Professional Activities: A Cross-Sectional Study. JOURNAL OF MEDICAL EDUCATION AND CURRICULAR DEVELOPMENT 2024; 11:23821205241275810. [PMID: 39346122 PMCID: PMC11437546 DOI: 10.1177/23821205241275810] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/28/2024] [Accepted: 07/04/2024] [Indexed: 10/01/2024]
Abstract
INTRODUCTION Competency-based medical education (CBME) has transformed postgraduate medical training, prioritizing competency acquisition over traditional time-based curricula. Integral to CBME are Entrustable Professional Activities (EPAs), that aim to provide high-quality feedback for trainee development. Despite its importance, the quality of feedback within EPAs remains underexplored. METHODS We employed a cross-sectional study to explore feedback quality within EPAs, and to examine factors influencing length of written comments and their relationship to quality. We collected and analyzed 1163 written feedback comments using the Quality of Assessment for Learning (QuAL) score. The QuAL aims to evaluate written feedback from low-stakes workplace assessments, based on 3 quality criteria (evidence, suggestion, connection). Afterwards, we performed correlation and regression analyses to examine factors influencing feedback length and quality. RESULTS EPAs facilitated high-quality written feedback, with a significant proportion of comments meeting quality criteria. Task-oriented and actionable feedback was prevalent, enhancing value of low-stakes workplace assessments. From the statistical analyses, the type of assessment tool significantly influenced feedback length and quality, implicating that direct and video observations can yield superior feedback in comparison to case-based discussions. However, no correlation between entrustment scores and feedback quality was found, suggesting potential discrepancies between the feedback and the score on the entrustability scale. CONCLUSION This study indicates the role of the EPAs to foster high-quality feedback within CBME. It also highlights the multifaceted feedback dynamics, suggesting the influence of factors such as feedback length and assessment tool on feedback quality. Future research should further explore contextual factors for enhancing medical education practices.
Collapse
Affiliation(s)
- Vasiliki Andreou
- Department of Public Health and Primary Care, Academic Centre for General Practice, KU Leuven, Leuven, Belgium
| | - Sanne Peters
- Department of Public Health and Primary Care, Academic Centre for General Practice, KU Leuven, Leuven, Belgium
- School of Health Sciences, Faculty of Medicine, Dentistry and Health Sciences, The University of Melbourne, Melbourne, Australia
| | - Jan Eggermont
- Department of Cellular and Molecular Medicine, KU Leuven, Leuven, Belgium
| | - Birgitte Schoenmakers
- Department of Public Health and Primary Care, Academic Centre for General Practice, KU Leuven, Leuven, Belgium
| |
Collapse
|
23
|
Marty AP, Linsenmeyer M, George B, Young JQ, Breckwoldt J, Ten Cate O. Mobile technologies to support workplace-based assessment for entrustment decisions: Guidelines for programs and educators: AMEE Guide No. 154. MEDICAL TEACHER 2023; 45:1203-1213. [PMID: 36706225 DOI: 10.1080/0142159x.2023.2168527] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/18/2023]
Abstract
With the rise of competency-based medical education and workplace-based assessment (WBA) since the turn of the century, much has been written about methods of assessment. Direct observation and other sources of information have become standard in many clinical programs. Entrustable professional activities (EPAs) have also become a central focus of assessment in the clinical workplace. Paper and pencil (one of the earliest mobile technologies!) to document observations have become almost obsolete with the advent of digital technology. Typically, clinical supervisors are asked to document assessment ratings using forms on computers. However, accessing these forms can be cumbersome and is not easily integrated into existing clinical workflows. With a call for more frequent documentation, this practice is hardly sustainable, and mobile technology is quickly becoming indispensable. Documentation of learner performance at the point of care merges WBA with patient care and WBA increasingly uses smartphone applications for this purpose.This AMEE Guide was developed to support institutions and programs who wish to use mobile technology to implement EPA-based assessment and, more generally, any type of workplace-based assessment. It covers backgrounds of WBA, EPAs and entrustment decision-making, provides guidance for choosing or developing mobile technology, discusses challenges and describes best practices.
Collapse
Affiliation(s)
| | - Machelle Linsenmeyer
- West Virginia School of Osteopathic Medicine, Lewisburg, WV, United States of America
| | - Brian George
- Surgery and Learning Health Sciences, University of Michigan, Ann Arbor, Michigan, United States of America
| | - John Q Young
- Department of Psychiatry, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell &, Zucker Hillside Hospital, NY, United States of America
| | - Jan Breckwoldt
- Institute of Anesthesia at the University Hospital Zurich, Switzerland
| | - Olle Ten Cate
- Utrecht Center for Research and Development of Health Professions Education at UMC Utrecht, The Netherlands
| |
Collapse
|
24
|
Hauer KE, Chang A, van Schaik SM, Lucey C, Cowell T, Teherani A. "It's All About the Trust And Building A Foundation:" Evaluation of a Longitudinal Medical Student Coaching Program. TEACHING AND LEARNING IN MEDICINE 2023; 35:550-564. [PMID: 35996842 DOI: 10.1080/10401334.2022.2111570] [Citation(s) in RCA: 13] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/07/2021] [Accepted: 08/01/2022] [Indexed: 06/15/2023]
Abstract
Coaching is increasingly implemented in medical education to support learners' growth, learning, and wellbeing. Data demonstrating the impact of longitudinal coaching programs are needed. We developed and evaluated a comprehensive longitudinal medical student coaching program designed to achieve three aims for students: fostering personal and professional development, advancing physician skills with a growth mindset, and promoting student wellbeing and belonging within an inclusive learning community. We also sought to advance coaches' development as faculty through satisfying education roles with structured training. Students meet with coaches weekly for the first 17 months of medical school for patient care and health systems skills learning, and at least twice yearly throughout the remainder of medical school for individual progress and planning meetings and small-group discussions about professional identity. Using the developmental evaluation framework, we iteratively evaluated the program over the first five years of implementation with multiple quantitative and qualitative measures of students' and coaches' experiences related to the three aims. The University of California, San Francisco, School of Medicine, developed a longitudinal coaching program in 2016 for medical students alongside reform of the four-year curriculum. The coaching program addressed unmet student needs for a longitudinal, non-evaluative relationship with a coach to support their development, shape their approach to learning, and promote belonging and community. In surveys and focus groups, students reported high satisfaction with coaching in measures of the three program aims. They appreciated coaches' availability and guidance for the range of academic, personal, career, and other questions they had throughout medical school. Students endorsed the value of a longitudinal relationship and coaches' ability to meet their changing needs over time. Students rated coaches' teaching of foundational clinical skills highly. Students observed coaches learning some clinical skills with them - skills outside a coach's daily practice. Students also raised some concerns about variability among coaches. Attention to wellbeing and belonging to a learning community were program highlights for students. Coaches benefited from relationships with students and other coaches and welcomed the professional development to equip them to support all student needs. Students perceive that a comprehensive medical student coaching program can achieve aims to promote their development and provide support. Within a non-evaluative longitudinal coach relationship, students build skills in driving their own learning and improvement. Coaches experience a satisfying yet challenging role. Ongoing faculty development within a coach community and funding for the role seem essential for coaches to fulfill their responsibilities.
Collapse
Affiliation(s)
- Karen E Hauer
- Department of Medicine, University of California, San Francisco, San Francisco, California, USA
| | - Anna Chang
- Department of Medicine, University of California, San Francisco, San Francisco, California, USA
| | - Sandrijn M van Schaik
- Department of Medicine, University of California, San Francisco, San Francisco, California, USA
| | - Catherine Lucey
- Department of Medicine, University of California, San Francisco, San Francisco, California, USA
| | - Tami Cowell
- Department of Medicine, University of California, San Francisco, San Francisco, California, USA
| | - Arianne Teherani
- Department of Medicine, University of California, San Francisco, San Francisco, California, USA
| |
Collapse
|
25
|
McCarthy N, Neville K, Pope A, Barry L, Livingstone V. Effectiveness of a proficiency-based progression e-learning approach to training in communication in the context of clinically deteriorating patients: a multi-arm randomised controlled trial. BMJ Open 2023; 13:e072488. [PMID: 37536965 PMCID: PMC10401258 DOI: 10.1136/bmjopen-2023-072488] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/03/2023] [Accepted: 06/08/2023] [Indexed: 08/05/2023] Open
Abstract
OBJECTIVE To determine the effectiveness of proficiency-based progression (PBP) e-learning in training in communication concerning clinically deteriorating patients. DESIGN Single-centre multi-arm randomised double-blind controlled trial with three parallel arms. RANDOMISATION, SETTING AND PARTICIPANTS A computer-generated program randomised and allocated 120 final year medical students in an Irish University into three trial groups. INTERVENTION Each group completed the standard Identification, Situation, Background, Assessment, Recommendation communication e-learning; group 1 Heath Service Executive course group (HSE) performed this alone; group 2 (PBP) performed additional e-learning using PBP scenarios with expert-determined proficiency benchmarks composed of weighted marking schemes of steps, errors and critical errors cut-offs; group 3 (S) (self-directed, no PBP) performed additional e-learning with identical scenarios to (PBP) without PBP. MAIN OUTCOME MEASURES Primary analysis was based on 114 students, comparing ability to reach expert-determined predefined proficiency benchmark in standardised low-fidelity simulation assessment, before and after completion of each group's e-learning requirements. Performance was recorded and scored by two independent blinded assessors. RESULTS Post-intervention, proficiency in each group in the low-fidelity simulation environment improved with statistically significant difference in proficiency between groups (p<0.001). Proficiency was highest in (PBP) (81.1%, 30/37). Post hoc pairwise comparisons revealed statistically significant differences between (PBP) and self-directed (S) (p<0.001) and (HSE) (p<0.001). No statistically significant difference existed between (S) and (HSE) (p=0.479). Changes in proficiency from pre-intervention to post-intervention were significantly different between the three groups (p=0.001). Post-intervention, an extra 67.6% (25/37) in (PBP) achieved proficiency in the low-fidelity simulation. Post hoc pairwise comparisons revealed statistically significant differences between (PBP) and both (S) (p=0.020) and (HSE) (p<0.001). No statistically significant difference was found between (S) and (HSE) (p=0.156). CONCLUSIONS PBP e-learning is a more effective way to train in communication concerning clinically deteriorating patients than standard e-learning or e-learning without PBP. TRIAL REGISTRATION NUMBER NCT02937597.
Collapse
Affiliation(s)
- Nora McCarthy
- Medical Education Unit, School of Medicine, University College Cork, Cork, Ireland
| | - Karen Neville
- Department of Business Information Systems, Cork University Business School, University College Cork, Cork, Ireland
| | - Andrew Pope
- Department of Business Information Systems, Cork University Business School, University College Cork, Cork, Ireland
| | - Lee Barry
- ESA-BIC, Tyndall Institute, University College Cork National University of Ireland, Cork, Ireland
| | - Vicki Livingstone
- INFANT Centre, University College Cork National University of Ireland, Cork, Ireland
| |
Collapse
|
26
|
Miller KA, Nagler J, Wolff M, Schumacher DJ, Pusic MV. It Takes a Village: Optimal Graduate Medical Education Requires a Deliberately Developmental Organization. PERSPECTIVES ON MEDICAL EDUCATION 2023; 12:282-293. [PMID: 37520509 PMCID: PMC10377742 DOI: 10.5334/pme.936] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 02/14/2023] [Accepted: 07/06/2023] [Indexed: 08/01/2023]
Abstract
Coaching is proposed as a means of improving the learning culture of medicine. By fostering trusting teacher-learner relationships, learners are encouraged to embrace feedback and make the most of failure. This paper posits that a cultural shift is necessary to fully harness the potential of coaching in graduate medical education. We introduce the deliberately developmental organization framework, a conceptual model focusing on three core dimensions: developmental communities, developmental aspirations, and developmental practices. These dimensions broaden the scope of coaching interactions. Implementing this organizational change within graduate medical education might be challenging, yet we argue that embracing deliberately developmental principles can embed coaching into everyday interactions and foster a culture in which discussing failure to maximize learning becomes acceptable. By applying the dimensions of developmental communities, aspirations, and practices, we present a six-principle roadmap towards transforming graduate medical education training programs into deliberately developmental organizations.
Collapse
Affiliation(s)
- Kelsey A. Miller
- Pediatrics and Emergency Medicine, Harvard Medical School, Boston, MA, USA
| | - Joshua Nagler
- Pediatrics and Emergency Medicine, Harvard Medical School, Boston, MA, USA
| | - Margaret Wolff
- Emergency Medicine and Pediatrics, University of Michigan Medical School, Ann Arbor, MI, USA
| | - Daniel J. Schumacher
- Cincinnati Children’s Hospital Medical Center and the University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Martin V. Pusic
- Pediatrics and Emergency Medicine, Harvard Medical School, Boston, MA, USA
| |
Collapse
|
27
|
Berger S, Stalmeijer RE, Marty AP, Berendonk C. Exploring the Impact of Entrustable Professional Activities on Feedback Culture: A Qualitative Study of Anesthesiology Residents and Attendings. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2023; 98:836-843. [PMID: 36812061 DOI: 10.1097/acm.0000000000005188] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/18/2023]
Abstract
PURPOSE Entrustable professional activities (EPAs) were introduced as a potential way to optimize workplace-based assessments. Yet, recent studies suggest that EPAs have not yet overcome all of the challenges to implementing meaningful feedback. The aim of this study was to explore the extent to which the introduction of EPAs via mobile app impacts feedback culture as experienced by anesthesiology residents and attending physicians. METHOD Using a constructivist grounded theory approach, the authors interviewed a purposive and theoretical sample of residents (n = 11) and attendings (n = 11) at the Institute of Anaesthesiology, University Hospital of Zurich, where EPAs had recently been implemented. Interviews took place between February and December 2021. Data collection and analysis were conducted iteratively. The authors used open, axial, and selective coding to gain knowledge and understanding on the interplay of EPAs and feedback culture. RESULTS Participants reflected on a number of changes in their day-to-day experience of feedback culture with the implementation of EPAs. Three main mechanisms were instrumental in this process: lowering the feedback threshold, change in feedback focus, and gamification. Participants felt a lower threshold to feedback seeking and giving and that the frequency of feedback conversations increased and tended to be more focused on a specific topic and shorter, while feedback content tended to focus more on technical skills and more attention was given to average performances. Residents indicated that the app-based approach fostered a game-like motivation to "climb levels," while attendings did not perceive a game-like experience. CONCLUSIONS EPAs may offer a solution to problems of infrequent occurrence of feedback and invite attention to average performances and technical competencies, but may come at the expense of feedback on nontechnical skills. This study suggests that feedback culture and feedback instruments have a mutually interacting influence on each other.
Collapse
Affiliation(s)
- Sabine Berger
- S. Berger is a third-year medical resident, Internal Medicine Training Program, St. Claraspital, Basel, Switzerland
| | - Renee E Stalmeijer
- R.E. Stalmeijer is associate professor, Department of Educational Development and Research, Faculty of Health, Medicine and Life Sciences, Maastricht University, Maastricht, The Netherlands
| | - Adrian P Marty
- A.P. Marty is currently senior attending physician and team lead for education, Institute of Anaesthesiology, Intensive Care and Pain Medicine, Orthopedic University Hospital Balgrist, Zurich, Switzerland. At the time of writing, he was attending physician, Institute of Anaesthesiology, University of Zurich, University Hospital of Zurich, Zurich, Switzerland
| | - Christoph Berendonk
- C. Berendonk is senior lecturer in medical education, Institute for Medical Education, University of Bern, Bern, Switzerland
| |
Collapse
|
28
|
Paterson QS, Alrimawi H, Sample S, Bouwsema M, Anjum O, Vincent M, Cheung WJ, Hall A, Woods R, Martin LJ, Chan T. Examining enablers and barriers to entrustable professional activity acquisition using the theoretical domains framework: A qualitative framework analysis study. AEM EDUCATION AND TRAINING 2023; 7:e10849. [PMID: 36994315 PMCID: PMC10041073 DOI: 10.1002/aet2.10849] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/04/2022] [Revised: 01/20/2023] [Accepted: 01/29/2023] [Indexed: 06/19/2023]
Abstract
Background Without a clear understanding of the factors contributing to the effective acquisition of high-quality entrustable professional activity (EPA) assessments, trainees, supervising faculty, and training programs may lack appropriate strategies for successful EPA implementation and utilization. The purpose of this study was to identify barriers and facilitators to acquiring high-quality EPA assessments in Canadian emergency medicine (EM) training programs. Methods We conducted a qualitative framework analysis study utilizing the Theoretical Domains Framework (TDF). Semistructured interviews of EM resident and faculty participants underwent audio recording, deidentification, and line-by-line coding by two authors, being coded to extract themes and subthemes across the domains of the TDF. Results From 14 interviews (eight faculty and six residents) we identified, within the 14 TDF domains, major themes and subthemes for barriers and facilitators to EPA acquisition for both faculty and residents. The two most cited domains (and their frequencies) among residents and faculty were environmental context and resources (56) and behavioral regulation (48). Example strategies to improving EPA acquisition include orienting residents to the competency-based medical education (CBME) paradigm, recalibrating expectations relating to "low ratings" on EPAs, engaging in continuous faculty development to ensure familiarity and fluency with EPAs, and implementing longitudinal coaching programs between residents and faculty to encourage repetitive longitudinal interactions and high-quality specific feedback. Conclusions We identified key strategies to support residents, faculty, programs, and institutions in overcoming barriers and improving EPA assessment processes. This is an important step toward ensuring the successful implementation of CBME and the effective operationalization of EPAs within EM training programs.
Collapse
Affiliation(s)
- Quinten S. Paterson
- Department of Emergency MedicineUniversity of SaskatchewanSaskatoonSaskatchewanCanada
| | - Hussein Alrimawi
- Emergency Medicine Division, Department of MedicineMcMaster UniversityHamiltonOntarioCanada
| | - Spencer Sample
- Emergency Medicine Division, Department of MedicineMcMaster UniversityHamiltonOntarioCanada
| | - Melissa Bouwsema
- Department of Emergency MedicineQueens UniversityKingstonOntarioCanada
| | - Omar Anjum
- Department of Emergency MedicineUniversity of OttawaOttawaOntarioCanada
| | - Maggie Vincent
- Emergency Medicine Division, Department of MedicineMcMaster UniversityHamiltonOntarioCanada
| | - Warren J. Cheung
- Department of Emergency MedicineUniversity of OttawaOttawaOntarioCanada
| | - Andrew Hall
- Department of Emergency MedicineQueens UniversityKingstonOntarioCanada
- Department of Emergency MedicineUniversity of OttawaOttawaOntarioCanada
| | - Rob Woods
- Department of Emergency MedicineUniversity of SaskatchewanSaskatoonSaskatchewanCanada
| | - Lynsey J. Martin
- Department of Emergency MedicineUniversity of SaskatchewanSaskatoonSaskatchewanCanada
| | - Teresa Chan
- Emergency Medicine Division, Department of MedicineMcMaster UniversityHamiltonOntarioCanada
| |
Collapse
|
29
|
Booth GJ, Ross B, Cronin WA, McElrath A, Cyr KL, Hodgson JA, Sibley C, Ismawan JM, Zuehl A, Slotto JG, Higgs M, Haldeman M, Geiger P, Jardine D. Competency-Based Assessments: Leveraging Artificial Intelligence to Predict Subcompetency Content. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2023; 98:497-504. [PMID: 36477379 DOI: 10.1097/acm.0000000000005115] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/17/2023]
Abstract
PURPOSE Faculty feedback on trainees is critical to guiding trainee progress in a competency-based medical education framework. The authors aimed to develop and evaluate a Natural Language Processing (NLP) algorithm that automatically categorizes narrative feedback into corresponding Accreditation Council for Graduate Medical Education Milestone 2.0 subcompetencies. METHOD Ten academic anesthesiologists analyzed 5,935 narrative evaluations on anesthesiology trainees at 4 graduate medical education (GME) programs between July 1, 2019, and June 30, 2021. Each sentence (n = 25,714) was labeled with the Milestone 2.0 subcompetency that best captured its content or was labeled as demographic or not useful. Inter-rater agreement was assessed by Fleiss' Kappa. The authors trained an NLP model to predict feedback subcompetencies using data from 3 sites and evaluated its performance at a fourth site. Performance metrics included area under the receiver operating characteristic curve (AUC), positive predictive value, sensitivity, F1, and calibration curves. The model was implemented at 1 site in a self-assessment exercise. RESULTS Fleiss' Kappa for subcompetency agreement was moderate (0.44). Model performance was good for professionalism, interpersonal and communication skills, and practice-based learning and improvement (AUC 0.79, 0.79, and 0.75, respectively). Subcompetencies within medical knowledge and patient care ranged from fair to excellent (AUC 0.66-0.84 and 0.63-0.88, respectively). Performance for systems-based practice was poor (AUC 0.59). Performances for demographic and not useful categories were excellent (AUC 0.87 for both). In approximately 1 minute, the model interpreted several hundred evaluations and produced individual trainee reports with organized feedback to guide a self-assessment exercise. The model was built into a web-based application. CONCLUSIONS The authors developed an NLP model that recognized the feedback language of anesthesiologists across multiple GME programs. The model was operationalized in a self-assessment exercise. It is a powerful tool which rapidly organizes large amounts of narrative feedback.
Collapse
Affiliation(s)
- Gregory J Booth
- G.J. Booth is assistant professor, Uniformed Services University of the Health Sciences, and residency program director, Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth, Portsmouth, Virginia
| | - Benjamin Ross
- G.J. Booth is assistant professor, Uniformed Services University of the Health Sciences, and residency program director, Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth, Portsmouth, Virginia
| | - William A Cronin
- G.J. Booth is assistant professor, Uniformed Services University of the Health Sciences, and residency program director, Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth, Portsmouth, Virginia
| | - Angela McElrath
- G.J. Booth is assistant professor, Uniformed Services University of the Health Sciences, and residency program director, Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth, Portsmouth, Virginia
| | - Kyle L Cyr
- G.J. Booth is assistant professor, Uniformed Services University of the Health Sciences, and residency program director, Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth, Portsmouth, Virginia
| | - John A Hodgson
- G.J. Booth is assistant professor, Uniformed Services University of the Health Sciences, and residency program director, Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth, Portsmouth, Virginia
| | - Charles Sibley
- G.J. Booth is assistant professor, Uniformed Services University of the Health Sciences, and residency program director, Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth, Portsmouth, Virginia
| | - J Martin Ismawan
- G.J. Booth is assistant professor, Uniformed Services University of the Health Sciences, and residency program director, Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth, Portsmouth, Virginia
| | - Alyssa Zuehl
- G.J. Booth is assistant professor, Uniformed Services University of the Health Sciences, and residency program director, Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth, Portsmouth, Virginia
| | - James G Slotto
- G.J. Booth is assistant professor, Uniformed Services University of the Health Sciences, and residency program director, Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth, Portsmouth, Virginia
| | - Maureen Higgs
- G.J. Booth is assistant professor, Uniformed Services University of the Health Sciences, and residency program director, Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth, Portsmouth, Virginia
| | - Matthew Haldeman
- G.J. Booth is assistant professor, Uniformed Services University of the Health Sciences, and residency program director, Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth, Portsmouth, Virginia
| | - Phillip Geiger
- G.J. Booth is assistant professor, Uniformed Services University of the Health Sciences, and residency program director, Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth, Portsmouth, Virginia
| | - Dink Jardine
- G.J. Booth is assistant professor, Uniformed Services University of the Health Sciences, and residency program director, Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth, Portsmouth, Virginia
| |
Collapse
|
30
|
Hill AE, Bartle E, Copley JA, Olson R, Dunwoodie R, Barnett T, Zuber A. The VOTIS, part 1: development and pilot trial of a tool to assess students' interprofessional skill development using video-reflexive ethnography. J Interprof Care 2023; 37:223-231. [PMID: 35403549 DOI: 10.1080/13561820.2022.2052270] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/06/2023]
Abstract
This paper explores the development and evaluation of the video Observation Tool for Interprofessional Skills (VOTIS). We describe the development of an authentic interprofessional assessment tool that incorporates video reflection and allows formative and summative assessment of individual learners' interprofessional skills within an authentic interprofessional context. We then investigate its validity and reliability. The VOTIS was developed using a modified Delphi technique. The tool was piloted with 61 students and 11 clinical educators who completed the VOTIS following team meetings where students interacted about their interprofessional clinical work. The following were calculated: internal consistency; students' proficiency levels; inter-rater reliability between students and clinical educators; and inter-rater reliability between clinical educators and an independent rater. Results indicate that the VOTIS has acceptable internal consistency and moderate reliability and has value in evaluating students' interprofessional skills. Study outcomes highlight the need for more explicit wording of tool content and instructions and further clinical educator training to increase the utility and reliability of the VOTIS as a learning and assessment tool.
Collapse
Affiliation(s)
- Anne E Hill
- School of Health and Rehabilitation Sciences, The University of Queensland, St. Lucia, Qld, Australia
| | - Emma Bartle
- School of Dentistry, The University of Queensland, St. Lucia, Qld, Australia
| | - Jodie A Copley
- School of Health and Rehabilitation Sciences, The University of Queensland, St. Lucia, Qld, Australia
| | - Rebecca Olson
- Sociology, School of Social Science, The University of Queensland, St. Lucia, Qld, Australia
| | - Ruth Dunwoodie
- School of Health and Rehabilitation Sciences, The University of Queensland, St. Lucia, Qld, Australia
| | - Tessa Barnett
- School of Health and Rehabilitation Sciences, The University of Queensland, St. Lucia, Qld, Australia
| | - Alice Zuber
- School of Health and Rehabilitation Sciences, The University of Queensland, St. Lucia, Qld, Australia
| |
Collapse
|
31
|
Rietmeijer CBT, van Esch SCM, Blankenstein AH, van der Horst HE, Veen M, Scheele F, Teunissen PW. A phenomenology of direct observation in residency: Is Miller's 'does' level observable? MEDICAL EDUCATION 2023; 57:272-279. [PMID: 36515981 PMCID: PMC10107098 DOI: 10.1111/medu.15004] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Received: 04/11/2022] [Revised: 12/08/2022] [Accepted: 12/10/2022] [Indexed: 06/17/2023]
Abstract
INTRODUCTION Guidelines on direct observation (DO) present DO as an assessment of Miller's 'does' level, that is, the learner's ability to function independently in clinical situations. The literature, however, indicates that residents may behave 'inauthentically' when observed. To minimise this 'observer effect', learners are encouraged to 'do what they would normally do' so that they can receive feedback on their actual work behaviour. Recent phenomenological research on patients' experiences with DO challenges this approach; patients needed-and caused-some participation of the observing supervisor. Although guidelines advise supervisors to minimise their presence, we are poorly informed on how some deliberate supervisor participation affects residents' experience in DO situations. Therefore, we investigated what residents essentially experienced in DO situations. METHODS We performed an interpretive phenomenological interview study, including six general practice (GP) residents. We collected and analysed our data, using the four phenomenological lenses of lived body, lived space, lived time and lived relationship. We grouped our open codes by interpreting what they revealed about common structures of residents' pre-reflective experiences. RESULTS Residents experienced the observing supervisor not just as an observer or assessor. They also experienced them as both a senior colleague and as the patient's familiar GP, which led to many additional interactions. When residents tried to act as if the supervisor was not there, they could feel insecure and handicapped because the supervisor was there, changing the situation. DISCUSSION Our results indicate that the 'observer effect' is much more material than was previously understood. Consequently, observing residents' 'authentic' behaviour at Miller's 'does' level, as if the supervisor was not there, seems impossible and a misleading concept: misleading, because it may frustrate residents and cause supervisors to neglect patients' and residents' needs in DO situations. We suggest that one-way DO is better replaced by bi-directional DO in working-and-learning-together sessions.
Collapse
Affiliation(s)
- Chris B. T. Rietmeijer
- Department of General PracticeAmsterdam UMC, location Vrije Universiteit AmsterdamAmsterdamThe Netherlands
| | - Suzanne C. M. van Esch
- Department of General PracticeAmsterdam UMC, location University of AmsterdamAmsterdamThe Netherlands
| | - Annette H. Blankenstein
- Department of General PracticeAmsterdam UMC, location Vrije Universiteit AmsterdamAmsterdamThe Netherlands
| | - Henriëtte E. van der Horst
- Department of General PracticeAmsterdam UMC, location Vrije Universiteit AmsterdamAmsterdamThe Netherlands
| | - Mario Veen
- Department of General PracticeErasmus Medical CenterRotterdamThe Netherlands
| | - Fedde Scheele
- School of Medical Sciences, Athena Institute for Transdisciplinary ResearchAmsterdam UMC, location Vrije Universiteit AmsterdamAmsterdamThe Netherlands
| | - Pim W. Teunissen
- School of Health Professions EducationMaastricht UniversityMaastrichtThe Netherlands
| |
Collapse
|
32
|
Adam P, Mauksch LB, Brandenburg DL, Danner C, Ross VR. Optimal training in communication model (OPTiCOM): A programmatic roadmap. PATIENT EDUCATION AND COUNSELING 2023; 107:107573. [PMID: 36410312 DOI: 10.1016/j.pec.2022.107573] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/26/2022] [Revised: 11/13/2022] [Accepted: 11/16/2022] [Indexed: 06/16/2023]
Abstract
OBJECTIVES Teaching primary care residents patient communication skills is essential, complex, and impeded by barriers. We find no models guiding faculty how to train residents in the workplace that integrate necessary system components, the science of physician-patient communication training and competency-based medical education. The aim of this project is to create such a model. METHODS We created OPTiCOM using four steps: (1) communication educator interviews, analysis and theme development; (2) initial model construction; (3) model refinement using expert feedback; (4) structured literature review to validate, refine and finalize the model. RESULTS Our model contains ten interdependent building blocks organized into four developmental tiers. The Foundational value tier has one building block Naming relationship as a core value. The Expertize and resources tier includes four building blocks addressing: Curricular expertize, Curricular content, Leadership, and Time. The four building blocks in the Application and development tier are Observation form, Faculty development, Technology, and Formative assessment. The Language and culture tier identifies the final building block, Culture promoting continuous improvement in teaching communication. CONCLUSIONS OPTiCOM organizes ten interdependent systems building blocks to maximize and sustain resident learning of communication skills. Practice Implications Residency faculty can use OPTiCOM for self-assessment, program creation and revision.
Collapse
Affiliation(s)
- Patricia Adam
- Department of Family Medicine and Community Health, University of Minnesota, Smiley's Clinic, 2020 East 28th Street, Minneapolis, MN 55407, USA.
| | - Larry B Mauksch
- Emeritus - Department of Family Medicine, University of Washington, Home, 6026 30th Ave NE, Seattle, WA 98115, USA.
| | - Dana L Brandenburg
- Department of Family Medicine and Community Health, University of Minnesota, Smiley's Clinic, 2020 East 28th Street, Minneapolis, MN 55407, USA.
| | - Christine Danner
- Department of Family Medicine and Community Health, University of Minnesota, Bethesda Clinic, 580 Rice St, St Paul, MN 55103, USA.
| | - Valerie R Ross
- University of Washington Department of Family Medicine, Family Medicine Residency Program, Box 356390, 331 N.E. Thornton Place, Seattle, WA 98125, USA.
| |
Collapse
|
33
|
Kogan JR, Conforti LN, Holmboe ES. Faculty Perceptions of Frame of Reference Training to Improve Workplace-Based Assessment. J Grad Med Educ 2023; 15:81-91. [PMID: 36817545 PMCID: PMC9934818 DOI: 10.4300/jgme-d-22-00287.1] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/06/2022] [Revised: 06/27/2022] [Accepted: 12/06/2022] [Indexed: 02/17/2023] Open
Abstract
BACKGROUND Workplace-based assessment (WBA) is a key assessment strategy in competency-based medical education. However, its full potential has not been actualized secondary to concerns with reliability, validity, and accuracy. Frame of reference training (FORT), a rater training technique that helps assessors distinguish between learner performance levels, can improve the accuracy and reliability of WBA, but the effect size is variable. Understanding FORT benefits and challenges help improve this rater training technique. OBJECTIVE To explore faculty's perceptions of the benefits and challenges associated with FORT. METHODS Subjects were internal medicine and family medicine physicians (n=41) who participated in a rater training intervention in 2018 consisting of in-person FORT followed by asynchronous online spaced learning. We assessed participants' perceptions of FORT in post-workshop focus groups and an end-of-study survey. Focus groups and survey free text responses were coded using thematic analysis. RESULTS All subjects participated in 1 of 4 focus groups and completed the survey. Four benefits of FORT were identified: (1) opportunity to apply skills frameworks via deliberate practice; (2) demonstration of the importance of certain evidence-based clinical skills; (3) practice that improved the ability to discriminate between resident skill levels; and (4) highlighting the importance of direct observation and the dangers using proxy information in assessment. Challenges included time constraints and task repetitiveness. CONCLUSIONS Participants believe that FORT training serves multiple purposes, including helping them distinguish between learner skill levels while demonstrating the impact of evidence-based clinical skills and the importance of direct observation.
Collapse
Affiliation(s)
- Jennifer R. Kogan
- Jennifer R. Kogan, MD, is Associate Dean, Student Success and Professional Development, and Professor of Medicine, Perelman School of Medicine, University of Pennsylvania
| | - Lisa N. Conforti
- Lisa N. Conforti, MPH, is Research Associate for Milestones Evaluation, Accreditation Council for Graduate Medical Education (ACGME)
| | - Eric S. Holmboe
- Eric S. Holmboe, MD, is Chief Research, Milestone Development, and Evaluation Officer, ACGME
| |
Collapse
|
34
|
Kogan JR, Dine CJ, Conforti LN, Holmboe ES. Can Rater Training Improve the Quality and Accuracy of Workplace-Based Assessment Narrative Comments and Entrustment Ratings? A Randomized Controlled Trial. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2023; 98:237-247. [PMID: 35857396 DOI: 10.1097/acm.0000000000004819] [Citation(s) in RCA: 13] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
PURPOSE Prior research evaluating workplace-based assessment (WBA) rater training effectiveness has not measured improvement in narrative comment quality and accuracy, nor accuracy of prospective entrustment-supervision ratings. The purpose of this study was to determine whether rater training, using performance dimension and frame of reference training, could improve WBA narrative comment quality and accuracy. A secondary aim was to assess impact on entrustment rating accuracy. METHOD This single-blind, multi-institution, randomized controlled trial of a multifaceted, longitudinal rater training intervention consisted of in-person training followed by asynchronous online spaced learning. In 2018, investigators randomized 94 internal medicine and family medicine physicians involved with resident education. Participants assessed 10 scripted standardized resident-patient videos at baseline and follow-up. Differences in holistic assessment of narrative comment accuracy and specificity, accuracy of individual scenario observations, and entrustment rating accuracy were evaluated with t tests. Linear regression assessed impact of participant demographics and baseline performance. RESULTS Seventy-seven participants completed the study. At follow-up, the intervention group (n = 41), compared with the control group (n = 36), had higher scores for narrative holistic specificity (2.76 vs 2.31, P < .001, Cohen V = .25), accuracy (2.37 vs 2.06, P < .001, Cohen V = .20) and mean quantity of accurate (6.14 vs 4.33, P < .001), inaccurate (3.53 vs 2.41, P < .001), and overall observations (2.61 vs 1.92, P = .002, Cohen V = .47). In aggregate, the intervention group had more accurate entrustment ratings (58.1% vs 49.7%, P = .006, Phi = .30). Baseline performance was significantly associated with performance on final assessments. CONCLUSIONS Quality and specificity of narrative comments improved with rater training; the effect was mitigated by inappropriate stringency. Training improved accuracy of prospective entrustment-supervision ratings, but the effect was more limited. Participants with lower baseline rating skill may benefit most from training.
Collapse
Affiliation(s)
- Jennifer R Kogan
- J.R. Kogan is associate dean, Student Success and Professional Development, and professor of medicine, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania; ORCID: https://orcid.org/0000-0001-8426-9506
| | - C Jessica Dine
- C.J. Dine is associate dean, Evaluation and Assessment, and associate professor of medicine, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania; ORCID: https://orcid.org/0000-0001-5894-0861
| | - Lisa N Conforti
- L.N. Conforti is research associate for milestones evaluation, Accreditation Council for Graduate Medical Education, Chicago, Illinois; ORCID: https://orcid.org/0000-0002-7317-6221
| | - Eric S Holmboe
- E.S. Holmboe is chief, research, milestones development and evaluation, Accreditation Council for Graduate Medical Education, Chicago, Illinois; ORCID: https://orcid.org/0000-0003-0108-6021
| |
Collapse
|
35
|
Erumeda NJ, George AZ, Jenkins LS. Evaluating postgraduate family medicine supervisor feedback in registrars' learning portfolios. Afr J Prim Health Care Fam Med 2022; 14:e1-e10. [PMID: 36546494 PMCID: PMC9772774 DOI: 10.4102/phcfm.v14i1.3744] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2022] [Revised: 10/19/2022] [Accepted: 10/25/2022] [Indexed: 12/24/2022] Open
Abstract
BACKGROUND Postgraduate supervision forms a vital component of decentralised family medicine training. While the components of effective supervisory feedback have been explored in high-income countries, how this construct is delivered in resource-constrained low- to middle-income countries has not been investigated adequately. AIM This article evaluated supervisory feedback in family medicine registrars' learning portfolios (LPs) as captured in their learning plans and mini-Clinical Evaluation Exercise (mini-CEX) forms and whether the training district or the year of training affected the nature of the feedback. SETTING Registrars' LPs from 2020 across five decentralised sites affiliated with the University of the Witwatersrand in South Africa were analysed. METHODS Two modified tools were used to evaluate the quantity of the written feedback in 38 learning plans and 57 mini-CEX forms. Descriptive statistics, Fisher's exact and Wilcoxon rank-sum tests were used for analysis. Content analysis was used to derive counts of areas of feedback. RESULTS Most learning plans (61.2%) did not refer to registrars' clinical knowledge or offer an improvement strategy (86.1%). The 'extent of supervisors' feedback' was rated as 'poor' (63.2%), with only 14.0% rated as 'good.' The 'some' and 'no' feedback categories in the mini-CEX competencies (p 0.001 to p = 0.014) and the 'extent of supervisors' feedback' (p 0.001) were significantly associated with training district. Feedback focused less on clinical reasoning and negotiation skills. CONCLUSION Supervisors should provide specific and constructive narrative feedback and an action plan to improve registrars' future performance.Contribution: Supervisory feedback in postgraduate family medicine training needs overall improvement to develop skilled family physicians.
Collapse
Affiliation(s)
- Neetha J. Erumeda
- Department of Family Medicine and Primary Care, Faculty of Health Sciences, University of the Witwatersrand, Johannesburg, South Africa,Gauteng Department of Health, Ekurhuleni District Health Services, Germiston, South Africa
| | - Ann Z. George
- Centre of Health Science Education, Faculty of Health Sciences, University of the Witwatersrand, Johannesburg, South Africa
| | - Louis S. Jenkins
- Division of Family Medicine and Primary Care, Department of Family and Emergency Medicine, Faculty of Health Sciences, Stellenbosch University, Cape Town, South Africa,George Hospital, Western Cape Department of Health, George, South Africa,Primary Health Care Directorate, Department of Family, Community and Emergency Care, Faculty of Health Sciences, University of Cape Town, Cape Town, South Africa
| |
Collapse
|
36
|
Core Entrustable Professional Activities for Entering Residency: A National Survey of Graduating Medical Students' Self-Assessed Skills by Specialty. J Am Coll Surg 2022; 235:940-951. [PMID: 36102502 PMCID: PMC9653107 DOI: 10.1097/xcs.0000000000000395] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/15/2022]
Abstract
BACKGROUND The Association of American Medical Colleges described 13 Core Entrustable Professional Activities (EPAs) that graduating students should be prepared to perform under indirect supervision on day one of residency. Surgery program directors recently recommended entrustability in these Core EPAs for incoming surgery interns. We sought to determine if graduating students intending to enter surgery agreed they had the skills to perform these Core EPAs. STUDY DESIGN Using de-identified, individual-level data collected from and about 2019 Association of American Medical Colleges Graduation Questionnaire respondents, latent profile analysis was used to group respondents based on their self-assessed Core EPAs skills' response patterns. Associations between intended specialty, among other variables, and latent profile analysis group were assessed using independent sample t -tests and chi-square tests and multivariable logistic regression methods. RESULTS Among 12,308 Graduation Questionnaire respondents, latent profile analysis identified 2 respondent groups: 7,863 (63.9%) in a high skill acquisition agreement (SAA) group and 4,445 (36.1%) in a moderate SAA group. Specialty was associated with SAA group membership (p < 0.001), with general surgery, orthopaedic surgery, and emergency medicine respondents (among others) overrepresented in the high SAA group. In the multivariable logistic regression models, each of anesthesiology, ophthalmology, pediatrics, psychiatry, and radiology (vs general surgery) specialty intention was associated with a lower odds of high SAA group membership. CONCLUSION Graduating students' self-assessed Core EPAs skills were higher for those intending general surgery than for those intending some other specialties. Our findings can inform collaborative efforts to ensure graduates' acquisition of the skills expected of them at the start of residency.
Collapse
|
37
|
Safavi AH, Papadakos J, Papadakos T, Quartey NK, Lawrie K, Klein E, Storer S, Croke J, Millar BA, Jang R, Bezjak A, Giuliani ME. Feedback Delivery in an Academic Cancer Centre: Reflections From an R2C2-based Microlearning Course. JOURNAL OF CANCER EDUCATION : THE OFFICIAL JOURNAL OF THE AMERICAN ASSOCIATION FOR CANCER EDUCATION 2022; 37:1790-1797. [PMID: 34169464 DOI: 10.1007/s13187-021-02028-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 05/17/2021] [Indexed: 06/13/2023]
Abstract
Feedback delivery and training have not been characterized in the context of academic cancer centres. The purpose of this study was to assess the feasibility and utility of a microlearning course based on the R2C2 (Relationship, Reaction, Content, Coaching) feedback model and characterize multidisciplinary healthcare provider (HCP) perspectives on existing feedback practices in an academic cancer centre. Five HCP (two radiation oncologists, one medical oncologist, and two allied health professionals) with supervisory roles were selected by purposive sampling to participate in a prospective longitudinal qualitative study. Each participant completed a web-based multimedia course. Semi-structured one-on-one interviews were conducted with each participant at four time points: pre- and immediately post-course, and at one- and three-months post course. All participants found the course to be time feasible and completed it in 10-20 min. Participants expressed that the course fulfilled their need for feedback training and that its adoption may normalize a feedback culture in the cancer centre. Three themes were identified regarding perceptions of existing feedback practices: (1) hierarchical and interdisciplinary relationships modulate feedback delivery, (2) interest in feedback delivery varies by duration of the supervisory relationship, and (3) the transactionality of supervisor-trainee relationships influences feedback delivery. This study demonstrates the perceived feasibility and utility of a digital microlearning approach for development of feedback competencies in an academic cancer centre, perceptions of cultural barriers to feedback delivery, and the need for organizational commitment to developing a feedback culture.
Collapse
Affiliation(s)
- Amir H Safavi
- Department of Radiation Oncology, University of Toronto, Toronto, ON, M5T 1P5, Canada
| | - Janet Papadakos
- Cancer Education, Princess Margaret Cancer Centre, 585 University Ave, Munk Building B-PMB 130, Toronto, ON, M5G 2M9, Canada
- Patient Education, Ontario Health, Toronto, ON, M5S 1A1, Canada
- Institute of Health Policy, Management and Evaluation, University of Toronto, 155 College St 4th Floor, Toronto, ON, M5T 3M6, Canada
| | - Tina Papadakos
- Cancer Education, Princess Margaret Cancer Centre, 585 University Ave, Munk Building B-PMB 130, Toronto, ON, M5G 2M9, Canada
- Patient Education, Ontario Health, Toronto, ON, M5S 1A1, Canada
| | - Naa Kwarley Quartey
- Cancer Education, Princess Margaret Cancer Centre, 585 University Ave, Munk Building B-PMB 130, Toronto, ON, M5G 2M9, Canada
| | - Karen Lawrie
- Cancer Education, Princess Margaret Cancer Centre, 585 University Ave, Munk Building B-PMB 130, Toronto, ON, M5G 2M9, Canada
| | - Eden Klein
- Cancer Education, Princess Margaret Cancer Centre, 585 University Ave, Munk Building B-PMB 130, Toronto, ON, M5G 2M9, Canada
| | - Sarah Storer
- Cancer Education, Princess Margaret Cancer Centre, 585 University Ave, Munk Building B-PMB 130, Toronto, ON, M5G 2M9, Canada
| | - Jennifer Croke
- Department of Radiation Oncology, University of Toronto, Toronto, ON, M5T 1P5, Canada
- Radiation Medicine Program, Princess Margaret Cancer Centre, 700 University Ave 7th Floor, Toronto, ON, M5G 2M9, Canada
| | - Barbara-Ann Millar
- Department of Radiation Oncology, University of Toronto, Toronto, ON, M5T 1P5, Canada
- Radiation Medicine Program, Princess Margaret Cancer Centre, 700 University Ave 7th Floor, Toronto, ON, M5G 2M9, Canada
| | - Raymond Jang
- Division of Medical Oncology and Hematology, University of Toronto, 700 University Ave 7th Floor, Toronto, ON, M5G 2M9, Canada
| | - Andrea Bezjak
- Department of Radiation Oncology, University of Toronto, Toronto, ON, M5T 1P5, Canada
- Radiation Medicine Program, Princess Margaret Cancer Centre, 700 University Ave 7th Floor, Toronto, ON, M5G 2M9, Canada
| | - Meredith E Giuliani
- Department of Radiation Oncology, University of Toronto, Toronto, ON, M5T 1P5, Canada.
- Cancer Education, Princess Margaret Cancer Centre, 585 University Ave, Munk Building B-PMB 130, Toronto, ON, M5G 2M9, Canada.
- Radiation Medicine Program, Princess Margaret Cancer Centre, 700 University Ave 7th Floor, Toronto, ON, M5G 2M9, Canada.
| |
Collapse
|
38
|
Kinnear B, Schumacher DJ, Driessen EW, Varpio L. How argumentation theory can inform assessment validity: A critical review. MEDICAL EDUCATION 2022; 56:1064-1075. [PMID: 35851965 PMCID: PMC9796688 DOI: 10.1111/medu.14882] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/28/2022] [Revised: 07/07/2022] [Accepted: 07/15/2022] [Indexed: 05/21/2023]
Abstract
INTRODUCTION Many health professions education (HPE) scholars frame assessment validity as a form of argumentation in which interpretations and uses of assessment scores must be supported by evidence. However, what are purported to be validity arguments are often merely clusters of evidence without a guiding framework to evaluate, prioritise, or debate their merits. Argumentation theory is a field of study dedicated to understanding the production, analysis, and evaluation of arguments (spoken or written). The aim of this study is to describe argumentation theory, articulating the unique insights it can offer to HPE assessment, and presenting how different argumentation orientations can help reconceptualize the nature of validity in generative ways. METHODS The authors followed a five-step critical review process consisting of iterative cycles of focusing, searching, appraising, sampling, and analysing the argumentation theory literature. The authors generated and synthesised a corpus of manuscripts on argumentation orientations deemed to be most applicable to HPE. RESULTS We selected two argumentation orientations that we considered particularly constructive for informing HPE assessment validity: New rhetoric and informal logic. In new rhetoric, the goal of argumentation is to persuade, with a focus on an audience's values and standards. Informal logic centres on identifying, structuring, and evaluating arguments in real-world settings, with a variety of normative standards used to evaluate argument validity. DISCUSSION Both new rhetoric and informal logic provide philosophical, theoretical, or practical groundings that can advance HPE validity argumentation. New rhetoric's foregrounding of audience aligns with HPE's social imperative to be accountable to specific stakeholders such as the public and learners. Informal logic provides tools for identifying and structuring validity arguments for analysis and evaluation.
Collapse
Affiliation(s)
- Benjamin Kinnear
- Department of PediatricsUniversity of Cincinnati College of MedicineCincinnatiOhioUSA
- School of Health Professions Education (SHE)Maastricht UniversityMaastrichtThe Netherlands
| | - Daniel J. Schumacher
- Department of PediatricsUniversity of Cincinnati College of MedicineCincinnatiOhioUSA
| | - Erik W. Driessen
- School of Health Professions Education Faculty of HealthMedicine and Life Sciences of Maastricht UniversityMaastrichtThe Netherlands
| | - Lara Varpio
- Uniformed Services University of the Health SciencesBethesdaMarylandUSA
| |
Collapse
|
39
|
Phinney LB, Fluet A, O'Brien BC, Seligman L, Hauer KE. Beyond Checking Boxes: Exploring Tensions With Use of a Workplace-Based Assessment Tool for Formative Assessment in Clerkships. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2022; 97:1511-1520. [PMID: 35703235 DOI: 10.1097/acm.0000000000004774] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
PURPOSE To understand the role of a workplace-based assessment (WBA) tool in facilitating feedback for medical students, this study explored changes and tensions in a clerkship feedback activity system through the lens of cultural historical activity theory (CHAT) over 2 years of tool implementation. METHOD This qualitative study uses CHAT to explore WBA use in core clerkships by identifying feedback activity system elements (e.g., community, tools, rules, objects) and tensions among these elements. University of California, San Francisco core clerkship students were invited to participate in semistructured interviews eliciting experience with a WBA tool intended to enhance direct observation and feedback in year 1 (2019) and year 2 (2020) of implementation. In year 1, the WBA tool required supervisor completion in the school's evaluation system on a computer. In year 2, both students and supervisors had WBA completion abilities and could access the form via a smartphone separate from the school's evaluation system. RESULTS Thirty-five students participated in interviews. The authors identified tensions that shifted with time and tool iterations. Year 1 students described tensions related to cumbersome tool design, fear of burdening supervisors, confusion over WBA purpose, WBA as checking boxes, and WBA usefulness depending on clerkship context and culture. Students perceived dissatisfaction with the year 1 tool version among peers and supervisors. The year 2 mobile-based tool and student completion capabilities helped to reduce many of the tensions noted in year 1. Students expressed wider WBA acceptance among peers and supervisors in year 2 and reported understanding WBA to be for low-stakes feedback, thereby supporting formative assessment for learning. CONCLUSIONS Using CHAT to explore changes in a feedback activity system with WBA tool iterations revealed elements important to WBA implementation, including designing technology for tool efficiency and affording students autonomy to document feedback with WBAs.
Collapse
Affiliation(s)
- Lauren B Phinney
- L.B. Phinney is a first-year internal medicine resident, Department of Medicine, University of California, San Francisco School of Medicine, San Francisco, California
| | - Angelina Fluet
- A. Fluet is a fourth-year medical student, University of California, San Francisco School of Medicine, San Francisco, California
| | - Bridget C O'Brien
- B.C. O'Brien is professor of medicine and education scientist, Department of Medicine and Center for Faculty Educators, University of California, San Francisco School of Medicine, San Francisco, California
| | - Lee Seligman
- L. Seligman is a second-year internal medicine resident, Department of Medicine, New York-Presbyterian Hospital, Columbia University Irving Medical Center, New York, New York
| | - Karen E Hauer
- K.E. Hauer is associate dean for competency assessment and professional standards and professor, Department of Medicine, University of California, San Francisco School of Medicine, San Francisco, California
| |
Collapse
|
40
|
Shafqat S, Tejani I, Ali M, Tariq H, Sabzwari S. Feasibility and Effectiveness of Mini-Clinical Evaluation Exercise (Mini-CEX) in an Undergraduate Medical Program: A Study From Pakistan. Cureus 2022; 14:e29563. [PMID: 36312643 PMCID: PMC9595266 DOI: 10.7759/cureus.29563] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/25/2022] [Indexed: 11/07/2022] Open
Abstract
Background In clinical settings, direct observation (DO) with feedback is an effective method to assess and improve learner performance. One tool used for DO is the mini-clinical evaluation exercise (Mini-CEX). We conducted a study to assess the effectiveness and feasibility of Mini-CEX for medical students at Aga Khan University, Karachi. Methods Utilizing a purposive sampling technique, a total of 199 students in six core clerkships of Years 3 and 4 were selected for this study. Participating faculty underwent training workshops for use of Mini-CEX and feedback strategies. Each student was assessed twice by one faculty, using a modified version of the Mini-CEX, which assessed four domains of clinical skills: Data Gathering, Communication, Diagnosis/Differential, and Management Plan and Organization. Feedback was given after each encounter. Faculty and students also provided detailed feedback regarding the process of DO. Data were analyzed using Statistical Package for Social Sciences (SPSS) version 26 (IBM Corp., Armonk, NY, USA), with categorical variables arranged as frequencies and percentages. The Chi-squared test was used for further statistical analyses, and a P-value of < 0.05 was considered statistically significant. Effectiveness was assessed via a change in student performance between the first and second Mini-CEX, and feasibility was assessed via qualitative feedback. Results We obtained three sets of results: Mini-CEX forms (523), from which we included a total 350 evaluations for analysis, 216 from Year 3 and 134 from Year 4, and feedback on DO: student (70) and faculty (18). Year 3 students performed significantly better in all foci of the Mini-CEX between the first and second assessment (P ≤ 0.001), whereas in Year 4, significant improvement was limited to only two domains of the Mini-CEX [Communication of History/Physical Examination (P = 0.040) and Diagnosis/Differential and Management Plan (P < 0.001)]. Students (65.7%) and faculty (94.4%) felt this exercise improved their interaction. 83.3% faculty recommended its formal implementation compared to 27.1% of students, who reported challenges in implementation of the Mini-CEX such as time constraints, logistics, the subjectivity of assessment, and varying interest by faculty. Conclusion Direct observation using Mini-CEX is effective in improving the clinical and diagnostic skills of medical students and strengthens student-faculty interaction. While challenges exist in its implementation, the strategic placement of Mini-CEX may enhance its utility in measuring student competency.
Collapse
Affiliation(s)
- Shameel Shafqat
- College of Medicine, Aga Khan University Medical College, Karachi, PAK
| | - Isbaah Tejani
- College of Medicine, Aga Khan University Medical College, Karachi, PAK
| | - Muhammad Ali
- College of Medicine, Aga Khan University Medical College, Karachi, PAK
| | - Hemaila Tariq
- College of Medicine, Aga Khan University Medical College, Karachi, PAK
| | | |
Collapse
|
41
|
Landreville JM, Wood TJ, Frank JR, Cheung WJ. Does direct observation influence the quality of workplace-based assessment documentation? AEM EDUCATION AND TRAINING 2022; 6:e10781. [PMID: 35903424 PMCID: PMC9305723 DOI: 10.1002/aet2.10781] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/14/2022] [Revised: 06/07/2022] [Accepted: 06/08/2022] [Indexed: 05/30/2023]
Abstract
Background A key component of competency-based medical education (CBME) is direct observation of trainees. Direct observation has been emphasized as integral to workplace-based assessment (WBA) yet previously identified challenges may limit its successful implementation. Given these challenges, it is imperative to fully understand the value of direct observation within a CBME program of assessment. Specifically, it is not known whether the quality of WBA documentation is influenced by observation type (direct or indirect). Methods The objective of this study was to determine the influence of observation type (direct or indirect) on quality of entrustable professional activity (EPA) assessment documentation within a CBME program. EPA assessments were scored by four raters using the Quality of Assessment for Learning (QuAL) instrument, a previously published three-item quantitative measure of the quality of written comments associated with a single clinical performance score. An analysis of variance was performed to compare mean QuAL scores among the direct and indirect observation groups. The reliability of the QuAL instrument for EPA assessments was calculated using a generalizability analysis. Results A total of 244 EPA assessments (122 direct observation, 122 indirect observation) were rated for quality using the QuAL instrument. No difference in mean QuAL score was identified between the direct and indirect observation groups (p = 0.17). The reliability of the QuAL instrument for EPA assessments was 0.84. Conclusions Observation type (direct or indirect) did not influence the quality of EPA assessment documentation. This finding raises the question of how direct and indirect observation truly differ and the implications for meta-raters such as competence committees responsible for making judgments related to trainee promotion.
Collapse
Affiliation(s)
| | - Timothy J. Wood
- Department of Innovation in Medical EducationUniversity of OttawaOttawaOntarioCanada
| | - Jason R. Frank
- Department of Emergency MedicineUniversity of OttawaOttawaOntarioCanada
| | - Warren J. Cheung
- Department of Emergency MedicineUniversity of OttawaOttawaOntarioCanada
| |
Collapse
|
42
|
Gordon LB, Zelaya-Floyd M, White P, Hallen S, Varaklis K, Tavakolikashi M. Interprofessional bedside rounding improves quality of feedback to resident physicians. MEDICAL TEACHER 2022; 44:907-913. [PMID: 35373712 DOI: 10.1080/0142159x.2022.2049735] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
PURPOSE Obtaining high quality feedback in residency education is challenging, in part due to limited opportunities for faculty observation of authentic clinical work. This study reviewed the impact of interprofessional bedside rounds ('iPACE™') on the length and quality of faculty narrative evaluations of residents as compared to usual inpatient teaching rounds. METHODS Narrative comments from faculty evaluations of Internal Medicine (IM) residents both on usual teaching service as well as the iPACE™ service (spanning 2017-2020) were reviewed and coded using a deductive content analysis approach. RESULTS Six hundred ninety-two narrative evaluations by 63 attendings of 103 residents were included. Evaluations of iPACE™ residents were significantly longer than those of residents on usual teams (109 vs. 69 words, p < 0.001). iPACE™ evaluations contained a higher average occurrence of direct observations of patient/family interactions (0.72 vs. 0.32, p < 0.001), references to interprofessionalism (0.17 vs. 0.05, p < 0.001), as well as specific (3.21 vs. 2.26, p < 0.001), actionable (1.01 vs. 0.69, p < 0.001), and corrective feedback (1.2 vs. 0.88, p = 0.001) per evaluation. CONCLUSIONS This study suggests that the iPACE™ model, which prioritizes interprofessional bedside rounds, had a positive impact on the quantity and quality of feedback, as measured via narrative comments on weekly evaluations.
Collapse
Affiliation(s)
- Lesley B Gordon
- Tufts University School of Medicine, Boston, MA, USA
- Department of Medicine, Maine Medical Center, Portland, ME, USA
| | | | - Patricia White
- Department of Medical Education, Maine Medical Center, Portland, ME, USA
| | - Sarah Hallen
- Tufts University School of Medicine, Boston, MA, USA
- Division of Geriatrics, Maine Medical Center, Portland, ME, USA
| | - Kalli Varaklis
- Tufts University School of Medicine, Boston, MA, USA
- Department of Medical Education, Maine Medical Center, Portland, ME, USA
- Department of Obstetrics and Gynecology, Maine Medical Center, Portland, ME, USA
| | - Motahareh Tavakolikashi
- Department of Medical Education, Maine Medical Center, Portland, ME, USA
- Department of System Science and Industrial Engineering, Binghamton University, Binghamton, NY, USA
| |
Collapse
|
43
|
Reimagining the Clinical Competency Committee to Enhance Education and Prepare for Competency-Based Time-Variable Advancement. J Gen Intern Med 2022; 37:2280-2290. [PMID: 35445932 PMCID: PMC9021365 DOI: 10.1007/s11606-022-07515-3] [Citation(s) in RCA: 15] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/15/2021] [Accepted: 03/25/2022] [Indexed: 12/01/2022]
Abstract
Assessing residents and clinical fellows is a high-stakes activity. Effective assessment is important throughout training so that identified areas of strength and weakness can guide educational planning to optimize outcomes. Assessment has historically been underemphasized although medical education oversight organizations have strengthened requirements in recent years. Growing acceptance of competency-based medical education and its logical extension to competency-based time-variable (CB-TV) graduate medical education (GME) further highlights the importance of implementing effective evidence-based approaches to assessment. The Clinical Competency Committee (CCC) has emerged as a key programmatic structure in graduate medical education. In the context of launching a multi-specialty pilot of CB-TV GME in our health system, we have examined several program's CCC processes and reviewed the relevant literature to propose enhancements to CCCs. We recommend that all CCCs fulfill three core goals, regularly applied to every GME trainee: (1) discern and describe the resident's developmental status to individualize education, (2) determine readiness for unsupervised practice, and (3) foster self-assessment ability. We integrate the literature and observations from GME program CCCs in our institutions to evaluate how current CCC processes support or undermine these goals. Obstacles and key enablers are identified. Finally, we recommend ways to achieve the stated goals, including the following: (1) assess and promote the development of competency in all trainees, not just outliers, through a shared model of assessment and competency-based advancement; (2) strengthen CCC assessment processes to determine trainee readiness for independent practice; and (3) promote trainee reflection and informed self-assessment. The importance of coaching for competency, robust workplace-based assessments, feedback, and co-production of individualized learning plans are emphasized. Individual programs and their CCCs must strengthen assessment tools and frameworks to realize the potential of competency-oriented education.
Collapse
|
44
|
Peterson BD, Magee CD, Martindale JR, Dreicer JJ, Mutter MK, Young G, Sacco MJ, Parsons LC, Collins SR, Warburton KM, Parsons AS. REACT: Rapid Evaluation Assessment of Clinical Reasoning Tool. J Gen Intern Med 2022; 37:2224-2229. [PMID: 35710662 PMCID: PMC9202973 DOI: 10.1007/s11606-022-07513-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/15/2021] [Accepted: 03/25/2022] [Indexed: 11/04/2022]
Abstract
INTRODUCTION Clinical reasoning encompasses the process of data collection, synthesis, and interpretation to generate a working diagnosis and make management decisions. Situated cognition theory suggests that knowledge is relative to contextual factors, and clinical reasoning in urgent situations is framed by pressure of consequential, time-sensitive decision-making for diagnosis and management. These unique aspects of urgent clinical care may limit the effectiveness of traditional tools to assess, teach, and remediate clinical reasoning. METHODS Using two validated frameworks, a multidisciplinary group of clinicians trained to remediate clinical reasoning and with experience in urgent clinical care encounters designed the novel Rapid Evaluation Assessment of Clinical Reasoning Tool (REACT). REACT is a behaviorally anchored assessment tool scoring five domains used to provide formative feedback to learners evaluating patients during urgent clinical situations. A pilot study was performed to assess fourth-year medical students during simulated urgent clinical scenarios. Learners were scored using REACT by a separate, multidisciplinary group of clinician educators with no additional training in the clinical reasoning process. REACT scores were analyzed for internal consistency across raters and observations. RESULTS Overall internal consistency for the 41 patient simulations as measured by Cronbach's alpha was 0.86. A weighted kappa statistic was used to assess the overall score inter-rater reliability. Moderate reliability was observed at 0.56. DISCUSSION To our knowledge, REACT is the first tool designed specifically for formative assessment of a learner's clinical reasoning performance during simulated urgent clinical situations. With evidence of reliability and content validity, this tool guides feedback to learners during high-risk urgent clinical scenarios, with the goal of reducing diagnostic and management errors to limit patient harm.
Collapse
Affiliation(s)
| | - Charles D Magee
- University of Virginia School of Medicine, Charlottesville, VA, USA
| | | | | | - M Kathryn Mutter
- University of Virginia School of Medicine, Charlottesville, VA, USA
| | - Gregory Young
- University of Virginia School of Medicine, Charlottesville, VA, USA
| | | | - Laura C Parsons
- University of Virginia School of Medicine, Charlottesville, VA, USA
| | | | | | - Andrew S Parsons
- University of Virginia School of Medicine, Charlottesville, VA, USA.
| |
Collapse
|
45
|
An Online Pattern Recognition-Oriented Workshop to Promote Interest among Undergraduate Students in How Mathematical Principles Could Be Applied within Veterinary Science. SUSTAINABILITY 2022. [DOI: 10.3390/su14116768] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/04/2023]
Abstract
Knowing the importance of mathematics and its relationship with veterinary medicine plays an important role for students. To promote interest in this relationship, we developed the workshop “Math in Nature” that utilizes the surrounding environment for stimulating pattern-recognition and observational skills. It consisted of four sections: A talk by a professional researcher, a question-and-answer section, a mathematical pattern identification session, and a discussion of the ideas proposed by students. The effectiveness of the program to raise interest in mathematics was evaluated using a questionnaire applied before and after the workshop. Following the course, a higher number of students agreed with the fact that biological phenomena can be explained and predicted by applying mathematics, and that it is possible to identify mathematical patterns in living beings. However, the students’ perspectives regarding the importance of mathematics in their careers, as well as their interest in deepening their mathematical knowledge, did not change. Arguably, “Math in Nature” could have exerted a positive effect on the students’ interest in mathematics. We thus recommend the application of similar workshops to improve interests and skills in relevant subjects among undergraduate students.
Collapse
|
46
|
Maggio LA, Haustein S, Costello JA, Driessen EW, Artino AR. Joining the meta-research movement: A bibliometric case study of the journal Perspectives on Medical Education. PERSPECTIVES ON MEDICAL EDUCATION 2022; 11:127-136. [PMID: 35727471 PMCID: PMC9210332 DOI: 10.1007/s40037-022-00717-9] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 04/15/2022] [Revised: 05/07/2022] [Accepted: 05/13/2022] [Indexed: 06/15/2023]
Abstract
PURPOSE To conduct a bibliometric case study of the journal Perspectives on Medical Education (PME) to provide insights into the journal's inner workings and to "take stock" of where PME is today, where it has been, and where it might go. METHODS Data, including bibliographic metadata, reviewer and author details, and downloads, were collected for manuscripts submitted to and published in PME from the journal's Editorial Manager and Web of Science. Gender of authors and reviewers was predicted using Genderize.io. To visualize and analyze collaboration patterns, citation relationships and term co-occurrence social network analyses (SNA) were conducted. VOSviewer was used to visualize the social network maps. RESULTS Between 2012-2019 PME received, on average, 260 manuscripts annually (range = 73-402). Submissions were received from authors in 81 countries with the majority in the United States (US), United Kingdom, and the Netherlands. PME published 518 manuscripts with authors based in 31 countries, the majority being in the Netherlands, US, and Canada. PME articles were downloaded 717,613 times (mean per document: 1388). In total 1201 (55% women) unique peer reviewers were invited and 649 (57% women) completed reviews; 1227 (49% women) unique authors published in PME. SNA revealed that PME authors were quite collaborative, with most authoring articles with others and only a minority (n = 57) acting as single authors. DISCUSSION This case study provides a glimpse into PME and offers evidence for PME's next steps. In the future, PME is committed to growing the journal thoughtfully; diversifying and educating editorial teams, authors, and reviewers, and liberating and sharing journal data.
Collapse
Affiliation(s)
- Lauren A Maggio
- Uniformed Services University of the Health Sciences, Bethesda, MD, USA.
| | - Stefanie Haustein
- School of Information Studies (ÉSIS) and Scholarly Communications Lab, University of Ottawa, Ottawa, ON, Canada
| | | | | | - Anthony R Artino
- The George Washington University School of Medicine and Health Sciences, Washington, DC, USA
| |
Collapse
|
47
|
Carbajal MM, Dadiz R, Sawyer T, Kane S, Frost M, Angert R. Part 5: Essentials of Neonatal-Perinatal Medicine Fellowship: evaluation of competence and proficiency using Milestones. J Perinatol 2022; 42:809-814. [PMID: 35149835 DOI: 10.1038/s41372-021-01306-0] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/08/2020] [Revised: 11/03/2021] [Accepted: 12/23/2021] [Indexed: 11/09/2022]
Abstract
The Accreditation Council for Graduate Medical Education (ACGME) Pediatric Subspecialty Milestone Project competencies are used for Neonatal-Perinatal Medicine (NPM) fellows. Milestones are longitudinal markers that range from novice to expert (levels 1-5). There is no standard approach to the required biannual evaluation of fellows by fellowship programs, resulting in significant variability among programs regarding procedural experience and exposure to pathology during clinical training. In this paper, we discuss the opportunities that Milestones provide, potential strategies to address challenges, and future directions.
Collapse
Affiliation(s)
- Melissa M Carbajal
- Department of Pediatrics, Section of Neonatology, Baylor College of Medicine, Houston, TX, USA.
| | - Rita Dadiz
- Departments of Pediatrics, University of Rochester School of Medicine and Dentistry, Rochester, NY, USA
| | - Taylor Sawyer
- Department of Pediatrics, University of Washington School of Medicine, Seattle, WA, USA
| | - Sara Kane
- Department of Pediatrics, Indiana University School of Medicine, Indianapolis, IN, USA
| | - Mackenzie Frost
- Department of Pediatrics, Perelman School of Medicine at the University of Pennsylvania, Philadelphia, PA, USA
| | | | - Robert Angert
- Department of Pediatrics, New York University Grossman School of Medicine, New York, NY, USA
| |
Collapse
|
48
|
de Jonge LPJWM, Minkels FNE, Govaerts MJB, Muris JWM, Kramer AWM, van der Vleuten CPM, Timmerman AA. Supervisory dyads' communication and alignment regarding the use of workplace-based observations: a qualitative study in general practice residency. BMC MEDICAL EDUCATION 2022; 22:330. [PMID: 35484573 PMCID: PMC9052511 DOI: 10.1186/s12909-022-03395-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/09/2021] [Accepted: 04/21/2022] [Indexed: 06/14/2023]
Abstract
BACKGROUND In medical residency, performance observations are considered an important strategy to monitor competence development, provide feedback and warrant patient safety. The aim of this study was to gain insight into whether and how supervisor-resident dyads build a working repertoire regarding the use of observations, and how they discuss and align goals and approaches to observation in particular. METHODS We used a qualitative, social constructivist approach to explore if and how supervisory dyads work towards alignment of goals and preferred approaches to performance observations. We conducted semi-structured interviews with supervisor-resident dyads, performing a template analysis of the data thus obtained. RESULTS The supervisory dyads did not frequently communicate about the use of observations, except at the start of training and unless they were triggered by internal or external factors. Their working repertoire regarding the use of observations seemed to be primarily driven by patient safety goals and institutional assessment requirements rather than by providing developmental feedback. Although intended as formative, the institutional test was perceived as summative by supervisors and residents, and led to teaching to the test rather than educating for purposes of competence development. CONCLUSIONS To unlock the full educational potential of performance observations, and to foster the development of an educational alliance, it is essential that supervisory dyads and the training institute communicate clearly about these observations and the role of assessment practices of- and for learning, in order to align their goals and respective approaches.
Collapse
Affiliation(s)
- Laury P J W M de Jonge
- Department of General Practice, Maastricht University, P.O. Box 616, 6200, MD, Maastricht, The Netherlands.
| | - Floor N E Minkels
- Department of General Practice, Maastricht University, P.O. Box 616, 6200, MD, Maastricht, The Netherlands
| | - Marjan J B Govaerts
- Department of Educational Research and Development, Maastricht University, Maastricht, The Netherlands
| | - Jean W M Muris
- Department of General Practice, Maastricht University, P.O. Box 616, 6200, MD, Maastricht, The Netherlands
| | - Anneke W M Kramer
- Department of Family Medicine, Leiden University, Leiden, The Netherlands
| | - Cees P M van der Vleuten
- Department of Educational Research and Development, Maastricht University, Maastricht, The Netherlands
| | - Angelique A Timmerman
- Department of General Practice, Maastricht University, P.O. Box 616, 6200, MD, Maastricht, The Netherlands
| |
Collapse
|
49
|
Swanberg M, Woodson-Smith S, Pangaro L, Torre D, Maggio L. Factors and Interactions Influencing Direct Observation: A Literature Review Guided by Activity Theory. TEACHING AND LEARNING IN MEDICINE 2022; 34:155-166. [PMID: 34238091 DOI: 10.1080/10401334.2021.1931871] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/29/2020] [Revised: 04/19/2021] [Accepted: 05/11/2021] [Indexed: 06/13/2023]
Abstract
PhenomenonEnsuring that future physicians are competent to practice medicine is necessary for high quality patient care and safety. The shift toward competency-based education has placed renewed emphasis on direct observation via workplace-based assessments in authentic patient care contexts. Despite this interest and multiple studies focused on improving direct observation, challenges regarding the objectivity of this assessment approach remain underexplored and unresolved. Approach: We conducted a literature review of direct observation in authentic patient contexts by systematically searching databases PubMed, Embase, Web of Science, and ERIC. Included studies comprised original research conducted in the patient care context with authentic patients, either as a live encounter or a video recording of an actual encounter, which focused on factors affecting the direct observation of undergraduate medical education (UME) or graduate medical education (GME) trainees. Because the patient care context adds factors that contribute to the cognitive load of the learner and of the clinician-observer we focused our question on such contexts, which are most useful in judgments about advancement to the next level of training or practice. We excluded articles or published abstracts not conducted in the patient care context (e.g., OSCEs) or those involving simulation, allied health professionals, or non-UME/GME trainees. We also excluded studies focused on end-of-rotation evaluations and in-training evaluation reports. We extracted key data from the studies and used Activity Theory as a lens to identify factors affecting these observations and the interactions between them. Activity Theory provides a framework to understand and analyze complex human activities, the systems in which people work, and the interactions or tensions between multiple associated factors. Findings: Nineteen articles were included in the analysis; 13 involved GME learners and 6 UME learners. Of the 19, six studies were set in the operating room and four in the Emergency department. Using Activity Theory, we discovered that while numerous studies focus on rater and tool influences, very few study the impact of social elements. These are the rules that govern how the activity happens, the environment and members of the community involved in the activity and how completion of the activity is divided up among the members of the community. Insights: Viewing direct observation via workplace-based assessment through the lens of Activity Theory may enable educators to implement curricular changes to improve direct observation of assessment. Activity Theory may allow researchers to design studies to focus on the identified underexplored interactions and influences in relation to direct observation.
Collapse
Affiliation(s)
- Margaret Swanberg
- Department of Neurology, Uniformed Services University, Bethesda, Maryland, USA
| | - Sarah Woodson-Smith
- Department of Neurology, Naval Medical Center Portsmouth, Portsmouth, Virginia, USA
| | - Louis Pangaro
- Department of Medicine, Uniformed Services University, Bethesda, Maryland, USA
| | - Dario Torre
- Department of Medicine, Uniformed Services University, Bethesda, Maryland, USA
- Center for Health Professions Education, Uniformed Services University, Bethesda, Maryland, USA
| | - Lauren Maggio
- Department of Medicine, Uniformed Services University, Bethesda, Maryland, USA
- Center for Health Professions Education, Uniformed Services University, Bethesda, Maryland, USA
| |
Collapse
|
50
|
Lacasse M, Renaud JS, Côté L, Lafleur A, Codsi MP, Dove M, Pélissier-Simard L, Pitre L, Rheault C. [Feedback Guide for direct observation of family medicine residents in Canada: a francophone tool]. CANADIAN MEDICAL EDUCATION JOURNAL 2022; 13:29-54. [PMID: 35321416 PMCID: PMC8909829 DOI: 10.36834/cmej.72587] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
BACKGROUND There is no CanMEDS-FM-based milestone tool to guide feedback during direct observation (DO). We have developed a guide to support documentation of feedback for direct observation (DO) in Canadian family medicine (FM) programs. METHODS The Guide was designed in three phases with the collaboration of five Canadian FM programs with at least a French-speaking teaching site: 1) literature review and needs assessment; 2) development of the DO Feedback Guide; 3) testing the Guide in a video simulation context with qualitative content analysis. RESULTS Phase 1 demonstrated the need for a narrative guide aimed at 1) specifying mutual expectations according to the resident's level of training and the clinical context, 2) providing the supervisor with tools and structure in his observations 3) to facilitate documentation of feedback. Phase 2 made it possible to develop the Guide, in paper and electronic formats, meeting the needs identified. In phase 3, 15 supervisors used the guide for three levels of residence. The Guide was adjusted following this testing to recall the phases of the clinical encounter that were often forgotten during feedback (before consultation, diagnosis and follow-up), and to suggest types of formulation to be favored (stimulating questions, questions of clarification, reflections). CONCLUSION Based on evidence and a collaborative approach, this Guide will equip French-speaking Canadian supervisors and residents performing DO in family medicine.
Collapse
Affiliation(s)
| | | | - Luc Côté
- Université Laval, Québec, Canada
| | | | | | | | | | | | | |
Collapse
|