1
|
Chen F, Belgique ST, Canter C, Boscardin CK, Willie C, Mitchell JD, Sullivan K, Martinelli SM. Unprofessionalism in anesthesiology: A qualitative study on classifying unprofessional behavior in anesthesiology residency education. J Clin Anesth 2024; 95:111429. [PMID: 38460412 DOI: 10.1016/j.jclinane.2024.111429] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/09/2023] [Revised: 01/16/2024] [Accepted: 02/24/2024] [Indexed: 03/11/2024]
Abstract
STUDY OBJECTIVE This study aims to identify the domains that constitute behaviors perceived to be unprofessional in anesthesiology residency training programs. DESIGN Qualitative study. SETTING Anesthesiology residency training programs. PATIENTS Not applicable. The participants involved residents, fellows, and faculty members purposefully sampled in four US-based anesthesiology residency programs. INTERVENTIONS Participants were asked to submit examples of unprofessional behavior they witnessed in anesthesiology residents, fellows, or faculty members via a Qualtrics link. MEASUREMENTS Not applicable. The behavior examples were independently reviewed and categorized into themes using content analysis. MAIN RESULTS A total of 116 vignettes were collected, resulting in a final list of 111 vignettes after excluding those that did not describe behavior exhibited by anesthesiology faculty or trainees. Fifty-eight vignettes pertained to unprofessional behaviors observed in faculty members and 53 were observed in trainees (residents and fellows). Nine unprofessionalism themes emerged in the analysis. The most common themes were VERBAL, SUPERVISION, QUALITY, ENGAGEMENT, and TIME. As to the distribution of role group (faculty versus trainee) by theme, unprofessional behaviors falling into the categories of BIAS, GOSSIP, LEWD, and VERBAL were observed more in faculty; whereas themes with unprofessional behavior primarily attributed to trainees included ENGAGEMENT, QUALITY, TIME, and SUPERVISION. CONCLUSION By reviewing reported professionalism-related vignettes within residency training programs, we identified classification descriptors for defining unprofessional behavior specific to anesthesiology residency education. Findings from this study enrich the definition of professionalism as a multi-dimensional competency pertaining to anesthesiology graduate medical education. This framework may facilitate preventative intervention and timely remediation plans for unprofessional behavior in residents and faculty.
Collapse
Affiliation(s)
- Fei Chen
- Department of Anesthesiology, University of North Carolina at Chapel Hill, Chapel Hill, NC, 27599, United States.
| | | | - Courtney Canter
- Department of Anesthesiology, University of North Carolina at Chapel Hill, Chapel Hill, NC, 27599, United States
| | - Christy K Boscardin
- Department of Anesthesiology and Perioperative Care, University of California San Francisco, San Francisco, CA, 94143, United States
| | - Chelsea Willie
- Department of Anesthesiology, Medical College of Wisconsin, Milwaukee, WI, 53226, United States
| | - John D Mitchell
- Department of Anesthesiology, Pain Management, and Perioperative Care, Henry Ford Health, Detroit, MI, 48105, United States; Department of Anesthesiology, Michigan State University College of Human Medicine, Grand Rapids, MI, 49503, United States
| | - Kristina Sullivan
- Department of Anesthesiology and Perioperative Care, University of California San Francisco, San Francisco, CA, 94143, United States
| | - Susan M Martinelli
- Department of Anesthesiology, University of North Carolina at Chapel Hill, Chapel Hill, NC, 27599, United States
| |
Collapse
|
2
|
Parikh R, Jones SB. Providing Impactful Feedback to the Current Generation of Anesthesiology Residents. Int Anesthesiol Clin 2024:00004311-990000000-00070. [PMID: 38798143 DOI: 10.1097/aia.0000000000000441] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 05/29/2024]
Affiliation(s)
- Reena Parikh
- Department of Anesthesiology, Albany Medical College, Albany, New York
| | | |
Collapse
|
3
|
Khalife R, Gupta M, Gonsalves C, Park YS, Riddle J, Tekian A, Horsley T. Patient involvement in assessment of postgraduate medical learners: A scoping review. MEDICAL EDUCATION 2022; 56:602-613. [PMID: 34981565 DOI: 10.1111/medu.14726] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/03/2021] [Revised: 11/18/2021] [Accepted: 12/27/2021] [Indexed: 06/14/2023]
Abstract
CONTEXT Competency-based assessment of learners may benefit from a more holistic, inclusive, approach for determining readiness for unsupervised practice. However, despite movements towards greater patient partnership in health care generally, inclusion of patients in postgraduate medical learners' assessment is largely absent. METHODS We conducted a scoping review to map the nature, extent and range of literature examining the inclusion (or exclusion) of patients within the assessment of postgraduate medical learners. Guided by Arskey and O'Malley's framework and informed by Levac et al. and Thomas et al., we searched two databases (MEDLINE® and Embase®) from inception until February 2021 using subheadings related to assessment, patients and postgraduate learners. Data analysis examined characteristics regarding the nature and factor influencing patient involvement in assessment. RESULTS We identified 41 papers spanning four decades. Some literature suggests patients are willing to be engaged in assessment, however choose not to engage when, for example, language barriers may exist. When stratified by specialty or clinical setting, the influence of factors such as gender, race, ethnicity or medical condition seems to remain consistent. Patients may participate in assessment as a stand-alone group or part of a multi-source feedback process. Patients generally provided high ratings but commented on the observed professional behaviours and communication skills in comparison with physicians who focused on medical expertise. CONCLUSION Factors that influence patient involvement in assessment are multifactorial including patients' willingness themselves, language and reading-comprehension challenges and available resources for training programmes to facilitate the integration of patient assessments. These barriers however are not insurmountable. While understudied, research examining patient involvement in assessment is increasing; however, our review suggests that the extent which the unique insights will be taken up in postgraduate medical education may be dependent on assessment systems readiness and, in particular, physician readiness to partner with patients in this way.
Collapse
Affiliation(s)
- Roy Khalife
- Department of Medicine (Hematology), The Ottawa Hospital, University of Ottawa, Ottawa, Ontario, Canada
| | - Manika Gupta
- Department of Medicine (Hematology), The Ottawa Hospital, University of Ottawa, Ottawa, Ontario, Canada
| | - Carol Gonsalves
- Department of Medicine (Hematology), The Ottawa Hospital, University of Ottawa, Ottawa, Ontario, Canada
| | - Yoon Soo Park
- Massachusetts General Hospital, Harvard Medical School, Boston, Massachusetts, USA
| | - Janet Riddle
- Department of Medical Education, University of Illinois College of Medicine at Chicago, Chicago, Illinois, USA
| | - Ara Tekian
- Department of Medical Education, University of Illinois College of Medicine at Chicago, Chicago, Illinois, USA
| | - Tanya Horsley
- Research Unit, Royal College of Physicians and Surgeons of Canada, Ottawa, Ontario, Canada
- School of Epidemiology and Public Health, University of Ottawa, Ottawa, Ontario, Canada
| |
Collapse
|
4
|
Murton C, Spowart L, Anderson M. How psychiatrists’ attitudes towards multi-source feedback including patient feedback influenced the educational value: a qualitative study. MEDEDPUBLISH 2022. [DOI: 10.12688/mep.17531.1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022] Open
Abstract
Background: Multi-source feedback (MSF) is well-established in psychiatric training. However, evidence on the educational impact is not definitive and there is scanty evidence exploring its value for professional development of psychiatry trainees in the United Kingdom (UK). Evidence suggests the MSF tool currently used is not suitable for specialist trainees. This qualitative research project explored psychiatric doctors’ attitudes towards MSF with patient feedback, to determine how this influenced the feedback’s educational usefulness. Methods: A qualitative study using a phenomenological approach based on a constructivist approach. Purposive sampling identified trainee psychiatrists who completed a more extensive MSF, including patient feedback, than they currently use. They discussed their results in supervised sessions to plan how to use the feedback. Semi-structured interviews were conducted separately with trainees and their supervisors following completion of MSF. The data was analysed thematically. The study was completed in 2020. Results: Seven trainees and five supervisors participated. Four themes were identified. Most had positive opinions about the educational usefulness of MSF, including patient feedback, and made changes to their behaviour following the feedback. Interviewees valued patient feedback and identified it as important in psychiatry. Most valued their patient feedback over their colleague feedback. The complexities of patient feedback in psychiatry and how this may influence the educational usefulness of the feedback were discussed in detail. Conclusions: Findings suggest a need to review the current system of MSF in psychiatry in order to maximise educational benefits. In particular, this research points to the benefits of psychiatric trainees engaging with patient feedback.
Collapse
|
5
|
Vilendrer SM, Kling SMR, Wang H, Brown-Johnson C, Jayaraman T, Trockel M, Asch SM, Shanafelt TD. How Feedback Is Given Matters: A Cross-Sectional Survey of Patient Satisfaction Feedback Delivery and Physician Well-being. Mayo Clin Proc 2021; 96:2615-2627. [PMID: 34479736 DOI: 10.1016/j.mayocp.2021.03.039] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/24/2020] [Revised: 01/28/2021] [Accepted: 03/15/2021] [Indexed: 11/24/2022]
Abstract
OBJECTIVE To evaluate how variation in the way patient satisfaction feedback is delivered relates to physician well-being and perceptions of its impact on patient care, job satisfaction, and clinical decision making. PARTICIPANTS AND METHODS A cross-sectional electronic survey was sent to faculty physicians from a large academic medical center in March 29, 2019. Physicians reported their exposure to feedback (timing, performance relative to peers, or channel) and related perceptions. The Professional Fulfillment Index captured burnout and professional fulfillment. Associations between feedback characteristics and well-being or perceived impact were tested using analysis of variance or logistic regression adjusted for covariates. RESULTS Of 1016 survey respondents, 569 (56.0%) reported receiving patient satisfaction feedback. Among those receiving feedback, 303 (53.2%) did not believe that this feedback improved patient care. Compared with physicians who never received feedback, those who received any type of feedback had higher professional fulfillment scores (mean, 6.6±2.1 vs 6.3±2.0; P=.03) but also reported an unfavorable impact on clinical decision making (odds ratio [OR], 2.9; 95% CI, 1.8 to 4.7; P<.001). Physicians who received feedback that included one-on-one discussions (as opposed to feedback without this channel) held more positive perceptions of the feedback's impact on patient care (OR, 2.0; 95% CI, 1.3 to 3.0; P=.003), whereas perceptions were less positive in physicians whose feedback included comparisons to named colleagues (OR, 0.5; 95% CI, 0.3 to 0.8; P=.003). CONCLUSION Providing patient satisfaction feedback to physicians was associated with mixed results, and physician perceptions of the impact of feedback depended on the characteristics of feedback delivery. Our findings suggest that feedback is viewed most constructively by physicians when delivered through one-on-one discussions and without comparison to peers.
Collapse
Affiliation(s)
- Stacie M Vilendrer
- Division of Primary Care and Population Health, Stanford School of Medicine, Stanford, CA.
| | - Samantha M R Kling
- Division of Primary Care and Population Health, Stanford School of Medicine, Stanford, CA
| | - Hanhan Wang
- Stanford Medicine WellMD Center, Stanford School of Medicine, Stanford, CA
| | - Cati Brown-Johnson
- Division of Primary Care and Population Health, Stanford School of Medicine, Stanford, CA
| | | | - Mickey Trockel
- Stanford Medicine WellMD Center, Stanford School of Medicine, Stanford, CA; Department of Psychiatry and Behavioral Sciences, Stanford School of Medicine, Stanford, CA
| | - Steven M Asch
- Division of Primary Care and Population Health, Stanford School of Medicine, Stanford, CA; VA Center for Innovation to Implementation, Menlo Park, CA
| | - Tait D Shanafelt
- Stanford Medicine WellMD Center, Stanford School of Medicine, Stanford, CA
| |
Collapse
|
6
|
Sureda E, Chacón-Moscoso S, Sanduvete-Chaves S, Sesé A. A Training Intervention through a 360° Multisource Feedback Model. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2021; 18:ijerph18179137. [PMID: 34501727 PMCID: PMC8431571 DOI: 10.3390/ijerph18179137] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/26/2021] [Revised: 08/24/2021] [Accepted: 08/27/2021] [Indexed: 11/29/2022]
Abstract
Physicians and other health sciences professionals need continuous training, not only in technical aspects of their activity but also in nontechnical, transversal competencies with a cost-efficient impact on the proper functioning of healthcare. The objective of this paper is to analyze the behavioral change among health professionals at a large public hospital following a training intervention on a set of core nontechnical competencies: Teamwork, Adaptability-Flexibility, Commitment-Engagement, Results Orientation, and Leadership Skills for Supervisors. The 360° Multisource Feedback (MSF) model was applied using three sources of information: supervisors, co-workers, and the workers themselves (self-assessment). A quasi-experimental pretest–post-test single-group design with two points in time was utilized. The training intervention improved the scores of only one of the trained competencies—the “Results Orientation” competency—although the scores were slightly inflated. Moreover, significant discrepancies were detected between the three sources, with supervisors awarding the highest scores. The magnitude of behavioral change was related to certain sociodemographic and organizational variables. The study was not immune to the ceiling effect, despite control measures aimed at avoiding it. The empirical evidence suggests that the 360° MSF model must be maintained over time to enhance and reinforce an evaluation culture for better patient care.
Collapse
Affiliation(s)
- Elena Sureda
- Department of Psychology, University of Balearic Islands, 07122 Palma, Spain;
| | - Salvador Chacón-Moscoso
- Experimental Psychology Department, Universidad de Sevilla, 41018 Sevilla, Spain;
- Department of Psychology, Universidad Autónoma de Chile, Santiago 7500138, Chile
- Correspondence: (S.C.-M.); (A.S.)
| | | | - Albert Sesé
- Department of Psychology, University of Balearic Islands, 07122 Palma, Spain;
- Balearic Islands Health Research Institute (IdISBa), 07120 Palma, Spain
- Correspondence: (S.C.-M.); (A.S.)
| |
Collapse
|
7
|
Sessler DI, Khan MZ, Maheshwari K, Liu L, Adegboye J, Saugel B, Mascha EJ. Blood Pressure Management by Anesthesia Professionals: Evaluating Clinician Skill From Electronic Medical Records. Anesth Analg 2021; 132:946-956. [PMID: 33031346 DOI: 10.1213/ane.0000000000005198] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
Abstract
BACKGROUND Avoiding intraoperative hypotension might serve as a measure of clinician skill. We, therefore, estimated the range of hypotension in patients of nurse anesthetists, and whether observed differences were associated with a composite of serious complications. METHODS First, we developed a multivariable model to predict the amount of hypotension, defined as minutes of mean arterial pressure (MAP) <65 mm Hg, for noncardiac surgical cases from baseline characteristics excluding nurse anesthetist. Second, we compared observed and predicted amounts of hypotension for each case and summarized "excess" amounts across providers. Third, we estimated the extent to which hypotension on an individual case level was independently associated with a composite of serious complications. Finally, we assessed the range of actual and excess minutes of MAP <65 mm Hg on a provider level, and the extent to which these pressure exposures were associated with complications. RESULTS We considered 110,391 hours of anesthesia by 99 nurse anesthetists. A total of 69% of 25,702 included cases had at least 1 minute of MAP <65 mm Hg, with a median (quartiles) of 4 (0-15) minutes on the case level. We were unable to explain much variance of intraoperative hypotension from baseline patient characteristics. However, cases in the highest 2 quartiles (>10 and >24 min/case more than predicted) were an estimated 27% (95% confidence interval [CI], 1.1-1.4) and 31% (95% CI, 1.2-1.5) more likely to experience complications compared to those with 0 excess minutes (both P < .001). There was little variation of the average excess minutes <65 mm Hg across the nurse anesthetists, with median (quartiles) of 1.6 (1.2-1.9) min/h. There was no association in confounder-adjusted models on the nurse anesthetist level between average excess hypotension and complications, either for continuous exposure (P = .09) or as quintiles (P = .30). CONCLUSIONS Hypotension is associated with complications on a case basis. But the average amount of hypotension for nurse anesthetists over hundreds of cases differed only slightly and was insufficient to explain meaningful differences in complications. Avoiding hypotension is a worthy clinical goal, but does not appear to be a useful metric of performance because the range of average amounts per clinician is not meaningfully associated with patient outcomes, at least among nurse anesthetists in 1 tertiary center.
Collapse
Affiliation(s)
| | | | - Kamal Maheshwari
- From the Department of Outcomes Research.,Department of General Anesthesiology, Anesthesiology Institute
| | - Liu Liu
- From the Department of Outcomes Research.,Department of Quantitative Health Sciences, Cleveland Clinic, Cleveland, Ohio
| | - Janet Adegboye
- Cleveland Clinic Lerner College of Medicine, Cleveland, Ohio
| | - Bernd Saugel
- Department of Anesthesiology, Center of Anesthesiology and Intensive Care Medicine, University Medical Center Hamburg-Eppendorf, Hamburg, Germany.,Outcomes Research Consortium, Cleveland, Ohio
| | - Edward J Mascha
- From the Department of Outcomes Research.,Department of Quantitative Health Sciences, Cleveland Clinic, Cleveland, Ohio
| |
Collapse
|
8
|
Ragsdale JW, Berry A, Gibson JW, Herber-Valdez CR, Germain LJ, Engle DL. Evaluating the effectiveness of undergraduate clinical education programs. MEDICAL EDUCATION ONLINE 2020; 25:1757883. [PMID: 32352355 PMCID: PMC7241512 DOI: 10.1080/10872981.2020.1757883] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/28/2020] [Revised: 04/07/2020] [Accepted: 04/09/2020] [Indexed: 06/11/2023]
Abstract
Medical schools should use a variety of measures to evaluate the effectiveness of their clinical curricula. Both outcome measures and process measures should be included, and these can be organized according to the four-level training evaluation model developed by Donald Kirkpatrick. Managing evaluation data requires the institution to employ deliberate strategies to monitor signals in real-time and aggregate data so that informed decisions can be made. Future steps in program evaluation includes increased emphasis on patient outcomes and multi-source feedback, as well as better integration of existing data sources.
Collapse
Affiliation(s)
- John W. Ragsdale
- Assistant Dean for Clinical Education, University of Kentucky College of Medicine, Lexington, KY, USA
| | - Andrea Berry
- Executive Director of Faculty Life, University of Central Florida College of Medicine, Orlando, FL, USA
| | - Jennifer W. Gibson
- Director, Office of Medical Education, Tulane University School of Medicine, New Orleans, LA, USA
| | - Christiane R. Herber-Valdez
- Assistant Professor, Department of Medical Education, Paul L. Foster School of Medicine, Texas Tech University Health Sciences Center at El Paso, El Paso, TX, USA
- Managing Director, Office of Institutional Research and Effectiveness, Texas Tech University Health Sciences Center at El Paso, El Paso, TX, USA
| | - Lauren J. Germain
- Director of Evaluation, Assessment and Research; Assistant Professor, Public Health and Preventive Medicine, SUNY Upstate Medical University, Syracuse, NY, USA
| | - Deborah L. Engle
- Assistant Dean, Assessment and Evaluation, Duke University School of Medicine, Durham, NC, USA
| | | |
Collapse
|
9
|
Abstract
Introduction: Anesthesiology requires procedure fulfillment, problem, and real-time crisis resolution, problem, and complications forecast, among others; therefore, the evaluation of its learning should center around how students achieve competence rather than solely focusing on knowledge acquisition. Literature shows that despite the existence of numerous evaluation strategies, these are still underrated in most cases due to unawareness.
Objective: The present article aims to explain the process of competency-based anesthesiology assessment, in addition to suggesting a brief description of the learning domains evaluated, theories of knowledge, instruments, and assessment systems in the area; and finally, to show some of the most relevant results regarding assessment systems in Colombia.
Methodology: The results obtained in “Characteristics of the evaluation systems used by anesthesiology residency programs stakeholders in the educational process, a fact that motivated the publishing of this discussion around the topic of competency-based assessment in anesthesiology. Following a bibliography search with the keywords through PubMed, OVID, ERIC, DIALNET, and REDALYC, 110 articles were reviewed and 75 were established as relevant for the research’s theoretical framework.
Results and conclusion: Anesthesiology assessment should be conceived from the competency’s multidimensionality; it must be longitudinal and focused on the learning objectives.
Collapse
|
10
|
Tenny SO, Schmidt KP, Thorell WE. Pilot project to assess and improve neurosurgery resident and staff perception of feedback to residents for self-improvement goal formation. J Neurosurg 2020; 132:1261-1264. [DOI: 10.3171/2018.11.jns181664] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/11/2018] [Accepted: 11/21/2018] [Indexed: 11/06/2022]
Abstract
OBJECTIVEThe Accreditation Council for Graduate Medical Education (ACGME) has pushed for more frequent and comprehensive feedback for residents during their training, but there is scant evidence for how neurosurgery residents view the current feedback system as it applies to providing information for self-improvement and goal formation. The authors sought to assess neurosurgery resident and staff perceptions of the current resident feedback system in providing specific, meaningful, achievable, realistic, and timely (SMART) goals. The authors then created a pilot project to improve the most unfavorably viewed aspect of the feedback system.METHODSThe authors conducted an anonymous survey of neurosurgery residents and staff at an academic medical institution to assess SMART goals for resident feedback and used the results to create a pilot intervention to address the most unfavorably viewed aspect of the feedback system. The authors then conducted a postintervention survey to see if perceptions had improved for the target of the intervention.RESULTSNeurosurgery residents and staff completed an anonymous online survey, for which the results indicated that resident feedback was not occurring in a timely manner. The authors created a simple anonymous feedback form. The form was distributed monthly to neurosurgery residents, neurosurgical staff, and nurses, and the results were reported monthly to each resident for 6 months. A postintervention survey was then administered, and the results indicated that the opinions of the neurosurgery residents and staff on the timeliness of resident feedback had changed from a negative to a nonnegative opinion (p = 0.01).CONCLUSIONSThe required ACGME feedback methods may not be providing adequate feedback for goal formation for self-improvement for neurosurgery residents. Simple interventions, such as anonymous feedback questionnaires, can improve neurosurgery resident and staff perception of feedback to residents for self-improvement and goal formation.
Collapse
|
11
|
Brennan N, Price T, Archer J, Brett J. Remediating professionalism lapses in medical students and doctors: A systematic review. MEDICAL EDUCATION 2020; 54:196-204. [PMID: 31872509 DOI: 10.1111/medu.14016] [Citation(s) in RCA: 18] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/05/2019] [Revised: 07/08/2019] [Accepted: 10/07/2019] [Indexed: 06/10/2023]
Abstract
CONTEXT A remediation intervention aims to facilitate the improvement of an individual whose competence in a particular skill has dropped below the level expected. Little is known regarding the effectiveness of remediation, especially in the area of professionalism. This review sought to identify and assess the effectiveness of interventions to remediate professionalism lapses in medical students and doctors. METHODS Databases Embase, MEDLINE, Education Resources Information Center and the British Education Index were searched in September 2017 and October 2018. Studies reporting interventions to remediate professionalism lapses in medical students and doctors were included. A standardised data extraction form incorporating a previously described behaviour change technique taxonomy was utilised. A narrative synthesis approach was adopted. Quality was assessed using the Critical Appraisal Skills Programme checklist. RESULTS A total of 19 studies on remediation interventions reported in 23 articles were identified. Of these, 13 were case studies, five were cohort studies and one was a qualitative study; 37% targeted doctors, 26% medical students, 16% residents and 21% involved mixed populations. Most interventions were multifaceted and addressed professionalism issues concomitantly with clinical skills, but some focused on specific areas (eg sexual boundaries and disruptive behaviours). Most used three or more behaviour change techniques. The included studies were predominantly of low quality as 13 of the 19 were case studies. It was difficult to assess the effectiveness of the interventions as the majority of studies did not carry out any evaluation. CONCLUSIONS The review identifies a paucity of evidence to guide best practice in the remediation of professionalism lapses in medical students and doctors. The literature tentatively suggests that remediating lapses in professionalism, as part of a wider programme of remediation, can facilitate participants to graduate from a programme of study, and pass medical licensing and mock oral board examinations. However, it is not clear from this literature whether these interventions are successful in remediating lapses in professionalism specifically. Further research is required to improve the design and evaluation of interventions to remediate professionalism lapses.
Collapse
Affiliation(s)
- Nicola Brennan
- Collaboration for the Advancement of Medical Education Research and Assessment (CAMERA), Peninsula Medical School, Faculty of Health: Medicine, Dentistry and Human Sciences, University of Plymouth, Plymouth, UK
| | - Tristan Price
- Collaboration for the Advancement of Medical Education Research and Assessment (CAMERA), Peninsula Medical School, Faculty of Health: Medicine, Dentistry and Human Sciences, University of Plymouth, Plymouth, UK
| | - Julian Archer
- Royal Australasian College of Surgeons, Melbourne, Victoria, Australia
| | - Joe Brett
- Emergency Department, Hammersmith Hospital, Imperial College Healthcare NHS Trust, London, UK
| |
Collapse
|
12
|
Learners and Luddites in the Twenty-first Century: Bringing Evidence-based Education to Anesthesiology. Anesthesiology 2020; 131:908-928. [PMID: 31365369 DOI: 10.1097/aln.0000000000002827] [Citation(s) in RCA: 20] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/14/2022]
Abstract
Anesthesiologists are both teachers and learners and alternate between these roles throughout their careers. However, few anesthesiologists have formal training in the methodologies and theories of education. Many anesthesiology educators often teach as they were taught and may not be taking advantage of current evidence in education to guide and optimize the way they teach and learn. This review describes the most up-to-date evidence in education for teaching knowledge, procedural skills, and professionalism. Methods such as active learning, spaced learning, interleaving, retrieval practice, e-learning, experiential learning, and the use of cognitive aids will be described. We made an effort to illustrate the best available evidence supporting educational practices while recognizing the inherent challenges in medical education research. Similar to implementing evidence in clinical practice in an attempt to improve patient outcomes, implementing an evidence-based approach to anesthesiology education may improve learning outcomes.
Collapse
|
13
|
Pinto-Powell R, Lahey T. Just a Game: the Dangers of Quantifying Medical Student Professionalism. J Gen Intern Med 2019; 34:1641-1644. [PMID: 31147979 PMCID: PMC6667566 DOI: 10.1007/s11606-019-05063-x] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/21/2018] [Revised: 01/16/2019] [Accepted: 04/11/2019] [Indexed: 10/26/2022]
Abstract
A medical student on her internal medicine clerkship says her numerical medical professionalism grade was "just a game." Building on this anecdote, we suggest there is good reason to believe that numerical summative assessments of medical student professionalism can, paradoxically, undermine medical student professionalism by sapping internal motivation and converting conversations about core professional values into just another hurdle to residency. We suggest better ways of supporting medical student professional development, including a portfolio comprised of written personal reflection and periodic 360° formative assessment in the context of longitudinal faculty coaching.
Collapse
Affiliation(s)
- Roshini Pinto-Powell
- Department of Medicine, Geisel School of Medicine at Dartmouth, Hanover, NH, USA. .,Dartmouth-Hitchcock Medical Center, Lebanon, NH, USA. .,Department of Medical Education, Geisel School of Medicine at Dartmouth, Hanover, NH, USA.
| | - Timothy Lahey
- Larner College of Medicine, University of Vermont, Burlington, VT, USA.,The University of Vermont Medical Center, Burlington, VT, USA
| |
Collapse
|
14
|
Lockman JL, Yehya N, Schwartz AJ, Cronholm PF. Professionalism in pediatric anesthesiology: Affirmation of a definition based on results of a nationally administered survey of pediatric anesthesiologists. Paediatr Anaesth 2019; 29:345-352. [PMID: 30710425 PMCID: PMC7029412 DOI: 10.1111/pan.13598] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/30/2018] [Revised: 01/04/2019] [Accepted: 01/22/2019] [Indexed: 11/27/2022]
Abstract
BACKGROUND Previously published work established the need for a specialty-specific definition of professionalism in pediatric anesthesiology. That work established a composite definition consisting of 11 domains and their component "defining themes" for professionalism in pediatric anesthesiology. As a next step toward assessing generalizability of our single-center findings, we sought to gain input from a national sample of pediatric anesthesiologists. AIMS The aim of this study was to establish the construct validity of our previously published multidimensional definition of professionalism in pediatric anesthesiology using a nationally representative sample of pediatric anesthesiologists. METHODS A survey was distributed via snowball sampling to the leaders of every pediatric anesthesiology fellowship program and pediatric anesthesia department or clinical division in the United States. Survey items were designed to validate individual component themes in the working definition. For affirmed items, the respondent was asked to rate the importance of the item. Respondents were also invited to suggest novel themes to be included in the definition. RESULTS A total of 216 pediatric anesthesiologists representing a variety of experience levels and practice settings responded to the survey. All 40 themes were strongly supported by the respondents, with the least supported theme receiving 71.6% approval. 92.8% of respondents indicated that the 11 domains previously identified formed a comprehensive list of domains for professionalism in pediatric anesthesiology. Four additional novel themes were suggested by respondents, including wellness/self-care/burnout prevention, political advocacy, justice within a practice organization, and respect for leadership/experienced partners. These are topics for future study. The survey responses also indicated a near-universal agreement that didactic lectures would be ineffective for teaching professionalism. CONCLUSION This national survey of pediatric anesthesiologists serves to confirm the construct validity of our prior working definition of professionalism in pediatric anesthesiology, and has uncovered several opportunities for further study. This definition can be used for both curriculum and policy development within the specialty.
Collapse
Affiliation(s)
- Justin L. Lockman
- Department of Anesthesiology and Critical Care Medicine, The Children’s Hospital of Philadelphia, Perelman School of Medicine at The University of Pennsylvania, Philadelphia, USA
| | - Nadir Yehya
- Department of Anesthesiology and Critical Care Medicine, The Children’s Hospital of Philadelphia, Perelman School of Medicine at The University of Pennsylvania, Philadelphia, USA
| | - Alan Jay Schwartz
- Department of Anesthesiology and Critical Care Medicine, The Children’s Hospital of Philadelphia, Perelman School of Medicine at The University of Pennsylvania, Philadelphia, USA
| | - Peter F. Cronholm
- Department of Family Medicine and Community Health, Perelman School of Medicine at The University of Pennsylvania, Philadelphia, USA,Center for Public Health Initiatives, Perelman School of Medicine at The University of Pennsylvania, Philadelphia, USA,Leonard Davis Institute of Health Economics, Perelman School of Medicine at The University of Pennsylvania, Philadelphia, USA
| |
Collapse
|
15
|
DeBlasio D, Real FJ, Ollberding NJ, Klein MD. Provision of Parent Feedback via the Communication Assessment Tool: Does It Improve Resident Communication Skills? Acad Pediatr 2019; 19:152-156. [PMID: 29981855 DOI: 10.1016/j.acap.2018.06.013] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/10/2018] [Revised: 06/17/2018] [Accepted: 06/30/2018] [Indexed: 11/19/2022]
Abstract
OBJECTIVE To determine the impact of a curriculum that included parent feedback, via the Communication Assessment Tool (CAT), on resident communication skills. METHODS In a prospective, controlled study, categorical pediatric residents in continuity clinic were divided into control and intervention groups based on clinic day. Parent feedback was obtained for all residents at the beginning and end of the year using the CAT, a validated survey to assess physician communication. Intervention residents participated in learning conferences that reviewed communication best practices and received parental feedback via individual and group CAT scores. Scores were dichotomized as 5 (excellent) versus 1 to 4 (less than excellent) and reported as percentage of items rated excellent. Curriculum impact was assessed by comparing score changes between groups. Residents' scores in both arms were combined to assess changes from the beginning of the year to the end. Statistical testing was performed using generalized linear mixed-effects models. RESULTS All residents (N = 68) participated. Intervention (n = 38) and control (n = 30) residents received at least 10 CATs at the beginning and end of the year. The percentage of parents rating all items as excellent increased by similar percentages in intervention and control groups (60.9%-73.8% vs 61.1%-69.8; P = .38). When scores of residents in both arms were combined, improvement was found from the beginning to the end of the year for all CAT items (P < .001). CONCLUSIONS A curriculum including parent feedback from CATs did not significantly impact communication skills. However, communication skills improved over the year in intervention and control groups, suggesting that communication training occurs in multiple settings.
Collapse
Affiliation(s)
- Dominick DeBlasio
- Division of General and Community Pediatrics (D DeBlasio, FJ Real, and MD Klein),; Department of Pediatrics, University of Cincinnati College of Medicine (D DeBlasio, FJ Real, NJ Ollberding, and MD Klein), Cincinnati, Ohio.
| | - Francis J Real
- Division of General and Community Pediatrics (D DeBlasio, FJ Real, and MD Klein),; Department of Pediatrics, University of Cincinnati College of Medicine (D DeBlasio, FJ Real, NJ Ollberding, and MD Klein), Cincinnati, Ohio
| | - Nicholas J Ollberding
- Division of Biostatistics and Epidemiology (NJ Ollberding), Cincinnati Children's Hospital Medical Center; Department of Pediatrics, University of Cincinnati College of Medicine (D DeBlasio, FJ Real, NJ Ollberding, and MD Klein), Cincinnati, Ohio
| | - Melissa D Klein
- Division of General and Community Pediatrics (D DeBlasio, FJ Real, and MD Klein),; Department of Pediatrics, University of Cincinnati College of Medicine (D DeBlasio, FJ Real, NJ Ollberding, and MD Klein), Cincinnati, Ohio
| |
Collapse
|
16
|
Kwan YH, Png K, Phang JK, Leung YY, Goh H, Seah Y, Thumboo J, Ng ASC, Fong W, Lie D. A Systematic Review of the Quality and Utility of Observer-Based Instruments for Assessing Medical Professionalism. J Grad Med Educ 2018; 10:629-638. [PMID: 30619519 PMCID: PMC6314360 DOI: 10.4300/jgme-d-18-00086.1] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/24/2018] [Revised: 08/16/2018] [Accepted: 09/14/2018] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND Professionalism, which encompasses behavioral, ethical, and related domains, is a core competency of medical practice. While observer-based instruments to assess medical professionalism are available, information on their psychometric properties and utility is limited. OBJECTIVE We systematically reviewed the psychometric properties and utility of existing observer-based instruments for assessing professionalism in medical trainees. METHODS After selecting eligible studies, we employed the COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) criteria to score study methodological quality. We identified eligible instruments and performed quality assessment of psychometric properties for each selected instrument. We scored the utility of each instrument based on the ability to distinguish performance levels over time, availability of objective scoring criteria, validity evidence in medical students and residents, and instrument length. RESULTS Ten instruments from 16 studies met criteria for consideration, with studies having acceptable methodological quality. Psychometric properties were variably assessed. Among 10 instruments, the Education Outcomes Service (EOS) group questionnaire and Professionalism Mini-Evaluation Exercise (P-MEX) possessed the best psychometric properties, with the P-MEX scoring higher on utility than the EOS group questionnaire. CONCLUSIONS We identified 2 instruments with best psychometric properties, with 1 also showing acceptable utility for assessing professionalism in trainees. The P-MEX may be an option for program directors to adopt as an observer-based instrument for formative assessment of medical professionalism. Further studies of the 2 instruments to aggregate additional validity evidence is recommended, particularly in the domain of content validity before they are used in specific cultural settings and in summative assessments.
Collapse
|
17
|
Abstract
PURPOSE OF REVIEW Assessment of the current literature surrounding interventions directed toward the prevention of burnout in the field of medicine and particularly in anesthesiology. RECENT FINDINGS Recently, burnout has been noted to lead to medication errors and subsequently increased harm to our patients. On a personal level, burnout can lead to depression and even suicide amongst physicians. Strategies to prevent burnout amongst anesthesiologists that have been studied in the literature include multisource feedback, mentorship and early recognition. SUMMARY There remains no clear or definitive intervention to prevent burnout for physicians. However, changing our environment to embrace mentorship, the continual exchange of feedback and the fostering self-care could startlingly improve our work environment.
Collapse
|
18
|
Faucett EA, McCrary HC, Barry JY, Saleh AA, Erman AB, Ishman SL. High-Quality Feedback Regarding Professionalism and Communication Skills in Otolaryngology Resident Education. Otolaryngol Head Neck Surg 2017; 158:36-42. [DOI: 10.1177/0194599817737758] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Objective The Accreditation Council for Graduate Medical Education (ACGME) requires competency-based education for residents and recommends 5 basic features of high-quality feedback. Our aim was to examine the incorporation of feedback in articles regarding professionalism and interpersonal/communication skills for otolaryngology residency training curriculum. Data Sources PubMed, Embase, ERIC, Cochrane Library, Web of Science, Scopus, and ClinicalTrials.gov . Methods We used studies identified during a systematic review of all indexed years through October 4, 2016. Results Eighteen studies were included in this review. Professionalism was discussed in 16, of which 15 (94%) examined aspects of feedback. Interpersonal/communication skills were the focus of 16 articles, of which 14 16 (88%) discussed aspects of feedback. Our assessment demonstrated that timeliness was addressed in 8 (44%) articles, specificity in 4 (22%), learner reaction and reflection in 4 (22%), action plans in 3 (20%), and balancing reinforcing/corrective feedback in 2 (13%). Two articles did not address feedback, and 6 did not address aspects of high-quality feedback. The ACGME-recommended feedback systems of ADAPT (ask, discuss, ask, plan together) and R2C2 (relationship, reactions, content, and coach) were not reported in any of the studies. Conclusion Feedback is an essential component of graduate medical education and is required by the ACGME milestones assessment system. However, the core feedback components recommended by the ACGME are rarely included in the otolaryngology resident education literature.
Collapse
Affiliation(s)
- Erynne A. Faucett
- Department of Otolaryngology and Head and Neck Surgery, College of Medicine, University of Arizona, Tucson, Arizona, USA
| | | | - Jonnae Y. Barry
- Department of Otolaryngology and Head and Neck Surgery, College of Medicine, University of Arizona, Tucson, Arizona, USA
| | - Ahlam A. Saleh
- Health Sciences Library, University of Arizona, Tucson, Arizona, USA
| | - Audrey B. Erman
- Department of Otolaryngology and Head and Neck Surgery, College of Medicine, University of Arizona, Tucson, Arizona, USA
| | - Stacey L. Ishman
- Division of Pediatric Otolaryngology Head and Neck Surgery & Division of Pulmonary Medicine, Cincinnati Children’s Hospital Medical Center, Cincinnati, Ohio, USA
- Department of Otolaryngology, Head and Neck Surgery, University of Cincinnati College of Medicine Cincinnati, Ohio, USA
| |
Collapse
|