1
|
Johnson NR, Pelletier A, Berkowitz LR. Mini-Clinical Evaluation Exercise in the Era of Milestones and Entrustable Professional Activities in Obstetrics and Gynaecology: Resume or Reform? JOURNAL OF OBSTETRICS AND GYNAECOLOGY CANADA 2020; 42:718-725. [PMID: 31882285 DOI: 10.1016/j.jogc.2019.10.002] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2019] [Revised: 10/01/2019] [Accepted: 10/02/2019] [Indexed: 11/29/2022]
Abstract
OBJECTIVE The Accreditation Council for Graduate Medical Education (ACGME) milestones and the core Entrustable Professional Activities (EPAs) provide guiding frameworks and requirements for assessing residents' progress. The Mini-Clinical Evaluation Exercise (Mini-CEX) is a formative assessment tool used to provide direct observation after an ambulatory or clinical encounter. This study aimed to investigate the feasibility and reliability of the Mini-CEX in the authors' obstetrics and gynaecology (OB/GYN) residency program and its ability to measure residents' progress and competencies in the frameworks of ACGME milestones and EPAs. METHODS OB/GYN residents' 5-academic-year Mini-CEX performance was analyzed retrospectively to measure reliability and feasibility. Additionally, realistic evaluation was conducted to assess the usefulness of Mini-CEX in the frameworks of ACGME milestones and EPAs. RESULTS A total of 395 Mini-CEX evaluations for 49 OB/GYN residents were analyzed. Mini-CEX evaluation data significantly discriminated among residents' training levels (P < 0.003). Residents had an average of 8.1 evaluations per resident completed; 10% of second-year residents and 28% of third-year residents were evaluated 10 or more times per year, whereas no postgraduate year 1 or postgraduate year 4 residents achieved this number. Mini-CEX data could contribute to all 6 primary measurement domains of OB/GYN milestones and 8 of 10 EPAs required for first-year residents. CONCLUSION The Mini-CEX demonstrated potential for measuring residents' clinical competencies in their ACGME milestones. Faculty time commitment was the main challenge. Reform is necessary for the current feedback structure in Mini-CEX, faculty development, and operational guidelines that help residency programs match residents' clinical competency ratings with ACGME milestones and EPAs.
Collapse
Affiliation(s)
- Natasha R Johnson
- Department of Obstetrics and Gynecology, Brigham and Women's Hospital, Boston, MA; Department of Obstetrics, Gynecology and Reproductive Biology, Harvard Medical School, Boston, MA.
| | - Andrea Pelletier
- Department of Obstetrics and Gynecology, Brigham and Women's Hospital, Boston, MA
| | - Lori R Berkowitz
- Department of Obstetrics, Gynecology and Reproductive Biology, Harvard Medical School, Boston, MA; Department of Obstetrics and Gynecology, Massachusetts General Hospital, Boston, MA
| |
Collapse
|
2
|
CAEP 2019 Academic Symposium: Got competence? Best practices in trainee progress decisions. CAN J EMERG MED 2020; 22:187-193. [DOI: 10.1017/cem.2019.480] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
ABSTRACTBackgroundCompetence committees play a key role in a competency-based system of assessment. These committees are tasked with reviewing and synthesizing clinical performance data to make judgments regarding residents’ competence. Canadian emergency medicine (EM) postgraduate training programs recently implemented competence committees; however, a paucity of literature guides their work.ObjectiveThe objective of this study was to develop consensus-based recommendations to optimize the function and decisions of competence committees in Canadian EM training programs.MethodsSemi-structured interviews of EM competence committee chairs were conducted and analyzed. The interview guide was informed by a literature review of competence committee structure, processes, and best practices. Inductive thematic analysis of interview transcripts was conducted to identify emerging themes. Preliminary recommendations, based on themes, were drafted and presented at the 2019 CAEP Academic Symposium on Education. Through a live presentation and survey poll, symposium attendees representing the national EM community participated in a facilitated discussion of the recommendations. The authors incorporated this feedback and identified consensus among symposium attendees on a final set of nine high-yield recommendations.ConclusionThe Canadian EM community used a structured process to develop nine best practice recommendations for competence committees addressing: committee membership, meeting processes, decision outcomes, use of high-quality performance data, and ongoing quality improvement. These recommendations can inform the structure and processes of competence committees in Canadian EM training programs.
Collapse
|
3
|
Berkowitz LR, Johnson NR, Muret-Wagstaff S. Guided Self-Assessment and Action Plans: What Do Residents Need to Succeed? MEDICAL SCIENCE EDUCATOR 2020; 30:57-59. [PMID: 34457637 PMCID: PMC8368872 DOI: 10.1007/s40670-019-00853-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
Resident feedback and program evaluation are essential to ACGME-accredited training programs. We sought to integrate these requirements into our program by creating a systematic process for program improvement focusing on personal learning plans (PLPs). Residents completed a PLP tool every 6 months, followed by an evaluation completed with the program director. Among respondents, 96% reported the PLP process provided useful feedback. A majority found the PLP process useful in developing learning strategies and modeling lifelong learning. The integrated PLP/program improvement process serves as an effective strategy for quickly identifying and capitalizing on both individual and program opportunities for improvement.
Collapse
Affiliation(s)
- Lori R. Berkowitz
- Harvard Medical School, Boston, MA USA
- Massachusetts General Hospital, 55 Fruit Street, Founders 452, Boston, MA 02114 USA
| | - Natasha R. Johnson
- Harvard Medical School, Boston, MA USA
- Brigham and Women’s Hospital, Boston, MA USA
| | | |
Collapse
|
4
|
Cai H, Jiang Z, Chen X, Bailey JL. How to Successfully Train a Modern Nephrologist: Experience from US Fellowship Training Practice. KIDNEY DISEASES 2019; 5:204-210. [PMID: 31768377 DOI: 10.1159/000502976] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/23/2019] [Accepted: 08/25/2019] [Indexed: 11/19/2022]
Abstract
Background Graduate medical education varies in different countries. There is a general consensus in training methods, including residency and fellowship training systems. The graduate medical education system in western countries including the UK and the USA has been shown to be successful. The new graduate medical education training system in China was recently established and is still evolving and being implemented nationally. Summary This paper reviews the history of nephrology training programs in the USA, the role of the Accreditation Council for Graduate Medical Education (ACGME) in establishing and enforcing guidelines and curriculum for specialty training programs, the fellowship application and Match system for the recruitment of prospective fellows, and the quality control of fellowship training programs through rigorous evaluation and In-Training examination. This review specifically discusses the nephrology subspecialty fellowship and ACGME-accredited training programs in nephrology. The authors also provide several critical suggestions on the newly established postgraduate medical education training system in China, particularly in nephrology, based on experiences from successful US nephrology fellowship practices. Key Messages The ACGME-accredited nephrology fellowship program has been shown to be effective and successful, which could provide an insight into the newly established graduate medical education training system in China. The authors are optimistic that reforms in Chinese medical training systems will be successful in the near future.
Collapse
Affiliation(s)
- Hui Cai
- Renal Division, Emory University School of Medicine, Atlanta, Georgia, USA
| | - Zhong Jiang
- Department of Pathology, University of Massachusetts Medical Center, Worcester, Massachusetts, USA
| | - Xiongying Chen
- Hospitalist Program, Jackson Hospital, Montgomery, Alabama, USA
| | - James L Bailey
- Renal Division, Emory University School of Medicine, Atlanta, Georgia, USA
| |
Collapse
|
5
|
Warm EJ, Kinnear B, Kelleher M, Sall D, Holmboe E. Transforming Resident Assessment: An Analysis Using Deming's System of Profound Knowledge. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2019; 94:195-201. [PMID: 30334842 DOI: 10.1097/acm.0000000000002499] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/25/2023]
Abstract
W. Edwards Deming, in his System of Profound Knowledge, asserts that leaders who wish to transform a system should understand four essential elements: appreciation for a system, theory of knowledge, knowledge about variation, and psychology. The Accreditation Council for Graduate Medical Education (ACGME) introduced the milestones program as a part of the Next Accreditation System to create developmental language for the six core competencies and facilitate programmatic assessment within graduate medical education systems. Viewed through Deming's lens, the ACGME can be seen as the steward of a large system, with everyone who provides assessment data as workers in that system. The authors use Deming's framework to illustrate the working components of the assessment system of the University of Cincinnati College of Medicine's internal medicine residency program and draw parallels to the macrocosm of graduate medical education. Successes and failures in transforming resident assessment can be understood and predicted by identifying the system and its aims, turning information into knowledge, developing an understanding of variation, and appreciating the psychology of motivation of participants. The authors offer insights from their experience for educational leaders who wish to apply Deming's elements to their own assessment systems, with questions to explore, pitfalls to avoid, and practical approaches in doing this type of work.
Collapse
Affiliation(s)
- Eric J Warm
- E.J. Warm is professor of medicine and program director, Department of Internal Medicine, University of Cincinnati College of Medicine, Cincinnati, Ohio; ORCID: https://orcid.org/0000-0002-6088-2434. B. Kinnear is assistant professor of medicine and pediatrics and associate program director, Department of Internal Medicine, University of Cincinnati College of Medicine, Cincinnati, Ohio. M. Kelleher is assistant professor of medicine and pediatrics and associate program director, Department of Internal Medicine, University of Cincinnati College of Medicine, Cincinnati, Ohio. D. Sall is assistant professor of medicine and associate program director, Department of Internal Medicine, University of Cincinnati College of Medicine, Cincinnati, Ohio. E. Holmboe is senior vice president, Milestones Development and Evaluation, Accreditation Council for Graduate Medical Education, Chicago, Illinois; adjunct professor of medicine, Yale University, New Haven, Connecticut; and adjunct professor, Feinberg School of Medicine at Northwestern University, Chicago, Illinois
| | | | | | | | | |
Collapse
|
6
|
Kinnear B, Warm EJ, Hauer KE. Twelve tips to maximize the value of a clinical competency committee in postgraduate medical education. MEDICAL TEACHER 2018; 40:1110-1115. [PMID: 29944025 DOI: 10.1080/0142159x.2018.1474191] [Citation(s) in RCA: 41] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/10/2023]
Abstract
Medical education has shifted to a competency-based paradigm, leading to calls for improved learner assessment methods and validity evidence for how assessment data are interpreted. Clinical competency committees (CCCs) use the collective input of multiple people to improve the validity and reliability of decisions made and actions taken based on assessment data. Significant heterogeneity in CCC structure and function exists across postgraduate medical education programs and specialties, and while there is no "one-size-fits-all" approach, there are ways to maximize value for learners and programs. This paper collates available evidence and the authors' experiences to provide practical tips on CCC purpose, membership, processes, and outputs. These tips can benefit programs looking to start a CCC and those that are improving their current CCC processes.
Collapse
Affiliation(s)
- Benjamin Kinnear
- a Internal Medicine and Pediatrics , University of Cincinnati College of Medicine , Cincinnati , OH , USA
| | - Eric J Warm
- b Richard W. Vilter Professor of Medicine , University of Cincinnati College of Medicine , Cincinnati , OH , USA
| | - Karen E Hauer
- c Medicine , University of California, San Francisco School of Medicine , San Francisco , CA , USA
| |
Collapse
|
7
|
Holmboe ES, Sherbino J, Englander R, Snell L, Frank JR. A call to action: The controversy of and rationale for competency-based medical education. MEDICAL TEACHER 2017; 39:574-581. [PMID: 28598742 DOI: 10.1080/0142159x.2017.1315067] [Citation(s) in RCA: 140] [Impact Index Per Article: 20.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/12/2023]
Abstract
Although medical education has enjoyed many successes over the last century, there is a recognition that health care is too often unsafe and of poor quality. Errors in diagnosis and treatment, communication breakdowns, poor care coordination, inappropriate use of tests and procedures, and dysfunctional collaboration harm patients and families around the world. These issues reflect on our current model of medical education and raise the question: Are physicians being adequately prepared for twenty-first century practice? Multiple reports have concluded the answer is "no." Concurrent with this concern is an increasing interest in competency-based medical education (CBME) as an approach to help reform medical education. The principles of CBME are grounded in providing better and safer care. As interest in CBME has increased, so have criticisms of the movement. This article summarizes and addresses objections and challenges related to CBME. These can provide valuable feedback to improve CBME implementation and avoid pitfalls. We strongly believe medical education reform should not be reduced to an "either/or" approach, but should blend theories and approaches to suit the needs and resources of the populations served. The incorporation of milestones and entrustable professional activities within existing competency frameworks speaks to the dynamic evolution of CBME, which should not be viewed as a fixed doctrine, but rather as a set of evolving concepts, principles, tools, and approaches that can enable important reforms in medical education that, in turn, enable the best outcomes for patients.
Collapse
Affiliation(s)
- Eric S Holmboe
- a Accreditation Council for Graduate Medical Education , Chicago , IL , USA
| | - Jonathan Sherbino
- b Division of Emergency Medicine, Department of Medicine , McMaster University , Hamilton , Canada
| | - Robert Englander
- c School of Medicine, University of Minnesota , Minneapolis , MN , USA
| | - Linda Snell
- d Centre for Medical and Department of General Internal Medicine , McGill University , Montreal , Quebec , Canada
- e Royal College of Physicians and Surgeons of Canada , Ottawa , Canada
| | - Jason R Frank
- e Royal College of Physicians and Surgeons of Canada , Ottawa , Canada
- f Department of Emergency Medicine , University of Ottawa , Ottawa , Canada
| |
Collapse
|
8
|
Klick JC, Friebert S, Hutton N, Osenga K, Pituch KJ, Vesel T, Weidner N, Block SD, Morrison LJ. Developing competencies for pediatric hospice and palliative medicine. Pediatrics 2014; 134:e1670-7. [PMID: 25404726 DOI: 10.1542/peds.2014-0748] [Citation(s) in RCA: 22] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Abstract
In 2006, hospice and palliative medicine (HPM) became an officially recognized subspecialty. This designation helped initiate the Accreditation Council of Graduate Medical Education Outcomes Project in HPM. As part of this process, a group of expert clinician-educators in HPM defined the initial competency-based outcomes for HPM fellows (General HPM Competencies). Concurrently, these experts recognized and acknowledged that additional expertise in pediatric HPM would ensure that the competencies for pediatric HPM were optimally represented. To fill this gap, a group of pediatric HPM experts used a product development method to define specific Pediatric HPM Competencies. This article describes the development process. With the ongoing evolution of HPM, these competencies will evolve. As part of the Next Accreditation System, the Accreditation Council of Graduate Medical Education uses milestones as a framework to better define competency-based, measurable outcomes for trainees. Currently, there are no milestones specific to HPM, although the field is designing curricular milestones with multispecialty involvement, including pediatrics. These competencies are the conceptual framework for the pediatric content in the HPM milestones. They are specific to the pediatric HPM subspecialist and should be integrated into the training of pediatric HPM subspecialists. They will serve a foundational role in HPM and should inform a wide range of emerging innovations, including the next evolution of HPM Competencies, development of HPM curricular milestones, and training of adult HPM and other pediatric subspecialists. They may also inform pediatric HPM outcome measures, as well as standards of practice and performance for pediatric HPM interdisciplinary teams.
Collapse
Affiliation(s)
- Jeffrey C Klick
- Children's Healthcare of Atlanta, Emory University School of Medicine, Atlanta, Georgia;
| | - Sarah Friebert
- Akron Children's Hospital, Northeast Ohio Medical University, Rootstown, Ohio
| | - Nancy Hutton
- Johns Hopkins University School of Medicine, Baltimore, Maryland
| | - Kaci Osenga
- Children's Hospital and Clinics of Minnesota, Minneapolis, Minnesota
| | - Kenneth J Pituch
- Mott Children's Hospital, The University of Michigan School of Medicine, Ann Arbor, Michigan
| | - Tamara Vesel
- Hospice of the North Shore & Greater Boston, Danvers, Massachusetts
| | - Norbert Weidner
- Cincinnati Children's Hospital Medical Center, Cincinnati, Ohio
| | - Susan D Block
- Dana-Farber Cancer Institute and Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts; and
| | - Laura J Morrison
- Yale-New Haven Hospital, Yale University School of Medicine, New Haven, Connecticut
| | | |
Collapse
|
9
|
Lowry BN, Vansaghi LM, Rigler SK, Stites SW. Applying the milestones in an internal medicine residency program curriculum: a foundation for outcomes-based learner assessment under the next accreditation system. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2013; 88:1665-1669. [PMID: 24072132 DOI: 10.1097/acm.0b013e3182a8c756] [Citation(s) in RCA: 24] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/02/2023]
Abstract
In 2010, University of Kansas Medical Center internal medicine residency program leaders concluded that their competency-based curriculum and evaluation system was not sufficient to promote accurate assessment of learners' performance and needed revision to meet the requirements of the Accreditation Council for Graduate Medical Education (ACGME) Next Accreditation System (NAS). Evaluations of learners seldom referenced existing curricular goals and objectives and reflected an "everyone is exceptional, no one is satisfactory" view.The authors identified the American Board of Internal Medicine and ACGME's Developmental Milestones for Internal Medicine Residency Training as a published standard for resident development. They incorporated the milestones into templates, a format that could be modified for individual rotations. A milestones-based curriculum for each postgraduate year of training and every rotation was then created, with input from educational leaders within each division in the Department of Internal Medicine and with the support of the graduate medical education office.In this article, the authors share their implementation process, which took approximately one year, and discuss their current work to create a documentation system for direct observation of entrustable professional activities, with the aim of providing guidance to other programs challenged with developing an outcomes-based curriculum and assessment system within the time frame of the NAS.
Collapse
Affiliation(s)
- Becky N Lowry
- Dr. Lowry is assistant professor, Department of Internal Medicine, and associate program director for compliance and programming, Internal Medicine Residency Program, University of Kansas Medical Center, Kansas City, Kansas. Dr. Vansaghi is associate professor, Department of Internal Medicine, and program director, Internal Medicine Residency Program, University of Kansas Medical Center, Kansas City, Kansas. Dr. Rigler is professor, Department of Internal Medicine, and director, Office of Scholarly, Academic and Research Mentoring, Department of Internal Medicine, University of Kansas Medical Center, Kansas City, Kansas. Dr. Stites is Peter T. Bohan Professor and Chair, Department of Internal Medicine, University of Kansas School of Medicine, and senior associate dean for clinical affairs, University of Kansas Medical Center, Kansas City, Kansas
| | | | | | | |
Collapse
|
10
|
Finlay K, Probyn L, Ho S. The CanMEDS resume: a useful educational portfolio tool for diagnostic radiology residents. Can Assoc Radiol J 2011; 63:233-6. [PMID: 21873024 DOI: 10.1016/j.carj.2011.02.008] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2010] [Revised: 01/29/2011] [Accepted: 02/02/2011] [Indexed: 11/17/2022] Open
Affiliation(s)
- Karen Finlay
- Department of Radiology, McMaster University, Hamilton, Ontario, Canada.
| | | | | |
Collapse
|
11
|
Thomas MR, Beckman TJ, Mauck KF, Cha SS, Thomas KG. Group assessments of resident physicians improve reliability and decrease halo error. J Gen Intern Med 2011; 26:759-64. [PMID: 21369769 PMCID: PMC3138588 DOI: 10.1007/s11606-011-1670-4] [Citation(s) in RCA: 24] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/06/2010] [Revised: 01/19/2011] [Accepted: 02/11/2011] [Indexed: 11/29/2022]
Abstract
BACKGROUND Individual faculty assessments of resident competency are complicated by inconsistent application of standards, lack of reliability, and the "halo" effect. OBJECTIVE We determined whether the addition of faculty group assessments of residents in an ambulatory clinic, compared with individual faculty-of-resident assessments alone, have better reliability and reduced halo effects. DESIGN This prospective, longitudinal study was performed in the outpatient continuity clinics of a large internal medicine residency program. MAIN MEASURES Faculty-on-resident and group faculty-on-resident assessment scores were used for comparison. KEY RESULTS Overall mean scores were significantly higher for group than individual assessments (3.92 ± 0.51 vs. 3.83 ± 0.38, p = 0.0001). Overall inter-rater reliability increased when combining group and individual assessments compared to individual assessments alone (intraclass correlation coefficient, 95% CI = 0.828, 0.785-0.866 vs. 0.749, 0.686-0.804). Inter-item correlations were less for group (0.49) than individual (0.68) assessments. CONCLUSIONS This study demonstrates improved inter-rater reliability and reduced range restriction (halo effect) of resident assessment across multiple performance domains by adding the group assessment method to traditional individual faculty-on-resident assessment. This feasible model could help graduate medical education programs achieve more reliable and discriminating resident assessments.
Collapse
Affiliation(s)
- Matthew R Thomas
- Division of Primary Care Internal Medicine, Mayo Clinic, 200 First Street SW, Rochester, MN 55905, USA.
| | | | | | | | | |
Collapse
|
12
|
Fragneto RY, DiLorenzo AN, Schell RM, Bowe EA. Evaluating practice-based learning and improvement: efforts to improve acceptance of portfolios. J Grad Med Educ 2010; 2:638-43. [PMID: 22132291 PMCID: PMC3010953 DOI: 10.4300/jgme-d-10-00010.1] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/12/2010] [Revised: 04/15/2010] [Accepted: 07/13/2010] [Indexed: 11/06/2022] Open
Abstract
INTRODUCTION The Accreditation Council for Graduate Medical Education (ACGME) recommends resident portfolios as 1 method for assessing competence in practice-based learning and improvement. In July 2005, when anesthesiology residents in our department were required to start a portfolio, the residents and their faculty advisors did not readily accept this new requirement. Intensive education efforts addressing the goals and importance of portfolios were undertaken. We hypothesized that these educational efforts improved acceptance of the portfolio and retrospectively audited the portfolio evaluation forms completed by faculty advisors. METHODS Intensive education about the goals and importance of portfolios began in January 2006, including presentations at departmental conferences and one-on-one education sessions. Faculty advisors were instructed to evaluate each resident's portfolio and complete a review form. We retrospectively collected data to determine the percentage of review forms completed by faculty. The portfolio reviews also assessed the percentage of 10 required portfolio components residents had completed. RESULTS Portfolio review forms were completed by faculty advisors for 13% (5/38) of residents during the first advisor-advisee meeting in December 2005. Initiation of intensive education efforts significantly improved compliance, with review forms completed for 68% (26/38) of residents in May 2006 (P < .0001) and 95% (36/38) in December 2006 (P < .0001). Residents also significantly improved the completeness of portfolios between May and December of 2006. DISCUSSION Portfolios are considered a best methods technique by the ACGME for evaluation of practice-based learning and improvment. We have found that intensive education about the goals and importance of portfolios can enhance acceptance of this evaluation tool, resulting in improved compliance in completion and evaluation of portfolios.
Collapse
Affiliation(s)
| | - Amy Noel DiLorenzo
- Corresponding author: Mrs. Amy Noel DiLorenzo, MA, Department of Anesthesiology, University of Kentucky, Chandler Medical Center, 800 Rose Street, N217A, Lexington, KY 40536, 859.323.5956 x 80084,
| | | | | |
Collapse
|
13
|
Parker MG. Nephrology Training in the 21st Century: Toward Outcomes-Based Education. Am J Kidney Dis 2010; 56:132-42. [DOI: 10.1053/j.ajkd.2009.11.029] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/25/2009] [Accepted: 11/10/2009] [Indexed: 11/11/2022]
|
14
|
Warm EJ, Schauer D, Revis B, Boex JR. Multisource feedback in the ambulatory setting. J Grad Med Educ 2010; 2:269-77. [PMID: 21975632 PMCID: PMC2941386 DOI: 10.4300/jgme-d-09-00102.1] [Citation(s) in RCA: 22] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/12/2009] [Revised: 01/18/2010] [Accepted: 01/25/2010] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND The Accreditation Council for Graduate Medical Education has mandated multisource feedback (MSF) in the ambulatory setting for internal medicine residents. Few published reports demonstrate actual MSF results for a residency class, and fewer still include clinical quality measures and knowledge-based testing performance in the data set. METHODS Residents participating in a year-long group practice experience called the "long-block" received MSF that included self, peer, staff, attending physician, and patient evaluations, as well as concomitant clinical quality data and knowledge-based testing scores. Residents were given a rank for each data point compared with peers in the class, and these data were reviewed with the chief resident and program director over the course of the long-block. RESULTS Multisource feedback identified residents who performed well on most measures compared with their peers (10%), residents who performed poorly on most measures compared with their peers (10%), and residents who performed well on some measures and poorly on others (80%). Each high-, intermediate-, and low-performing resident had a least one aspect of the MSF that was significantly lower than the other, and this served as the basis of formative feedback during the long-block. CONCLUSION Use of multi-source feedback in the ambulatory setting can identify high-, intermediate-, and low-performing residents and suggest specific formative feedback for each. More research needs to be done on the effect of such feedback, as well as the relationships between each of the components in the MSF data set.
Collapse
Affiliation(s)
- Eric J. Warm
- Corresponding author: Eric J. Warm, MD, Department of Internal Medicine, University of Cincinnati Academic Health Center, 231 Albert Sabin Way, Cincinnati, OH 45267-0557, 513.558.2590,
| | | | | | | |
Collapse
|
15
|
Holmboe ES, Sherbino J, Long DM, Swing SR, Frank JR. The role of assessment in competency-based medical education. MEDICAL TEACHER 2010; 32:676-82. [PMID: 20662580 DOI: 10.3109/0142159x.2010.500704] [Citation(s) in RCA: 517] [Impact Index Per Article: 36.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/07/2023]
Abstract
Competency-based medical education (CBME), by definition, necessitates a robust and multifaceted assessment system. Assessment and the judgments or evaluations that arise from it are important at the level of the trainee, the program, and the public. When designing an assessment system for CBME, medical education leaders must attend to the context of the multiple settings where clinical training occurs. CBME further requires assessment processes that are more continuous and frequent, criterion-based, developmental, work-based where possible, use assessment methods and tools that meet minimum requirements for quality, use both quantitative and qualitative measures and methods, and involve the wisdom of group process in making judgments about trainee progress. Like all changes in medical education, CBME is a work in progress. Given the importance of assessment and evaluation for CBME, the medical education community will need more collaborative research to address several major challenges in assessment, including "best practices" in the context of systems and institutional culture and how to best to train faculty to be better evaluators. Finally, we must remember that expertise, not competence, is the ultimate goal. CBME does not end with graduation from a training program, but should represent a career that includes ongoing assessment.
Collapse
|
16
|
Swing SR, Clyman SG, Holmboe ES, Williams RG. Advancing resident assessment in graduate medical education. J Grad Med Educ 2009; 1:278-86. [PMID: 21975993 PMCID: PMC2931233 DOI: 10.4300/jgme-d-09-00010.1] [Citation(s) in RCA: 119] [Impact Index Per Article: 7.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND The Outcome Project requires high-quality assessment approaches to provide reliable and valid judgments of the attainment of competencies deemed important for physician practice. INTERVENTION The Accreditation Council for Graduate Medical Education (ACGME) convened the Advisory Committee on Educational Outcome Assessment in 2007-2008 to identify high-quality assessment methods. The assessments selected by this body would form a core set that could be used by all programs in a specialty to assess resident performance and enable initial steps toward establishing national specialty databases of program performance. The committee identified a small set of methods for provisional use and further evaluation. It also developed frameworks and processes to support the ongoing evaluation of methods and the longer-term enhancement of assessment in graduate medical education. OUTCOME The committee constructed a set of standards, a methodology for applying the standards, and grading rules for their review of assessment method quality. It developed a simple report card for displaying grades on each standard and an overall grade for each method reviewed. It also described an assessment system of factors that influence assessment quality. The committee proposed a coordinated, national-level infrastructure to support enhancements to assessment, including method development and assessor training. It recommended the establishment of a new assessment review group to continue its work of evaluating assessment methods. The committee delivered a report summarizing its activities and 5 related recommendations for implementation to the ACGME Board in September 2008.
Collapse
Affiliation(s)
- Susan R. Swing
- Corresponding author: Susan Swing, PhD, Accreditation Council for Graduate Medical Education (ACGME), 515 North State Street, Suite 2000, Chicago, IL 60654, 312.755.7447,
| | | | | | | |
Collapse
|
17
|
Nagler A, Andolsek K, Padmore JS. The unintended consequences of portfolios in graduate medical education. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2009; 84:1522-1526. [PMID: 19858808 DOI: 10.1097/acm.0b013e3181bb2636] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/28/2023]
Abstract
Portfolios have emerged in graduate medical education despite lack of consensus on their definition, purpose, or usefulness. Portfolios can be used as a tool for residents to record their accomplishments, reflect on their experiences, and gain formative feedback. This exercise may help prepare physicians for lifelong learning as well as enhance patient care. The Accreditation Council for Graduate Medical Education has endorsed and may soon require the use of portfolios as an assessment tool to evaluate resident competence. However, using portfolios for summative evaluation purposes such as making high-stakes decisions on resident promotion or matriculation may deter resident candidness. In addition, the use of portfolios in clinical settings raises issues unique to the health care setting such as patient privacy, disclosure of clinical information, and professional liability exposure of physicians. It is not clear that peer-review statutes that sometimes protect educational materials used in teaching and evaluation of residents would also bar disclosure and/or evidentiary use of portfolio contents. Is the teaching institution, resident, or graduate vulnerable to requests and subpoenas for the portfolio contents? If so, then a resident's documentation of insecurities, suboptimal performance, or bad outcomes would be ripe for discovery in a medical malpractice lawsuit. If embraced too quickly and without sufficient reflection on the nuances of implementation, this well-intentioned initiative may present unintended legal consequences.
Collapse
Affiliation(s)
- Alisa Nagler
- Office of Graduate Medical Education, Duke University Hospital, Durham, North Carolina 27710, USA.
| | | | | |
Collapse
|
18
|
Heflin MT, Pinheiro S, Kaminetzky CP, McNeill D. 'So you want to be a clinician-educator...': designing a clinician-educator curriculum for internal medicine residents. MEDICAL TEACHER 2009; 31:e233-40. [PMID: 19296370 DOI: 10.1080/01421590802516772] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/21/2023]
Abstract
BACKGROUND Despite a growing demand for skilled teachers and administrators in graduate medical education, clinician-educator tracks for residents are rare and though some institutions offer 'resident-as-teacher' programs to assist residents in developing teaching skills, the need exists to expand training opportunities in this area. METHODS The authors conducted a workshop at a national meeting to develop a description of essential components of a training pathway for internal medicine residents. Through open discussion and small group work, participants defined the various roles of clinician-educators and described goals, training opportunities, assessment and resource needs for such a program. RESULTS Workshop participants posited that the clinician-educator has several roles to fulfill beyond that of clinician, including those of teacher, curriculum developer, administrator and scholar. A pathway for residents aspiring to become clinician educators must offer structured training in each of these four areas to empower residents to effectively practice clinical education. In addition, the creation of such a track requires securing time and resources to support resident learning experiences and formal faculty development programs to support institutional mentors and leaders. CONCLUSION This article provides a framework by which leaders in medical education can begin to prepare current trainees interested in careers as clinician-educators.
Collapse
|
19
|
Houston TK, Wall TC, Willet LL, Heudebert GR, Allison JJ. Can residents accurately abstract their own charts? ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2009; 84:391-395. [PMID: 19240454 DOI: 10.1097/acm.0b013e3181971d11] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/27/2023]
Abstract
PURPOSE To assess the accuracy of residents' record review, using trained abstractors as a gold standard comparison. METHOD In 2005, the authors asked 74 residents to review their own charts (n = 392) after they received brief instruction on both how to locate data on the medical record and how to use a data abstraction form. Trained abstractors then re-reviewed these charts to assess performance of preventive health care measures in medicine (smoking screening, smoking cessation advice, mammography, colon cancer screening, lipid screening, and pneumonia vaccination) and pediatrics (parent smoking screening, parent smoking cessation advice, car seat safety, car restraint use, eye alignment, and immunizations up to date). The authors then quantified agreement between the two record reviews and assessed the sensitivity and specificity of the residents versus the trained abstractors. RESULTS Overall resident-measured performance was similar (within 5%) to that of the trained abstractor for five of six measures in medicine and four of six in pediatrics. For the various measures, sensitivity of resident-measured performance ranged from 100% to 15% and specificity from 100% to 33% compared with the trained abstractors. Relative to the trained abstractor record review, residents did not overestimate their performance. Most residents' (81%) relative performance rankings did not change when the basis for the ranking was resident measured versus trained abstractor measured. CONCLUSIONS Residents' self-abstraction can be an alternative to costly trained abstractors. Appropriate use of these data should be carefully considered, acknowledging the limitations.
Collapse
|
20
|
Colbert CY, Ownby AR, Butler PM. A review of portfolio use in residency programs and considerations before implementation. TEACHING AND LEARNING IN MEDICINE 2008; 20:340-345. [PMID: 18855239 DOI: 10.1080/10401330802384912] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/26/2023]
Abstract
BACKGROUND Portfolios, often described as collections of evidence, are discussed as a means of teaching or assessing the Accreditation Council for Graduate Medical Education competencies. Yet, it is unclear how many residency programs utilize portfolios. The purpose of this article is to (a) review the literature on portfolio use in graduate medical education; (b) examine efficacy of portfolio use, based upon studies in the field; and (c) offer a discussion of considerations for implementing portfolios. SUMMARY Two searches of PubMed, OVID, JSTOR, SCOPUS, and FirstSearch Wilson Select were conducted between October 2006 and April 2007 to identify studies and articles related to portfolio usage. Thirty-nine articles met criteria and were reviewed. CONCLUSIONS There is wide variation in how portfolios are utilized within U.S. residency programs. The challenge for graduate medical education is to create consensus on the definition and purpose of portfolios, such that best practices in portfolio implementation and assessment can be achieved.
Collapse
Affiliation(s)
- Colleen Y Colbert
- Office of Educational Programs, University of Texas Medical School-Houston, Houston, Texas, USA.
| | | | | |
Collapse
|
21
|
Scott E, Borate U, Heitner S, Chaitowitz M, Tester W, Eiger G. Pain Management Practices by Internal Medicine Residents—A Comparison Before and After Educational and Institutional Interventions. Am J Hosp Palliat Care 2008; 25:431-9. [DOI: 10.1177/1049909108320884] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022] Open
Abstract
We aimed to improve internal medicine residents' deficiencies in pain management and evaluate the effectiveness of our intervention, which included an interactive conference series, e-mail vignettes, and didactic sessions. An anonymous survey was administered at the beginning and at the end of an academic year, before and after the intervention, respectively. We analyzed 65 preintervention and 63 postintervention surveys. Self-perception of competency in pain management increased from 40% to 60% (P = .02). Perception of adequacy of training increased from 38.5% to 55.6% (P = .05). Opioid conversion skills improved by 25% (P = .02). Overall, knowledge did not change significantly, except in the subgroup of residents who had completed the oncology rotation from 0.60 to 0.72 (P = .003). ``Opiophobia'' improved by 20% (P = .05). Documentation of pain improved (rank correlation = 21; P = .02). We concluded that educational and institutional interventions administered over an academic year improved pain management skills and documentation and reduced ``opiophobia'' among residents.
Collapse
Affiliation(s)
| | | | | | - Mark Chaitowitz
- Thomas Jefferson University Hospital Philadelphia, Pennsylvania
| | | | | |
Collapse
|
22
|
Gibson KA, Boyle P, Black DA, Cunningham M, Grimm MC, McNeil HP. Enhancing evaluation in an undergraduate medical education program. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2008; 83:787-93. [PMID: 18667897 DOI: 10.1097/acm.0b013e31817eb8ab] [Citation(s) in RCA: 14] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/25/2023]
Abstract
Approaches to evaluation of medical student teaching programs have historically incorporated a range of methods and have had variable effectiveness. Such approaches are rarely comprehensive, typically evaluating only a component rather than the whole program, and are often episodic rather than continuous. There are growing pressures for significant improvement in academic program evaluation. The authors describe an initiative that arose after a radical reorganization of the undergraduate medical education program at the University of New South Wales in part in response to feedback from the accrediting authority. The aim was to design a comprehensive, multicomponent, program-wide evaluation and improvement system. The framework envisages the quality of the program as comprising four main aspects: curriculum and resources; staff and teaching; student experience; and student and graduate outcomes. Key principles of the adopted approach include the views that both student and staff experiences provide valuable information; that measurement of student and graduate outcomes are needed; that an emphasis on action after evaluation is critical (closing the loop); that the strategies and processes need to be continual rather than episodic; and that evaluation should be used to recognize, report on, and reward excellence in teaching. In addition, an important philosophy adopted was that teachers, course coordinators, and administrators should undertake evaluation and improvement activities as an inherent part of teaching, rather than viewing evaluation as something that is externally managed. Examples of the strategy in action, which provide initial evidence of validation for this approach, are described.
Collapse
Affiliation(s)
- Kathryn A Gibson
- South Western Sydney Clinical School, Liverpool Hospital, Liverpool, Australia.
| | | | | | | | | | | |
Collapse
|
23
|
Caverzagie KJ, Shea JA, Kogan JR. Resident identification of learning objectives after performing self-assessment based upon the ACGME core competencies. J Gen Intern Med 2008; 23:1024-7. [PMID: 18612737 PMCID: PMC2517926 DOI: 10.1007/s11606-008-0571-7] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
Abstract
BACKGROUND Self-assessment is increasingly being incorporated into competency evaluation in residency training. Little research has investigated the characteristics of residents' learning objectives and action plans after self-assessment. OBJECTIVE To explore the frequency and specificity of residents' learning objectives and action plans after completing either a highly or minimally structured self-assessment. DESIGN Internal Medicine residents (N = 90) were randomized to complete a highly or minimally structured self-assessment instrument based on the Accreditation Council for Graduate Medical Education Core Competencies. All residents then identified learning objectives and action plans. MEASUREMENTS Learning objectives and action plans were analyzed for content. Differences in specificity and content related to form, gender, and training level were assessed. RESULTS Seventy-six residents (84% response rate) identified 178 learning objectives. Objectives were general (79%), most often focused on medical knowledge (40%), and were not related to the type of form completed (p > 0.01). "Reading more" was the most common action plan. CONCLUSIONS Residents commonly identify general learning objectives focusing on medical knowledge regardless of the structure of the self-assessment form. Tools and processes that further facilitate self-assessment should be identified.
Collapse
Affiliation(s)
- Kelly J Caverzagie
- Division of Hospitalist Medicine, Henry Ford Hospital, Detroit, MI, USA.
| | | | | |
Collapse
|
24
|
Morrison LJ, Scott JO, Block SD. Developing Initial Competency-based Outcomes for the Hospice and Palliative Medicine Subspecialist: phase I of the hospice and palliative medicine competencies project. J Palliat Med 2007; 10:313-30. [PMID: 17472502 DOI: 10.1089/jpm.2006.9980] [Citation(s) in RCA: 23] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022] Open
Abstract
As a newly recognized subspecialty, the field of hospice and palliative medicine (HPM) must transition existing pathways for board certification, fellowship standards, and fellowship accreditation to one based on the Accreditation Council for Graduate Medical Education and American Board of Medical Specialties competency framework. The Competencies Work Group of the American Board of Hospice and Palliative Medicine, using an iterative process informed by the field, has developed a set of Initial Competency-based Outcomes for the HPM Subspecialist. These competencies will set the standard for the "competent hospice and palliative medicine subspecialist physician," guiding future HPM fellowship training and potential midcareer HPM training opportunities. Lessons learned are highlighted.
Collapse
Affiliation(s)
- Laura J Morrison
- Section of Geriatrics, Baylor College of Medicine, Institute for Palliative Medicine, The Methodist Hospital, Houston, Texas 77030, USA.
| | | | | |
Collapse
|
25
|
Current World Literature. Curr Opin Anaesthesiol 2007; 20:605-9. [DOI: 10.1097/aco.0b013e3282f355c3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
|
26
|
Zick A, Granieri M, Makoul G. First-year medical students' assessment of their own communication skills: a video-based, open-ended approach. PATIENT EDUCATION AND COUNSELING 2007; 68:161-6. [PMID: 17640843 DOI: 10.1016/j.pec.2007.05.018] [Citation(s) in RCA: 38] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/08/2007] [Revised: 04/06/2007] [Accepted: 05/29/2007] [Indexed: 05/16/2023]
Abstract
OBJECTIVE Interpersonal and communication skills are a core area of competency for medical students, residents, and practicing physicians. As reflection and self-assessment are essential components of skill-building, we examined the content of medical students' assessments of their own developing communication skills. METHODS Between 2000 and 2003, a total of 674 first-year medical students completed self-assessments of their communication skills after viewing videotapes of their interaction with simulated patients. Self-assessment forms were open-ended, providing ample space for students to write about the strengths and weaknesses they observed. Completed forms were coded by two members of the research team trained in content analysis. Students identified an average of 5.0 things that went well (range 1-15, S.D.=2.2) and 2.8 areas for improvement (range 1-9, S.D.=1.3). RESULTS The most frequently observed strengths were: elicited information/covered important topics (54%); made a personal connection/established rapport (51%); was supportive/encouraging/helpful (40%); attended to conversational flow and transitions (34%); ensured patient comfort (32%). The most frequently noted weaknesses involved problems with: eliciting information/covering important topics (35%); paralanguage, particularly in terms of tone, rate, volume, and disfluencies such as "uh", "um" (32%); discussing health risks (26%); attending to conversational flow and transitions (23%); students' own comfort/organization/preparation (20%). CONCLUSION We observed that a video-based, open-ended approach to self-assessment is feasible, practical, and informative. While the self-assessments covered a broad scope, students clearly attended to tasks and skills relevant to effective communication and relationship building. PRACTICE IMPLICATIONS Videotaped clinical encounters allow learners to review their own behavior and make specific comments supported by tangible examples. An open-ended approach to self-assessment of communication skills can serve as one important component of a systematic education and evaluation program.
Collapse
Affiliation(s)
- Amanda Zick
- Center for Communication and Medicine, Northwestern University Feinberg School of Medicine, Chicago, USA
| | | | | |
Collapse
|
27
|
Makoul G, Krupat E, Chang CH. Measuring patient views of physician communication skills: development and testing of the Communication Assessment Tool. PATIENT EDUCATION AND COUNSELING 2007; 67:333-42. [PMID: 17574367 DOI: 10.1016/j.pec.2007.05.005] [Citation(s) in RCA: 267] [Impact Index Per Article: 15.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/14/2007] [Revised: 05/01/2007] [Accepted: 05/03/2007] [Indexed: 05/07/2023]
Abstract
OBJECTIVE Interpersonal and communication skills have been identified as a core competency that must be demonstrated by physicians. We developed and tested a tool that can be used by patients to assess the interpersonal and communication skills of physicians-in-training and physicians-in-practice. METHODS We began by engaging in a systematic scale development process to obtain a psychometrically sound Communication Assessment Tool (CAT). This process yielded a 15-item instrument that is written at the fourth grade reading level and employs a five-point response scale, with 5=excellent. Fourteen items focus on the physician and one targets the staff. Pilot testing established that the CAT differentiates between physicians who rated high or low on a separate satisfaction scale. We conducted a field test with physicians and patients from a variety of specialties and regions within the US to assess the feasibility of using the CAT in everyday practice. RESULTS Thirty-eight physicians and 950 patients (25 patients per physician) participated in the field test. The average patient-reported mean score per physician was 4.68 across all CAT items (S.D.=0.54, range 3.97-4.95). The average proportion of excellent scores was 76.3% (S.D.=11.1, range 45.7-95.1%). Overall scale reliability was high (Cronbach's alpha=0.96); alpha coefficients were uniformly high when reliability was examined per doctor. CONCLUSION The CAT is a reliable and valid instrument for measuring patient perceptions of physician performance in the area of interpersonal and communication skills. The field test demonstrated that the CAT can be successfully completed by both physicians and patients across clinical specialties. Reporting the proportion of "excellent" ratings given by patients is more useful than summarizing scores via means, which are highly skewed. PRACTICE IMPLICATIONS Specialty boards, residency programs, medical schools, and practice plans may find the CAT valuable for both collecting information and providing feedback about interpersonal and communication skills.
Collapse
Affiliation(s)
- Gregory Makoul
- Center for Communication and Medicine, Northwestern University Feinberg School of Medicine, Division of General Internal Medicine, Chicago, IL 60611, USA.
| | | | | |
Collapse
|
28
|
Corbett EC, Payne NJ, Bradley EB, Maughan KL, Heald EB, Wang XQ. Enhancing clinical skills education: University of Virginia School of Medicine's Clerkship Clinical Skills Workshop Program. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2007; 82:690-5. [PMID: 17595568 DOI: 10.1097/acm.0b013e31806745b4] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/16/2023]
Abstract
In 1993, the University of Virginia School of Medicine began a clinical skills workshop program in an effort to improve the preparation of all clerkship students to participate in clinical care. This program involved the teaching of selected basic clinical skills by interested faculty to small groups of third-year medical students. Over the past 14 years, the number of workshops has increased from 11 to 31, and they now involve clerkship faculty from family medicine, internal medicine, and pediatrics. Workshops include a variety of common skills from the communication, physical examination, and clinical test and procedure domains such as pediatric phone triage, shoulder examination, ECG interpretation, and suturing. Workshop sessions allow students to practice skills on each other, with standardized patients, or with models, with the goal of improving competence and confidence in the performance of basic clinical skills. Students receive direct feedback from faculty on their skill performance. The style and content of these workshops are guided by an explicit set of educational criteria.A formal evaluation process ensures that faculty receive regular feedback from student evaluation comments so that adherence to workshop criteria is continuously reinforced. Student evaluations confirm that these workshops meet their skill-learning needs. Preliminary outcome measures suggest that workshop teaching can be linked to student assessment data and may improve students' skill performance. This program represents a work-in-progress toward the goal of providing a more comprehensive and developmental clinical skills curriculum in the school of medicine.
Collapse
Affiliation(s)
- Eugene C Corbett
- Department of Medicine, University of Virginia School of Medicine, Charlottesville, Virginia, USA.
| | | | | | | | | | | |
Collapse
|
29
|
Peerschke EIB, Agrawal Y, Alexander CB, Bovill E, Laposata M. Proposed Research Training Guidelines for Residents in Laboratory Medicine. Clin Lab Med 2007; 27:241-53; abstract v-vi. [PMID: 17556083 DOI: 10.1016/j.cll.2007.03.002] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
Abstract
It is expected that the role of the clinical pathologist will evolve from the more passive role of managing testing facilities to one of active service provider, using powerful molecular, cell biologic, and biochemical tools. The scope of knowledge required to be an effective physician scientist or an accomplished practicing clinical pathologist, however, cannot be acquired through clinical training alone and requires dedicated, structured research learning time. The goal of this article is to consider mechanisms that effectively integrate research training and scholarly activity into residency education in laboratory medicine/clinical pathology. The proposed curricula are purposely unstructured to allow maximum flexibility for training programs to meet the needs and career goals of individual residents.
Collapse
Affiliation(s)
- Ellinor I B Peerschke
- Department of Pathology and Laboratory Medicine, Weill Medical College of Cornell University, New York, NY 10021, USA.
| | | | | | | | | |
Collapse
|
30
|
Rosenberg ME. Adult Nephrology Fellowship Training in the United States: Trends and Issues. J Am Soc Nephrol 2007; 18:1027-33. [PMID: 17329571 DOI: 10.1681/asn.2006101169] [Citation(s) in RCA: 24] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/03/2022] Open
Abstract
This article reviews trends and issues related to adult nephrology fellowship education in the United States. The number of nephrology fellowship programs and trainees has continued to increase slowly despite limitations in funding of graduate medical education. The use of the Electronic Residency Application System has provided information for the first time on the number, demographics, and behavior of applicants that can be used as baseline data for tracking trends in fellowship applications and for formulating training policies. Issues that nephrology training programs face are discussed in this review: (1) A more stringent graduate medical education regulatory environment, (2) the use of the National Resident Matching Program to enhance the nephrology fellowship applicant selection process, (3) future nephrology workforce shortages, and (4) the continued subspecialization of nephrology. By working together, nephrology fellowship programs can overcome barriers that are raised by these issues and improve the fellowship training experience.
Collapse
|
31
|
Durning SJ, Hemmer P, Pangaro LN. The structure of program evaluation: an approach for evaluating a course, clerkship, or components of a residency or fellowship training program. TEACHING AND LEARNING IN MEDICINE 2007; 19:308-18. [PMID: 17594228 DOI: 10.1080/10401330701366796] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/16/2023]
Abstract
BACKGROUND Directors of courses, clerkships, residencies, and fellowships are responsible not only for determining whether individual trainees have met educational goals but also for ensuring the quality of the training program itself. The purpose of this article is to discuss a framework for program evaluation that has sufficient rigor to satisfy accreditation requirements yet is flexible and responsive to the uniqueness of individual educational programs. SUMMARY We discuss key aspects of program evaluation to include cardinal definitions, measurements, needed resources, and analyses of qualitative and quantitative data. We propose a three-phase framework for data collection (Before, During, and After) that can be used across undergraduate, graduate, and continuing medical education. CONCLUSIONS This Before, During, and After model is a feasible and practical approach that is sufficiently rigorous to allow for conclusions that can lead to action. It can be readily implemented for new and existing medical education programs.
Collapse
Affiliation(s)
- Steven J Durning
- Department of Medicine, Uniformed Services University of the Health Sciences, Bethesda, Maryland 20814, USA.
| | | | | |
Collapse
|