1
|
Tu W, Hibbert R, Kontolemos M, Dang W, Wood T, Verma R, McInnes MDF. Diagnostic Radiology Residency Assessment Tools: A Scoping Review. Can Assoc Radiol J 2021; 72:651-660. [PMID: 33401932 DOI: 10.1177/0846537120981581] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/04/2023] Open
Abstract
PURPOSE The multifaceted nature of learning in diagnostic radiology residency requires a variety of assessment methods. However, the scope and quality of assessment tools has not been formally examined. A scoping review was performed to identify assessment tools available for radiology resident training and to evaluate the validity of these tools. METHODS A literature search was conducted through multiple databases and on-line resources. Inclusion criteria were defined as any tool used in assessment of radiology resident competence. Data regarding residents, evaluators and specifics of each tool was extracted. Each tool was subjected through a validation process with a customized rating scale using the 5 categories of validity: content, response process, internal structure, relations to other variables, and consequences. RESULTS The initial search returned 447 articles; 35 were included. The most evaluated competency being overall knowledge (31%), most common published journal was Academic Radiology (24%); evaluations were most commonly set in the United States (57%). In terms of validation, we found low adherence to modern integrated validity, with 34% of studies including a definition of validity. When specifically examining the 5 domains of validation evidence presented, most were either absent or of low rigor (70%). Only one study presented a modern definition of validation (3%, 1/35). CONCLUSION We identified 35 evaluation tools covering a variety of competency areas. However, few of these tools have been validated. Development of new validated assessment tools or validation of existing tools is essential for the ongoing transition to a competency-based curriculum.
Collapse
Affiliation(s)
- Wendy Tu
- Department of Radiology, 27337University of Ottawa, Ontario, Canada
| | - Rebecca Hibbert
- Department of Radiology, 27337University of Ottawa, Ontario, Canada
| | - Mario Kontolemos
- Department of Radiology, 27337University of Ottawa, Ontario, Canada
| | - Wilfred Dang
- Department of Radiology, 27337University of Ottawa, Ontario, Canada
| | - Tim Wood
- Department of Innovation in Medical Education, 27337University of Ottawa, Ontario, Canada
| | - Raman Verma
- Department of Radiology, 27337University of Ottawa, Ontario, Canada
| | - Matthew D F McInnes
- Department of Radiology, 27337University of Ottawa, Ontario, Canada.,Clinical Epidemiology Program, 10055Ottawa Hospital Research Institute, Ottawa, Ontario, Canada
| |
Collapse
|
2
|
Sistrom CL, Slater RM, Rajderkar DA, Grajo JR, Rees JH, Mancuso AA. Full Resolution Simulation for Evaluation of Critical Care Imaging Interpretation; Part 1: Fixed Effects Identify Influences of Exam, Specialty, Fatigue, and Training on Resident Performance. Acad Radiol 2020; 27:1006-1015. [PMID: 32376185 DOI: 10.1016/j.acra.2019.11.023] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2019] [Revised: 11/01/2019] [Accepted: 11/01/2019] [Indexed: 10/24/2022]
Abstract
RATIONALE AND OBJECTIVES To describe our full-resolution simulation of critical care imaging coupled with posthoc grading of resident's interpretations and present results from the fixed effects terms in a comprehensive mixed regression model of the resulting scores. MATERIALS AND METHODS The system delivered full resolution DICOM studies via clinical-grade viewing software integrated with a custom built web-based workflow and reporting system. The interpretations submitted by participating residents from 47 different programs were graded (scores of 0-10) on a case by case basis by a cadre of faculty members from our department. The data from 5 yearly (2014-2018) cycles consisting of 992 separate 65 case, 8 hour simulation sessions were collated from the transaction records. We used a mixed (hierarchical) statistical model with nine fixed and four random independent variables. In this paper, we present the results from the nine fixed effects. RESULTS There were 19,916/63,839 (27.0%, CI 26.7%-27.4%) scores in the 0-2 range (i.e., clinically significant miss). Neurological cases were more difficult with adjusted scores 2.3 (CI 1.9-3.2) lower than body/musculoskeletal cases. There was a small (0.3, CI 0.20-0.38 points) but highly significant (p<0.0001) decrease in score for the final 13/65 cases (fifth quintile) as evidence of fatigue during the last hour of an 8 hour shift. By comparing adjusted scores from mid-R1 (quarter 3) to late-R3 (quarter 12) we estimate the training effect as an increase of 2.2 (CI 1.90-2.50) points. CONCLUSION Full resolution simulation based evaluation of critical care radiology interpretation is being conducted remotely and efficiently at large scale. Analysis of the resulting scores yields multiple insights into the interpretative process.
Collapse
|
3
|
Diaz FN, Ulla M. Validation of an informatics tool to assess resident's progress in developing reporting skills. Insights Imaging 2019; 10:96. [PMID: 31549253 PMCID: PMC6757078 DOI: 10.1186/s13244-019-0772-0] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/19/2019] [Accepted: 07/15/2019] [Indexed: 11/04/2022] Open
Abstract
Background Diagnostic radiology residency programs pursuits as main objectives of the development of diagnostic capabilities and written communication skills to answer clinicians’ questions of referring clinicians. There has been also an increasing focus on competencies, rather than just education inputs. Then, to show ongoing professional development is necessary for a system to assess and document resident’s competence in these areas. Therefore, we propose the implementation of an informatics tool to objectively assess resident’s progress in developing diagnostics and reporting skills. We expect to found decreased preliminary report-final report variability within the course of each year of the residency program. Results We analyzed 12,162 evaluations from 32 residents (8 residents per year in a 4-year residency program) in a 7-month period. 73.96% of these evaluations belong to 2nd-year residents. We chose two indicators to study the evolution of evaluations: the total of discrepancies over the total of preliminary reports (excluding score 0) and the total of likely to be clinically significant discrepancies (scores 2b, 3b, and 4b) over the total of preliminary reports (excluding score 0). With the analysis of these two indicators over the evaluations of 2nd-year residents, we found a slight decrease in the value of the first indicator and relative stable behavior of the second one. Conclusions This tool is useful for objective assessment of reporting skill of radiology residents. It can provide an opportunity for continuing medical education with case-based learning from those cases with clinically significant discrepancies between the preliminary and the final report. Electronic supplementary material The online version of this article (10.1186/s13244-019-0772-0) contains supplementary material, which is available to authorized users.
Collapse
Affiliation(s)
- Facundo N Diaz
- Diagnóstico por Imágenes, Hospital Italiano de Buenos Aires, Juan Domingo Perón 4190, C1199AAB, Ciudad Autónoma de Buenos Aires, Argentina. .,Facultad de Medicina, Universidad de Buenos Aires, II Cátedra de Anatomía, Buenos Aires, Argentina.
| | - Marina Ulla
- Diagnóstico por Imágenes, Hospital Italiano de Buenos Aires, Juan Domingo Perón 4190, C1199AAB, Ciudad Autónoma de Buenos Aires, Argentina
| |
Collapse
|
4
|
Nye JA, Cuddy M, Ruckdeschel T, Jallow N, Dharmadhikari S, Tang X, Mullins ME. Development of an Academic-Private Collaborative Medical Physics Imaging Residency Training Model. J Am Coll Radiol 2019; 16:355-359. [DOI: 10.1016/j.jacr.2018.08.034] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/08/2018] [Revised: 08/20/2018] [Accepted: 08/23/2018] [Indexed: 11/26/2022]
|
5
|
Mendoza D, Holbrook A, Bertino F, Balthazar P, Newell M, Meltzer CC. Supporting Radiology Residents' Professional Development Through a Competitive Intramural Grant. Acad Radiol 2019; 26:286-289. [PMID: 30107959 DOI: 10.1016/j.acra.2018.07.009] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2018] [Revised: 07/17/2018] [Accepted: 07/19/2018] [Indexed: 12/29/2022]
Abstract
Research and other scholarly activities are an important and required component of diagnostic radiology training. Several strategies, both at the departmental and the larger organizational levels, have been implemented to encourage radiology trainees to participate in these activities. In this article, we review and discuss our institution's 10-year experience in supporting the development and realization of scholarly projects through a competitive intramural grant for residents.
Collapse
|
6
|
Agarwal V, Bump GM, Heller MT, Chen LW, Branstetter BF, Amesur NB, Hughes MA. Resident Case Volume Correlates with Clinical Performance: Finding the Sweet Spot. Acad Radiol 2019; 26:136-140. [PMID: 30087064 DOI: 10.1016/j.acra.2018.06.023] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2018] [Revised: 06/26/2018] [Accepted: 06/27/2018] [Indexed: 10/28/2022]
Abstract
RATIONALE AND OBJECTIVES To determine whether the total number of studies interpreted during radiology residency correlates with clinical performance as measured by objective criteria. MATERIALS AND METHODS We performed a retrospective cohort study of three graduating classes of radiology residents from a single residency program between the years 2015-2017. The total number of studies interpreted by each resident during residency was tracked. Clinical performance was determined by tracking an individual resident's major discordance rate. A major discordance was recorded when there was a difference between the preliminary resident interpretation and final attending interpretation that could immediately impact patient care. Accreditation council for graduate medical education milestones at the completion of residency, Diagnostic radiology in-training scores in the third year, and score from the American board of radiology core exam were also tabulated. Pearson correlation coefficients and polynomial regression analysis were used to identify correlations between the total number of interpreted films and clinical, test, and milestone performance. RESULTS Thirty-seven residents interpreted a mean of 12,709 studies (range 8898-19,818; standard deviation [SD] 2351.9) in residency with a mean major discordance rate of 1.1% (range 0.34%-2.54%; stand dev 0.49%). There was a nonlinear correlation between total number of interpreted films and performance. As the number of interpreted films increased to approximately 16,000, clinical performance (p = 0.004) and test performance (p = 0.01) improved, but volumes over 16,000 correlated with worse performance. CONCLUSION The total number of studies interpreted during radiology training correlates with performance. Residencies should endeavor to find the "sweet spot": the amount of work that maximizes clinical exposure and knowledge without overburdening trainees.
Collapse
|
7
|
Schiller PT, Phillips AW, Straus CM. Radiology Education in Medical School and Residency: The Views and Needs of Program Directors. Acad Radiol 2018; 25:1333-1343. [PMID: 29748045 DOI: 10.1016/j.acra.2018.04.004] [Citation(s) in RCA: 21] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/02/2018] [Revised: 04/09/2018] [Accepted: 04/14/2018] [Indexed: 10/17/2022]
Abstract
RATIONALE AND OBJECTIVES The authors of this study used the perspectives of residency program directors (PDs) nationally to explore whether trainees are adequately prepared to utilize and interpret medical imaging as interns, to identify the types of imaging skills most important for residency, and to begin to address current shortcomings in radiology education. MATERIALS AND METHODS The authors created a survey using a modified version of Accreditation Council for Graduate Medical Education radiology milestones and sent it to 100 randomly selected PDs each in pediatrics, internal medicine, obstetrics and gynecology, and general surgery. The survey asked PDs to assess the actual and desired imaging skills of their incoming interns, the incoming interns' variability of skill level upon matriculation, and which imaging skills were most important from the PDs' perspective. RESULTS PDs from all specialties identified a significant shortcoming relative to their expectations for both image interpretation and utilization skills. Additionally, PDs identified a significant variability in imaging skills, and described that variability as a hindrance to their programs. All of the potential imaging skills were rated as highly important with little clinically relevant difference between them. DISCUSSION This multidisciplinary national survey found a deficiency in imaging education among interns across specialties and substantiates calls for formalized and improved radiology education in undergraduate medical education. Additionally, PDs had difficulty distinguishing which skills were most important, suggesting an unclear understanding of imaging ability needs for interns in respective specialties. More specific needs assessments are warranted on a national level.
Collapse
|
8
|
Mendoza D, Peterson R, Ho C, Harri P, Baumgarten D, Mullins ME. Cultivating Future Radiology Educators: Development and Implementation of a Clinician-Educator Track for Residents. Acad Radiol 2018; 25:1227-1231. [PMID: 29731418 DOI: 10.1016/j.acra.2018.03.032] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/05/2018] [Revised: 03/24/2018] [Accepted: 03/28/2018] [Indexed: 11/28/2022]
Abstract
Effective and dedicated educators are critical to the preservation and advancement of the practice of radiology. The need for innovative and adaptable educators is increasingly being recognized, with several institutions granting academic promotions through clinician-educator tracks. The implementation of resident "clinician-educator tracks" or "teaching tracks" should better prepare residents aspiring to become academic radiologists focused on teaching. In this work, we describe our experience in the development and implementation of a clinician-educator track for diagnostic radiology residents at our institution.
Collapse
Affiliation(s)
- Dexter Mendoza
- Department of Radiology and Imaging Sciences, Emory University School of Medicine, 1364 Clifton Rd. NE, D125A, Atlanta, GA 30322.
| | - Ryan Peterson
- Department of Radiology and Imaging Sciences, Emory University School of Medicine, 1364 Clifton Rd. NE, D125A, Atlanta, GA 30322
| | - Christopher Ho
- Department of Radiology and Imaging Sciences, Emory University School of Medicine, 1364 Clifton Rd. NE, D125A, Atlanta, GA 30322
| | - Peter Harri
- Department of Radiology and Imaging Sciences, Emory University School of Medicine, 1364 Clifton Rd. NE, D125A, Atlanta, GA 30322
| | - Deborah Baumgarten
- Department of Radiology and Imaging Sciences, Emory University School of Medicine, 1364 Clifton Rd. NE, D125A, Atlanta, GA 30322
| | - Mark E Mullins
- Department of Radiology and Imaging Sciences, Emory University School of Medicine, 1364 Clifton Rd. NE, D125A, Atlanta, GA 30322
| |
Collapse
|
9
|
Durojaiye AB, Snyder E, Cohen M, Nagy P, Hong K, Johnson PT. Radiology Resident Assessment and Feedback Dashboard. Radiographics 2018; 38:1443-1453. [PMID: 30096050 DOI: 10.1148/rg.2018170117] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Abstract
Assessment of residents is optimally performed through processes and platforms that provide daily feedback, which can be immediately acted on. Given the documentation required by the Accreditation Council for Graduate Medical Education (ACGME), effective data management, integration, and presentation are crucial to ease the burden of manual documentation and increase the timeliness of actionable information. To this end, the authors modeled the learning activities of residents using the Experience Application Programming Interface (xAPI) framework, which is a standard framework for the learning community. On the basis of the xAPI framework and using open-source software to extend their existing infrastructure, the authors developed a Web-based dashboard that provides residents with a more holistic view of their educational experience. The dashboard was designed around the ACGME radiology milestones and provides real-time feedback to residents using various assessment metrics derived from multiple data sources. The purpose of this article is to describe the dashboard's architecture and components, the design and technical considerations, and the lessons learned in implementing the dashboard. ©RSNA, 2018.
Collapse
Affiliation(s)
- Ashimiyu B Durojaiye
- From the Department of Radiology, Johns Hopkins Medicine, 601 N Caroline St, Room 4223, Baltimore, MD 21287
| | - Elizabeth Snyder
- From the Department of Radiology, Johns Hopkins Medicine, 601 N Caroline St, Room 4223, Baltimore, MD 21287
| | - Michael Cohen
- From the Department of Radiology, Johns Hopkins Medicine, 601 N Caroline St, Room 4223, Baltimore, MD 21287
| | - Paul Nagy
- From the Department of Radiology, Johns Hopkins Medicine, 601 N Caroline St, Room 4223, Baltimore, MD 21287
| | - Kelvin Hong
- From the Department of Radiology, Johns Hopkins Medicine, 601 N Caroline St, Room 4223, Baltimore, MD 21287
| | - Pamela T Johnson
- From the Department of Radiology, Johns Hopkins Medicine, 601 N Caroline St, Room 4223, Baltimore, MD 21287
| |
Collapse
|
10
|
Sarkany D, DeBenedectis CM, Morrow M, Sotardi S, Del Re D, DiVito D, Slanetz PJ. Educating Radiology Residents About Patient- and Family-Centered Care: The Time Has Come. J Am Coll Radiol 2018; 15:897-899. [DOI: 10.1016/j.jacr.2018.02.018] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2017] [Revised: 02/04/2018] [Accepted: 02/16/2018] [Indexed: 10/17/2022]
|
11
|
Keiper M, Donovan T, DeVries M. Comprehensive Health Care Economics Curriculum and Training in Radiology Residency. J Am Coll Radiol 2018; 15:900-904. [DOI: 10.1016/j.jacr.2018.02.022] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/07/2017] [Revised: 02/09/2018] [Accepted: 02/22/2018] [Indexed: 11/30/2022]
|
12
|
Shenoy-Bhangle AS, Eisenberg RL, Fineberg T, Slanetz PJ. Residency Mini-fellowships in the PGY-5 Year: Is There Added Value? Acad Radiol 2018; 25:708-713. [PMID: 29751857 DOI: 10.1016/j.acra.2017.12.033] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/23/2017] [Revised: 12/03/2017] [Accepted: 12/04/2017] [Indexed: 11/15/2022]
Abstract
RATIONALE AND OBJECTIVES With the restructuring of radiology board certification, many residencies created PGY-5 "mini-fellowships," during which residents spend focused time pursuing advanced subspecialty training or developing nonclinical skills in leadership, health policy and health-care economics, education, quality improvement, informatics, research, or global health. We surveyed graduates of an academic diagnostic radiology residency to assess the relative value and impact of PGY-5 mini-fellowships on career satisfaction and success. METHODS From 2012 to 2016, 39 radiology residents at our institution were offered the opportunity to pursue a 3- to 6-month mini-fellowship during the PGY-5 year. Thirty of 39 radiology residents (77%) participated, whereas 9 of 39 (23%) opted out. Of 39 residents, 13 completed two clinical mini-fellowships, 3 completed research mini-fellowships only, and 14 completed one nonclinical and one clinical mini-fellowship. Through SurveyMonkey, 23 of 39 residents (59%) responded to a questionnaire that collected basic demographic information and asked respondents about the value of this experience as it relates to fellowship choice and career using a five-point Likert scale. RESULTS Of 23 respondents (14 male, 8 female,1 not specified), 78.3% practice in an academic university-based setting, with 8.7% in a community-based hospital practice, 4.3% in the veterans system, and 4.3% in a private practice setting. Of 23 respondents, the most popular clinical mini-fellowships were magnetic resonance imaging (31.6%), neuroradiology (21.1%), and interventional radiology (15.8%). For nonclinical mini-fellowships, the most popular were research (10.5%), education (10.5%), global health (5.3%), and healthcare economics (5.3%). Of 23 respondents who did mini-fellowships, 95% felt that the mini-fellowship prepared them well for their career, 85% felt it gave them the necessary skills to succeed, 85% cited that it gave them additional skills beyond their peers, and 40% felt it helped them create a life-long connection to a mentor. Ninety-five percent of respondents would choose to do the mini-fellowship again. Respondents suggested increasing the duration to 6-9 months and to develop a more structured curriculum and mentorship component. Only one respondent felt that the nonclinical mini-fellowship took away time from furthering clinical skills. CONCLUSIONS Graduates of a university-affiliated academic radiology residency who participated in clinical and nonclinical mini-fellowships during the PGY-5 year of residency greatly value this experience and uniformly recommend that this type of program continue to be offered to trainees given its ability to develop skills perceived to be vital to ultimate career satisfaction and success.
Collapse
Affiliation(s)
- Anuradha S Shenoy-Bhangle
- Department of Radiology, Beth Israel Deaconess Medical Center, 330 Brookline Ave, Boston, MA 02215; Harvard Medical School, 25 Shattuck St, Boston, MA 02215.
| | - Ronald L Eisenberg
- Department of Radiology, Beth Israel Deaconess Medical Center, 330 Brookline Ave, Boston, MA 02215; Harvard Medical School, 25 Shattuck St, Boston, MA 02215
| | - Tabitha Fineberg
- Department of Radiology, Beth Israel Deaconess Medical Center, 330 Brookline Ave, Boston, MA 02215
| | - Priscilla J Slanetz
- Department of Radiology, Beth Israel Deaconess Medical Center, 330 Brookline Ave, Boston, MA 02215; Harvard Medical School, 25 Shattuck St, Boston, MA 02215
| |
Collapse
|
13
|
Teaching and Assessing Professionalism in Radiology: Resources and Scholarly Opportunities to Contribute to Required Expectations. Acad Radiol 2018; 25:599-609. [PMID: 29478920 DOI: 10.1016/j.acra.2018.01.008] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/28/2017] [Revised: 01/10/2018] [Accepted: 01/14/2018] [Indexed: 11/20/2022]
Abstract
Teaching and assessing trainees' professionalism now represents an explicit expectation for Accreditation Council Graduate Medical Education-accredited radiology programs. Challenges to meeting this expectation include variability in defining the construct of professionalism; limits of traditional teaching and assessment methods, used for competencies historically more prominent in medical education, for professionalism; and emerging expectations for credible and feasible professionalism teaching and assessment practices in the current context of health-care training and practice. This article identifies promising teaching resources and methods that can be used strategically to augment traditional teaching of the cognitive basis for professionalism, including role modeling, case-based scenarios, debriefing, simulations, narrative medicine (storytelling), guided discussions, peer-assisted learning, and reflective practice. This article also summarizes assessment practices intended to promote learning, as well as to inform how and when to assess trainees as their professional identities develop over time, settings, and autonomous practice, particularly in terms of measurable behaviors. This includes assessment tools (including mini observations, critical incident reports, and appreciative inquiry) for authentic assessment in the workplace; engaging multiple sources (self-, peer, other health professionals, and patients) in assessment; and intentional practices for trainees to take responsibility for seeking our actionable feedback and reflection. This article examines the emerging evidence of the feasibility and value added of assessment of medical competency milestones, including professionalism, coordinated by the Accreditation Council Graduate Medical Education in radiology and other medical specialties. Radiology has a strategic opportunity to contribute to scholarship and inform policies in professionalism teaching and assessment practices.
Collapse
|
14
|
Kelly AM, Mullan PB. Designing a Curriculum for Professionalism and Ethics Within Radiology: Identifying Challenges and Expectations. Acad Radiol 2018; 25:610-618. [PMID: 29580789 DOI: 10.1016/j.acra.2018.02.007] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2017] [Revised: 02/17/2018] [Accepted: 02/19/2018] [Indexed: 12/17/2022]
Abstract
Although professionalism and ethics represent required competencies, they are more challenging than other competencies to design a curriculum for and teach. Reasons include variability in agreed definitions of professionalism within medicine and radiology. This competency is also framed differently whether as roles, duties, actions, skills, behavior, beliefs, and attitudes. Standardizing a curriculum in professionalism is difficult because each learner's (medical student/resident) professional experiences and interactions will be unique. Professionalism is intertwined throughout all (sub) specialties and areas and its teaching cannot occur in isolation as a standalone curriculum. In the past, professionalism was not emphasized enough or at all, with global (or no) assessments, with the potential effect of trainees not valuing it. Although we can teach it formally in the classroom and informally in small groups, much of professionalism is witnessed and learned as "hidden curricula". The formal, informal, and hidden curricula often contradict each other creating confusion, disillusion, and cynicism in trainees. The corporatization of medicine pressurizes us to increase efficiency (throughput) with less focus on aspects of professionalism that add value, creating a disjoint between what we do in practice and preach to trainees. Progressively, expectations for our curriculum include providing evidence for the impacts of our efforts on patient outcomes. Generational differences in the perception of professionalism and the increasingly diverse and multicultural society in which we live affects our interpretation of professionalism, which can add to confusion and misunderstanding. The objectives of this article are to outline challenges facing curriculum design in professionalism and to make suggestions to help educators avoid or overcome them.
Collapse
|
15
|
Itri JN, Yacob S, Mithqal A. Teaching Communication Skills to Radiology Residents. Curr Probl Diagn Radiol 2017; 46:377-381. [DOI: 10.1067/j.cpradiol.2017.01.005] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/04/2016] [Revised: 01/15/2017] [Accepted: 01/17/2017] [Indexed: 11/22/2022]
|
16
|
Robbins JB, Sarkany D. Self-Study: Practical Tips for a Successful and Rewarding Experience. Acad Radiol 2017; 24:721-724. [PMID: 28262521 DOI: 10.1016/j.acra.2016.10.015] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2016] [Revised: 10/07/2016] [Accepted: 10/07/2016] [Indexed: 11/24/2022]
Abstract
The Accreditation Council for Graduate Medical Education (ACGME) self-study is a new process for ACGME accredited radiology programs. This article serves to provide the reader with the evolution of ACGME accreditation leading to the conception of the self-study process, detail the self-study method, and offer practical advice to programs embarking upon their inaugural self-study.
Collapse
|
17
|
Heitkamp DE, Johnson KS, Suh RD, Bedi HS, Oldham SA, Ho CP, Paladin AM. Sustaining Change in Radiology Education: The Need for Universal Curricula. J Am Coll Radiol 2017; 14:804-807. [DOI: 10.1016/j.jacr.2017.02.047] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/24/2017] [Revised: 02/24/2017] [Accepted: 02/24/2017] [Indexed: 11/27/2022]
|
18
|
Development of a Standardized Kalamazoo Communication Skills Assessment Tool for Radiologists: Validation, Multisource Reliability, and Lessons Learned. AJR Am J Roentgenol 2017; 209:351-357. [PMID: 28537754 DOI: 10.2214/ajr.16.17439] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
OBJECTIVE The purpose of this study was to develop and test a standardized communication skills assessment instrument for radiology. MATERIALS AND METHODS The Delphi method was used to validate the Kalamazoo Communication Skills Assessment instrument for radiology by revising and achieving consensus on the 43 items of the preexisting instrument among an interdisciplinary team of experts consisting of five radiologists and four nonradiologists (two men, seven women). Reviewers assessed the applicability of the instrument to evaluation of conversations between radiology trainees and trained actors portraying concerned parents in enactments about bad news, radiation risks, and diagnostic errors that were video recorded during a communication workshop. Interrater reliability was assessed by use of the revised instrument to rate a series of enactments between trainees and actors video recorded in a hospital-based simulator center. Eight raters evaluated each of seven different video-recorded interactions between physicians and parent-actors. RESULTS The final instrument contained 43 items. After three review rounds, 42 of 43 (98%) items had an average rating of relevant or very relevant for bad news conversations. All items were rated as relevant or very relevant for conversations about error disclosure and radiation risk. Reliability and rater agreement measures were moderate. The intraclass correlation coefficient range was 0.07-0.58; mean, 0.30; SD, 0.13; and median, 0.30. The range of weighted kappa values was 0.03-0.47; mean, 0.23; SD, 0.12; and median, 0.22. Ratings varied significantly among conversations (χ26 = 1186; p < 0.0001) and varied significantly by viewing order, rater type, and rater sex. CONCLUSION The adapted communication skills assessment instrument is highly relevant for radiology, having moderate interrater reliability. These findings have important implications for assessing the relational competencies of radiology trainees.
Collapse
|
19
|
Meyer EC, Lamiani G, Luff D, Brown SD. Voices emerging from the shadows: Radiologic practitioners' experiences of challenging conversations. PATIENT EDUCATION AND COUNSELING 2017; 100:133-138. [PMID: 27639514 DOI: 10.1016/j.pec.2016.07.033] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/15/2016] [Revised: 07/27/2016] [Accepted: 07/29/2016] [Indexed: 06/06/2023]
Abstract
OBJECTIVE Traditionally, radiologists have practiced their profession behind the scenes. Today, radiologic practitioners face mounting expectations to communicate more directly with patients. However, their experiences with patient communication are not well understood. The aim of this study was to describe the challenges of radiologic practitioners when communicating with patients. METHODS Twelve day-long interprofessional communication skills workshops for radiologic clinicians were held at Boston Children's Hospital. Prior to each workshop, participants were asked to write narratives describing experiences with difficult radiologic conversations that they found particularly challenging or satisfying. The narratives were transcribed and analyzed through thematic content analysis by two researchers. RESULTS Radiologists, radiology trainees, technologists, nurses, and medical interpreters completed 92 narratives. The most challenging aspects of healthcare conversations included: Conveying Serious News (n=44/92; 48%); Expanded Scope of Radiologic Practice (n=37/92; 40%); Inexperience and Gaps in Education (n=15/92; 16%); Clinical Uncertainty (n=14/92; 15%); and Interprofessional Teamwork (n=9/92; 10%). CONCLUSION Radiologic clinicians face substantial communicative challenges focused on conveying serious, unexpected and uncertain diagnoses amid practical challenges and limited educational opportunities. PRACTICE IMPLICATIONS Innovative educational curricula that address these challenges may enhance radiologic practitioners' success in adopting patient-centered communication.
Collapse
Affiliation(s)
- Elaine C Meyer
- Boston Children's Hospital and Harvard Medical School, 300 Longwood Ave, Boston, MA, USA.
| | - Giulia Lamiani
- Department of Health Sciences, Università degli Studi di Milano, San Paolo University Hospital, Via Di Rudinì 8, Milan, Italy.
| | - Donna Luff
- Boston Children's Hospital and Harvard Medical School, 300 Longwood Ave, Boston, MA, USA.
| | - Stephen D Brown
- Boston Children's Hospital and Harvard Medical School, 300 Longwood Ave, Boston, MA, USA.
| |
Collapse
|
20
|
|
21
|
|
22
|
Guy C. Genetic Counseling Milestones: A Framework for Student Competency Evaluation. J Genet Couns 2015; 25:635-43. [PMID: 26462934 DOI: 10.1007/s10897-015-9895-8] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2015] [Accepted: 09/23/2015] [Indexed: 11/25/2022]
Abstract
Graduate medical education has recently increased focus on the development of medical specialty competency milestones to provide a targeted tool for medical resident evaluation. Milestones provide developmental assessment of the attainment of competencies over the course of an educational program. An educational framework is described to explore the development of Genetic Counseling Milestones for the evaluation of the development of genetic counseling competencies by genetic counseling students. The development of Genetic Counseling Milestones may provide a valuable tool to assess genetic counseling students across all program activities. Historical educational context, current practices, and potential benefits and challenges in the development of Genetic Counseling Milestones are discussed.
Collapse
Affiliation(s)
- Carrie Guy
- Department of Pediatrics, Division of Genetics, Masters in Genetic Counseling Program, University of Oklahoma Health Science Center, 1200 Children's Ave, Ste 12100, Oklahoma City, OK, 73104, USA.
| |
Collapse
|
23
|
Schmitt JE, Scanlon MH, Servaes S, Levin D, Cook TS. Milestones on a Shoestring: A Cost-Effective, Semi-automated Implementation of the New ACGME Requirements for Radiology. Acad Radiol 2015; 22:1287-93. [PMID: 25920551 DOI: 10.1016/j.acra.2015.02.013] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/04/2014] [Revised: 12/19/2014] [Accepted: 02/16/2015] [Indexed: 11/25/2022]
Abstract
RATIONALE AND OBJECTIVES The advent of the ACGME's Next Accreditation System represents a significant new challenge for residencies and fellowships, owing to its requirements for more complex and detailed information. MATERIAL AND METHODS We developed a system of online assessment tools to provide comprehensive coverage of the twelve ACGME Milestones and digitized them using freely available cloud-based productivity tools. These tools include a combination of point-of-care procedural assessments, electronic quizzes, online modules, and other data entry forms. Using free statistical analytic tools, we also developed an automated system for management, processing, and data reporting. RESULTS After one year of use, our Milestones project has resulted in the submission of over 20,000 individual data points. The use of automated statistical methods to generate resident-specific profiles has allowed for dynamic reports of individual residents' progress. These profiles both summarize data and also allow program directors access to more granular information as needed. CONCLUSION Informatics-driven strategies for data assessment and processing represent feasible solutions to Milestones assessment and analysis, reducing the potential administrative burden for program directors, residents, and staff.
Collapse
|
24
|
Rachlin S, Schonberger A, Nocera N, Acharya J, Shah N, Henkel J. Continuous Certification Within Residency: An Educational Model. Acad Radiol 2015; 22:1294-8. [PMID: 26314498 DOI: 10.1016/j.acra.2015.07.009] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2014] [Revised: 07/31/2015] [Accepted: 07/31/2015] [Indexed: 11/18/2022]
Abstract
Given that maintaining compliance with Maintenance of Certification is necessary for maintaining licensure to practice as a radiologist and provide quality patient care, it is important for radiology residents to practice fulfilling each part of the program during their training not only to prepare for success after graduation but also to adequately learn best practices from the beginning of their professional careers. This article discusses ways to implement continuous certification (called Continuous Residency Certification) as an educational model within the residency training program.
Collapse
Affiliation(s)
- Susan Rachlin
- Department of Radiology, New York Medical College, 40 Sunshine Cottage Road, Valhalla, New York.
| | | | | | - Jay Acharya
- Department of Radiology, Westchester Medical Center
| | | | | |
Collapse
|
25
|
|
26
|
The use of ACR Appropriateness Criteria: a survey of radiology residents and program directors. Clin Imaging 2014; 39:334-8. [PMID: 25457568 DOI: 10.1016/j.clinimag.2014.10.011] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2014] [Revised: 10/03/2014] [Accepted: 10/20/2014] [Indexed: 11/23/2022]
Abstract
PURPOSE Assess the utilization of American College of Radiology Appropriateness Criteria (ACR-AC) among radiology residency program directors (PDs) and residents. METHODS Radiology PD and resident survey. RESULTS Seventy-four percent (46/62) of PDs promote ACR-AC in education (P<.05), and 84% (317/376) of residents have read at least a few (P<.05). Seventy-four percent (74/100) of first-year residents compared to 56.8% (157/276) of second- to fourth-year residents report at least occasional faculty reference of ACR-AC (P<.05). ACR-AC are well regarded (P<.05), but 40% believe that they are perplexing. CONCLUSION There is widespread resident awareness of ACR-AC and integration into resident training. However, faculty are only beginning to teach with them, and radiologists are not citing them with clinicians.
Collapse
|
27
|
|
28
|
Section editor's notebook: Using self-assessment modules to document milestone achievement in radiology residency training. AJR Am J Roentgenol 2013; 201:1184-5. [PMID: 24261354 DOI: 10.2214/ajr.13.11452] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
|