1
|
Salhöfer L, Haubold J, Gutt M, Hosch R, Umutlu L, Meetschen M, Schuessler M, Forsting M, Nensa F, Schaarschmidt BM. The importance of educational tools and a new software solution for visualizing and quantifying report correction in radiology training. Sci Rep 2024; 14:1172. [PMID: 38216664 PMCID: PMC10786897 DOI: 10.1038/s41598-024-51462-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2023] [Accepted: 01/05/2024] [Indexed: 01/14/2024] Open
Abstract
A novel software, DiffTool, was developed in-house to keep track of changes made by board-certified radiologists to preliminary reports created by residents and evaluate its impact on radiological hands-on training. Before (t0) and after (t2-4) the deployment of the software, 18 residents (median age: 29 years; 33% female) completed a standardized questionnaire on professional training. At t2-4 the participants were also requested to respond to three additional questions to evaluate the software. Responses were recorded via a six-point Likert scale ranging from 1 ("strongly agree") to 6 ("strongly disagree"). Prior to the release of the software, 39% (7/18) of the residents strongly agreed with the statement that they manually tracked changes made by board-certified radiologists to each of their radiological reports while 61% were less inclined to agree with that statement. At t2-4, 61% (11/18) stated that they used DiffTool to track differences. Furthermore, we observed an increase from 33% (6/18) to 44% (8/18) of residents who agreed to the statement "I profit from every corrected report". The DiffTool was well accepted among residents with a regular user base of 72% (13/18), while 78% (14/18) considered it a relevant improvement to their training. The results of this study demonstrate the importance of providing a time-efficient way to analyze changes made to preliminary reports as an additive for professional training.
Collapse
Affiliation(s)
- Luca Salhöfer
- Institute of Diagnostic and Interventional Radiology and Neuroradiology, University Hospital Essen, Hufelandstr. 55, 45147, Essen, Germany.
- Institute for Artificial Intelligence in Medicine, University Hospital Essen, Essen, Germany.
| | - Johannes Haubold
- Institute of Diagnostic and Interventional Radiology and Neuroradiology, University Hospital Essen, Hufelandstr. 55, 45147, Essen, Germany
- Institute for Artificial Intelligence in Medicine, University Hospital Essen, Essen, Germany
| | - Maurice Gutt
- Central IT Services, University Hospital Essen, Essen, Germany
| | - René Hosch
- Institute for Artificial Intelligence in Medicine, University Hospital Essen, Essen, Germany
| | - Lale Umutlu
- Institute of Diagnostic and Interventional Radiology and Neuroradiology, University Hospital Essen, Hufelandstr. 55, 45147, Essen, Germany
| | - Mathias Meetschen
- Institute of Diagnostic and Interventional Radiology and Neuroradiology, University Hospital Essen, Hufelandstr. 55, 45147, Essen, Germany
- Institute for Artificial Intelligence in Medicine, University Hospital Essen, Essen, Germany
| | - Maximilian Schuessler
- Institute of Diagnostic and Interventional Radiology and Neuroradiology, University Hospital Essen, Hufelandstr. 55, 45147, Essen, Germany
- Institute for Artificial Intelligence in Medicine, University Hospital Essen, Essen, Germany
| | - Michael Forsting
- Institute of Diagnostic and Interventional Radiology and Neuroradiology, University Hospital Essen, Hufelandstr. 55, 45147, Essen, Germany
| | - Felix Nensa
- Institute of Diagnostic and Interventional Radiology and Neuroradiology, University Hospital Essen, Hufelandstr. 55, 45147, Essen, Germany
- Institute for Artificial Intelligence in Medicine, University Hospital Essen, Essen, Germany
| | - Benedikt Michael Schaarschmidt
- Institute of Diagnostic and Interventional Radiology and Neuroradiology, University Hospital Essen, Hufelandstr. 55, 45147, Essen, Germany
| |
Collapse
|
2
|
Stewart M, Yang N, Lim R. Provision of feedback to radiology trainees: Barriers and inefficiencies, why it matters and a potential solution. J Med Imaging Radiat Oncol 2023; 67:77-80. [PMID: 36480020 DOI: 10.1111/1754-9485.13497] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/02/2022] [Accepted: 11/15/2022] [Indexed: 12/13/2022]
Affiliation(s)
- Michael Stewart
- Radiology Department, Austin Health, Melbourne, Victoria, Australia
| | - Natalie Yang
- Radiology Department, Austin Health, Melbourne, Victoria, Australia
| | - Ruth Lim
- Radiology Department, Austin Health, Melbourne, Victoria, Australia
| |
Collapse
|
3
|
Vosshenrich J, Nesic I, Cyriac J, Boll DT, Merkle EM, Heye T. Revealing the most common reporting errors through data mining of the report proofreading process. Eur Radiol 2020; 31:2115-2125. [PMID: 32997178 PMCID: PMC7979672 DOI: 10.1007/s00330-020-07306-6] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/03/2020] [Revised: 08/18/2020] [Accepted: 09/16/2020] [Indexed: 11/04/2022]
Abstract
Objectives To investigate the most common errors in residents’ preliminary reports, if structured reporting impacts error types and frequencies, and to identify possible implications for resident education and patient safety. Material and methods Changes in report content were tracked by a report comparison tool on a word level and extracted for 78,625 radiology reports dictated from September 2017 to December 2018 in our department. Following data aggregation according to word stems and stratification by subspecialty (e.g., neuroradiology) and imaging modality, frequencies of additions/deletions were analyzed for findings and impression report section separately and compared between subgroups. Results Overall modifications per report averaged 4.1 words, with demonstrably higher amounts of changes for cross-sectional imaging (CT: 6.4; MRI: 6.7) than non-cross-sectional imaging (radiographs: 0.2; ultrasound: 2.8). The four most frequently changed words (right, left, one, and none) remained almost similar among all subgroups (range: 0.072–0.117 per report; once every 9–14 reports). Albeit representing only 0.02% of analyzed words, they accounted for up to 9.7% of all observed changes. Subspecialties solely using structured reporting had substantially lower change ratios in the findings report section (mean: 0.2 per report) compared with prose-style reporting subspecialties (mean: 2.0). Relative frequencies of the most changed words remained unchanged. Conclusion Residents’ most common reporting errors in all subspecialties and modalities are laterality discriminator confusions (left/right) and unnoticed descriptor misregistration by speech recognition (one/none). Structured reporting reduces overall error rates, but does not affect occurrence of the most common errors. Increased error awareness and measures improving report correctness and ensuring patient safety are required. Key Points • The two most common reporting errors in residents’ preliminary reports are laterality discriminator confusions (left/right) and unnoticed descriptor misregistration by speech recognition (one/none). • Structured reporting reduces the overall the error frequency in the findings report section by a factor of 10 (structured reporting: mean 0.2 per report; prose-style reporting: 2.0) but does not affect the occurrence of the two major errors. • Staff radiologist review behavior noticeably differs between radiology subspecialties. Electronic supplementary material The online version of this article (10.1007/s00330-020-07306-6) contains supplementary material, which is available to authorized users.
Collapse
Affiliation(s)
- Jan Vosshenrich
- Department of Radiology, University Hospital Basel, Petersgraben 4, 4031, Basel, Switzerland.
| | - Ivan Nesic
- Department of Radiology, University Hospital Basel, Petersgraben 4, 4031, Basel, Switzerland
| | - Joshy Cyriac
- Department of Radiology, University Hospital Basel, Petersgraben 4, 4031, Basel, Switzerland
| | - Daniel T Boll
- Department of Radiology, University Hospital Basel, Petersgraben 4, 4031, Basel, Switzerland
| | - Elmar M Merkle
- Department of Radiology, University Hospital Basel, Petersgraben 4, 4031, Basel, Switzerland
| | - Tobias Heye
- Department of Radiology, University Hospital Basel, Petersgraben 4, 4031, Basel, Switzerland
| |
Collapse
|
4
|
Matalon SA, Souza DA, Gaviola GC, Silverman SG, Mayo-Smith WW, Lee LK. Trainee and Attending Perspectives on Remote Radiology Readouts in the Era of the COVID-19 Pandemic. Acad Radiol 2020; 27:1147-1153. [PMID: 32507612 PMCID: PMC7245278 DOI: 10.1016/j.acra.2020.05.019] [Citation(s) in RCA: 41] [Impact Index Per Article: 10.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/24/2020] [Revised: 05/15/2020] [Accepted: 05/18/2020] [Indexed: 11/26/2022]
Abstract
Rationale and Objectives Social distancing mandates due to COVID-19 have necessitated adaptations to radiology trainee workflow and educational practices, including the radiology “readout.” We describe how a large academic radiology department achieved socially distant “remote readouts,” provide trainee and attending perspectives on this early experience, and propose ways by which “remote readouts” can be used effectively by training programs beyond COVID-19. Materials and Methods Beginning March 2020, radiologists were relocated to workspaces outside of conventional reading rooms. Information technologies were employed to allow for “remote readouts” between trainees and attendings. An optional anonymous open-ended survey regarding remote readouts was administered to radiology trainees and attendings as a quality improvement initiative. From the responses, response themes were abstracted using thematic analysis. Descriptive statistics of the qualitative data were calculated. Results Radiologist workstations from 14 traditional reading rooms were relocated to 36 workspaces across the hospital system. Two models of remote readouts, synchronous and asynchronous, were developed, facilitated by commercially available information technologies. Thirty-nine of 105 (37%) trainees and 42 of 90 (47%) attendings responded to the survey. Main response themes included: social distancing, technology, autonomy/competency, efficiency, education/feedback and atmosphere/professional relationship. One hundred and forty-eight positive versus 97 negative comments were reported. Social distancing, technology, and autonomy/competency were most positively rated. Trainees and attending perspectives differed regarding the efficiency of remote readouts. Conclusion “Remote readouts,” compliant with social distancing measures, are feasible in academic radiology practice settings. Perspectives from our initial experience provide insight into how this can be accomplished, opportunities for improvement and future application, beyond the COVID-19 pandemic.
Collapse
|
5
|
Ranking Significant Discrepancies in Clinical Reports. LECTURE NOTES IN COMPUTER SCIENCE 2020. [PMCID: PMC7148074 DOI: 10.1007/978-3-030-45442-5_30] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 10/27/2022]
|
6
|
Choi HH, Clark J, Jay AK, Filice RW. Minimizing Barriers in Learning for On-Call Radiology Residents-End-to-End Web-Based Resident Feedback System. J Digit Imaging 2019; 31:117-123. [PMID: 28840360 DOI: 10.1007/s10278-017-0015-1] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022] Open
Abstract
Feedback is an essential part of medical training, where trainees are provided with information regarding their performance and further directions for improvement. In diagnostic radiology, feedback entails a detailed review of the differences between the residents' preliminary interpretation and the attendings' final interpretation of imaging studies. While the on-call experience of independently interpreting complex cases is important to resident education, the more traditional synchronous "read-out" or joint review is impossible due to multiple constraints. Without an efficient method to compare reports, grade discrepancies, convey salient teaching points, and view images, valuable lessons in image interpretation and report construction are lost. We developed a streamlined web-based system, including report comparison and image viewing, to minimize barriers in asynchronous communication between attending radiologists and on-call residents. Our system provides real-time, end-to-end delivery of case-specific and user-specific feedback in a streamlined, easy-to-view format. We assessed quality improvement subjectively through surveys and objectively through participation metrics. Our web-based feedback system improved user satisfaction for both attending and resident radiologists, and increased attending participation, particularly with regards to cases where substantive discrepancies were identified.
Collapse
Affiliation(s)
- Hailey H Choi
- Department of Radiology, MedStar Georgetown University Hospital, Washington, DC, USA.
| | - Jennifer Clark
- Georgetown University School of Medicine, Washington, DC, USA
| | - Ann K Jay
- Department of Radiology, MedStar Georgetown University Hospital, Washington, DC, USA
| | - Ross W Filice
- Department of Radiology, MedStar Georgetown University Hospital, Washington, DC, USA
| |
Collapse
|
7
|
Kelahan LC, Kalaria AD, Filice RW. PathBot: A Radiology-Pathology Correlation Dashboard. J Digit Imaging 2018; 30:681-686. [PMID: 28374195 DOI: 10.1007/s10278-017-9969-2] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Abstract
Pathology is considered the "gold standard" of diagnostic medicine. The importance of radiology-pathology correlation is seen in interdepartmental patient conferences such as "tumor boards" and by the tradition of radiology resident immersion in a radiologic-pathology course at the American Institute of Radiologic Pathology. In practice, consistent pathology follow-up can be difficult due to time constraints and cumbersome electronic medical records. We present a radiology-pathology correlation dashboard that presents radiologists with pathology reports matched to their dictations, for both diagnostic imaging and image-guided procedures. In creating our dashboard, we utilized the RadLex ontology and National Center for Biomedical Ontology (NCBO) Annotator to identify anatomic concepts in pathology reports that could subsequently be mapped to relevant radiology reports, providing an automated method to match related radiology and pathology reports. Radiology-pathology matches are presented to the radiologist on a web-based dashboard. We found that our algorithm was highly specific in detecting matches. Our sensitivity was slightly lower than expected and could be attributed to missing anatomy concepts in the RadLex ontology, as well as limitations in our parent term hierarchical mapping and synonym recognition algorithms. By automating radiology-pathology correlation and presenting matches in a user-friendly dashboard format, we hope to encourage pathology follow-up in clinical radiology practice for purposes of self-education and to augment peer review. We also hope to provide a tool to facilitate the production of quality teaching files, lectures, and publications. Diagnostic images have a richer educational value when they are backed up by the gold standard of pathology.
Collapse
Affiliation(s)
- Linda C Kelahan
- Department of Radiology, MedStar Georgetown University Hospital, 3800 Reservoir Road NW, Washington, DC, 20007, USA.
| | - Amit D Kalaria
- Department of Radiology, MedStar Georgetown University Hospital, 3800 Reservoir Road NW, Washington, DC, 20007, USA
| | - Ross W Filice
- Department of Radiology, MedStar Georgetown University Hospital, 3800 Reservoir Road NW, Washington, DC, 20007, USA.,MedStar Medical Group Radiology, Washington, D.C, USA
| |
Collapse
|
8
|
Wildenberg JC, Chen PH, Scanlon MH, Cook TS. Attending Radiologist Variability and Its Effect on Radiology Resident Discrepancy Rates. Acad Radiol 2017; 24:694-699. [PMID: 28130051 DOI: 10.1016/j.acra.2016.12.004] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2016] [Revised: 11/23/2016] [Accepted: 12/01/2016] [Indexed: 11/30/2022]
Abstract
RATIONALE AND OBJECTIVES Discrepancy rates for interpretations produced in a call situation are one metric to evaluate residents during training. Current benchmarks, reported in previous studies, do not consider the effects of practice pattern variability among attending radiologists. This study aims to investigate the impact of attending variability on resident discrepancy rates to determine if the current benchmarks are an accurate measure of resident performance and, if necessary, update discrepancy benchmarks to accurately identify residents performing below expectations. MATERIALS AND METHODS All chest radiographs, musculoskeletal (MSK) radiographs, chest computed tomographies (CTs), abdomen and pelvis CTs, and head CTs interpreted by postgraduate year-3 residents in a call situation over 5 years were reviewed for the presence of a significant discrepancy and composite results compared to prior findings. Simulations of the expected discrepancy distribution for an "average resident" were then performed using Gibbs sampling, and this distribution was compared to the actual resident distribution. RESULTS A strong inverse correlation between resident volume and discrepancy rates was found. There was wide variability among attendings in both overread volume and propensity to issue a discrepancy, although there was no significant correlation. Simulations show that previous benchmarks match well for chest radiographs, abdomen and pelvis CTs, and head CTs but not for MSK radiographs and chest CTs. The simulations also demonstrate a large effect of attending practice patterns on resident discrepancy rates. CONCLUSIONS The large variability in attending practice patterns suggests direct comparison of residents using discrepancy rates is unlikely to reflect true performance. Current benchmarks for chest radiographs, abdomen and pelvis CTs, and head CTs are appropriate and correctly flag residents whose performance may benefit from additional attention, whereas those for MSK radiographs and chest CTs are likely too strict.
Collapse
Affiliation(s)
- Joseph C Wildenberg
- Department of Radiology, Hospital of the University of Pennsylvania, 3400 Spruce St., Philadelphia, PA 19104.
| | - Po-Hao Chen
- Department of Radiology, Hospital of the University of Pennsylvania, 3400 Spruce St., Philadelphia, PA 19104
| | - Mary H Scanlon
- Department of Radiology, Hospital of the University of Pennsylvania, 3400 Spruce St., Philadelphia, PA 19104
| | - Tessa S Cook
- Department of Radiology, Hospital of the University of Pennsylvania, 3400 Spruce St., Philadelphia, PA 19104
| |
Collapse
|