1
|
Ryan SL, Logan M, Liu X, Shahian DM, Mort E. Long-Term Sustainability and Adaptation of I-PASS Handovers. Jt Comm J Qual Patient Saf 2023; 49:689-697. [PMID: 37648628 DOI: 10.1016/j.jcjq.2023.07.007] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2023] [Revised: 07/26/2023] [Accepted: 07/27/2023] [Indexed: 09/01/2023]
Abstract
BACKGROUND Inadequate communication during transitions of care is a major health care quality and safety vulnerability. In 2013 Massachusetts General Hospital (MGH) embarked on a comprehensive training program using a standardized handover system (I-PASS) that had been shown to reduce adverse events by 30% even when not completely executed on each patient. In this cross-sectional study, the authors sought to characterize handover practices six years later. METHODS Using a standardized interview tool, the researchers evaluated handovers between responding clinicians in 10 departments and then validated these findings through direct observations, allowing for flexibility and customization in the I-PASS elements. The study qualitatively compared I-PASS element use in verbal handovers to MGH early postintervention data, as well as verbal and written handovers with the I-PASS Study Group's postintervention results. RESULTS The authors observed 156 verbal and reviewed 182 written patient handovers. Ninety percent of departments adhered at least partially to the I-PASS system. Average handover duration ranged from 0.6 to 2.1 minutes per established patient. The service with best I-PASS adherence also consistently included the most information per unit of time. Acknowledging substantial differences in study technique, MGH adherence was, on average, comparable or better on all I-PASS elements in verbal handovers and on three of four elements of written handovers compared with the I-PASS Study Group's postintervention results. CONCLUSION Although uptake has varied across services, six years after hospitalwide implementation of I-PASS, the majority of services are performing structured and sequenced handovers, most of which include some elements of the I-PASS system. Those services with the best I-PASS adherence conducted the most efficient handovers.
Collapse
|
2
|
Moore M, Bain-Donohue S, Barry M, Gray P. It sounds like a good handover but can I trust it: the correlation between perceived quality and accuracy? MedEdPublish (2016) 2021; 10:102. [PMID: 38486591 PMCID: PMC10939514 DOI: 10.15694/mep.2021.000102.1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/17/2024] Open
Abstract
This article was migrated. The article was marked as recommended. Background Safe handover is crucial in healthcare and is taught in undergraduate and pre-vocational training curricula. It is now considered an Entrustable Professional Activity (EPA). Handover assessment tools have been developed but the correlation between the perceived quality of a handover and its accuracy has not been studied. Aims This paper aims to determine the correlation between the perceived quality and the accuracy and safety of handover. Methods This descriptive, quantitative study looked at medical students on long-term rural clinical placements who gave clinical handovers to supervisors. The supervisors scored the handovers using the Clinical Handover Assessment Tool (CHAT) and assessed the accuracy and safety of the handover, after seeing the patient. The correlation between handover scores, accuracy and safety was calculated using Cramer's V coefficient. Results 114 handovers from 25 students were assessed. The correlation coefficient for a global assessment of quality and accuracy was 0.585 and for safety was 0.583, considered large effects (>0.35). This also held using a checklist quality assessment but less strongly: 0.419, 0.363 respectively. Conclusion These findings suggest that handovers that sound 'good' are likely to be accurate: clinicians can 'trust their gut-feeling'. A high quality handover reflects more than the trainee'. clinical reasoning, communication and organisational skills: it suggests that they can provide accurate and safe handover. This supports the use of global assessments of handover as an important part of the multi-source feedback required for summative entrustment decision-making.
Collapse
Affiliation(s)
| | | | - Molly Barry
- Australian National University Medical School
| | | |
Collapse
|
3
|
Kashiouris MG, Stefanou C, Sharma D, Yshii-Tamashiro C, Vega R, Hartigan S, Albrecht C 3rd, Brown RH. A Handoffs Software Led to Fewer Errors of Omission and Better Provider Satisfaction: A Randomized Control Trial. J Patient Saf 2020; 16:194-8. [PMID: 28230581 DOI: 10.1097/PTS.0000000000000340] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
BACKGROUND Computer-assisted communication is shown to prevent critical omissions ("errors") in the handoff process. OBJECTIVE The aim of the study was to study this effect and related provider satisfaction, using a standardized software. METHODS Fourteen internal medicine house officers staffed 6 days and 1 cross-covering teams were randomized to either the intervention group or control, employing usual handoff, so that handoff information was exchanged only between same-group subjects (daily, for 28 days). RESULTS In the intervention group, fewer omissions (among those studied) occurred intravenous access (17 versus 422, P < 0.001), code status (1 versus 158, P < 0.001), diet/nothing per mouth (28 versus 477, P < 0.001), and deep venous thrombosis prophylaxis (17 versus 284, P < 0.001); duration to compose handoff was similar; and physicians perceived less workload adjusted for patient census and provider characteristics (P = 0.004) as well as better handoff quality (P < 0.001) and clarity (P < 0.001). CONCLUSIONS The intervention was associated with fewer errors and superior provider satisfaction.
Collapse
|
4
|
Young JQ, Sugarman R, Schwartz J, O'Sullivan PS. Faculty and Resident Engagement With a Workplace-Based Assessment Tool: Use of Implementation Science to Explore Enablers and Barriers. Acad Med 2020; 95:1937-1944. [PMID: 32568853 DOI: 10.1097/acm.0000000000003543] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
PURPOSE Implementation of workplace-based assessment programs has encountered significant challenges. Faculty and residents alike often have a negative view of these programs as "tick-box" or "jump through the hoops" exercises. A number of recommendations have been made to address these challenges. To understand the experience with a workplace-based assessment tool that follows many of these recommendations, the authors conducted a qualitative study using the Consolidated Framework for Implementation Research (CFIR) to identify enablers and barriers to engagement with the tool. METHOD The Psychopharmacotherapy-Structured Clinical Observation (P-SCO) is a direct observation tool designed to assess resident performance during a psychiatric medication management visit. From August 2017 to February 2018, the P-SCO was implemented in the outpatient continuity clinics for second- and third-year residents at Zucker Hillside Hospital/Northwell Health. In February and March 2019, the authors conducted semistructured interviews of participating faculty and residents. Interview guides based on the CFIR were used to capture the enablers and barriers to engagement. Interview transcripts were independently coded. Codes were then organized into themes relevant to the domains of the CFIR. RESULTS Ten faculty and 10 residents were interviewed. Overall, participants had a positive experience with the P-SCO. Enabling factors for faculty and residents included the ongoing training, design features of the P-SCO, predisposing beliefs, dedicated faculty time, and the perception that the P-SCO improved verbal feedback quality. Barriers for faculty included checklist length and discomfort with feedback that threatens identity, and barriers for residents included faculty variability in timeliness and quality of feedback and minimal review of the feedback after initial receipt. CONCLUSIONS This study demonstrates that the negative experience of faculty and residents with workplace-based assessment tools shown in prior studies can be overcome, at least in part, when specific implementation strategies are pursued. The findings provide guidance for future research and implementation efforts.
Collapse
Affiliation(s)
- John Q Young
- J.Q. Young is professor of psychiatry and vice chair for education, Department of Psychiatry, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempstead, New York, and Zucker Hillside Hospital at Northwell Health, Glen Oaks, New York
| | - Rebekah Sugarman
- R. Sugarman is a research assistant, Department of Psychiatry, Zucker Hillside Hospital at Northwell Health, Glen Oaks, New York
| | - Jessica Schwartz
- J. Schwartz is a resident, Department of Psychiatry, Zucker Hillside Hospital at Northwell Health, Glen Oaks, New York
| | - Patricia S O'Sullivan
- P.S. O'Sullivan is professor, Department of Medicine, and director of research and development in medical education, University of California, San Francisco, School of Medicine, San Francisco, California
| |
Collapse
|
5
|
|
6
|
Young JQ, Sugarman R, Schwartz J, McClure M, O'Sullivan PS. A mobile app to capture EPA assessment data: Utilizing the consolidated framework for implementation research to identify enablers and barriers to engagement. Perspect Med Educ 2020; 9:210-219. [PMID: 32504446 PMCID: PMC7459074 DOI: 10.1007/s40037-020-00587-z] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/08/2023]
Abstract
INTRODUCTION Mobile apps that utilize the framework of entrustable professional activities (EPAs) to capture and deliver feedback are being implemented. If EPA apps are to be successfully incorporated into programmatic assessment, a better understanding of how they are experienced by the end-users will be necessary. The authors conducted a qualitative study using the Consolidated Framework for Implementation Research (CFIR) to identify enablers and barriers to engagement with an EPA app. METHODS Structured interviews of faculty and residents were conducted with an interview guide based on the CFIR. Transcripts were independently coded by two study authors using directed content analysis. Differences were resolved via consensus. The study team then organized codes into themes relevant to the domains of the CFIR. RESULTS Eight faculty and 10 residents chose to participate in the study. Both faculty and residents found the app easy to use and effective in facilitating feedback immediately after the observed patient encounter. Faculty appreciated how the EPA app forced brief, distilled feedback. Both faculty and residents expressed positive attitudes and perceived the app as aligned with the department's philosophy. Barriers to engagement included faculty not understanding the EPA framework and scale, competing clinical demands, residents preferring more detailed feedback and both faculty and residents noting that the app's feedback should be complemented by a tool that generates more systematic, nuanced, and comprehensive feedback. Residents rarely if ever returned to the feedback after initial receipt. DISCUSSION This study identified key enablers and barriers to engagement with the EPA app. The findings provide guidance for future research and implementation efforts focused on the use of mobile platforms to capture direct observation feedback.
Collapse
Affiliation(s)
- John Q Young
- Donald and Barbara Zucker School of Medicine at Hofstra/Northwell and the Zucker Hillside Hospital at Northwell Health, Hempstead, NY, USA.
| | - Rebekah Sugarman
- Donald and Barbara Zucker School of Medicine at Hofstra/Northwell and the Zucker Hillside Hospital at Northwell Health, Hempstead, NY, USA
| | - Jessica Schwartz
- Donald and Barbara Zucker School of Medicine at Hofstra/Northwell and the Zucker Hillside Hospital at Northwell Health, Hempstead, NY, USA
| | - Matthew McClure
- Donald and Barbara Zucker School of Medicine at Hofstra/Northwell and the Zucker Hillside Hospital at Northwell Health, Hempstead, NY, USA
| | - Patricia S O'Sullivan
- Department of Medicine, University of California at San Francisco School of Medicine, San Francisco, USA
| |
Collapse
|
7
|
Sun LW, Stahulak AL, Costakos DM. Utilizing Electronic Medical Records to Standardize Handoffs in Academic Ophthalmology. Journal of Academic Ophthalmology 2020. [DOI: 10.1055/s-0040-1718566] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022] Open
Abstract
Abstract
Purpose Formalized handoff procedures have been shown to increase patient safety and quality of care across multiple medical and surgical specialties,1–4 but literature regarding handoffs in ophthalmology remains sparse. We instituted a standardized handoff utilizing an electronic medical record (EMR) system to improve care for patients shared by multiple resident physicians across weekday, weeknight, and weekend duty shifts. We measured efficiency, efficacy, and resident satisfaction before and after the standardized handoff was implemented.
Methods Resident physicians surveyed were primarily responsible for patient care on consult and call services at two quaternary academic medical centers in a major metropolitan area. Patient care was performed in outpatient, emergency, and inpatient settings. Annual anonymous questionnaires consisting of 6 questions were used to collect pre- and postintervention impressions of the standardized EMR handoff process from ophthalmology resident physicians (9 per year; 3 preintervention years and 1 postintervention year). An additional anonymous postintervention questionnaire consisting of 12 questions was used to further characterize resident response to the newly implemented handoff procedure.
Results Prior to implementation of a standardized EMR-based handoff procedure, residents unanimously reported incomplete, infrequently updated handoff reports that did not include important clinical and/or psychosocial information. Following implementation, residents reported a statistically significant increase in completeness and timeliness of handoff reports. Additionally, resident perception of EMR handoff utility, efficiency, and usability were comprehensively favorable. Residents reported handoffs only added a mean of 6.5 minutes to a typical duty shift.
Conclusion Implementation of our protocol dramatically improved resident perceptions of the handoff process at our institution. Improvements included increased quality, ease-of-use, and efficiency. Our standardized EMR-based handoff procedure may be of use to other ambulatory-based services.
Collapse
Affiliation(s)
- Lynn W. Sun
- Department of Ophthalmology, Medical College of Wisconsin, Milwaukee, Wisconsin
| | - Andrea L. Stahulak
- Department of Ophthalmology, Medical College of Wisconsin, Milwaukee, Wisconsin
| | - Deborah M. Costakos
- Department of Ophthalmology, Medical College of Wisconsin, Milwaukee, Wisconsin
| |
Collapse
|
8
|
Young JQ. Advancing Our Understanding of Narrative Comments Generated by Direct Observation Tools: Lessons From the Psychopharmacotherapy-Structured Clinical Observation. J Grad Med Educ 2019; 11:570-579. [PMID: 31636828 PMCID: PMC6795331 DOI: 10.4300/jgme-d-19-00207.1] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/23/2019] [Revised: 07/07/2019] [Accepted: 08/05/2019] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND While prior research has focused on the validity of quantitative ratings generated by direct observation tools, much less is known about the written comments. OBJECTIVE This study examines the quality of written comments and their relationship with checklist scores generated by a direct observation tool, the Psychopharmacotherapy-Structured Clinical Observation (P-SCO). METHODS From 2008 to 2012, faculty in a postgraduate year 3 psychiatry outpatient clinic completed 601 P-SCOs. Twenty-five percent were randomly selected from each year; the sample included 8 faculty and 57 residents. To assess quality, comments were coded for valence (reinforcing or corrective), behavioral specificity, and content. To assess the relationship between comments and scores, the authors calculated the correlation between comment and checklist score valence and examined the degree to which comments and checklist scores addressed the same content. RESULTS Ninety-one percent of the comments were behaviorally specific. Sixty percent were reinforcing, and 40% were corrective. Eight themes were identified, including 2 constructs not adequately represented by the checklist. Comment and checklist score valence was moderately correlated (Spearman's rho = 0.57, P < .001). Sixty-seven percent of high and low checklist scores were associated with a comment of the same valence and content. Only 50% of overall comments were associated with a checklist score of the same valence and content. CONCLUSIONS A direct observation tool such as the P-SCO can generate high-quality written comments. Narrative comments both explain checklist scores and convey unique content. Thematic coding of comments can improve the content validity of a checklist.
Collapse
|
9
|
Abstract
Communication errors during transitions of care are a leading source of adverse events for hospitalized patients. This article provides an overview of the role of communication errors in adverse events, describes the complexities of communication for hospitalized patients, and provides evidence regarding the positive effects of applying high-reliability principles to transitions of care and culture of safety. Elements of effective handoffs and a detailed approach for successful implementation of a handoff program are provided. The role of handoff communication in medical education at all levels, as well as for the interprofessional team, is discussed.
Collapse
Affiliation(s)
- Shilpa J Patel
- John A. Burns School of Medicine, Kapi`olani Medical Center for Women & Children, Hawaii Pacific Health, 1319 Punahou Street, 7th Floor, Honolulu, HI 96826, USA.
| | - Christopher P Landrigan
- Boston Children's Hospital, Brigham & Women's Hospital, Harvard Medical School, 300 Longwood Avenue, Enders 1, Boston, MA 02115, USA
| |
Collapse
|
10
|
Young JQ, Rasul R, O'Sullivan PS. Evidence for the Validity of the Psychopharmacotherapy-Structured Clinical Observation Tool: Results of a Factor and Time Series Analysis. Acad Psychiatry 2018; 42:759-764. [PMID: 29951950 DOI: 10.1007/s40596-018-0928-0] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/28/2017] [Accepted: 04/18/2018] [Indexed: 06/08/2023]
Abstract
OBJECTIVE The Psychopharmacotherapy-Structured Clinical Observation (P-SCO) is a direct observation tool designed to assess resident performance of a medication visit. This study examines two dimensions of validity for the P-SCO: internal structure and how scores correlate with another variable associated with competence (experience). METHODS The faculty completed 601 P-SCOs over 4 years. Multilevel exploratory factor analysis was performed with minimum thresholds for eigenvalue (≥ 1.0) and proportion of variance explained (≥ 5.0%). Internal reliability was assessed with Cronbach alpha. To examine how scores changed with experience, mean ratings (1-4 scale) were calculated for each factor by quarter of the academic year. Separate linear mixed models were also performed. RESULTS The analysis yielded three factors that explained 50% of the variance and demonstrated high internal reliability: affective tasks (alpha = 0.90), cognitive tasks (alpha = 0.84), and hard tasks (alpha = 0.74). Items within "hard tasks" were assessment of substance use, violence risk, and adherence, and inquiry about interactions with other providers. Monitoring adverse effects did not load on the hard task factor but also had overall low mean ratings. Compared to the first quarter, fourth quarter scores for affective tasks (b = 0.54, p < 0.01) and hard tasks (b = 0.46, p = 0.02) were significantly improved while cognitive tasks had a non-significant increase. For the hard tasks, the proportion of residents with a low mean rating improved but was still over 30% during the fourth quarter. CONCLUSIONS The results provide evidence for the validity of the P-SCO with respect to its internal structure and how scores correlate with experience. Curricular implications are explored, especially for the tasks that were hard to learn.
Collapse
Affiliation(s)
- John Q Young
- Zucker School of Medicine at Hofstra/Northwell, Hempstead, NY, USA.
| | | | | |
Collapse
|
11
|
Festekjian A, Mody AP, Chang TP, Ziv N, Nager AL. Novel Transfer of Care Sign-out Assessment Tool in a Pediatric Emergency Department. Acad Pediatr 2018; 18:86-93. [PMID: 28843485 DOI: 10.1016/j.acap.2017.08.009] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/16/2016] [Revised: 08/10/2017] [Accepted: 08/15/2017] [Indexed: 10/19/2022]
Abstract
OBJECTIVE Transfer of care sign-outs (TOCS) for admissions from a pediatric emergency department have unique challenges. Standardized and reliable assessment tools for TOCS remain elusive. We describe the development, reliability, and validity of a TOCS assessment tool. METHODS Video recordings of resident TOCS were assessed to capture 4 domains: completeness, synopsis, foresight, and professionalism. In phase 1, 56 TOCS were used to modify the tool and improve reliability. In phase 2, 91 TOCS were used to examine validity. Analyses included Cronbach's alpha for internal structure, intraclass correlation and Cohen's kappa for interrater reliability, Pearson's correlation for relationships between variables, and 95% confidence interval of the mean for resident group comparisons. RESULTS Cronbach's alpha was 0.52 for internal structure of the tool's subjective rating scale. Intraclass correlation for the subjective rating scale items ranged from 0.70 to 0.80. Cohen's kappa for most objective checklist items ranged from 0.43 to 1. Content completeness was significantly correlated with synopsis, foresight, and professionalism (Pearson's r ranged from 0.36 to 0.62, P values were <0.001). House staff senior residents scored higher (on average) than interns and rotating senior residents in synopsis and foresight. Also, house staff interns scored higher (on average) than rotating senior residents in professionalism. House staff senior residents scored higher (on average) than rotating senior residents in content completeness. CONCLUSIONS We provide validity evidence to support using scores from the TOCS tool to assess higher-level transfer of care comprehension and communication by pediatric emergency department residents and to test interventions to improve TOCS.
Collapse
Affiliation(s)
- Ara Festekjian
- Department of Pediatrics, Division of Emergency and Transport Medicine, Children's Hospital Los Angeles, Los Angeles, Calif; Keck School of Medicine, University of Southern California, Los Angeles, Calif.
| | - Ameer P Mody
- Department of Pediatrics, Division of Emergency and Transport Medicine, Children's Hospital Los Angeles, Los Angeles, Calif; Keck School of Medicine, University of Southern California, Los Angeles, Calif
| | - Todd P Chang
- Department of Pediatrics, Division of Emergency and Transport Medicine, Children's Hospital Los Angeles, Los Angeles, Calif; Keck School of Medicine, University of Southern California, Los Angeles, Calif
| | - Nurit Ziv
- Department of Pediatrics, Division of Emergency and Transport Medicine, Children's Hospital Los Angeles, Los Angeles, Calif
| | - Alan L Nager
- Department of Pediatrics, Division of Emergency and Transport Medicine, Children's Hospital Los Angeles, Los Angeles, Calif; Keck School of Medicine, University of Southern California, Los Angeles, Calif
| |
Collapse
|
12
|
Thomson H, Tourangeau A, Jeffs L, Puts M. Factors affecting quality of nurse shift handover in the emergency department. J Adv Nurs 2017; 74:876-886. [DOI: 10.1111/jan.13499] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 10/25/2017] [Indexed: 11/28/2022]
Affiliation(s)
- Heather Thomson
- Lawrence S. Bloomberg Faculty of Nursing; University of Toronto; Toronto ON Canada
| | - Ann Tourangeau
- Lawrence S. Bloomberg Faculty of Nursing; University of Toronto; Toronto ON Canada
| | - Lianne Jeffs
- Lawrence S. Bloomberg Faculty of Nursing; University of Toronto; Toronto ON Canada
- St. Michael's Hospital; Toronto ON Canada
- Institute for Health Policy Management and Evaluation; University of Toronto; Toronto ON Canada
| | - Martine Puts
- Lawrence S. Bloomberg Faculty of Nursing; University of Toronto; Toronto ON Canada
| |
Collapse
|
13
|
Affiliation(s)
- Glenn Rosenbluth
- Divisions of Pediatric Hospital Medicine and Medical Education, Department of Pediatrics, UCSF Benioff Children's Hospital, University of California, San Francisco, San Francisco, California,
| |
Collapse
|
14
|
Moore M, Roberts C, Newbury J, Crossley J. Am I getting an accurate picture: a tool to assess clinical handover in remote settings? BMC Med Educ 2017; 17:213. [PMID: 29141622 PMCID: PMC5688655 DOI: 10.1186/s12909-017-1067-0] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/23/2016] [Accepted: 11/07/2017] [Indexed: 06/07/2023]
Abstract
BACKGROUND Good clinical handover is critical to safe medical care. Little research has investigated handover in rural settings. In a remote setting where nurses and medical students give telephone handover to an aeromedical retrieval service, we developed a tool by which the receiving clinician might assess the handover; and investigated factors impacting on the reliability and validity of that assessment. METHODS Researchers consulted with clinicians to develop an assessment tool, based on the ISBAR handover framework, combining validity evidence and the existing literature. The tool was applied 'live' by receiving clinicians and from recorded handovers by academic assessors. The tool's performance was analysed using generalisability theory. Receiving clinicians and assessors provided feedback. RESULTS Reliability for assessing a call was good (G = 0.73 with 4 assessments). The scale had a single factor structure with good internal consistency (Cronbach's alpha = 0.8). The group mean for the global score for nurses and students was 2.30 (SD 0.85) out of a maximum 3.0, with no difference between these sub-groups. CONCLUSIONS We have developed and evaluated a tool to assess high-stakes handover in a remote setting. It showed good reliability and was easy for working clinicians to use. Further investigation and use is warranted beyond this setting.
Collapse
Affiliation(s)
- Malcolm Moore
- Rural Clinical School, Australian National University Medical School, 54 Mills Rd, Acton, ACT 2601 Australia
- Broken Hill University Department of Rural Health, University of Sydney, Broken Hill, Australia
| | - Chris Roberts
- Northern Clinical School, Sydney Medical School, University of Sydney, Sydney, Australia
| | - Jonathan Newbury
- Rural Clinical School, University of Adelaide, Adelaide, Australia
| | - Jim Crossley
- Medical School, University of Sheffield, Sheffield, UK
| |
Collapse
|
15
|
Starmer AJ, Schnock KO, Lyons A, Hehn RS, Graham DA, Keohane C, Landrigan CP. Effects of the I-PASS Nursing Handoff Bundle on communication quality and workflow. BMJ Qual Saf 2017; 26:949-957. [DOI: 10.1136/bmjqs-2016-006224] [Citation(s) in RCA: 30] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/21/2016] [Revised: 04/05/2017] [Accepted: 04/23/2017] [Indexed: 11/04/2022]
|
16
|
Klein MD, Li ST. Building on the Shoulders of Giants: A Model for Developing Medical Education Scholarship Using I-PASS. Acad Pediatr 2016; 16:499-500. [PMID: 27262525 DOI: 10.1016/j.acap.2016.05.148] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/25/2016] [Accepted: 05/25/2016] [Indexed: 11/23/2022]
|