1
|
Sarkies MN, Skinner EH, Bowles KA, Morris ME, Williams C, O'Brien L, Bardoel A, Martin J, Holland AE, Carey L, White J, Haines TP. A novel counterbalanced implementation study design: methodological description and application to implementation research. Implement Sci 2019; 14:45. [PMID: 31046788 PMCID: PMC6498461 DOI: 10.1186/s13012-019-0896-0] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/07/2018] [Accepted: 04/15/2019] [Indexed: 12/27/2022] Open
Abstract
Background Implementation research is increasingly being recognised for optimising the outcomes of clinical practice. Frequently, the benefits of new evidence are not implemented due to the difficulties applying traditional research methodologies to implementation settings. Randomised controlled trials are not always practical for the implementation phase of knowledge transfer, as differences between individual and organisational readiness for change combined with small sample sizes can lead to imbalances in factors that impede or facilitate change between intervention and control groups. Within-cluster repeated measure designs could control for variance between intervention and control groups by allowing the same clusters to receive a sequence of conditions. Although in implementation settings, they can contaminate the intervention and control groups after the initial exposure to interventions. We propose the novel application of counterbalanced design to implementation research where repeated measures are employed through crossover, but contamination is averted by counterbalancing different health contexts in which to test the implementation strategy. Methods In a counterbalanced implementation study, the implementation strategy (independent variable) has two or more levels evaluated across an equivalent number of health contexts (e.g. community-acquired pneumonia and nutrition for critically ill patients) using the same outcome (dependent variable). This design limits each cluster to one distinct strategy related to one specific context, and therefore does not overburden any cluster to more than one focussed implementation strategy for a particular outcome, and provides a ready-made control comparison, holding fixed. The different levels of the independent variable can be delivered concurrently because each level uses a different health context within each cluster to avoid the effect of treatment contamination from exposure to the intervention or control condition. Results An example application of the counterbalanced implementation design is presented in a hypothetical study to demonstrate the comparison of ‘video-based’ and ‘written-based’ evidence summary research implementation strategies for changing clinical practice in community-acquired pneumonia and nutrition in critically ill patient health contexts. Conclusion A counterbalanced implementation study design provides a promising model for concurrently investigating the success of research implementation strategies across multiple health context areas such as community-acquired pneumonia and nutrition for critically ill patients. Electronic supplementary material The online version of this article (10.1186/s13012-019-0896-0) contains supplementary material, which is available to authorized users.
Collapse
Affiliation(s)
- Mitchell N Sarkies
- School of Primary and Allied Health Care, Monash University, Building G Peninsula Campus, McMahons Road, Frankston, Victoria, 3199, Australia. .,Allied Health Research Unit, Monash Health, 400 Warrigal Road, Cheltenham, Victoria, 3092, Australia. .,Department of Physiotherapy, Monash Health, 400 Warrigal Road, Cheltenham, Victoria, 3092, Australia.
| | - Elizabeth H Skinner
- Allied Health Research Unit, Monash Health, 400 Warrigal Road, Cheltenham, Victoria, 3092, Australia
| | - Kelly-Ann Bowles
- Department of Community Emergency Health and Paramedic Practice, Monash University, Building H Peninsula Campus, McMahons Road, Frankston, Victoria, 3199, Australia
| | - Meg E Morris
- La Trobe Centre for Sport and Exercise Medicine Research, La Trobe University, Bundoora, Victoria, 3086, Australia.,North Eastern Rehabilitation Centre, Healthscope, Ivanhoe, Victoria, 3079, Australia
| | - Cylie Williams
- Peninsula Health, 4 Hastings Road, Frankston, Victoria, 3199, Australia
| | - Lisa O'Brien
- Department of Occupational Therapy, Monash University, Building G Peninsula Campus, McMahons Road, Frankston, Victoria, 3199, Australia
| | - Anne Bardoel
- Department of Management and Marketing, Swinburne University, BA Buidling John Street, Hawthorn Campus, Hawthorn, Victoria, 3122, Australia
| | - Jenny Martin
- Swinburne University, John Street, Hawthorn, Victoria, 3122, Australia
| | - Anne E Holland
- Alfred Health and La Trobe University, 99 Commercial Road, Melbourne, Victoria, 3004, Australia
| | - Leeanne Carey
- Occupational Therapy, School of Allied Health, La Trobe University, Bundoora, Victoria, 3086, Australia.,Neurorehabilitation and Recovery, Melbourne Brain Centre, Florey Institute of Neuroscience and Mental Health, 245 Burgundy Street, Heidelberg, Victoria, 3084, Australia
| | - Jennifer White
- School of Primary and Allied Health Care, Monash University, Building G Peninsula Campus, McMahons Road, Frankston, Victoria, 3199, Australia
| | - Terry P Haines
- School of Primary and Allied Health Care, Monash University, Building G Peninsula Campus, McMahons Road, Frankston, Victoria, 3199, Australia
| |
Collapse
|
2
|
Petkovic J, Welch V, Jacob MH, Yoganathan M, Ayala AP, Cunningham H, Tugwell P. Do evidence summaries increase health policy-makers' use of evidence from systematic reviews? A systematic review. CAMPBELL SYSTEMATIC REVIEWS 2018; 14:1-52. [PMID: 37131376 PMCID: PMC8428003 DOI: 10.4073/csr.2018.8] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Indexed: 05/04/2023]
Abstract
This review summarizes the evidence from six randomized controlled trials that judged the effectiveness of systematic review summaries on policymakers' decision making, or the most effective ways to present evidence summaries to increase policymakers' use of the evidence. This review included six randomized controlled studies. A randomized controlled study is one in which the participants are divided randomly (by chance) into separate groups to compare different treatments or other interventions. This method of dividing people into groups means that the groups will be similar and that the effects of the treatments they receive will be compared more fairly. At the time the study is done, it is not known which treatment is the better one. The researchers who did these studies invited people from Europe, North America, South America, Africa, and Asia to take part in them. Two studies looked at "policy briefs," one study looked at an "evidence summary," two looked at a "summary of findings table," and one compared a "summary of findings table" to an evidence summary. None of these studies looked at how policymakers directly used evidence from systematic reviews in their decision making, but two studies found that there was little to no difference in how they used the summaries. The studies relied on reports from decision makers. These studies included questions such as, "Is this summary easy to understand?" Some of the studies looked at users' knowledge, understanding, beliefs, or how credible (trustworthy) they believed the summaries to be. There was little to no difference in the studies that looked at these outcomes. Study participants rated the graded entry format higher for usability than the full systematic review. The graded entry format allows the reader to select how much information they want to read. The study participants felt that all evidence summary formats were easier to understand than full systematic reviews. Plain language summary Policy briefs make systematic reviews easier to understand but little evidence of impact on use of study findings: It is likely that evidence summaries are easier to understand than complete systematic reviews. Whether these summaries increase the use of evidence from systematic reviews in policymaking is not clear.What is this review about?: Systematic reviews are long and technical documents that may be hard for policymakers to use when making decisions. Evidence summaries are short documents that describe research findings in systematic reviews. These summaries may simplify the use of systematic reviews.Other names for evidence reviews are policy briefs, evidence briefs, summaries of findings, or plain language summaries. The goal of this review was to learn whether evidence summaries help policymakers use evidence from systematic reviews. This review also aimed to identify the best ways to present the evidence summary to increase the use of evidence.What are the main findings of this review?: This review included six randomized controlled studies. A randomized controlled study is one in which the participants are divided randomly (by chance) into separate groups to compare different treatments or other interventions. This method of dividing people into groups means that the groups will be similar and that the effects of the treatments they receive will be compared more fairly. At the time the study is done, it is not known which treatment is the better one.The researchers who did these studies invited people from Europe, North America, South America, Africa, and Asia to take part in them. Two studies looked at "policy briefs," one study looked at an "evidence summary," two looked at a "summary of findings table," and one compared a "summary of findings table" to an evidence summary.None of these studies looked at how policymakers directly used evidence from systematic reviews in their decision making, but two studies found that there was little to no difference in how they used the summaries. The studies relied on reports from decision makers. These studies included questions such as, "Is this summary easy to understand?"Some of the studies looked at users' knowledge, understanding, beliefs, or how credible (trustworthy) they believed the summaries to be. There was little to no difference in the studies that looked at these outcomes. Study participants rated the graded entry format higher for usability than the full systematic review. The graded entry format allows the reader to select how much information they want to read.. The study participants felt that all evidence summary formats were easier to understand than full systematic reviews.What do the findings of this review mean?: Our review suggests that evidence summaries help policymakers to better understand the findings presented in systematic reviews. In short, evidence summaries should be developed to make it easier for policymakers to understand the evidence presented in systematic reviews. However, right now there is very little evidence on the best way to present systematic review evidence to policymakers.How up to date is this review?: The authors of this review searched for studies through June 2016. Executive summary/Abstract Background: Systematic reviews are important for decision makers. They offer many potential benefits but are often written in technical language, are too long, and do not contain contextual details which makes them hard to use for decision-making. Strategies to promote the use of evidence to decision makers are required, and evidence summaries have been suggested as a facilitator. Evidence summaries include policy briefs, briefing papers, briefing notes, evidence briefs, abstracts, summary of findings tables, and plain language summaries. There are many organizations developing and disseminating systematic review evidence summaries for different populations or subsets of decision makers. However, evidence on the usefulness and effectiveness of systematic review summaries is lacking. We present an overview of the available evidence on systematic review evidence summaries.Objectives: This systematic review aimed to 1) assess the effectiveness of evidence summaries on policy-makers' use of the evidence and 2) identify the most effective summary components for increasing policy-makers' use of the evidence.Search methods: We searched several online databases (Medline, EMBASE, CINAHL, Cochrane Central Register of Controlled Trials, Global Health Library, Popline, Africa-wide, Public Affairs Information Services, Worldwide Political Science Abstracts, Web of Science, and DfiD), websites of research groups and organizations which produce evidence summaries, and reference lists of included summaries and related systematic reviews. These databases were searched in March-April, 2016.Selection criteria: Eligible studies included randomised controlled trials (RCTs), non-randomised controlled trials (NRCTs), controlled before-after (CBA) studies, and interrupted time series (ITS) studies. We included studies of policymakers at all levels as well as health system managers. We included studies examining any type of "evidence summary", "policy brief", or other product derived from systematic reviews that presented evidence in a summarized form. These interventions could be compared to active comparators (e.g. other summary formats) or no intervention.The primary outcomes were: 1) use of systematic review summaries decision-making (e.g. self-reported use of the evidence in policy-making, decision-making) and 2) policymaker understanding, knowledge, and/or beliefs (e.g. changes in knowledge scores about the topic included in the summary). We also assessed perceived relevance, credibility, usefulness, understandability, and desirability (e.g. format) of the summaries.Results: Our database search combined with our grey literature search yielded 10,113 references after removal of duplicates. From these, 54 were reviewed in full text and we included 6 studies (reported in 7 papers, 1661 participants) as well as protocols from 2 ongoing studies. Two studies assessed the use of evidence summaries in decision-making and found little to no difference in effect. There was also little to no difference in effect for knowledge, understanding or beliefs (4 studies) and perceived usefulness or usability (3 studies). Summary of Findings tables and graded entry summaries were perceived as slightly easier to understand compared to complete systematic reviews. Two studies assessed formatting changes and found that for Summary of Findings tables, certain elements, such as reporting study event rates and absolute differences were preferred as well as avoiding the use of footnotes. No studies assessed adverse effects. The risks of bias in these studies were mainly assessed as unclear or low however, two studies were assessed as high risk of bias for incomplete outcome data due to very high rates of attrition.Authors' conclusions: Evidence summaries may be easier to understand than complete systematic reviews. However, their ability to increase the use of systematic review evidence in policymaking is unclear.
Collapse
|
3
|
Willging CE, Gunderson L, Green AE, Jaramillo ET, Garrison L, Ehrhart MG, Aarons GA. Perspectives from Community-Based Organizational Managers on Implementing and Sustaining Evidence-Based Interventions in Child Welfare. HUMAN SERVICE ORGANIZATIONS, MANAGEMENT, LEADERSHIP & GOVERNANCE 2018; 42:359-379. [PMID: 31179349 PMCID: PMC6553866 DOI: 10.1080/23303131.2018.1495673] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
The managers of community-based organizations that are contracted to deliver publicly funded programs, such as in the child welfare sector, occupy a crucial role in the implementation and sustainment of evidence-based interventions to improve the effectiveness of services, as they exert influence across levels of stakeholders in multitiered systems. This study utilized qualitative interviews to examine the perspectives and experiences of managers in implementing Safe Care®, an evidence-based intervention to reduce child maltreatment. Factors influencing managers' abilities to support SafeCare® included policy and ideological trends, characteristics of leadership in systems and organizations, public-private partnerships, procurement and contracting, collaboration and coopetition, and support for organizational staff.
Collapse
Affiliation(s)
- Cathleen E Willging
- Behavioral Health Research Center of the Southwest, Pacific Institute for Research and Evaluation, Albuquerque, New Mexico, USA
| | - Lara Gunderson
- Behavioral Health Research Center of the Southwest, Pacific Institute for Research and Evaluation, Albuquerque, New Mexico, USA
| | - Amy E Green
- Department of Psychiatry, University of California, San Diego, California, USA
| | - Elise Trott Jaramillo
- Behavioral Health Research Center of the Southwest, Pacific Institute for Research and Evaluation, Albuquerque, New Mexico, USA
| | - Laura Garrison
- Behavioral Health Research Center of the Southwest, Pacific Institute for Research and Evaluation, Albuquerque, New Mexico, USA
| | - Mark G Ehrhart
- Department of Psychology, University of Central Florida, Orlando, Florida, USA
| | - Gregory A Aarons
- Department of Psychiatry, University of California, San Diego, California, USA
| |
Collapse
|
4
|
Tudisca V, Valente A, Castellani T, Stahl T, Sandu P, Dulf D, Spitters H, Van de Goor I, Radl-Karimi C, Syed MA, Loncarevic N, Lau CJ, Roelofs S, Bertram M, Edwards N, Aro AR. Development of measurable indicators to enhance public health evidence-informed policy-making. Health Res Policy Syst 2018; 16:47. [PMID: 29855328 PMCID: PMC5984390 DOI: 10.1186/s12961-018-0323-z] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/02/2017] [Accepted: 05/04/2018] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND Ensuring health policies are informed by evidence still remains a challenge despite efforts devoted to this aim. Several tools and approaches aimed at fostering evidence-informed policy-making (EIPM) have been developed, yet there is a lack of availability of indicators specifically devoted to assess and support EIPM. The present study aims to overcome this by building a set of measurable indicators for EIPM intended to infer if and to what extent health-related policies are, or are expected to be, evidence-informed for the purposes of policy planning as well as formative and summative evaluations. METHODS The indicators for EIPM were developed and validated at international level by means of a two-round internet-based Delphi study conducted within the European project 'REsearch into POlicy to enhance Physical Activity' (REPOPA). A total of 82 researchers and policy-makers from the six European countries (Denmark, Finland, Italy, the Netherlands, Romania, the United Kingdom) involved in the project and international organisations were asked to evaluate the relevance and feasibility of an initial set of 23 indicators developed by REPOPA researchers on the basis of literature and knowledge gathered from the previous phases of the project, and to propose new indicators. RESULTS The first Delphi round led to the validation of 14 initial indicators and to the development of 8 additional indicators based on panellists' suggestions; the second round led to the validation of a further 11 indicators, including 6 proposed by panellists, and to the rejection of 6 indicators. A total of 25 indicators were validated, covering EIPM issues related to human resources, documentation, participation and monitoring, and stressing different levels of knowledge exchange and involvement of researchers and other stakeholders in policy development and evaluation. CONCLUSION The study overcame the lack of availability of indicators to assess if and to what extent policies are realised in an evidence-informed manner thanks to the active contribution of researchers and policy-makers. These indicators are intended to become a shared resource usable by policy-makers, researchers and other stakeholders, with a crucial impact on fostering the development of policies informed by evidence.
Collapse
Affiliation(s)
| | | | | | - Timo Stahl
- The National Institute for Health and Welfare (THL), Tampere, Finland
| | - Petru Sandu
- Babeș-Bolyai University (BBU), Cluj-Napoca, Romania
| | - Diana Dulf
- Babeș-Bolyai University (BBU), Cluj-Napoca, Romania
| | | | | | - Christina Radl-Karimi
- Unit for Health Promotion Research, University of Southern Denmark (SDU), Odense, Denmark
| | | | - Natasa Loncarevic
- Unit for Health Promotion Research, University of Southern Denmark (SDU), Odense, Denmark
| | - Cathrine Juel Lau
- Center for Clinical Research and Disease Prevention, previously called Research Centre for Prevention and Health (RCPH), Bispebjerg and Frederiksberg Hospital, The Capital Region, Copenhagen, Denmark
| | | | - Maja Bertram
- Unit for Health Promotion Research, University of Southern Denmark (SDU), Odense, Denmark
| | | | - Arja R. Aro
- Unit for Health Promotion Research, University of Southern Denmark (SDU), Odense, Denmark
| |
Collapse
|
5
|
Sarkies MN, Bowles KA, Skinner EH, Haas R, Lane H, Haines TP. The effectiveness of research implementation strategies for promoting evidence-informed policy and management decisions in healthcare: a systematic review. Implement Sci 2017; 12:132. [PMID: 29137659 PMCID: PMC5686806 DOI: 10.1186/s13012-017-0662-0] [Citation(s) in RCA: 59] [Impact Index Per Article: 8.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/20/2017] [Accepted: 11/01/2017] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND It is widely acknowledged that health policy and management decisions rarely reflect research evidence. Therefore, it is important to determine how to improve evidence-informed decision-making. The primary aim of this systematic review was to evaluate the effectiveness of research implementation strategies for promoting evidence-informed policy and management decisions in healthcare. The secondary aim of the review was to describe factors perceived to be associated with effective strategies and the inter-relationship between these factors. METHODS An electronic search was developed to identify studies published between January 01, 2000, and February 02, 2016. This was supplemented by checking the reference list of included articles, systematic reviews, and hand-searching publication lists from prominent authors. Two reviewers independently screened studies for inclusion, assessed methodological quality, and extracted data. RESULTS After duplicate removal, the search strategy identified 3830 titles. Following title and abstract screening, 96 full-text articles were reviewed, of which 19 studies (21 articles) met all inclusion criteria. Three studies were included in the narrative synthesis, finding policy briefs including expert opinion might affect intended actions, and intentions persisting to actions for public health policy in developing nations. Workshops, ongoing technical assistance, and distribution of instructional digital materials may improve knowledge and skills around evidence-informed decision-making in US public health departments. Tailored, targeted messages were more effective in increasing public health policies and programs in Canadian public health departments compared to messages and a knowledge broker. Sixteen studies (18 articles) were included in the thematic synthesis, leading to a conceptualisation of inter-relating factors perceived to be associated with effective research implementation strategies. A unidirectional, hierarchal flow was described from (1) establishing an imperative for practice change, (2) building trust between implementation stakeholders and (3) developing a shared vision, to (4) actioning change mechanisms. This was underpinned by the (5) employment of effective communication strategies and (6) provision of resources to support change. CONCLUSIONS Evidence is developing to support the use of research implementation strategies for promoting evidence-informed policy and management decisions in healthcare. The design of future implementation strategies should be based on the inter-relating factors perceived to be associated with effective strategies. TRIAL REGISTRATION This systematic review was registered with Prospero (record number: 42016032947).
Collapse
Affiliation(s)
- Mitchell N. Sarkies
- Kingston Centre, Monash University and Monash Health Allied Health Research Unit, 400 Warrigal Road, Heatherton, VIC 3202 Australia
| | - Kelly-Ann Bowles
- Monash University Department of Community Emergency Health and Paramedic Practice, Building H McMahons Road, Frankston, VIC 3199 Australia
| | - Elizabeth H. Skinner
- Kingston Centre, Monash University and Monash Health Allied Health Research Unit, 400 Warrigal Road, Heatherton, VIC 3202 Australia
| | - Romi Haas
- Kingston Centre, Monash University and Monash Health Allied Health Research Unit, 400 Warrigal Road, Heatherton, VIC 3202 Australia
| | - Haylee Lane
- Kingston Centre, Monash University and Monash Health Allied Health Research Unit, 400 Warrigal Road, Heatherton, VIC 3202 Australia
| | - Terry P. Haines
- Kingston Centre, Monash University and Monash Health Allied Health Research Unit, 400 Warrigal Road, Heatherton, VIC 3202 Australia
| |
Collapse
|
6
|
Wilson PM, Farley K, Bickerdike L, Booth A, Chambers D, Lambert M, Thompson C, Turner R, Watt IS. Does access to a demand-led evidence briefing service improve uptake and use of research evidence by health service commissioners? A controlled before and after study. Implement Sci 2017; 12:20. [PMID: 28196539 PMCID: PMC5310088 DOI: 10.1186/s13012-017-0545-4] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/28/2016] [Accepted: 02/01/2017] [Indexed: 11/18/2022] Open
Abstract
Background The Health and Social Care Act mandated research use as a core consideration of health service commissioning arrangements in England. We undertook a controlled before and after study to evaluate whether access to a demand-led evidence briefing service improved the use of research evidence by commissioners compared with less intensive and less targeted alternatives. Methods Nine Clinical Commissioning Groups (CCGs) in the North of England received one of three interventions: (A) access to an evidence briefing service; (B) contact plus an unsolicited push of non-tailored evidence; or (C) unsolicited push of non-tailored evidence. Data for the primary outcome measure were collected at baseline and 12 months using a survey instrument devised to assess an organisations’ ability to acquire, assess, adapt and apply research evidence to support decision-making. Documentary and observational evidence of the use of the outputs of the service were sought. Results Over the course of the study, the service addressed 24 topics raised by participating CCGs. At 12 months, the evidence briefing service was not associated with increases in CCG capacity to acquire, assess, adapt and apply research evidence to support decision-making, individual intentions to use research findings or perceptions of CCG relationships with researchers. Regardless of intervention received, participating CCGs indicated that they remained inconsistent in their research-seeking behaviours and in their capacity to acquire research. The informal nature of decision-making processes meant that there was little traceability of the use of evidence. Low baseline and follow-up response rates and missing data limit the reliability of the findings. Conclusions Access to a demand-led evidence briefing service did not improve the uptake and use of research evidence by NHS commissioners compared with less intensive and less targeted alternatives. Commissioners appear well intentioned but ad hoc users of research. Further research is required on the effects of interventions and strategies to build individual and organisational capacity to use research.
Collapse
Affiliation(s)
- Paul M Wilson
- Alliance Manchester Business School, University of Manchester, Booth Street East, Manchester, M15 6PB, UK.
| | - Kate Farley
- School of Healthcare, University of Leeds, Leeds, UK
| | - Liz Bickerdike
- Centre for Reviews and Dissemination, University of York, York, UK
| | | | - Duncan Chambers
- School of Health and Related Research, University of Sheffield, Sheffield, UK
| | - Mark Lambert
- Public Health England North East Centre, Newcastle upon Tyne, UK
| | - Carl Thompson
- School of Healthcare, University of Leeds, Leeds, UK
| | - Rhiannon Turner
- School of Psychology, Queen's University Belfast, Belfast, UK
| | - Ian S Watt
- Department of Health Sciences, University of York, York, UK
| |
Collapse
|
7
|
Wilson PM, Farley K, Bickerdike L, Booth A, Chambers D, Lambert M, Thompson C, Turner R, Watt IS. Effects of a demand-led evidence briefing service on the uptake and use of research evidence by commissioners of health services: a controlled before-and-after study. HEALTH SERVICES AND DELIVERY RESEARCH 2017. [DOI: 10.3310/hsdr05050] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
BackgroundThe Health and Social Care Act 2012 (Great Britain.Health and Social Care Act 2012. London: The Stationery Office; 2012) has mandated research use as a core consideration of health service commissioning arrangements. We evaluated whether or not access to a demand-led evidence briefing service improved the use of research evidence by commissioners, compared with less intensive and less targeted alternatives.DesignControlled before-and-after study.SettingClinical Commissioning Groups (CCGs) in the north of England.Main outcome measuresChange at 12 months from baseline of a CCG’s ability to acquire, assess, adapt and apply research evidence to support decision-making. Secondary outcomes measured individual clinical leads’ and managers’ intentions to use research evidence in decision-making.MethodsNine CCGs received one of three interventions: (1) access to an evidence briefing service; (2) contact plus an unsolicited push of non-tailored evidence; or (3) an unsolicited push of non-tailored evidence. Data for the primary outcome measure were collected at baseline and 12 months post intervention, using a survey instrument devised to assess an organisation’s ability to acquire, assess, adapt and apply research evidence to support decision-making. In addition, documentary and observational evidence of the use of the outputs of the service was sought and interviews with CCG participants were undertaken.ResultsMost of the requests were conceptual; they were not directly linked to discrete decisions or actions but were intended to provide knowledge about possible options for future actions. Symbolic use to justify existing decisions and actions were less frequent and included a decision to close a walk-in centre and to lend weight to a major initiative to promote self-care already under way. The opportunity to impact directly on decision-making processes was limited to work to establish disinvestment policies. In terms of impact overall, the evidence briefing service was not associated with increases in CCGs’ capacity to acquire, assess, adapt and apply research evidence to support decision-making, individual intentions to use research findings or perceptions of CCGs’ relationships with researchers. Regardless of the intervention received, at baseline participating CCGs indicated that they felt that they were inconsistent in their research-seeking behaviours and their capacity to acquire research remained so at follow-up. The informal nature of decision-making processes meant that there was little or no traceability of the use of evidence.LimitationsLow baseline and follow-up response rates (of 68% and 44%, respectively) and missing data limit the reliability of these findings.ConclusionsAccess to a demand-led evidence briefing service did not improve the uptake and use of research evidence by NHS commissioners compared with less intensive and less targeted alternatives. Commissioners appear to be well intentioned but ad hoc users of research.Future workFurther research is required on the effects of interventions and strategies to build individual and organisational capacity to use research. Resource-intensive approaches to providing evidence may best be employed to support instrumental decision-making. Comparative evaluation of the impact of less intensive but targeted strategies on the uptake and use of research by commissioners is warranted.FundingThe National Institute for Health Research Health Services and Delivery Research programme.
Collapse
Affiliation(s)
- Paul M Wilson
- Alliance Manchester Business School, University of Manchester, Manchester, UK
| | - Kate Farley
- School of Healthcare, University of Leeds, Leeds, UK
| | - Liz Bickerdike
- Centre for Reviews and Dissemination, University of York, York, UK
| | | | - Duncan Chambers
- School of Health and Related Research, University of Sheffield, Sheffield, UK
| | - Mark Lambert
- Public Heath England North East Centre, Newcastle upon Tyne, UK
| | - Carl Thompson
- School of Healthcare, University of Leeds, Leeds, UK
| | - Rhiannon Turner
- School of Psychology, Queen’s University Belfast, Belfast, UK
| | - Ian S Watt
- Department of Health Sciences, University of York, York, UK
| |
Collapse
|
8
|
Petkovic J, Welch V, Jacob MH, Yoganathan M, Ayala AP, Cunningham H, Tugwell P. The effectiveness of evidence summaries on health policymakers and health system managers use of evidence from systematic reviews: a systematic review. Implement Sci 2016; 11:162. [PMID: 27938409 PMCID: PMC5148903 DOI: 10.1186/s13012-016-0530-3] [Citation(s) in RCA: 32] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/03/2016] [Accepted: 12/02/2016] [Indexed: 11/20/2022] Open
Abstract
Background Systematic reviews are important for decision makers. They offer many potential benefits but are often written in technical language, are too long, and do not contain contextual details which make them hard to use for decision-making. There are many organizations that develop and disseminate derivative products, such as evidence summaries, from systematic reviews for different populations or subsets of decision makers. This systematic review aimed to (1) assess the effectiveness of evidence summaries on policymakers’ use of the evidence and (2) identify the most effective summary components for increasing policymakers’ use of the evidence. We present an overview of the available evidence on systematic review derivative products. Methods We included studies of policymakers at all levels as well as health system managers. We included studies examining any type of “evidence summary,” “policy brief,” or other products derived from systematic reviews that presented evidence in a summarized form. The primary outcomes were the (1) use of systematic review summaries in decision-making (e.g., self-reported use of the evidence in policymaking and decision-making) and (2) policymakers’ understanding, knowledge, and/or beliefs (e.g., changes in knowledge scores about the topic included in the summary). We also assessed perceived relevance, credibility, usefulness, understandability, and desirability (e.g., format) of the summaries. Results Our database search combined with our gray literature search yielded 10,113 references after removal of duplicates. From these, 54 were reviewed in full text, and we included six studies (reported in seven papers) as well as protocols from two ongoing studies. Two studies assessed the use of evidence summaries in decision-making and found little to no difference in effect. There was also little to no difference in effect for knowledge, understanding or beliefs (four studies), and perceived usefulness or usability (three studies). Summary of findings tables and graded entry summaries were perceived as slightly easier to understand compared to complete systematic reviews. Two studies assessed formatting changes and found that for summary of findings tables, certain elements, such as reporting study event rates and absolute differences, were preferred as well as avoiding the use of footnotes. Conclusions Evidence summaries are likely easier to understand than complete systematic reviews. However, their ability to increase the use of systematic review evidence in policymaking is unclear. Trial registration The protocol was published in the journal Systematic Reviews (2015;4:122) Electronic supplementary material The online version of this article (doi:10.1186/s13012-016-0530-3) contains supplementary material, which is available to authorized users.
Collapse
Affiliation(s)
- Jennifer Petkovic
- University of Split School of Medicine, Split, Croatia. .,Bruyère Research Institute, University of Ottawa, 43 Bruyère Street, Annex E room 302, Ottawa, ON, K1N 5C8, Canada.
| | - Vivian Welch
- Bruyère Research Institute, University of Ottawa, 43 Bruyère Street, Annex E room 302, Ottawa, ON, K1N 5C8, Canada.,School of Epidemiology, Public Health and Preventive Medicine, University of Ottawa, Ottawa, Canada
| | - Maria Helena Jacob
- Bruyère Research Institute, University of Ottawa, 43 Bruyère Street, Annex E room 302, Ottawa, ON, K1N 5C8, Canada
| | - Manosila Yoganathan
- Bruyère Research Institute, University of Ottawa, 43 Bruyère Street, Annex E room 302, Ottawa, ON, K1N 5C8, Canada
| | - Ana Patricia Ayala
- Gerstein Science Information Centre, University of Toronto, Toronto, Canada
| | - Heather Cunningham
- Gerstein Science Information Centre, University of Toronto, Toronto, Canada
| | - Peter Tugwell
- Department of Medicine, Faculty of Medicine, University of Ottawa, Ottawa, Canada.,Ottawa Hospital Research Institute, Clinical Epidemiology Program, Ottawa, Canada.,Department of Epidemiology and Community Medicine, Faculty of Medicine, University of Ottawa, Ottawa, Canada
| |
Collapse
|