1
|
Piotrowicz M, Gajewska M, Lewtak K, Urban E, Rutyna A, Nitsch-Osuch A. Best practice portals in health promotion and disease prevention: approaches, definitions, and intervention evaluation criteria. Front Public Health 2025; 13:1480078. [PMID: 39935881 PMCID: PMC11810953 DOI: 10.3389/fpubh.2025.1480078] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2024] [Accepted: 01/13/2025] [Indexed: 02/13/2025] Open
Abstract
Introduction The evaluation of practices is a valuable source of evidence in the context of an evidence-based approach to public health. Best practice portals (BPPs) are promising tools for facilitating access to recommended programmes, monitoring and improving the quality of interventions. There are several such portals in Europe, but there is little work in the scientific literature on the subject. The study aimed to identify and characterise BPPs in health promotion and disease prevention and analyse the approaches, definitions, and criteria for evaluating interventions. Methods To identify portals, websites of public health institutions and organisations, the PubMed database and grey literature were searched. The material consisted of elements of each portal's design, information available on their websites, and collected publications. The study applied a qualitative analysis with a descriptive approach and covered a detailed description of the four selected portals. Results Among the analysed BPPs, three were from the European region, and one was from Canada (pioneer in developing best practice tools). The dates of launching the portals ranged from the year 2003 to 2016. The number of interventions collected in the databases ranged from 120 to 337. Portals were useful, well-designed, and developed tools. BPPs differed in terms of their objectives and roles, adopted standards and criteria for assessing practices, and other operational factors. In each portal, interventions underwent a rigorous and multilevel assessment process conducted by independent experts in the field and based on intervention evaluation criteria. Generally, the analysed catalogues described similar issues, e.g., Selection of the issue addressed by the practice, Description of a particular element of the practice, Theoretical foundation, or Evaluation/Effectiveness. However, we identified both similarities and differences in the adopted terms (names of criteria) and their definitions. It was shown that sometimes the same criterion had different names depending on the catalogue. On the other hand, criteria with identical or similar names could be defined differently within the detailed thematic scope. Conclusion The similarities and differences presented in this work can serve as a valuable starting point for designing such tools to support practice-based and evidence-based decision-making in health promotion and disease prevention.
Collapse
Affiliation(s)
- Maria Piotrowicz
- Department of Health Promotion and Chronic Disease Prevention, National Institute of Public Health NIH – National Research Institute, Warsaw, Poland
| | - Małgorzata Gajewska
- Department of Health Promotion and Chronic Disease Prevention, National Institute of Public Health NIH – National Research Institute, Warsaw, Poland
| | - Katarzyna Lewtak
- Department of Health Promotion and Chronic Disease Prevention, National Institute of Public Health NIH – National Research Institute, Warsaw, Poland
- Department of Social Medicine and Public Health, Medical University of Warsaw, Warsaw, Poland
| | - Ewa Urban
- Department of Health Promotion and Chronic Disease Prevention, National Institute of Public Health NIH – National Research Institute, Warsaw, Poland
| | - Anna Rutyna
- Department of Health Promotion and Chronic Disease Prevention, National Institute of Public Health NIH – National Research Institute, Warsaw, Poland
| | - Aneta Nitsch-Osuch
- Department of Social Medicine and Public Health, Medical University of Warsaw, Warsaw, Poland
| |
Collapse
|
2
|
Brender R, Bremer K, Kula A, Groeger-Roth F, Walter U. [Evidence Register Green List Prevention: Analysis of the listed effectiveness-tested programmes]. DAS GESUNDHEITSWESEN 2024; 86:474-482. [PMID: 39013368 DOI: 10.1055/a-2308-7256] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 07/18/2024]
Abstract
BACKGROUND In the areas of prevention and health promotion, there is a large number of measures for children and adolescents. One way of facilitating evidence-based action for those involved in these taks is by making available online evidence registers with customised, effectiveness-tested measures. The Green List Prevention is such a register and offers an overview of evidence-based programmes in Germany, currently with a focus on psychosocial health. OBJECTIVE The aims of this study were (a) to analyse the characteristics of the available and evaluated programmes on the psychosocial health of children and adolescents, (b) to identify priorities and underrepresented areas of the Green List Prevention and (c) to optimise the search functions of the register. METHOD The characteristic features were recorded on the basis of the existing upper categories of the register entries which were differentiated into subcategories in an inductive procedure by at least two persons. In addition, deductive categories were added for relevant aspects concerning content and implementation. The upper and lower categories formed were operationalized with characteristic values. All entries were analyzed by using a data sheet and were descriptively evaluated. RESULTS The 102 programmes listed (as of 2/2024) addressed not only the primary target group of children and youth, but also secondary target groups (mainly teachers and guardians). Social and life skills programmes as well as trainings for guardians represented a focus. Behavioral prevention programmes on the topics of violence (including bullying) (63.7%), addiction (46.1%) and/or mental health (35.3%) were frequently represented, whereas nutrition and/or physical activity (4.9%) were hardly represented. Most of the programmes (88.2%) could be assigned to the eligibility criteria of the statutory health insurers (§20a SGB V). Potentials digital implementation forms and further implementation aspects were identified. CONCLUSION The Green List Prevention bundles a large number of different measures and that there is potential for expansion. Processing knowledge about effective measures in a user-friendly manner can be optimised through expanded search functions, so that resource-conserving, evidence-based action can be facilitated.
Collapse
Affiliation(s)
- Ricarda Brender
- Medizinische Hochschule Hannover, Institut für Epidemiologie, Sozialmedizin und Gesundheitssystemforschung, Hannover, Germany
| | - Katharina Bremer
- Medizinische Hochschule Hannover, Institut für Epidemiologie, Sozialmedizin und Gesundheitssystemforschung, Hannover, Germany
| | - Antje Kula
- Medizinische Hochschule Hannover, Institut für Epidemiologie, Sozialmedizin und Gesundheitssystemforschung, Hannover, Germany
| | - Frederick Groeger-Roth
- Niedersächsisches Justizministerium, Landespräventionsrat Niedersachsen, Hannover, Germany
| | - Ulla Walter
- Medizinische Hochschule Hannover, Institut für Epidemiologie, Sozialmedizin und Gesundheitssystemforschung, Hannover, Germany
| |
Collapse
|
3
|
Grant S, Mayo-Wilson E, Kianersi S, Naaman K, Henschel B. Open Science Standards at Journals that Inform Evidence-Based Policy. PREVENTION SCIENCE : THE OFFICIAL JOURNAL OF THE SOCIETY FOR PREVENTION RESEARCH 2023; 24:1275-1291. [PMID: 37178346 DOI: 10.1007/s11121-023-01543-z] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 04/19/2023] [Indexed: 05/15/2023]
Abstract
Evidence-based policy uses intervention research to inform consequential decisions about resource allocation. Research findings are often published in peer-reviewed journals. Because detrimental research practices associated with closed science are common, journal articles report more false-positives and exaggerated effect sizes than would be desirable. Journal implementation of standards that promote open science-such as the transparency and openness promotion (TOP) guidelines-could reduce detrimental research practices and improve the trustworthiness of research evidence on intervention effectiveness. We evaluated TOP implementation at 339 peer-reviewed journals that have been used to identify evidence-based interventions for policymaking and programmatic decisions. Each of ten open science standards in TOP was not implemented in most journals' policies (instructions to authors), procedures (manuscript submission systems), or practices (published articles). Journals implementing at least one standard typically encouraged, but did not require, an open science practice. We discuss why and how journals could improve implementation of open science standards to safeguard evidence-based policy.
Collapse
Affiliation(s)
- Sean Grant
- HEDCO Institute for Evidence-Based Educational Practice, College of Education, University of Oregon, OR, 97403-1215, Eugene, USA.
- Richard M. Fairbanks School of Public Health, Indiana University, Indianapolis, IN, USA.
| | - Evan Mayo-Wilson
- Gillings School of Global Public Health, University of North Carolina, Chapel Hill, NC, USA
- School of Public Health-Bloomington, Indiana University, Bloomington, IN, USA
| | - Sina Kianersi
- School of Public Health-Bloomington, Indiana University, Bloomington, IN, USA
- Channing Division of Network Medicine, Department of Medicine, Brigham and Women's Hospital and Harvard Medical School, Boston, MA, USA
| | - Kevin Naaman
- School of Public Health-Bloomington, Indiana University, Bloomington, IN, USA
- Indiana University, School of Education, Bloomington, IN, USA
| | - Beate Henschel
- School of Public Health-Bloomington, Indiana University, Bloomington, IN, USA
| |
Collapse
|
4
|
Iniesta J, Verdugo MA, Schalock RL. Organizational change and evidence-based practices in support services for people with intellectual and developmental disabilities. EVALUATION AND PROGRAM PLANNING 2023; 100:102337. [PMID: 37451034 DOI: 10.1016/j.evalprogplan.2023.102337] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/24/2022] [Revised: 05/22/2023] [Accepted: 06/18/2023] [Indexed: 07/18/2023]
Abstract
The impact on support services for persons with intellectual and developmental disabilities of the socioeconomic movements and theoretical reformulations of the last decades has generated the necessity, in order to guarantee their sustainability, to carry out processes of profound change in their organizational culture, intervening in the elements that compose it. Among them are professional practices as the best way to intervene in culture, with the use of comparative analysis between an organization's current practices and those expected with culture change. In this line, the organizational self-assessment tool "Organizational Effectiveness and Efficiency Scale" (OEES) is applied in a study with 24 organizations, which uses a collaborative assessment approach in the service of a set of evidence-based practices identified as standards in key aspects that guide culture change, specifically, a person-centered approach, participative structures, use of information systems and data management, implementation of quality systems and participative and transformational leadership. The results obtained show that a large majority of organizations have significant discrepancies between their current practices and evidence-based practices. The descriptive analysis allows affirming the usefulness of the scale for an organizational diagnosis and identification of strategies to guide transformational change.
Collapse
|
5
|
Sloboda Z, Johnson KA, Fishbein DH, Brown CH, Coatsworth JD, Fixsen DL, Kandel D, Paschall MJ, Silva FS, Sumnall H, Vanyukov M. Normalization of Prevention Principles and Practices to Reduce Substance Use Disorders Through an Integrated Dissemination and Implementation Framework. PREVENTION SCIENCE : THE OFFICIAL JOURNAL OF THE SOCIETY FOR PREVENTION RESEARCH 2023; 24:1078-1090. [PMID: 37052866 PMCID: PMC10476513 DOI: 10.1007/s11121-023-01532-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 03/21/2023] [Indexed: 04/14/2023]
Abstract
Major research breakthroughs over the past 30 years in the field of substance use prevention have served to: (1) enhance understanding of pharmacological effects on the central and peripheral nervous systems and the health and social consequences of use of psychoactive substances, particularly for children and adolescents; (2) delineate the processes that increase vulnerability to or protect from initiation of substance use and progression to substance use disorders (SUDs) and, based on this understanding, (3) develop effective strategies and practices to prevent the initiation and escalation of substance use. The challenge we now face as a field is to "normalize" what we have learned from this research so that it is incorporated into the work of those involved in supporting, planning, and delivering prevention programming to populations around the world, is integrated into health and social service systems, and helps to shape public policies. But we wish to go further, to incorporate these effective prevention practices into everyday life and the mind-sets of the public, particularly parents and educators. This paper reviews the advances that have been made in the field of prevention and presents a framework and recommendations to achieve these objectives generated during several meetings of prevention and implementation science researchers sponsored by the International Consortium of Universities for Drug Demand Reduction (ICUDDR) that guides a roadmap to achieve "normalization."
Collapse
Affiliation(s)
- Zili Sloboda
- Applied Prevention Science International, Ontario, OH, USA.
| | - Kimberly A Johnson
- Department of Mental Health Law and Policy, College of Community and Behavioral Sciences, University of South Florida, Tampa, FL, USA
- International Consortium of Universities of Drug Demand Reduction, Tampa, FL, USA
| | - Diana H Fishbein
- Frank Porter Graham Child Development Institute, University of North Carolina-Chapel Hill, Chapel Hill, NC, USA
- National Prevention Science, The Pennsylvania State University, State College, Harrisburg, PA, USA
| | | | | | - Dean L Fixsen
- Active Implementation Research Network, Inc, Chapel Hill, NC, USA
| | - Denise Kandel
- Department of Psychiatry and School of Public Health, Columbia University, New York, NY, USA
| | - Mallie J Paschall
- Prevention Research Center (PRC), Pacific Institute for Research and Evaluation, Berkeley, CA, USA
| | | | - Harry Sumnall
- Faculty of Health, Public Health Institute, Liverpool John Moores University, Liverpool, UK
| | - Michael Vanyukov
- Departments of Pharmaceutical Sciences, Psychiatry, and Human Genetics, University of Pittsburgh, Pittsburgh, PA, USA
| |
Collapse
|
6
|
Rossmann C, Krnel SR, Kylänen M, Lewtak K, Tortone C, Ragazzoni P, Grasso M, Maassen A, Costa L, van Dale D. Health promotion and disease prevention registries in the EU: a cross country comparison. Arch Public Health 2023; 81:85. [PMID: 37161420 PMCID: PMC10170815 DOI: 10.1186/s13690-023-01097-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/09/2022] [Accepted: 04/23/2023] [Indexed: 05/11/2023] Open
Abstract
BACKGROUND Health promotion and disease prevention programme registries (HPPRs), also called 'best practice portals', serve as entry points and practical repositories that provide decision-makers with easy access to (evidence-based) practices. However, there is limited knowledge of differences or overlaps of howe current national HPPRs in Europe function, the context and circumstances in which these HPPRs were developed, and the mechanisms utilised by each HPPR for the assessment, classification and quality improvement of the included practices. This study prepared an overview of different approaches in several national HPPRs and the EU Best Practice Portal (EU BPP) as well as identified commonalities and differences among the core characteristics of the HPPRs. METHODS We conducted a descriptive comparison - that focused on six European countries with existing or recently developed/implemented national HPPR and the EU BPP -to create a comparative overview. We used coding mechanisms to identify commonalities and differences; we performed data management, collection and building consensus during EuroHealthNet Thematic Working Group meetings. RESULTS All HPPRs offer a broad range of health promotion and disease-prevention practices and serve to support practitioners, policymakers and researchers in selecting practices. Almost all HPPRs have an assessment process in place or planned, requiring the application of assessment criteria that differ among the HPPRs. While all HPPRs collect and share recommendable practices, others have implemented further measures to improve the quality of the submitted practices. Different dissemination tools and strategies are employed to promote the use of the HPPRs, including social media, newsletters and publications as well as capacity building workshops for practice owners or technical options to connect citizens/patients with local practices. CONCLUSIONS Collaboration between HPPRs (at national and EU level) is appreciated, especially regarding the use consistent terminology to avoid misinterpretation, facilitate cross-country comparison and enable discussions on the adaption of assessment criteria by national HPPRs. Greater efforts are needed to promote the actual implementation and transfer of practices at the national level to address public health challenges with proven and effective practices.
Collapse
Affiliation(s)
| | | | - Marika Kylänen
- Finnish Institute for Health and Welfare (THL), PO Box 30, Helsinki, 00271, Finland
| | - Katarzyna Lewtak
- National Institute of Public Health NIH-NRI (NIPH NIH-NRI), Warsaw, 00-791, Poland
- Medical University of Warsaw, Warsaw, 02-007, Poland
| | - Claudio Tortone
- DoRS - Health Promotion Regional Documentation Centre, Regione Piemonte ASL TO3, Grugliasco (Turin), I- 10095, Italy
| | - Paola Ragazzoni
- DoRS - Health Promotion Regional Documentation Centre, Regione Piemonte ASL TO3, Grugliasco (Turin), I- 10095, Italy
| | - Mara Grasso
- DoRS - Health Promotion Regional Documentation Centre, Regione Piemonte ASL TO3, Grugliasco (Turin), I- 10095, Italy
| | | | - Luciana Costa
- National Institute of Health Dr. Ricardo Jorge, Lisboa, Lisbon, 1649-016, Portugal
- BioISI-Biosystems and Integrative Sciences Institute, Faculty of Sciences, University of Lisbon, Lisbon, Portugal
| | - Djoeke van Dale
- National Institute for Public Health and the Environment, PO Box 1, Bilthoven, 3720, The Netherlands.
| |
Collapse
|
7
|
Lee-Easton MJ, Magura S, Maranda MJ, Landsverk J, Rolls-Royce J, Green B, DeCamp W, Abu-Obaid R. A Scoping Review of the Influence of Evidence-Based Program Resources (EBPR) Websites for Behavioral Health. ADMINISTRATION AND POLICY IN MENTAL HEALTH AND MENTAL HEALTH SERVICES RESEARCH 2023; 50:379-391. [PMID: 36564667 PMCID: PMC10191876 DOI: 10.1007/s10488-022-01245-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 12/12/2022] [Indexed: 12/25/2022]
Abstract
Evidence-based program resources (EBPR) websites evaluate behavioral health programs, practices or policies (i.e., interventions) according to a predetermined set of research criteria and standards, usually resulting in a summary rating of the strength of an intervention's evidence base. This study is a mixed-methods analysis of the peer-reviewed academic literature relating to the influence of EBPRs on clinical practice and policy in the behavioral health field. Using an existing framework for a scoping review, we searched for research articles in PubMed, Web of Science, SCOPUS, and ProQuest that were published between January 2002 and March 2022, referenced an EBPR or multiple EBPRs, and presented data showing the influence of one or more EBPRs on behavioral health. A total of 210 articles met the inclusion criteria and were classified into five distinct categories of influence, the most important of which was showing the direct impact of one or more EBPRs on behavioral health (8.1% of articles), defined as documenting observable changes in interventions or organizations that are at least partly due to information obtained from EBPR(s). These included impacts at the state legislative and policy-making level, at the community intervention level, provider agency level, and individual practitioner level. The majority of influences identified in the study were indirect demonstrations of how EBPRs are used in various ways. However, more studies are needed to learn about the direct impact of information from EBPRs on the behavioral health field, including impact on clinician practice and treatment outcomes for consumers.
Collapse
Affiliation(s)
- Miranda J Lee-Easton
- The Evaluation Center, Western Michigan University, 1903 W. Michigan Avenue, Kalamazoo, MI, 49008, USA
| | - Stephen Magura
- The Evaluation Center, Western Michigan University, 1903 W. Michigan Avenue, Kalamazoo, MI, 49008, USA.
| | - Michael J Maranda
- The Evaluation Center, Western Michigan University, 1903 W. Michigan Avenue, Kalamazoo, MI, 49008, USA
| | - John Landsverk
- Oregon Social Learning Center, 10 Shelton McMurphey Blvd, Eugene, OR, 97401, USA
| | - Jennifer Rolls-Royce
- Chadwick Center, Rady Children's Hospital, 3020 Children's Way-Mailcode 5131, San Diego, CA, 92123, USA
| | - Brandn Green
- Development Services Group Inc, 7315 Wisconsin Ave #800E, Bethesda, MD, 20814, USA
| | - Whitney DeCamp
- Department of Sociology, Western Michigan University, 1903 W. Michigan Avenue, Kalamazoo, MI, 49008, USA
| | - Ruqayyah Abu-Obaid
- The Evaluation Center, Western Michigan University, 1903 W. Michigan Avenue, Kalamazoo, MI, 49008, USA
| |
Collapse
|
8
|
Magura S, Lee-Easton MJ, Abu-Obaid RN, Landsverk J, DeCamp W, Rolls-Reutz J, Moore K, Firpo-Triplett R, Buckley PR, Stout ER, Perkins DF. The influence of evidence-based program registry websites for dissemination of evidence-based interventions in behavioral healthcare. EVALUATION AND PROGRAM PLANNING 2023; 97:102214. [PMID: 36586304 PMCID: PMC10121732 DOI: 10.1016/j.evalprogplan.2022.102214] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/25/2022] [Revised: 06/27/2022] [Accepted: 12/19/2022] [Indexed: 05/27/2023]
Abstract
PURPOSE Evidence-based program registries (EBPRs) are web-based databases of evaluation studies that summarize the available evidence for the effectiveness of behavioral healthcare programs, including programs addressing substance misuse, mental health, child welfare, or offender rehabilitation. The study determined the extent to which visitors to selected EBPRs accomplished the objectives of their visits and how often those visits resulted in the adoption of new or improved evidence-based interventions (EBIs). METHOD A follow-up telephone survey was conducted with 216 visitors to a convenience sample of six EBPRs an average of six months after the visitors' incident visit to the EBPR. RESULTS The most frequent objective was to identify evidence-based programs/services, curricula or assessments, followed by finding resources to implement or improve the preceding and writing a grant proposal including to comply with funding requirements; 71% of such objectives were achieved across the full set of objectives. Implementation of an EBI was completely achieved for 31% of relevant objectives and some progress on EBI implementation occurred for 19% of relevant objectives. CONCLUSIONS This is the first study to document the usage of EBPRs as a modality to increase the utilization of EBIs in the actual practice of behavioral healthcare. The results support the continued use of web-based EBPRs for disseminating information on evidence-based interventions for behavioral healthcare.
Collapse
Affiliation(s)
- Stephen Magura
- Evaluation Center, Western Michigan University, Kalamazoo, MI, USA.
| | | | | | | | - Whitney DeCamp
- Department of Sociology, Western Michigan University, Kalamazoo, MI, USA
| | | | | | | | - Pamela R Buckley
- Institute of Behavioral Science, University of Colorado Boulder, Boulder, CO, USA
| | - Ellyson R Stout
- Suicide Prevention Resource Center project, Education Development Center, Waltham, MA, USA
| | - Daniel F Perkins
- Clearinghouse for Military Family Readiness, Pennsylvania State University, University Park, PA, USA
| |
Collapse
|
9
|
Evidence Clearinghouses as Tools to Advance Health Equity: What We Know from a Systematic Scan. PREVENTION SCIENCE : THE OFFICIAL JOURNAL OF THE SOCIETY FOR PREVENTION RESEARCH 2023; 24:613-624. [PMID: 36856737 DOI: 10.1007/s11121-023-01511-7] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 02/10/2023] [Indexed: 03/02/2023]
Abstract
Evidence clearinghouses evaluate and summarize literature to help decision-makers prioritize and invest in evidence-informed interventions. Clearinghouses and related practice-oriented tools are continuously evolving; however, it is unclear the extent to which these tools assess and summarize evidence describing an intervention's impact on health equity. We conducted a systematic scan to explore how clearinghouses communicated an intervention's equity impact and reviewed their underlying methods and how they defined and operationalized health equity. In 2021, we identified 18 clearinghouses that were US-focused, web-based registries of interventions that assigned an intervention effectiveness rating for improving community health and the social determinants of health. We reviewed each clearinghouse's website and collected publicly available information about their health equity impact review, review methods, and health equity definitions and values. We conducted a comparative analysis among select clearinghouses using qualitative methods. Among the 18 clearinghouses, fewer than half (only seven) summarized an intervention's potential impact on health equity. Overall, those seven clearinghouses defined and operationalized equity differently, and most lacked transparency in their review methods. Clearinghouses used one or more approaches to communicate findings from their review: summarize study findings on differential impact for subpopulations, curate interventions that reduce health disparities, and/or assign a disparity/equity rating to each intervention. Evidence clearinghouses can enhance equity-focused methods and be transparent in their underlying values to better support the uptake and implementation of evidence-informed interventions to advance health equity. However, clearinghouses are unable to do so without underlying equity-focused empirical evidence.
Collapse
|
10
|
Lee-Easton MJ, Magura S. Discrepancies in Ratings of Behavioral Healthcare Interventions Among Evidence-Based Program Resources Websites. INQUIRY : A JOURNAL OF MEDICAL CARE ORGANIZATION, PROVISION AND FINANCING 2023; 60:469580231186836. [PMID: 37462104 DOI: 10.1177/00469580231186836] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 07/20/2023]
Abstract
Decision makers in the behavioral health disciplines could benefit from tools to assist them in identifying and implementing evidence-based interventions. One tool is an evidence-based program resources website (EBPR). Prior studies documented that when multiple EBPRs rate an intervention, they may disagree. Prior research concerning the reason for such conflicts is sparse. The present study examines how EBPRs rate interventions and the sources of disagreement between EBPRs when rating the same intervention. This study hypothesizes that EBPRs may disagree about intervention ratings because they either use different rating paradigms or they use different studies as evidence of intervention effectiveness (or both). This study identified 15 EBPRs for inclusion. One author (M.J.L.E.) coded the EBPRs for which "tiers of evidence" each EBPR used to classify behavioral health interventions and which criteria they used when rating interventions. The author then computed one Jaccard index of similarity for the criteria shared between each pair of EBPRs that co-rated interventions, and one for the studies used by EBPR rating pairs when rating the same program. The authors used a combination of chi-square, correlation, and binary logistic regression analyses to analyze the data. There was a statistically significant negative correlation between the number of Cochrane Risk of Bias criteria shared between 2 EBPRs and the likelihood of those 2 EBPRs agreeing on an intervention rating (r = -.12, P ≤ .01). There was no relationship between the number of studies evaluated by 2 EBPRs and the likelihood of those EBPRs agreeing on an intervention rating. The major reason for disagreements between EBPRs when rating the same intervention in this study was due to differences in the rating criteria used by the EBPRs. The studies used by the EBPRs to rate programs does not appear to have an impact.
Collapse
|
11
|
Magura S, Lee MJ, Abu-Obaid RN, Landsverk J, DeCamp W, Rolls-Reutz J, Green B, Ingoglia C, Hollen V, Flagg A. State Department and Provider Agency Utilization of Evidence-Based Program Registries in Behavioral Healthcare and Child Welfare. Eval Health Prof 2022; 45:397-410. [PMID: 35446692 DOI: 10.1177/01632787221085754] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/29/2022]
Abstract
Evidence-based program registries (EBPRs) are web-based compilations of behavioral healthcare programs/interventions that rely on research-based criteria to rate program efficacy or effectiveness for support of programmatic decision-making. The objective was to determine the extent to which behavioral health decision-makers access EBPRs and to understand whether and exactly how they use the information obtained from EPBRs. Single State Authorities (SSAs) and service provider agencies in the areas of behavioral health and child welfare were recruited nationally. Senior staff (n = 375) responsible for the selection and implementation of programs and/or policies were interviewed by telephone concerning their visits (if any) to 28 relevant EBPRs, the types of information they were seeking, whether they found it, and how they may have used that information to effect changes in their organizations. At least one EBPR was visited by 80% of the respondents, with a median of three different registers being visited. Most visitors (55%) found all the information they were seeking; those who did not desired more guidance or tools for individual program implementation or were unable to locate the program or practice that they were seeking. Most visitors (65%) related using the information obtained to make changes in their organizations, in particular to select, start or change a program, or to support the adoption or improvement of evidence-based clinical practices. EBPRs were shown to be important resources for dissemination of research-based program effectiveness data, leading to increased use of evidence-based practices in the field, but the study also identified needs for greater awareness of EBPRs generally and for more attention to implementation of specific recommended programs and practices.
Collapse
Affiliation(s)
- Stephen Magura
- Evaluation Center, 4175Western Michigan University, Kalamazoo, MI, USA
| | - Miranda J Lee
- Evaluation Center, 4175Western Michigan University, Kalamazoo, MI, USA
| | | | | | - Whitney DeCamp
- Department of Sociology, Western Michigan University, Kalamazoo, MI, USA
| | | | - Brandn Green
- 420926Development Services Group Inc, Bethesda, MD, USA
| | - Charles Ingoglia
- 51641National Council for Behavioral Health, Washington, DC, USA
| | - Vera Hollen
- National Association of State Mental Health Program Directors Research Institute, Falls Church, VA, USA
| | - Anne Flagg
- 50379American Public Human Services Association, Arlington, VA, USA
| |
Collapse
|
12
|
Zheng J, Wadhwa M, Cook TD. How Consistently Do 13 Clearinghouses Identify Social and Behavioral Development Programs as "Evidence-Based"? PREVENTION SCIENCE : THE OFFICIAL JOURNAL OF THE SOCIETY FOR PREVENTION RESEARCH 2022; 23:1343-1358. [PMID: 36040619 DOI: 10.1007/s11121-022-01407-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 07/01/2022] [Indexed: 11/30/2022]
Abstract
Clearinghouses develop scientific criteria that they then use to vet existing research studies on a program to reach a verdict about how evidence-based it is. This verdict is then recorded on a website in hopes that stakeholders in science, public policy, the media, and even the general public, will consult it. This paper (1) compares the causal design and analysis preferences of 13 clearinghouses that assess the effectiveness of social and behavioral development programs, (2) estimates how consistently these clearinghouses rank the same program, and then (3) uses case studies to probe why their conclusions differ. Most clearinghouses place their highest value on randomized control trials, but they differ in how they treat program implementation, quasi-experiments, and whether their highest program ratings require effects of a given size that independently replicate or that temporally persist. Of the 2525 social and behavioral development programs sampled over clearinghouses, 82% (n = 2069) were rated by a single clearinghouse. Of the 297 programs rated by two clearinghouses, agreement about program effectiveness was obtained for about 55% (n = 164), but the clearinghouses agreed much more on program ineffectiveness than effectiveness. Most of the inconsistency is due to clearinghouses' differences in requiring independently replicated and/or temporally sustained effects. Without scientific consensus about matters like these, "evidence-based" will remain more of an aspiration than achievement in the social and behavioral sciences.
Collapse
Affiliation(s)
| | - Mansi Wadhwa
- George Washington University, Washington DC, USA
| | - Thomas D Cook
- George Washington University, Washington DC, USA. .,Northwestern University, Evanston, USA.
| |
Collapse
|
13
|
Lee-Easton MJ, Magura S, Abu-Obaid RN, Landsverk J, DeCamp W, Rolls-Reutz J, Moore K, Firpo-Triplett R, Buckley PR, Stout ER, Perkins DF. Visitors' Assessment and Utilization of Evidence-Based Program Resources (EBPR) Websites. Subst Use Misuse 2022; 57:1688-1697. [PMID: 35968844 PMCID: PMC9552282 DOI: 10.1080/10826084.2022.2107675] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/15/2022]
Abstract
Background: Evidence-based program resources (EBPR) websites for behavioral health are a potentially useful tool to assist decision-makers and practitioners in deciding which behavioral health interventions to implement. EBPR websites apply rigorous research standards to assess the effectiveness of behavioral healthcare programs, models, and clinical practices. Method: Visitors to a convenience sample of six EBPR websites (N=369, excluding students) were recruited for telephone interviews primarily by means of a pop-up invitation on the sites. Results: The visitors view the EBPR sites as important sources of information to support the identification and adoption of evidence-based programs/practices (EBPs) in behavioral healthcare, which aligns with the primary mission of EBPRs. For repeat visitors, there was some indication that the information obtained helped effect certain changes in their agencies' programs and policies. However, increased or improved guidance on EBP implementation was also requested. Conclusion: EBPR websites should be better publicized to the behavioral healthcare field.
Collapse
Affiliation(s)
| | - Stephen Magura
- Evaluation Center, Western Michigan University, Kalamazoo, MI
| | | | | | - Whitney DeCamp
- Department of Sociology, Western Michigan University, Kalamazoo, MI
| | | | | | | | - Pamela R. Buckley
- Institute of Behavioral Science, University of Colorado Boulder, Boulder, CO
| | - Ellyson R. Stout
- Suicide Prevention Resource Center, Education Development Center, Waltham, MA
| | - Daniel F. Perkins
- Clearinghouse for Military Family Readiness, Pennsylvania State University, University Park, PA
| |
Collapse
|
14
|
Maranda MJ, Lee-Easton MJ, Magura S. Variations in Definitions of Evidence-Based Interventions for Behavioral Health in Eight Selected U.S. States. EVALUATION REVIEW 2022; 46:363-390. [PMID: 35544762 DOI: 10.1177/0193841x221100356] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
BACKGROUND U.S. state legislatures fill a vital role in supporting the use of evidence-based interventions (EBIs) through statutes and regulations (mandates). OBJECTIVE The study determined the terms used by selected states to describe EBIs and how those terms are defined in mandates. RESEARCH METHODS The mandates of eight purposely selected states were accessed and coded using the Westlaw Legal Research Database. RESULTS Considerable variation was found in the terms used by states to describe EBIs. Although "evidence-based" was the most frequently utilized term (60% of mandates), an additional 29 alternative terms appeared with varying frequencies. Most terms were simply mentioned, with no further definition or elaboration. When terms were further defined or elaborated, the majority were defined using numerous and different types of external sources or references. Three approaches were found in the mandates defining EBIs: "single definition," "hierarchies of evidence levels," and "best available evidence"; the states differed considerably in the approaches used in their mandates. CONCLUSIONS The variations in EBI-related terminology across states and within states, coupled with a lack of elaboration on the meaning of important terms and the predominant use of external rather than internal guidelines, may be a source of confusion for behavioral health provider agencies that seek direction about what constitutes an EBI. Prior studies indicate that many agencies may lack staff with the technical ability to adequately evaluate what constitutes an EBI. Thus, lack of clear guidance from official state government mandates may impede the implementation of EBIs within states.
Collapse
|
15
|
Buckley PR, Edwards D, Ladika A, Steeger CM, Hill KG. Implementing Evidence-Based Preventive Interventions During a Pandemic. GLOBAL IMPLEMENTATION RESEARCH AND APPLICATIONS 2022; 2:266-277. [PMID: 35813089 PMCID: PMC9255843 DOI: 10.1007/s43477-022-00047-2] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Received: 02/07/2022] [Accepted: 06/01/2022] [Indexed: 11/25/2022]
Affiliation(s)
- Pamela R. Buckley
- Institute of Behavioral Science, University of Colorado Boulder, UCB 483, Boulder, CO 80309 USA
| | - Dan Edwards
- Evidence-Based Associates, 1221 Taylor St NW, Washington, DC 20011 USA
| | - Amanda Ladika
- Institute of Behavioral Science, University of Colorado Boulder, UCB 483, Boulder, CO 80309 USA
| | - Christine M. Steeger
- Institute of Behavioral Science, University of Colorado Boulder, UCB 483, Boulder, CO 80309 USA
| | - Karl G. Hill
- Institute of Behavioral Science, University of Colorado Boulder, UCB 483, Boulder, CO 80309 USA
| |
Collapse
|
16
|
Mayo-Wilson E, Grant S, Supplee LH. Clearinghouse Standards of Evidence on the Transparency, Openness, and Reproducibility of Intervention Evaluations. PREVENTION SCIENCE : THE OFFICIAL JOURNAL OF THE SOCIETY FOR PREVENTION RESEARCH 2022. [PMID: 34357509 DOI: 10.1007/s11121-021-01284] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/02/2023]
Abstract
Clearinghouses are influential repositories of information on the effectiveness of social interventions. To identify which interventions are "evidence-based," clearinghouses review intervention evaluations using published standards of evidence that focus primarily on internal validity and causal inferences. Open science practices can improve trust in evidence from evaluations on the effectiveness of social interventions. Including open science practices in clearinghouse standards of evidence is one of many efforts that could increase confidence in designations of interventions as "evidence-based." In this study, we examined the policies, procedures, and practices of 10 federal evidence clearinghouses that review preventive interventions-an important and influential subset of all evidence clearinghouses. We found that seven consider at least one open science practice when evaluating interventions: replication (6 of 10 clearinghouses), public availability of results (6), investigator conflicts of interest (3), design and analysis transparency (3), study registration (2), and protocol sharing (1). We did not identify any policies, procedures, or practices related to analysis plan registration, data sharing, code sharing, material sharing, and citation standards. We provide a framework with specific recommendations to help federal and other evidence clearinghouses implement the Transparency and Openness Promotion (TOP) Guidelines. Our proposed "TOP Guidelines for Clearinghouses" includes reporting whether evaluations used open science practices, incorporating open science practices in their standards for receiving "evidence-based" designations, and verifying that evaluations used open science practices. Doing so could increase the trustworthiness of evidence used for policy making and support improvements throughout the evidence ecosystem.
Collapse
Affiliation(s)
- Evan Mayo-Wilson
- Department of Epidemiology and Biostatistics, Indiana University School of Public Health-Bloomington, 1025 E. 7th Street, Bloomington, IN, 47405, USA. .,Department of Social & Behavioral Sciences, Fairbanks School of Public Health, Indiana University Richard M, Indianapolis, IN, USA.
| | - Sean Grant
- Department of Social & Behavioral Sciences, Fairbanks School of Public Health, Indiana University Richard M, Indianapolis, IN, USA
| | | |
Collapse
|
17
|
Carr ES, Obertino-Norwood H. Legitimizing evidence: The trans-institutional life of evidence-based practice. Soc Sci Med 2022; 310:115130. [DOI: 10.1016/j.socscimed.2022.115130] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/06/2021] [Revised: 06/03/2022] [Accepted: 06/10/2022] [Indexed: 12/01/2022]
|
18
|
Lee MJ, Maranda MJ, Magura S, Greenman G. References to Evidence-based Program Registry (EBPR) websites for behavioral health in U.S. state government statutes and regulations. JOURNAL OF APPLIED SOCIAL SCIENCE 2022; 16:442-458. [PMID: 35873708 PMCID: PMC9306327 DOI: 10.1177/19367244221078278] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/27/2023]
Abstract
BACKGROUND AND AIM U.S. state governments have the responsibility to regulate and license behavioral healthcare interventions, such as for addiction and mental illness, with increasing emphasis on implementing evidence-based programs (EBPs). A serious obstacle to this is lack of clarity or agreement about what constitutes "evidence-based." The study's purpose was to determine the extent to which and in what contexts web-based Evidence-based Program Registries (EBPRs) are referenced in state government statutes and regulations ("mandates") concerning behavioral healthcare. Examples are: What Works Clearinghouse; National Register of Evidence-based Programs and Practices; Cochrane Database of Systematic Reviews. METHODS The study employed the Westlaw Legal Research Database to search for 30 known EBPR websites relevant to behavioral healthcare within the statutes and regulations of all 50 states. RESULTS There was low prevalence of EBPR references in state statutes and regulations pertaining to behavioral healthcare; 20 states had a total of 33 mandates that referenced an EBPR. These mandates usually do not rely on an EBPR as the sole acceptable source for classifying a program or practice as "evidence-based." Instead, EBPRs were named in conjunction with internal state or external sources of information about putative program effectiveness, which may be less valid than EBPRs, to determine what is "evidence-based." CONCLUSION Greater awareness of scientifically - based EBPRs and greater understanding of their advantages need to be fostered among state legislators and regulators charged with making policy to increase or improve the use of evidence-based programs and practices in behavioral healthcare in the U.S.
Collapse
Affiliation(s)
- Miranda J. Lee
- The Evaluation Center at Western Michigan University, Kalamazoo, MI
| | | | - Stephen Magura
- The Evaluation Center at Western Michigan University, Kalamazoo, MI
| | - Gregory Greenman
- The Evaluation Center at Western Michigan University, Kalamazoo, MI
| |
Collapse
|
19
|
In Search of the Common Elements of Clinical Supervision: A Systematic Review. ADMINISTRATION AND POLICY IN MENTAL HEALTH AND MENTAL HEALTH SERVICES RESEARCH 2022; 49:623-643. [PMID: 35129739 DOI: 10.1007/s10488-022-01188-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 01/20/2022] [Indexed: 10/19/2022]
Abstract
The importance of clinical supervision for supporting effective implementation of evidence-based treatments (EBTs) is widely accepted; however, very little is known about which supervision practice elements contribute to implementation effectiveness. This systematic review aimed to generate a taxonomy of empirically-supported supervision practice elements that have been used in treatment trials and shown to independently predict improved EBT implementation. Supervision practice elements were identified using a two-phase, empirically-validated distillation process. In Phase I, a systematic review identified supervision protocols that had evidence of effectiveness based on (a) inclusion in one or more EBT trials, and (b) independent association with improved EBT implementation in one or more secondary studies. In Phase II, a hybrid deductive-inductive coding process was applied to the supervision protocols to characterize the nature and frequency of supervision practice elements across EBTs. Twenty-one of the 876 identified articles assessed the associations of supervision protocols with implementation or clinical outcomes, representing 13 separate studies. Coding and distillation of the supervision protocols resulted in a taxonomy of 21 supervision practice elements. The most frequently used elements were: reviewing supervisees' practice (92%; n = 12), clinical suggestions (85%; n = 11), behavioral rehearsal (77%; n = 10), elicitation (77%; n = 10), and fidelity assessment (77%; n = 10). This review identified supervision practice elements that could be targets for future research testing which elements are necessary and sufficient to support effective EBT implementation. Discrepancies between supervision practice elements observed in trials as compared to routine practice highlights the importance of research addressing supervision-focused implementation strategies.
Collapse
|
20
|
Lee-Easton MJ, Magura S, Maranda MJ. Utilization of Evidence-based Intervention Criteria in U.S. Federal Grant Funding Announcements for Behavioral Healthcare. INQUIRY: THE JOURNAL OF HEALTH CARE ORGANIZATION, PROVISION, AND FINANCING 2022; 59:469580221126295. [PMID: 36154326 PMCID: PMC9516425 DOI: 10.1177/00469580221126295] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Recent U.S. federal government policy has required or recommended the use of evidence-based interventions (EBIs), so that it is important to determine the extent to which this priority is reflected in actual federal solicitations for intervention funding, particularly for behavioral healthcare interventions. Understanding how well such policies are incorporated in federal opportunity announcements (FOAs) for grant funding could improve compliance with policy and increase the societal use of evidence-based interventions for behavioral healthcare. FOAs for discretionary grants (n = 243) in fiscal year 2021 were obtained from the Grants.gov website for 44 federal departments, agencies and sub-agencies that were likely to fund interventions in behavioral health-related areas. FOAs for block/formula grants to states that included behavioral healthcare (n = 17) were obtained from the SAM.gov website. Across both discretionary and block grants, EBIs were required in 60% and recommended in 21% of these FOAs for funding. Numerous different terms were used to signify EBIs by the FOAs, with the greatest variation occurring among the block grants. Lack of adequate elaboration or definition of alternative EBI terms prominently characterized FOAs issued by the Department of Health and Human Services, although less so for those issued by the Departments of Justice and Education. Overall, 43% of FOAs referenced evidence-based program registers on the web, which are scientifically credible sources of EBIs. Otherwise, most of the remaining elaborations of EBI terms in these FOAs were quite brief, often idiosyncratic, and not scientifically vetted. The FOAs generally adhered to federal policy requiring or encouraging the use of EBIs for funding requests. However, an overall pattern showing lack or inadequate elaboration of terms signifying EBIs makes it difficult for applicants to comply with federal policies regarding use of EBIs for behavioral healthcare.
Collapse
Affiliation(s)
| | - Stephen Magura
- Western Michigan University, Kalamazoo,
MI, USA
- Stephen Magura, Western Michigan
University, 1903 W. Michigan Avenue, Kalamazoo, MI 49008, USA.
| | | |
Collapse
|
21
|
Mayo-Wilson E, Grant S, Supplee LH. Clearinghouse Standards of Evidence on the Transparency, Openness, and Reproducibility of Intervention Evaluations. PREVENTION SCIENCE : THE OFFICIAL JOURNAL OF THE SOCIETY FOR PREVENTION RESEARCH 2021; 23:774-786. [PMID: 34357509 PMCID: PMC9283145 DOI: 10.1007/s11121-021-01284-x] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 07/14/2021] [Indexed: 12/22/2022]
Abstract
Clearinghouses are influential repositories of information on the effectiveness of social interventions. To identify which interventions are “evidence-based,” clearinghouses review intervention evaluations using published standards of evidence that focus primarily on internal validity and causal inferences. Open science practices can improve trust in evidence from evaluations on the effectiveness of social interventions. Including open science practices in clearinghouse standards of evidence is one of many efforts that could increase confidence in designations of interventions as “evidence-based.” In this study, we examined the policies, procedures, and practices of 10 federal evidence clearinghouses that review preventive interventions—an important and influential subset of all evidence clearinghouses. We found that seven consider at least one open science practice when evaluating interventions: replication (6 of 10 clearinghouses), public availability of results (6), investigator conflicts of interest (3), design and analysis transparency (3), study registration (2), and protocol sharing (1). We did not identify any policies, procedures, or practices related to analysis plan registration, data sharing, code sharing, material sharing, and citation standards. We provide a framework with specific recommendations to help federal and other evidence clearinghouses implement the Transparency and Openness Promotion (TOP) Guidelines. Our proposed “TOP Guidelines for Clearinghouses” includes reporting whether evaluations used open science practices, incorporating open science practices in their standards for receiving “evidence-based” designations, and verifying that evaluations used open science practices. Doing so could increase the trustworthiness of evidence used for policy making and support improvements throughout the evidence ecosystem.
Collapse
Affiliation(s)
- Evan Mayo-Wilson
- Department of Epidemiology and Biostatistics, Indiana University School of Public Health-Bloomington, 1025 E. 7th Street, Bloomington, IN, 47405, USA.
- Department of Social & Behavioral Sciences, Fairbanks School of Public Health, Indiana University Richard M, Indianapolis, IN, USA.
| | - Sean Grant
- Department of Social & Behavioral Sciences, Fairbanks School of Public Health, Indiana University Richard M, Indianapolis, IN, USA
| | | |
Collapse
|
22
|
Steeger CM, Buckley PR, Pampel FC, Gust CJ, Hill KG. Common Methodological Problems in Randomized Controlled Trials of Preventive Interventions. PREVENTION SCIENCE : THE OFFICIAL JOURNAL OF THE SOCIETY FOR PREVENTION RESEARCH 2021; 22:1159-1172. [PMID: 34176002 DOI: 10.1007/s11121-021-01263-2] [Citation(s) in RCA: 28] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 06/09/2021] [Indexed: 12/28/2022]
Abstract
Randomized controlled trials (RCTs) are often considered the gold standard in evaluating whether intervention results are in line with causal claims of beneficial effects. However, given that poor design and incorrect analysis may lead to biased outcomes, simply employing an RCT is not enough to say an intervention "works." This paper applies a subset of the Society for Prevention Research (SPR) Standards of Evidence for Efficacy, Effectiveness, and Scale-up Research, with a focus on internal validity (making causal inferences) to determine the degree to which RCTs of preventive interventions are well-designed and analyzed, and whether authors provide a clear description of the methods used to report their study findings. We conducted a descriptive analysis of 851 RCTs published from 2010 to 2020 and reviewed by the Blueprints for Healthy Youth Development web-based registry of scientifically proven and scalable interventions. We used Blueprints' evaluation criteria that correspond to a subset of SPR's standards of evidence. Only 22% of the sample satisfied important criteria for minimizing biases that threaten internal validity. Overall, we identified an average of 1-2 methodological weaknesses per RCT. The most frequent sources of bias were problems related to baseline non-equivalence (i.e., differences between conditions at randomization) or differential attrition (i.e., differences between completers versus attritors or differences between study conditions that may compromise the randomization). Additionally, over half the sample (51%) had missing or incomplete tests to rule out these potential sources of bias. Most preventive intervention RCTs need improvement in rigor to permit causal inference claims that an intervention is effective. Researchers also must improve reporting of methods and results to fully assess methodological quality. These advancements will increase the usefulness of preventive interventions by ensuring the credibility and usability of RCT findings.
Collapse
Affiliation(s)
- Christine M Steeger
- Institute of Behavioral Science, University of Colorado Boulder, CO, 1440 15th 80309, St., Boulder, USA.
| | - Pamela R Buckley
- Institute of Behavioral Science, University of Colorado Boulder, CO, 1440 15th 80309, St., Boulder, USA
| | - Fred C Pampel
- Institute of Behavioral Science, University of Colorado Boulder, CO, 1440 15th 80309, St., Boulder, USA
| | - Charleen J Gust
- Institute of Behavioral Science, University of Colorado Boulder, CO, 1440 15th 80309, St., Boulder, USA
| | - Karl G Hill
- Institute of Behavioral Science, University of Colorado Boulder, CO, 1440 15th 80309, St., Boulder, USA
| |
Collapse
|
23
|
Buckley PR, Ebersole CR, Steeger CM, Michaelson LE, Hill KG, Gardner F. The Role of Clearinghouses in Promoting Transparent Research: A Methodological Study of Transparency Practices for Preventive Interventions. PREVENTION SCIENCE : THE OFFICIAL JOURNAL OF THE SOCIETY FOR PREVENTION RESEARCH 2021; 23:787-798. [PMID: 33983558 DOI: 10.1007/s11121-021-01252-5] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 04/26/2021] [Indexed: 11/29/2022]
Abstract
Transparency of research methods is vital to science, though incentives are variable, with only some journals and funders adopting transparency policies. Clearinghouses are also important stakeholders; however, to date none have implemented formal procedures that facilitate transparent research. Using data from the longest standing clearinghouse, we examine transparency practices for preventive interventions to explore the role of online clearinghouses in incentivizing researchers to make their research more transparent. We conducted a descriptive analysis of 88 evaluation reports reviewed in 2018-2019 by Blueprints for Healthy Youth Development, when the clearinghouse began checking for trial registrations, and expanded on these efforts by applying broader transparency standards to interventions eligible for an endorsement on the Blueprints website during the study period. Reports were recent, with 84% published between 2010 and 2019. We found that few reports had data, code, or research materials that were publicly available. Meanwhile, 40% had protocols that were registered, but only 8% were registered prospectively, while one-quarter were registered before conducting analyses. About one-third included details in a registered protocol describing the treatment contrast and planned inclusions, and less than 5% had a registered statistical analysis plan (e.g., planned analytical methods, pre-specified covariates). Confirmatory research was distinguished from exploratory work in roughly 40% of reports. Reports published more recently (after 2015) had higher rates of transparency. Preventive intervention research needs to be more transparent. Since clearinghouses rely on robust findings to make well-informed decisions and researchers are incentivized to meet clearinghouse standards, clearinghouses should consider policies that encourage transparency to improve the credibility of evidence-based interventions.
Collapse
Affiliation(s)
- Pamela R Buckley
- Institute of Behavioral Science, University of Colorado Boulder, Boulder, CO, 80309, USA.
| | | | - Christine M Steeger
- Institute of Behavioral Science, University of Colorado Boulder, Boulder, CO, 80309, USA
| | | | - Karl G Hill
- Institute of Behavioral Science, University of Colorado Boulder, Boulder, CO, 80309, USA
| | - Frances Gardner
- Centre for Evidence-Based Intervention, Department of Social Policy & Intervention, Oxford University, Oxford, England
| |
Collapse
|
24
|
Maranda MJ, Magura S, Gugerty R, Lee MJ, Landsverk JA, Rolls-Reutz J, Green B. State behavioral health agency website references to evidence-based program registers. EVALUATION AND PROGRAM PLANNING 2021; 85:101906. [PMID: 33567376 PMCID: PMC7932747 DOI: 10.1016/j.evalprogplan.2021.101906] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/01/2020] [Revised: 08/19/2020] [Accepted: 01/18/2021] [Indexed: 05/27/2023]
Abstract
PURPOSE Evidence-based program registers (EBPRs) are important tools for facilitating the use of evidence-based practices or programs (EBPs) by state statutory agencies responsible for behavioral healthcare, broadly defined as substance misuse, mental health, HIV/AIDS prevention, child welfare, and offender rehabilitation. There are currently no data on the purposes for which such state agencies reference EBPRs on their official websites. METHOD A webscraping method was used to identify and classify relevant "hits", defined as a state behavioral health webpage with single or multiple references to a study EBPR. A total of 778 hits (unique combinations of webpage and register) were coded. Up to three codes were applied to each hit for the "reasons for the EBPR reference" (EBPR use) dimension, one code was applied to each hit for the "purpose of the EBPR reference" and "intended audience of the webpage containing the hit" dimensions, and up to two codes were applied to each hit for the "funding mentions" dimension. RESULTS Three EBPRs out of 28 accounted for 73.6% of the hits. The most frequent reason for referencing EBPRs were as a resource for selecting EBPs or validating existing programs and practices. The references tended to appear in reports from the state, in training materials, or guidelines. The references tended to address broad groups of behavioral healthcare professionals. EBPRs were frequently referenced in the context of federal block grants or other federal funding. CONCLUSIONS Increasing state agencies' awareness and use of the entire range of existing EBPRs may improve implementation of EBPs nationally.
Collapse
Affiliation(s)
- Michael J Maranda
- Western Michigan University, 1903 W. Michigan Avenue, Kalamazoo MI 49008, USA
| | - Stephen Magura
- Western Michigan University, 1903 W. Michigan Avenue, Kalamazoo MI 49008, USA.
| | | | - Miranda J Lee
- Western Michigan University, 1903 W. Michigan Avenue, Kalamazoo MI 49008, USA
| | - John A Landsverk
- Oregon Social Learning Center, 10 Shelton McMurphey Blvd, Eugene, OR 97401, USA
| | - Jennifer Rolls-Reutz
- Chadwick Center for Children and Families, 3020 Children's Way-Mailcode 5131, San Diego CA 92123, USA
| | - Brandn Green
- Development Services Group Inc., 7315 Wisconsin Avenue, 800 East Bethesda, MD 20814-3210, USA
| |
Collapse
|
25
|
Buckley PR, Fagan AA, Pampel FC, Hill KG. Making Evidence-Based Interventions Relevant for Users: A Comparison of Requirements for Dissemination Readiness Across Program Registries. EVALUATION REVIEW 2020; 44:51-83. [PMID: 32588654 PMCID: PMC8022079 DOI: 10.1177/0193841x20933776] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/25/2023]
Abstract
This study compares prevention program registries in current use on their level of support for users seeking to implement evidence-based programs. Despite the importance of registries as intermediaries between researchers and the public, and although previous studies have examined how registries define their standards for methodological soundness and evidence of efficacy, little research has focused on the degree to which registries consider programs' dissemination readiness. The result is that registry users are uncertain whether listed programs and their necessary support materials are even available for implementation. This study evaluates 11 publicly and privately funded prevention registries that review the evidence base of programs seeking to improve child health and prosocial outcomes on the degree to which they use dissemination readiness as an evidentiary criterion for rating programs, and the extent and type of information they provide about dissemination readiness to support real-world implementation. The results show wide variability, with few having standards about dissemination readiness or making evidence-based information about interventions easily accessible to users. Findings indicate the need for registries to (1) do more to assess dissemination readiness before including programs on their website and (2) offer more complete information on dissemination readiness and implementation support to users.
Collapse
|
26
|
Fagan AA, Bumbarger BK, Barth RP, Bradshaw CP, Cooper BR, Supplee LH, Walker DK. Scaling up Evidence-Based Interventions in US Public Systems to Prevent Behavioral Health Problems: Challenges and Opportunities. PREVENTION SCIENCE : THE OFFICIAL JOURNAL OF THE SOCIETY FOR PREVENTION RESEARCH 2019; 20:1147-1168. [PMID: 31444621 PMCID: PMC6881430 DOI: 10.1007/s11121-019-01048-8] [Citation(s) in RCA: 98] [Impact Index Per Article: 16.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/03/2023]
Abstract
A number of programs, policies, and practices have been tested using rigorous scientific methods and shown to prevent behavioral health problems (Catalano et al., Lancet 379:1653-1664, 2012; National Research Council and Institute of Medicine, 2009). Yet these evidence-based interventions (EBIs) are not widely used in public systems, and they have limited reach (Glasgow et al., American Journal of Public Health 102:1274-1281, 2012; National Research Council and Institute of Medicine 2009; Prinz and Sanders, Clinical Psychology Review 27:739-749, 2007). To address this challenge and improve public health and well-being at a population level, the Society for Prevention Research (SPR) formed the Mapping Advances in Prevention Science (MAPS) IV Translation Research Task Force, which considered ways to scale up EBIs in five public systems: behavioral health, child welfare, education, juvenile justice, and public health. After reviewing other efforts to scale up EBIs in public systems, a common set of factors were identified as affecting scale-up in all five systems. The most important factor was the degree to which these systems enacted public policies (i.e., statutes, regulations, and guidance) requiring or recommending EBIs and provided public funds for EBIs. Across systems, other facilitators of scale-up were creating EBIs that are ready for scale-up, public awareness of and support for EBIs, community engagement and capacity to implement EBIs, leadership support for EBIs, a skilled workforce capable of delivering EBIs, and data monitoring and evaluation capacity. It was concluded that the following actions are needed to significantly increase EBI scale-up in public systems: (1) provide more public policies and funding to support the creation, testing, and scaling up of EBIs; (2) develop and evaluate specific frameworks that address systems level barriers impeding EBI scale-up; and (3) promote public support for EBIs, community capacity to implement EBIs at scale, and partnerships between community stakeholders, policy makers, practitioners, and scientists within and across systems.
Collapse
Affiliation(s)
- Abigail A Fagan
- Department of Sociology, Criminology & Law, University of Florida, 3362 Turlington Hall, P.O. Box 117330, Gainesville, FL, 32611-7330, USA.
| | | | - Richard P Barth
- School of Social Work, University of Maryland, Baltimore, Baltimore, MD, USA
| | | | | | | | | |
Collapse
|
27
|
Zack MK, Karre JK, Olson J, Perkins DF. Similarities and differences in program registers: A case study. EVALUATION AND PROGRAM PLANNING 2019; 76:101676. [PMID: 31252374 DOI: 10.1016/j.evalprogplan.2019.101676] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/04/2019] [Revised: 05/17/2019] [Accepted: 06/21/2019] [Indexed: 05/25/2023]
Abstract
Researchers, clinicians, and other professionals are increasingly in need of cost-effective, evidence-based programs and practices. However, these individuals may lack the time and, for some, the required expertise to search for and identify such interventions. To address this concern, several online registers that list or categorize programs according to their empirical evidence of effectiveness have been established. Although these registers are designed to simplify the task of selecting evidence-based interventions, the use of distinct review processes and standards by each register creates discrepancies in final program classifications, which can pose a challenge for users. The present case study highlights three programs that have been evaluated by more than one register and have received similar or different classifications. Reasons for inconsistencies are discussed, and several recommendations for evaluating organizations and register users are provided to enhance the functionality and ease of use of online program registers.
Collapse
|
28
|
Beelmann A, Malti T, Noam GG, Sommer S. Innovation and Integrity: Desiderata and Future Directions for Prevention and Intervention Science. PREVENTION SCIENCE : THE OFFICIAL JOURNAL OF THE SOCIETY FOR PREVENTION RESEARCH 2019; 19:358-365. [PMID: 29372487 DOI: 10.1007/s11121-018-0869-6] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Abstract
This article summarizes essential implications of the papers within this special issue and discusses directions for future prevention and intervention research on conceptual issues, methodological and transfer-related challenges and opportunities. We identify a need to move from programs to principles in intervention research and encourage the implementation of research on potential mechanisms underlying intervention effectiveness. In addition, current methodological issues in intervention research are highlighted, including advancements in methodology and statistical procedures, extended outcome assessments, replication studies, and a thorough examination of potential biases. We further discuss transfer-related issues, for example the need for more research on the flexibility and adaptability of programs and intervention approaches as well as more general problems in knowledge translation reasoning the need for enhanced communication between practitioners, policy makers, and researchers. Finally, we briefly touch on the need to discuss the relation between single intervention programs, the mental health system, and changes of contextual conditions at the macro level.
Collapse
Affiliation(s)
- Andreas Beelmann
- Institute of Psychology, Department of Research Synthesis, Intervention, Evaluation, Friedrich-Schiller-University Jena, Humboldtstrasse 26, 07743, Jena, Germany.
| | - Tina Malti
- Department of Psychology and Department of Psychiatry, University of Toronto, Mississauga, Canada
| | - Gil G Noam
- Harvard Medical School, Havard University, Boston, MA, USA
| | | |
Collapse
|
29
|
Westbrook TR, Avellar SA, Seftor N. Reviewing the Reviews: Examining Similarities and Differences Between Federally Funded Evidence Reviews. EVALUATION REVIEW 2017; 41:183-211. [PMID: 27694128 DOI: 10.1177/0193841x16666463] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/27/2023]
Abstract
BACKGROUND The federal government's emphasis on supporting the implementation of evidence-based programs has fueled a need to conduct and assess rigorous evaluations of programs. Through partnerships with researchers, policy makers, and practitioners, evidence reviews-projects that identify, assess, and summarize existing research in a given area-play an important role in supporting the quality of these evaluations and how the findings are used. These reviews encourage the use of sound scientific principles to identify, select, and implement evidence-based programs. The goals and standards of each review determine its conclusions about whether a given evaluation is of high quality or a program is effective. It can be difficult for decision makers to synthesize the body of evidence when faced with results from multiple program evaluations. SAMPLE This study examined 14 federally funded evidence reviews to identify commonalities and differences in their assessments of evidence of effectiveness. FINDINGS There were both similarities and significant differences across the reviews. In general, the evidence reviews agreed on the broad critical elements to consider when assessing evaluation quality, such as research design, low attrition, and baseline equivalence. The similarities suggest that, despite differences in topic and the availability of existing research, reviews typically favor evaluations that limit potential bias in their estimates of program effects. However, the way in which some of the elements were assessed, such as what constituted acceptable amounts of attrition, differed. Further, and more substantially, the reviews showed greater variation in how they conceptualized "effectiveness."
Collapse
Affiliation(s)
- T'Pring R Westbrook
- 1 Office of Planning, Research and Evaluation, Administration for Children and Families, Washington, DC, USA
| | | | - Neil Seftor
- 3 Mathematica Policy Research, Princeton, NJ, USA
| |
Collapse
|
30
|
Schalock RL, Gomez LE, Verdugo MA, Claes C. Evidence and Evidence-Based Practices: Are We There Yet? INTELLECTUAL AND DEVELOPMENTAL DISABILITIES 2017; 55:112-119. [PMID: 28375801 DOI: 10.1352/1934-9556-55.2.112] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/07/2023]
Abstract
The purpose of this article is to move the field of intellectual and closely related developmental disabilities (IDD) towards a better understanding of evidence and evidence-based practices. To that end, we discuss (a) different perspectives on and levels of evidence, (b) commonly used evidence-gathering strategies, (c) standards to evaluate evidence, (d) the distinction between internal and external validity, and (e) guidelines for establishing evidence-based practices. We also describe how the conceptualization and use of evidence and evidence-based practices are changing to accommodate recent trends in the field.
Collapse
|
31
|
Iudici A, Gagliardo Corsi A. Evaluation in the field of social services for minors: Measuring the efficacy of interventions in the Italian service for health protection and promotion. EVALUATION AND PROGRAM PLANNING 2017; 61:160-168. [PMID: 28107694 DOI: 10.1016/j.evalprogplan.2016.11.016] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/26/2015] [Revised: 11/24/2016] [Accepted: 11/27/2016] [Indexed: 06/06/2023]
Abstract
This article presents the availment of a new Methodology for the efficacy evaluation of interventions in the field of social science: the Method of Computerized Textual Data Analysis (M.A.D.I.T.). In the beginning, we present some elements of the international and Italian legislation referred to the efficacy evaluation and about the child protection. Subsequently this work describes the process of efficacy evaluation of an intervention of minor protection delivered by a public Italian Service, the Minor and Family Service, MiFa. The MADIT Methodology is applied to the efficacy evaluation and it is interested in discursive repertoires, defined as "a linguistically intended mode of construction of finite reality". The aim of the research is to show, through the description of every step of the implementation of the Methodology based on text analysis, how is possible to notice if there are progress in the direction of the objective of intervention of child protection. The results describes how from a starting situation of "first appearance of psychiatric career" referred to the minor, the work of the psychologist of the Service MiFa has enabled to produce a shifting in the direction of objective of the intervention, that was "developing the competence of the minor to identify objectives". Through this work, we show how a rigorous methodology for assessing effectiveness may contribute to improve the quality of service of Minor Protection and may also be suitable for new fields of social science.
Collapse
Affiliation(s)
- Antonio Iudici
- Department of Philosophy, Sociology, Education and Applied Psychology, University of Padova, Via Venezia 8, Padova, Italy.
| | | |
Collapse
|
32
|
|
33
|
How Do Family-Focused Prevention Programs Work? A Review of Mediating Mechanisms Associated with Reductions in Youth Antisocial Behaviors. Clin Child Fam Psychol Rev 2016; 19:285-309. [DOI: 10.1007/s10567-016-0207-0] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
|
34
|
Implementing Effective Substance Abuse Treatments in General Medical Settings: Mapping the Research Terrain. J Subst Abuse Treat 2015; 60:110-8. [PMID: 26233697 DOI: 10.1016/j.jsat.2015.06.020] [Citation(s) in RCA: 33] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2015] [Revised: 06/22/2015] [Accepted: 06/22/2015] [Indexed: 11/20/2022]
Abstract
The National Institute on Alcohol Abuse and Alcoholism (NIAAA), National Institute on Drug Abuse (NIDA), and Veterans Health Administration (VHA) share an interest in promoting high quality, rigorous health services research to improve the availability and utilization of evidence-based treatment for substance use disorders (SUD). Recent and continuing changes in the healthcare policy and funding environments prioritize the integration of evidence-based substance abuse treatments into primary care and general medical settings. This area is a prime candidate for implementation research. Recent and ongoing implementation projects funded by these agencies are reviewed. Research in five areas is highlighted: screening and brief intervention for risky drinking; screening and brief intervention for tobacco use; uptake of FDA-approved addiction pharmacotherapies; safe opioid prescribing; and disease management. Gaps in the portfolios, and priorities for future research, are described.
Collapse
|
35
|
Burkhardt JT, Schröter DC, Magura S, Means SN, Coryn CLS. An overview of evidence-based program registers (EBPRs) for behavioral health. EVALUATION AND PROGRAM PLANNING 2015; 48:92-9. [PMID: 25450777 PMCID: PMC4413923 DOI: 10.1016/j.evalprogplan.2014.09.006] [Citation(s) in RCA: 47] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/17/2023]
Abstract
Evaluations of behavioral health interventions have identified many that are potentially effective. However, clinicians and other decision makers typically lack the time and ability to effectively search and synthesize the relevant research literature. In response to this opportunity, and to increasing policy and funding pressures for the use of evidence-based practices, a number of “what works” websites have emerged to assist decision makers in selecting interventions with the highest probability of benefit. However, these registers as a whole are not well understood. This article, which represents phase one of a concurrent mixed methods study, presents a review of the scopes, structures, dissemination strategies, uses, and challenges faced by evidence-based registers in the behavioral health disciplines. The major findings of this study show that in general, registers of evidence-based practices are able, to a degree, to identify the most effective practices meet this need to a degree. However, much needs to be done to improve the ability of the registers to fully realize their purpose.
Collapse
|