1
|
Lee-Easton MJ, Magura S, Maranda MJ, Landsverk J, Rolls-Royce J, Green B, DeCamp W, Abu-Obaid R. A Scoping Review of the Influence of Evidence-Based Program Resources (EBPR) Websites for Behavioral Health. ADMINISTRATION AND POLICY IN MENTAL HEALTH AND MENTAL HEALTH SERVICES RESEARCH 2023; 50:379-391. [PMID: 36564667 PMCID: PMC10191876 DOI: 10.1007/s10488-022-01245-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 12/12/2022] [Indexed: 12/25/2022]
Abstract
Evidence-based program resources (EBPR) websites evaluate behavioral health programs, practices or policies (i.e., interventions) according to a predetermined set of research criteria and standards, usually resulting in a summary rating of the strength of an intervention's evidence base. This study is a mixed-methods analysis of the peer-reviewed academic literature relating to the influence of EBPRs on clinical practice and policy in the behavioral health field. Using an existing framework for a scoping review, we searched for research articles in PubMed, Web of Science, SCOPUS, and ProQuest that were published between January 2002 and March 2022, referenced an EBPR or multiple EBPRs, and presented data showing the influence of one or more EBPRs on behavioral health. A total of 210 articles met the inclusion criteria and were classified into five distinct categories of influence, the most important of which was showing the direct impact of one or more EBPRs on behavioral health (8.1% of articles), defined as documenting observable changes in interventions or organizations that are at least partly due to information obtained from EBPR(s). These included impacts at the state legislative and policy-making level, at the community intervention level, provider agency level, and individual practitioner level. The majority of influences identified in the study were indirect demonstrations of how EBPRs are used in various ways. However, more studies are needed to learn about the direct impact of information from EBPRs on the behavioral health field, including impact on clinician practice and treatment outcomes for consumers.
Collapse
Affiliation(s)
- Miranda J Lee-Easton
- The Evaluation Center, Western Michigan University, 1903 W. Michigan Avenue, Kalamazoo, MI, 49008, USA
| | - Stephen Magura
- The Evaluation Center, Western Michigan University, 1903 W. Michigan Avenue, Kalamazoo, MI, 49008, USA.
| | - Michael J Maranda
- The Evaluation Center, Western Michigan University, 1903 W. Michigan Avenue, Kalamazoo, MI, 49008, USA
| | - John Landsverk
- Oregon Social Learning Center, 10 Shelton McMurphey Blvd, Eugene, OR, 97401, USA
| | - Jennifer Rolls-Royce
- Chadwick Center, Rady Children's Hospital, 3020 Children's Way-Mailcode 5131, San Diego, CA, 92123, USA
| | - Brandn Green
- Development Services Group Inc, 7315 Wisconsin Ave #800E, Bethesda, MD, 20814, USA
| | - Whitney DeCamp
- Department of Sociology, Western Michigan University, 1903 W. Michigan Avenue, Kalamazoo, MI, 49008, USA
| | - Ruqayyah Abu-Obaid
- The Evaluation Center, Western Michigan University, 1903 W. Michigan Avenue, Kalamazoo, MI, 49008, USA
| |
Collapse
|
2
|
Magura S, Lee-Easton MJ, Abu-Obaid RN, Landsverk J, DeCamp W, Rolls-Reutz J, Moore K, Firpo-Triplett R, Buckley PR, Stout ER, Perkins DF. The influence of evidence-based program registry websites for dissemination of evidence-based interventions in behavioral healthcare. EVALUATION AND PROGRAM PLANNING 2023; 97:102214. [PMID: 36586304 PMCID: PMC10121732 DOI: 10.1016/j.evalprogplan.2022.102214] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/25/2022] [Revised: 06/27/2022] [Accepted: 12/19/2022] [Indexed: 05/27/2023]
Abstract
PURPOSE Evidence-based program registries (EBPRs) are web-based databases of evaluation studies that summarize the available evidence for the effectiveness of behavioral healthcare programs, including programs addressing substance misuse, mental health, child welfare, or offender rehabilitation. The study determined the extent to which visitors to selected EBPRs accomplished the objectives of their visits and how often those visits resulted in the adoption of new or improved evidence-based interventions (EBIs). METHOD A follow-up telephone survey was conducted with 216 visitors to a convenience sample of six EBPRs an average of six months after the visitors' incident visit to the EBPR. RESULTS The most frequent objective was to identify evidence-based programs/services, curricula or assessments, followed by finding resources to implement or improve the preceding and writing a grant proposal including to comply with funding requirements; 71% of such objectives were achieved across the full set of objectives. Implementation of an EBI was completely achieved for 31% of relevant objectives and some progress on EBI implementation occurred for 19% of relevant objectives. CONCLUSIONS This is the first study to document the usage of EBPRs as a modality to increase the utilization of EBIs in the actual practice of behavioral healthcare. The results support the continued use of web-based EBPRs for disseminating information on evidence-based interventions for behavioral healthcare.
Collapse
Affiliation(s)
- Stephen Magura
- Evaluation Center, Western Michigan University, Kalamazoo, MI, USA.
| | | | | | | | - Whitney DeCamp
- Department of Sociology, Western Michigan University, Kalamazoo, MI, USA
| | | | | | | | - Pamela R Buckley
- Institute of Behavioral Science, University of Colorado Boulder, Boulder, CO, USA
| | - Ellyson R Stout
- Suicide Prevention Resource Center project, Education Development Center, Waltham, MA, USA
| | - Daniel F Perkins
- Clearinghouse for Military Family Readiness, Pennsylvania State University, University Park, PA, USA
| |
Collapse
|
3
|
Evidence Clearinghouses as Tools to Advance Health Equity: What We Know from a Systematic Scan. PREVENTION SCIENCE : THE OFFICIAL JOURNAL OF THE SOCIETY FOR PREVENTION RESEARCH 2023; 24:613-624. [PMID: 36856737 DOI: 10.1007/s11121-023-01511-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 02/10/2023] [Indexed: 03/02/2023]
Abstract
Evidence clearinghouses evaluate and summarize literature to help decision-makers prioritize and invest in evidence-informed interventions. Clearinghouses and related practice-oriented tools are continuously evolving; however, it is unclear the extent to which these tools assess and summarize evidence describing an intervention's impact on health equity. We conducted a systematic scan to explore how clearinghouses communicated an intervention's equity impact and reviewed their underlying methods and how they defined and operationalized health equity. In 2021, we identified 18 clearinghouses that were US-focused, web-based registries of interventions that assigned an intervention effectiveness rating for improving community health and the social determinants of health. We reviewed each clearinghouse's website and collected publicly available information about their health equity impact review, review methods, and health equity definitions and values. We conducted a comparative analysis among select clearinghouses using qualitative methods. Among the 18 clearinghouses, fewer than half (only seven) summarized an intervention's potential impact on health equity. Overall, those seven clearinghouses defined and operationalized equity differently, and most lacked transparency in their review methods. Clearinghouses used one or more approaches to communicate findings from their review: summarize study findings on differential impact for subpopulations, curate interventions that reduce health disparities, and/or assign a disparity/equity rating to each intervention. Evidence clearinghouses can enhance equity-focused methods and be transparent in their underlying values to better support the uptake and implementation of evidence-informed interventions to advance health equity. However, clearinghouses are unable to do so without underlying equity-focused empirical evidence.
Collapse
|
4
|
Lee-Easton MJ, Magura S. Discrepancies in Ratings of Behavioral Healthcare Interventions Among Evidence-Based Program Resources Websites. INQUIRY : A JOURNAL OF MEDICAL CARE ORGANIZATION, PROVISION AND FINANCING 2023; 60:469580231186836. [PMID: 37462104 DOI: 10.1177/00469580231186836] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 07/20/2023]
Abstract
Decision makers in the behavioral health disciplines could benefit from tools to assist them in identifying and implementing evidence-based interventions. One tool is an evidence-based program resources website (EBPR). Prior studies documented that when multiple EBPRs rate an intervention, they may disagree. Prior research concerning the reason for such conflicts is sparse. The present study examines how EBPRs rate interventions and the sources of disagreement between EBPRs when rating the same intervention. This study hypothesizes that EBPRs may disagree about intervention ratings because they either use different rating paradigms or they use different studies as evidence of intervention effectiveness (or both). This study identified 15 EBPRs for inclusion. One author (M.J.L.E.) coded the EBPRs for which "tiers of evidence" each EBPR used to classify behavioral health interventions and which criteria they used when rating interventions. The author then computed one Jaccard index of similarity for the criteria shared between each pair of EBPRs that co-rated interventions, and one for the studies used by EBPR rating pairs when rating the same program. The authors used a combination of chi-square, correlation, and binary logistic regression analyses to analyze the data. There was a statistically significant negative correlation between the number of Cochrane Risk of Bias criteria shared between 2 EBPRs and the likelihood of those 2 EBPRs agreeing on an intervention rating (r = -.12, P ≤ .01). There was no relationship between the number of studies evaluated by 2 EBPRs and the likelihood of those EBPRs agreeing on an intervention rating. The major reason for disagreements between EBPRs when rating the same intervention in this study was due to differences in the rating criteria used by the EBPRs. The studies used by the EBPRs to rate programs does not appear to have an impact.
Collapse
|
5
|
Magura S, Lee MJ, Abu-Obaid RN, Landsverk J, DeCamp W, Rolls-Reutz J, Green B, Ingoglia C, Hollen V, Flagg A. State Department and Provider Agency Utilization of Evidence-Based Program Registries in Behavioral Healthcare and Child Welfare. Eval Health Prof 2022; 45:397-410. [PMID: 35446692 DOI: 10.1177/01632787221085754] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/29/2022]
Abstract
Evidence-based program registries (EBPRs) are web-based compilations of behavioral healthcare programs/interventions that rely on research-based criteria to rate program efficacy or effectiveness for support of programmatic decision-making. The objective was to determine the extent to which behavioral health decision-makers access EBPRs and to understand whether and exactly how they use the information obtained from EPBRs. Single State Authorities (SSAs) and service provider agencies in the areas of behavioral health and child welfare were recruited nationally. Senior staff (n = 375) responsible for the selection and implementation of programs and/or policies were interviewed by telephone concerning their visits (if any) to 28 relevant EBPRs, the types of information they were seeking, whether they found it, and how they may have used that information to effect changes in their organizations. At least one EBPR was visited by 80% of the respondents, with a median of three different registers being visited. Most visitors (55%) found all the information they were seeking; those who did not desired more guidance or tools for individual program implementation or were unable to locate the program or practice that they were seeking. Most visitors (65%) related using the information obtained to make changes in their organizations, in particular to select, start or change a program, or to support the adoption or improvement of evidence-based clinical practices. EBPRs were shown to be important resources for dissemination of research-based program effectiveness data, leading to increased use of evidence-based practices in the field, but the study also identified needs for greater awareness of EBPRs generally and for more attention to implementation of specific recommended programs and practices.
Collapse
Affiliation(s)
- Stephen Magura
- Evaluation Center, 4175Western Michigan University, Kalamazoo, MI, USA
| | - Miranda J Lee
- Evaluation Center, 4175Western Michigan University, Kalamazoo, MI, USA
| | | | | | - Whitney DeCamp
- Department of Sociology, Western Michigan University, Kalamazoo, MI, USA
| | | | - Brandn Green
- 420926Development Services Group Inc, Bethesda, MD, USA
| | - Charles Ingoglia
- 51641National Council for Behavioral Health, Washington, DC, USA
| | - Vera Hollen
- National Association of State Mental Health Program Directors Research Institute, Falls Church, VA, USA
| | - Anne Flagg
- 50379American Public Human Services Association, Arlington, VA, USA
| |
Collapse
|
6
|
Lee-Easton MJ, Magura S, Abu-Obaid RN, Landsverk J, DeCamp W, Rolls-Reutz J, Moore K, Firpo-Triplett R, Buckley PR, Stout ER, Perkins DF. Visitors' Assessment and Utilization of Evidence-Based Program Resources (EBPR) Websites. Subst Use Misuse 2022; 57:1688-1697. [PMID: 35968844 PMCID: PMC9552282 DOI: 10.1080/10826084.2022.2107675] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/15/2022]
Abstract
Background: Evidence-based program resources (EBPR) websites for behavioral health are a potentially useful tool to assist decision-makers and practitioners in deciding which behavioral health interventions to implement. EBPR websites apply rigorous research standards to assess the effectiveness of behavioral healthcare programs, models, and clinical practices. Method: Visitors to a convenience sample of six EBPR websites (N=369, excluding students) were recruited for telephone interviews primarily by means of a pop-up invitation on the sites. Results: The visitors view the EBPR sites as important sources of information to support the identification and adoption of evidence-based programs/practices (EBPs) in behavioral healthcare, which aligns with the primary mission of EBPRs. For repeat visitors, there was some indication that the information obtained helped effect certain changes in their agencies' programs and policies. However, increased or improved guidance on EBP implementation was also requested. Conclusion: EBPR websites should be better publicized to the behavioral healthcare field.
Collapse
Affiliation(s)
| | - Stephen Magura
- Evaluation Center, Western Michigan University, Kalamazoo, MI
| | | | | | - Whitney DeCamp
- Department of Sociology, Western Michigan University, Kalamazoo, MI
| | | | | | | | - Pamela R. Buckley
- Institute of Behavioral Science, University of Colorado Boulder, Boulder, CO
| | - Ellyson R. Stout
- Suicide Prevention Resource Center, Education Development Center, Waltham, MA
| | - Daniel F. Perkins
- Clearinghouse for Military Family Readiness, Pennsylvania State University, University Park, PA
| |
Collapse
|
7
|
Buckley PR, Edwards D, Ladika A, Steeger CM, Hill KG. Implementing Evidence-Based Preventive Interventions During a Pandemic. GLOBAL IMPLEMENTATION RESEARCH AND APPLICATIONS 2022; 2:266-277. [PMID: 35813089 PMCID: PMC9255843 DOI: 10.1007/s43477-022-00047-2] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Received: 02/07/2022] [Accepted: 06/01/2022] [Indexed: 11/25/2022]
Affiliation(s)
- Pamela R. Buckley
- Institute of Behavioral Science, University of Colorado Boulder, UCB 483, Boulder, CO 80309 USA
| | - Dan Edwards
- Evidence-Based Associates, 1221 Taylor St NW, Washington, DC 20011 USA
| | - Amanda Ladika
- Institute of Behavioral Science, University of Colorado Boulder, UCB 483, Boulder, CO 80309 USA
| | - Christine M. Steeger
- Institute of Behavioral Science, University of Colorado Boulder, UCB 483, Boulder, CO 80309 USA
| | - Karl G. Hill
- Institute of Behavioral Science, University of Colorado Boulder, UCB 483, Boulder, CO 80309 USA
| |
Collapse
|
8
|
Mayo-Wilson E, Grant S, Supplee LH. Clearinghouse Standards of Evidence on the Transparency, Openness, and Reproducibility of Intervention Evaluations. PREVENTION SCIENCE : THE OFFICIAL JOURNAL OF THE SOCIETY FOR PREVENTION RESEARCH 2022. [PMID: 34357509 DOI: 10.1007/s11121-021-01284] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/02/2023]
Abstract
Clearinghouses are influential repositories of information on the effectiveness of social interventions. To identify which interventions are "evidence-based," clearinghouses review intervention evaluations using published standards of evidence that focus primarily on internal validity and causal inferences. Open science practices can improve trust in evidence from evaluations on the effectiveness of social interventions. Including open science practices in clearinghouse standards of evidence is one of many efforts that could increase confidence in designations of interventions as "evidence-based." In this study, we examined the policies, procedures, and practices of 10 federal evidence clearinghouses that review preventive interventions-an important and influential subset of all evidence clearinghouses. We found that seven consider at least one open science practice when evaluating interventions: replication (6 of 10 clearinghouses), public availability of results (6), investigator conflicts of interest (3), design and analysis transparency (3), study registration (2), and protocol sharing (1). We did not identify any policies, procedures, or practices related to analysis plan registration, data sharing, code sharing, material sharing, and citation standards. We provide a framework with specific recommendations to help federal and other evidence clearinghouses implement the Transparency and Openness Promotion (TOP) Guidelines. Our proposed "TOP Guidelines for Clearinghouses" includes reporting whether evaluations used open science practices, incorporating open science practices in their standards for receiving "evidence-based" designations, and verifying that evaluations used open science practices. Doing so could increase the trustworthiness of evidence used for policy making and support improvements throughout the evidence ecosystem.
Collapse
Affiliation(s)
- Evan Mayo-Wilson
- Department of Epidemiology and Biostatistics, Indiana University School of Public Health-Bloomington, 1025 E. 7th Street, Bloomington, IN, 47405, USA. .,Department of Social & Behavioral Sciences, Fairbanks School of Public Health, Indiana University Richard M, Indianapolis, IN, USA.
| | - Sean Grant
- Department of Social & Behavioral Sciences, Fairbanks School of Public Health, Indiana University Richard M, Indianapolis, IN, USA
| | | |
Collapse
|
9
|
Lee MJ, Maranda MJ, Magura S, Greenman G. References to Evidence-based Program Registry (EBPR) websites for behavioral health in U.S. state government statutes and regulations. JOURNAL OF APPLIED SOCIAL SCIENCE 2022; 16:442-458. [PMID: 35873708 PMCID: PMC9306327 DOI: 10.1177/19367244221078278] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/27/2023]
Abstract
BACKGROUND AND AIM U.S. state governments have the responsibility to regulate and license behavioral healthcare interventions, such as for addiction and mental illness, with increasing emphasis on implementing evidence-based programs (EBPs). A serious obstacle to this is lack of clarity or agreement about what constitutes "evidence-based." The study's purpose was to determine the extent to which and in what contexts web-based Evidence-based Program Registries (EBPRs) are referenced in state government statutes and regulations ("mandates") concerning behavioral healthcare. Examples are: What Works Clearinghouse; National Register of Evidence-based Programs and Practices; Cochrane Database of Systematic Reviews. METHODS The study employed the Westlaw Legal Research Database to search for 30 known EBPR websites relevant to behavioral healthcare within the statutes and regulations of all 50 states. RESULTS There was low prevalence of EBPR references in state statutes and regulations pertaining to behavioral healthcare; 20 states had a total of 33 mandates that referenced an EBPR. These mandates usually do not rely on an EBPR as the sole acceptable source for classifying a program or practice as "evidence-based." Instead, EBPRs were named in conjunction with internal state or external sources of information about putative program effectiveness, which may be less valid than EBPRs, to determine what is "evidence-based." CONCLUSION Greater awareness of scientifically - based EBPRs and greater understanding of their advantages need to be fostered among state legislators and regulators charged with making policy to increase or improve the use of evidence-based programs and practices in behavioral healthcare in the U.S.
Collapse
Affiliation(s)
- Miranda J. Lee
- The Evaluation Center at Western Michigan University, Kalamazoo, MI
| | | | - Stephen Magura
- The Evaluation Center at Western Michigan University, Kalamazoo, MI
| | - Gregory Greenman
- The Evaluation Center at Western Michigan University, Kalamazoo, MI
| |
Collapse
|
10
|
Lee-Easton MJ, Magura S, Maranda MJ. Utilization of Evidence-based Intervention Criteria in U.S. Federal Grant Funding Announcements for Behavioral Healthcare. INQUIRY: THE JOURNAL OF HEALTH CARE ORGANIZATION, PROVISION, AND FINANCING 2022; 59:469580221126295. [PMID: 36154326 PMCID: PMC9516425 DOI: 10.1177/00469580221126295] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Recent U.S. federal government policy has required or recommended the use of evidence-based interventions (EBIs), so that it is important to determine the extent to which this priority is reflected in actual federal solicitations for intervention funding, particularly for behavioral healthcare interventions. Understanding how well such policies are incorporated in federal opportunity announcements (FOAs) for grant funding could improve compliance with policy and increase the societal use of evidence-based interventions for behavioral healthcare. FOAs for discretionary grants (n = 243) in fiscal year 2021 were obtained from the Grants.gov website for 44 federal departments, agencies and sub-agencies that were likely to fund interventions in behavioral health-related areas. FOAs for block/formula grants to states that included behavioral healthcare (n = 17) were obtained from the SAM.gov website. Across both discretionary and block grants, EBIs were required in 60% and recommended in 21% of these FOAs for funding. Numerous different terms were used to signify EBIs by the FOAs, with the greatest variation occurring among the block grants. Lack of adequate elaboration or definition of alternative EBI terms prominently characterized FOAs issued by the Department of Health and Human Services, although less so for those issued by the Departments of Justice and Education. Overall, 43% of FOAs referenced evidence-based program registers on the web, which are scientifically credible sources of EBIs. Otherwise, most of the remaining elaborations of EBI terms in these FOAs were quite brief, often idiosyncratic, and not scientifically vetted. The FOAs generally adhered to federal policy requiring or encouraging the use of EBIs for funding requests. However, an overall pattern showing lack or inadequate elaboration of terms signifying EBIs makes it difficult for applicants to comply with federal policies regarding use of EBIs for behavioral healthcare.
Collapse
Affiliation(s)
| | - Stephen Magura
- Western Michigan University, Kalamazoo,
MI, USA
- Stephen Magura, Western Michigan
University, 1903 W. Michigan Avenue, Kalamazoo, MI 49008, USA.
| | | |
Collapse
|
11
|
Liu J, Gao L. Research on the Characteristics and Usefulness of User Reviews of Online Mental Health Consultation Services: A Content Analysis. Healthcare (Basel) 2021; 9:1111. [PMID: 34574885 PMCID: PMC8472137 DOI: 10.3390/healthcare9091111] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/06/2021] [Revised: 08/22/2021] [Accepted: 08/22/2021] [Indexed: 11/17/2022] Open
Abstract
Online consultation based on Internet technology is gradually becoming the main way to seek health information and professional assistance. Online user reviews, such as content reviews and star ratings, are an important basis for reflecting users' views on the effectiveness of health services. Here, we used user reviews related to online psychological consultation services for content feature mining and usefulness analyses. We used a professional online psychological counseling service platform in China to collect user reviews that were liked by users as a data sample for a content analysis. An LDA topic model, dictionary-based sentiment analysis, and the NRC Word-Emotion Association Lexicon were used to extract the topic, sentiment, and context features of the content of 4254 useful reviews, and the influence of these features on the usefulness of the reviews was verified by a multiple linear regression analysis. Our results show that the content of online reviews by psychological counseling users presented a positive emotional attitude as a whole and expressed more views on the process, effects, and future expectations of counseling than on other topics. There was a significant correlation between the topic, sentiment, and context features of a user review and its usefulness: reviews giving high scores and containing topics such as "ease emotions" and "consulting expectations" received more user likes. However, the usefulness of a review was significantly reduced if it was in existence for too long. This research provides valuable suggestions for understanding the needs and emotional attitudes of users with mental health problems in terms of online psychological consultation; identifying the factors that affect the number of likes a review receives can help platform users write better consultation evaluations and thereby provide greater usefulness. In addition, the use of online reviews generated by users for content analysis effectively supplements the current research on online psychological counseling in terms of data and methods.
Collapse
Affiliation(s)
| | - Lu Gao
- School of Management, Shanghai University, Shanghai 201800, China;
| |
Collapse
|
12
|
Mayo-Wilson E, Grant S, Supplee LH. Clearinghouse Standards of Evidence on the Transparency, Openness, and Reproducibility of Intervention Evaluations. PREVENTION SCIENCE : THE OFFICIAL JOURNAL OF THE SOCIETY FOR PREVENTION RESEARCH 2021; 23:774-786. [PMID: 34357509 PMCID: PMC9283145 DOI: 10.1007/s11121-021-01284-x] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 07/14/2021] [Indexed: 12/22/2022]
Abstract
Clearinghouses are influential repositories of information on the effectiveness of social interventions. To identify which interventions are “evidence-based,” clearinghouses review intervention evaluations using published standards of evidence that focus primarily on internal validity and causal inferences. Open science practices can improve trust in evidence from evaluations on the effectiveness of social interventions. Including open science practices in clearinghouse standards of evidence is one of many efforts that could increase confidence in designations of interventions as “evidence-based.” In this study, we examined the policies, procedures, and practices of 10 federal evidence clearinghouses that review preventive interventions—an important and influential subset of all evidence clearinghouses. We found that seven consider at least one open science practice when evaluating interventions: replication (6 of 10 clearinghouses), public availability of results (6), investigator conflicts of interest (3), design and analysis transparency (3), study registration (2), and protocol sharing (1). We did not identify any policies, procedures, or practices related to analysis plan registration, data sharing, code sharing, material sharing, and citation standards. We provide a framework with specific recommendations to help federal and other evidence clearinghouses implement the Transparency and Openness Promotion (TOP) Guidelines. Our proposed “TOP Guidelines for Clearinghouses” includes reporting whether evaluations used open science practices, incorporating open science practices in their standards for receiving “evidence-based” designations, and verifying that evaluations used open science practices. Doing so could increase the trustworthiness of evidence used for policy making and support improvements throughout the evidence ecosystem.
Collapse
Affiliation(s)
- Evan Mayo-Wilson
- Department of Epidemiology and Biostatistics, Indiana University School of Public Health-Bloomington, 1025 E. 7th Street, Bloomington, IN, 47405, USA.
- Department of Social & Behavioral Sciences, Fairbanks School of Public Health, Indiana University Richard M, Indianapolis, IN, USA.
| | - Sean Grant
- Department of Social & Behavioral Sciences, Fairbanks School of Public Health, Indiana University Richard M, Indianapolis, IN, USA
| | | |
Collapse
|
13
|
Buckley PR, Ebersole CR, Steeger CM, Michaelson LE, Hill KG, Gardner F. The Role of Clearinghouses in Promoting Transparent Research: A Methodological Study of Transparency Practices for Preventive Interventions. PREVENTION SCIENCE : THE OFFICIAL JOURNAL OF THE SOCIETY FOR PREVENTION RESEARCH 2021; 23:787-798. [PMID: 33983558 DOI: 10.1007/s11121-021-01252-5] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 04/26/2021] [Indexed: 11/29/2022]
Abstract
Transparency of research methods is vital to science, though incentives are variable, with only some journals and funders adopting transparency policies. Clearinghouses are also important stakeholders; however, to date none have implemented formal procedures that facilitate transparent research. Using data from the longest standing clearinghouse, we examine transparency practices for preventive interventions to explore the role of online clearinghouses in incentivizing researchers to make their research more transparent. We conducted a descriptive analysis of 88 evaluation reports reviewed in 2018-2019 by Blueprints for Healthy Youth Development, when the clearinghouse began checking for trial registrations, and expanded on these efforts by applying broader transparency standards to interventions eligible for an endorsement on the Blueprints website during the study period. Reports were recent, with 84% published between 2010 and 2019. We found that few reports had data, code, or research materials that were publicly available. Meanwhile, 40% had protocols that were registered, but only 8% were registered prospectively, while one-quarter were registered before conducting analyses. About one-third included details in a registered protocol describing the treatment contrast and planned inclusions, and less than 5% had a registered statistical analysis plan (e.g., planned analytical methods, pre-specified covariates). Confirmatory research was distinguished from exploratory work in roughly 40% of reports. Reports published more recently (after 2015) had higher rates of transparency. Preventive intervention research needs to be more transparent. Since clearinghouses rely on robust findings to make well-informed decisions and researchers are incentivized to meet clearinghouse standards, clearinghouses should consider policies that encourage transparency to improve the credibility of evidence-based interventions.
Collapse
Affiliation(s)
- Pamela R Buckley
- Institute of Behavioral Science, University of Colorado Boulder, Boulder, CO, 80309, USA.
| | | | - Christine M Steeger
- Institute of Behavioral Science, University of Colorado Boulder, Boulder, CO, 80309, USA
| | | | - Karl G Hill
- Institute of Behavioral Science, University of Colorado Boulder, Boulder, CO, 80309, USA
| | - Frances Gardner
- Centre for Evidence-Based Intervention, Department of Social Policy & Intervention, Oxford University, Oxford, England
| |
Collapse
|
14
|
Harding JF, Knab J, Zief S, Kelly K, McCallum D. A Systematic Review of Programs to Promote Aspects of Teen Parents' Self-sufficiency: Supporting Educational Outcomes and Healthy Birth Spacing. Matern Child Health J 2021; 24:84-104. [PMID: 31965469 PMCID: PMC7497377 DOI: 10.1007/s10995-019-02854-w] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
Introduction Expectant and parenting teens experience many challenges to achieving self-sufficiency and promoting their children’s healthy development. Teen parents need support to help them address these challenges, and many different types of programs aim to support them. In this systematic review, we examine the research about programs that aim to support aspects of teen parents’ self-sufficiency by promoting their educational outcomes and healthy birth spacing. Methods We conducted a comprehensive literature search of published and unpublished literature to identify studies of programs to support teen parents that met this review’s eligibility criteria. The quality and execution of the eligible study research designs were assessed to determine whether studies’ findings were at risk of bias. We then extracted information about study characteristics, outcomes, and program characteristics for studies considered to provide rigorous evidence. Results We identified 58 eligible studies. Twenty-three studies were considered to provide rigorous evidence about either education, contraceptive use, or repeat pregnancy or birth. Seventeen of these studies showed at least one favorable effect on an outcome in one of these domains, whereas the other six did not show any significant or substantial effects in these domains. These 17 studies represent 14 effective programs. Discussion Effective programs to support expectant and parenting teens have diverse characteristics, indicating there is no single approach for promoting teens’ education and healthy birth spacing. More rigorous studies of programs to support teen parents are needed to understand more about how to support teen fathers and the program characteristics associated with effectiveness. Electronic supplementary material The online version of this article (10.1007/s10995-019-02854-w) contains supplementary material, which is available to authorized users.
Collapse
Affiliation(s)
| | - Jean Knab
- Mathematica, P.O. Box 2393, Princeton, NJ, 08543-2393, USA
| | - Susan Zief
- Mathematica, P.O. Box 2393, Princeton, NJ, 08543-2393, USA
| | - Kevin Kelly
- Mathematica, P.O. Box 2393, Princeton, NJ, 08543-2393, USA
| | - Diana McCallum
- Mathematica, P.O. Box 2393, Princeton, NJ, 08543-2393, USA
| |
Collapse
|
15
|
Maranda MJ, Magura S, Gugerty R, Lee MJ, Landsverk JA, Rolls-Reutz J, Green B. State behavioral health agency website references to evidence-based program registers. EVALUATION AND PROGRAM PLANNING 2021; 85:101906. [PMID: 33567376 PMCID: PMC7932747 DOI: 10.1016/j.evalprogplan.2021.101906] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/01/2020] [Revised: 08/19/2020] [Accepted: 01/18/2021] [Indexed: 05/27/2023]
Abstract
PURPOSE Evidence-based program registers (EBPRs) are important tools for facilitating the use of evidence-based practices or programs (EBPs) by state statutory agencies responsible for behavioral healthcare, broadly defined as substance misuse, mental health, HIV/AIDS prevention, child welfare, and offender rehabilitation. There are currently no data on the purposes for which such state agencies reference EBPRs on their official websites. METHOD A webscraping method was used to identify and classify relevant "hits", defined as a state behavioral health webpage with single or multiple references to a study EBPR. A total of 778 hits (unique combinations of webpage and register) were coded. Up to three codes were applied to each hit for the "reasons for the EBPR reference" (EBPR use) dimension, one code was applied to each hit for the "purpose of the EBPR reference" and "intended audience of the webpage containing the hit" dimensions, and up to two codes were applied to each hit for the "funding mentions" dimension. RESULTS Three EBPRs out of 28 accounted for 73.6% of the hits. The most frequent reason for referencing EBPRs were as a resource for selecting EBPs or validating existing programs and practices. The references tended to appear in reports from the state, in training materials, or guidelines. The references tended to address broad groups of behavioral healthcare professionals. EBPRs were frequently referenced in the context of federal block grants or other federal funding. CONCLUSIONS Increasing state agencies' awareness and use of the entire range of existing EBPRs may improve implementation of EBPs nationally.
Collapse
Affiliation(s)
- Michael J Maranda
- Western Michigan University, 1903 W. Michigan Avenue, Kalamazoo MI 49008, USA
| | - Stephen Magura
- Western Michigan University, 1903 W. Michigan Avenue, Kalamazoo MI 49008, USA.
| | | | - Miranda J Lee
- Western Michigan University, 1903 W. Michigan Avenue, Kalamazoo MI 49008, USA
| | - John A Landsverk
- Oregon Social Learning Center, 10 Shelton McMurphey Blvd, Eugene, OR 97401, USA
| | - Jennifer Rolls-Reutz
- Chadwick Center for Children and Families, 3020 Children's Way-Mailcode 5131, San Diego CA 92123, USA
| | - Brandn Green
- Development Services Group Inc., 7315 Wisconsin Avenue, 800 East Bethesda, MD 20814-3210, USA
| |
Collapse
|
16
|
Lawarée J, Jacob S, Ouimet M. A scoping review of knowledge syntheses in the field of evaluation across four decades of practice. EVALUATION AND PROGRAM PLANNING 2020; 79:101761. [PMID: 31812838 DOI: 10.1016/j.evalprogplan.2019.101761] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/04/2019] [Revised: 11/14/2019] [Accepted: 11/20/2019] [Indexed: 06/10/2023]
Abstract
This scoping review of 62 knowledge syntheses published in evaluation-focused journals between 1979 and May 2018 provides a portrait of synthesis practices and their evolution in the mainstream of the field of evaluation. Concerns surrounding the production of knowledge syntheses to answer policy questions are not new in the field of evaluation. However, during this last decade, knowledge synthesis methods have expanded as a means to go beyond the limits and constraints of singular evaluations. This scoping review reveals and discusses two key issues with regards to the expansion of knowledge synthesis practices within the field of evaluation: the diversity-and muddling- of methodological practices and synthesis designs, and the frequent omission of quality appraisals.
Collapse
Affiliation(s)
- Justin Lawarée
- Université Laval, CAPP - Centre d'analyse Des Politiques Publiques, Pavillon Charles-De Koninck, 1030 Avenue Des Sciences Humaines, Québec, QC, G1V 0A6, Canada.
| | - Steve Jacob
- Université Laval, CAPP - Centre d'analyse Des Politiques Publiques, Pavillon Charles-De Koninck, 1030 Avenue Des Sciences Humaines, Québec, QC, G1V 0A6, Canada
| | - Mathieu Ouimet
- Université Laval, CAPP - Centre d'analyse Des Politiques Publiques, Pavillon Charles-De Koninck, 1030 Avenue Des Sciences Humaines, Québec, QC, G1V 0A6, Canada
| |
Collapse
|
17
|
Buckley PR, Fagan AA, Pampel FC, Hill KG. Making Evidence-Based Interventions Relevant for Users: A Comparison of Requirements for Dissemination Readiness Across Program Registries. EVALUATION REVIEW 2020; 44:51-83. [PMID: 32588654 PMCID: PMC8022079 DOI: 10.1177/0193841x20933776] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/25/2023]
Abstract
This study compares prevention program registries in current use on their level of support for users seeking to implement evidence-based programs. Despite the importance of registries as intermediaries between researchers and the public, and although previous studies have examined how registries define their standards for methodological soundness and evidence of efficacy, little research has focused on the degree to which registries consider programs' dissemination readiness. The result is that registry users are uncertain whether listed programs and their necessary support materials are even available for implementation. This study evaluates 11 publicly and privately funded prevention registries that review the evidence base of programs seeking to improve child health and prosocial outcomes on the degree to which they use dissemination readiness as an evidentiary criterion for rating programs, and the extent and type of information they provide about dissemination readiness to support real-world implementation. The results show wide variability, with few having standards about dissemination readiness or making evidence-based information about interventions easily accessible to users. Findings indicate the need for registries to (1) do more to assess dissemination readiness before including programs on their website and (2) offer more complete information on dissemination readiness and implementation support to users.
Collapse
|
18
|
Green-Hennessy S. Suspension of the National Registry of Evidence-Based Programs and Practices: the importance of adhering to the evidence. SUBSTANCE ABUSE TREATMENT PREVENTION AND POLICY 2018; 13:26. [PMID: 29929542 PMCID: PMC6013894 DOI: 10.1186/s13011-018-0162-5] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Received: 02/28/2018] [Accepted: 05/29/2018] [Indexed: 11/10/2022]
Abstract
Recently the United States Assistant Secretary of Mental Health and Substance Use disclosed having suspended the National Registry of Evidence-Based Programs and Practices, stating it was so deficient in both rigor and breadth that it must be replaced. However, a closer examination of her claims about the Registry indicates many of them to be inaccurate. Contrary to her assertions, the Registry is not devoid of medication-assisted treatments for opioid use; nor does it contain but a scant few interventions related to schizophrenia and psychosis. Moreover, many of her criticisms regarding rigor pertain to reviews completed since late 2015, when the Substance Abuse and Mental Health Services Administration altered key aspects of the Registry. In contrast to reviews generated under the 2007 rules, these newer reviews rely on fewer references, incorporate less expert input, are more likely to be based exclusively on gray literature, and are no longer required either to provide dissemination readiness information or meet certain minimum research quality standards. However, only 123 (25.7%) of the 479 Registry interventions have been reviewed solely using the problematic 2015 criteria, with the remaining 356 interventions having a review which use the 2007 guidelines. Yet, rather than address the agency’s recent missteps and expand the Registry’s content coverage, the agency appears to have decided to invest considerable resources into replacing it, relying heavily on expert consensus versus empirical data in its initial attempt to do so. This raises questions about the agency’s current commitment to evidence-based practice.
Collapse
Affiliation(s)
- Sharon Green-Hennessy
- Department of Psychology, Loyola University Maryland, 4501 N. Charles Street, Baltimore, MD, 21210, USA.
| |
Collapse
|
19
|
Klerman JA. Editor in Chief's Comment: External Validity in Systematic Reviews. EVALUATION REVIEW 2017; 41:391-402. [PMID: 30231693 DOI: 10.1177/0193841x17746740] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
|
20
|
Haegerich TM, David-Ferdon C, Noonan RK, Manns BJ, Billie HC. Technical Packages in Injury and Violence Prevention to Move Evidence Into Practice: Systematic Reviews and Beyond. EVALUATION REVIEW 2017; 41:78-108. [PMID: 27604301 PMCID: PMC5340632 DOI: 10.1177/0193841x16667214] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/17/2023]
Abstract
Injury and violence prevention strategies have greater potential for impact when they are based on scientific evidence. Systematic reviews of the scientific evidence can contribute key information about which policies and programs might have the greatest impact when implemented. However, systematic reviews have limitations, such as lack of implementation guidance and contextual information, that can limit the application of knowledge. "Technical packages," developed by knowledge brokers such as the federal government, nonprofit agencies, and academic institutions, have the potential to be an efficient mechanism for making information from systematic reviews actionable. Technical packages provide information about specific evidence-based prevention strategies, along with the estimated costs and impacts, and include accompanying implementation and evaluation guidance to facilitate adoption, implementation, and performance measurement. We describe how systematic reviews can inform the development of technical packages for practitioners, provide examples of technical packages in injury and violence prevention, and explain how enhancing review methods and reporting could facilitate the use and applicability of scientific evidence.
Collapse
Affiliation(s)
- Tamara M Haegerich
- 1 Division of Unintentional Injury Prevention, National Center for Injury Prevention and Control, Centers for Disease Control and Prevention, Atlanta, GA, USA
| | - Corinne David-Ferdon
- 2 Division of Violence Prevention, National Center for Injury Prevention and Control, Centers for Disease Control and Prevention, Atlanta, GA, USA
| | - Rita K Noonan
- 1 Division of Unintentional Injury Prevention, National Center for Injury Prevention and Control, Centers for Disease Control and Prevention, Atlanta, GA, USA
| | - Brian J Manns
- 3 Office of the Associate Director for Policy, Centers for Disease Control and Prevention, Atlanta, GA, USA
| | - Holly C Billie
- 1 Division of Unintentional Injury Prevention, National Center for Injury Prevention and Control, Centers for Disease Control and Prevention, Atlanta, GA, USA
| |
Collapse
|