1
|
Ng JY, Lin B, Parikh T, Cramer H, Moher D. Investigating the nature of open science practices across complementary, alternative, and integrative medicine journals: An audit. PLoS One 2024; 19:e0302655. [PMID: 38701100 PMCID: PMC11068175 DOI: 10.1371/journal.pone.0302655] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2023] [Accepted: 04/08/2024] [Indexed: 05/05/2024] Open
Abstract
BACKGROUND Open science practices are implemented across many scientific fields to improve transparency and reproducibility in research. Complementary, alternative, and integrative medicine (CAIM) is a growing field that may benefit from adoption of open science practices. The efficacy and safety of CAIM practices, a popular concern with the field, can be validated or refuted through transparent and reliable research. Investigating open science practices across CAIM journals by using the Transparency and Openness Promotion (TOP) guidelines can potentially promote open science practices across CAIM journals. The purpose of this study is to conduct an audit that compares and ranks open science practices adopted by CAIM journals against TOP guidelines laid out by the Center for Open Science (COS). METHODS CAIM-specific journals with titles containing the words "complementary", "alternative" and/or "integrative" were included in this audit. Each of the eight TOP criteria were used to extract open science practices from each of the CAIM journals. Data was summarized by the TOP guideline and ranked using the TOP Factor to identify commonalities and differences in practices across the included journals. RESULTS A total of 19 CAIM journals were included in this audit. Across all journals, the mean TOP Factor was 2.95 with a median score of 2. The findings of this study reveal high variability among the open science practices required by journals in this field. Four journals (21%) had a final TOP score of 0, while the total scores of the remaining 15 (79%) ranged from 1 to 8. CONCLUSION While several studies have audited open science practices across discipline-specific journals, none have focused on CAIM journals. The results of this study indicate that CAIM journals provide minimal guidelines to encourage or require authors to adhere to open science practices and there is an opportunity to improve the use of open science practices in the field.
Collapse
Affiliation(s)
- Jeremy Y. Ng
- Institute of General Practice and Interprofessional Care, University Hospital Tübingen, Tübingen, Germany
- Robert Bosch Center for Integrative Medicine and Health, Bosch Health Campus, Stuttgart, Germany
- Ottawa Hospital Research Institute, Centre for Journalology, Ottawa Methods Centre, Ottawa, Ontario, Canada
| | - Brenda Lin
- Institute of General Practice and Interprofessional Care, University Hospital Tübingen, Tübingen, Germany
- Robert Bosch Center for Integrative Medicine and Health, Bosch Health Campus, Stuttgart, Germany
| | - Tisha Parikh
- Institute of General Practice and Interprofessional Care, University Hospital Tübingen, Tübingen, Germany
- Robert Bosch Center for Integrative Medicine and Health, Bosch Health Campus, Stuttgart, Germany
| | - Holger Cramer
- Institute of General Practice and Interprofessional Care, University Hospital Tübingen, Tübingen, Germany
- Robert Bosch Center for Integrative Medicine and Health, Bosch Health Campus, Stuttgart, Germany
| | - David Moher
- Ottawa Hospital Research Institute, Centre for Journalology, Ottawa Methods Centre, Ottawa, Ontario, Canada
- School of Epidemiology and Public Health, University of Ottawa, Ottawa, Canada
| |
Collapse
|
2
|
Kashif Al-Ghita M, Cobey K, Moher D, Leeflang MMG, Ebrahimzadeh S, Lam E, Rooprai P, Khalil AA, Islam N, Algodi H, Dawit H, Adamo R, Zeghal M, McInnes MDF. Cross-Sectional Evaluation of Open Science Practices at Imaging Journals: A Meta-Research Study. Can Assoc Radiol J 2024; 75:330-343. [PMID: 37997809 DOI: 10.1177/08465371231211290] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2023] Open
Abstract
Objective: To evaluate open science policies of imaging journals, and compliance to these policies in published articles. Methods: From imaging journals listed we extracted open science policy details: protocol registration, reporting guidelines, funding, ethics and conflicts of interest (COI), data sharing, and open access publishing. The 10 most recently published studies from each journal were assessed to determine adherence to these policies. We calculated the proportion of open science policies into an Open Science Score (OSS) for all journals and articles. We evaluated relationships between OSS and journal/article level variables. Results: 82 journals/820 articles were included. The OSS of journals and articles was 58.3% and 31.8%, respectively. Of the journals, 65.9% had registration and 78.1% had reporting guideline policies. 79.3% of journals were members of COPE, 81.7% had plagiarism policies, 100% required disclosure of funding, and 97.6% required disclosure of COI and ethics approval. 81.7% had data sharing policies and 15.9% were fully open access. 7.8% of articles had a registered protocol, 8.4% followed a reporting guideline, 77.4% disclosed funding, 88.7% disclosed COI, and 85.6% reported ethics approval. 12.3% of articles shared their data. 51% of articles were available through open access or as a preprint. OSS was higher for journal with DOAJ membership (80% vs 54.2%; P < .0001). Impact factor was not correlated with journal OSS. Knowledge synthesis articles has a higher OSS scores (44.5%) than prospective/retrospective studies (32.6%, 30.0%, P < .0001). Conclusion: Imaging journals endorsed just over half of open science practices considered; however, the application of these practices at the article level was lower.
Collapse
Affiliation(s)
| | - Kelly Cobey
- School of Epidemiology and Public Health, University of Ottawa Heart Institute, Ottawa, ON, Canada
| | - David Moher
- School of Epidemiology and Public Health, University of Ottawa Heart Institute, Ottawa, ON, Canada
- Centre for Journalology, Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, ON, Canada
| | - Mariska M G Leeflang
- Epidemiology and Data Science, Amsterdam University Medical Centers, University of Amsterdam, Amsterdam, The Netherlands
| | | | - Eric Lam
- Ottawa Hospital Research Institute, Ottawa, ON, Canada
| | - Paul Rooprai
- Faculty of Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Ahmed Al Khalil
- Department of Radiology, Faculty of Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Nabil Islam
- Faculty of Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Hamza Algodi
- Faculty of Biology, University of Western Ontario, London, ON, Canada
| | - Haben Dawit
- School of Epidemiology, Faculty of Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Robert Adamo
- Faculty of Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Mahdi Zeghal
- Faculty of Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Matthew D F McInnes
- Department of Radiology, School of Epidemiology and Public Health, Ottawa Hospital Research Institute, University of Ottawa, Ottawa, ON, Canada
| |
Collapse
|
3
|
Gnambs T, Schroeders U. Accuracy and precision of fixed and random effects in meta-analyses of randomized control trials for continuous outcomes. Res Synth Methods 2024; 15:86-106. [PMID: 37751893 DOI: 10.1002/jrsm.1673] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2022] [Revised: 08/17/2023] [Accepted: 08/31/2023] [Indexed: 09/28/2023]
Abstract
Meta-analyses of treatment effects in randomized control trials are often faced with the problem of missing information required to calculate effect sizes and their sampling variances. Particularly, correlations between pre- and posttest scores are frequently not available. As an ad-hoc solution, researchers impute a constant value for the missing correlation. As an alternative, we propose adopting a multivariate meta-regression approach that models independent group effect sizes and accounts for the dependency structure using robust variance estimation or three-level modeling. A comprehensive simulation study mimicking realistic conditions of meta-analyses in clinical and educational psychology suggested that imputing a fixed correlation 0.8 or adopting a multivariate meta-regression with robust variance estimation work well for estimating the pooled effect but lead to slightly distorted between-study heterogeneity estimates. In contrast, three-level meta-regressions resulted in largely unbiased fixed effects but more inconsistent prediction intervals. Based on these results recommendations for meta-analytic practice and future meta-analytic developments are provided.
Collapse
Affiliation(s)
- Timo Gnambs
- Leibniz Institute for Educational Trajectories, Bamberg, Germany
| | | |
Collapse
|
4
|
Bullock GS, Ward P, Impellizzeri FM, Kluzek S, Hughes T, Hillman C, Waterman BR, Danelson K, Henry K, Barr E, Healy K, Räisänen AM, Gomez C, Fernandez G, Wolf J, Nicholson KF, Sell T, Zerega R, Dhiman P, Riley RD, Collins GS. Up Front and Open? Shrouded in Secrecy? Or Somewhere in Between? A Meta-Research Systematic Review of Open Science Practices in Sport Medicine Research. J Orthop Sports Phys Ther 2023; 53:1-13. [PMID: 37860866 DOI: 10.2519/jospt.2023.12016] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/21/2023]
Abstract
OBJECTIVE: To investigate open science practices in research published in the top 5 sports medicine journals from May 1, 2022, and October 1, 2022. DESIGN: A meta-research systematic review. LITERATURE SEARCH: Open science practices were searched in MEDLINE. STUDY SELECTION CRITERIA: We included original scientific research published in one of the identified top 5 sports medicine journals in 2022 as ranked by Clarivate: (1) British Journal of Sports Medicine, (2) Journal of Sport and Health Science, (3) American Journal of Sports Medicine, (4) Medicine and Science in Sports and Exercise, and (5) Sports Medicine-Open. Studies were excluded if they were systematic reviews, qualitative research, gray literature, or animal or cadaver models. DATA SYNTHESIS: Open science practices were extracted in accordance with the Transparency and Openness Promotion guidelines and patient and public involvement. RESULTS: Two hundred forty-three studies were included. The median number of open science practices in each study was 2, out of a maximum of 12 (range: 0-8; interquartile range: 2). Two hundred thirty-four studies (96%, 95% confidence interval [CI]: 94%-99%) provided an author conflict-of-interest statement and 163 (67%, 95% CI: 62%-73%) reported funding. Twenty-one studies (9%, 95% CI: 5%-12%) provided open-access data. Fifty-four studies (22%, 95% CI: 17%-27%) included a data availability statement and 3 (1%, 95% CI: 0%-3%) made code available. Seventy-six studies (32%, 95% CI: 25%-37%) had transparent materials and 30 (12%, 95% CI: 8%-16%) used a reporting guideline. Twenty-eight studies (12%, 95% CI: 8%-16%) were preregistered. Six studies (3%, 95% CI: 1%-4%) published a protocol. Four studies (2%, 95% CI: 0%-3%) reported an analysis plan a priori. Seven studies (3%, 95% CI: 1%-5%) reported patient and public involvement. CONCLUSION: Open science practices in the sports medicine field are extremely limited. The least followed practices were sharing code, data, and analysis plans. J Orthop Sports Phys Ther 2023;53(12):1-13. Epub 20 October 2023. doi:10.2519/jospt.2023.12016.
Collapse
Affiliation(s)
- Garrett S Bullock
- Department of Orthopaedic Surgery & Rehabilitation, Wake Forest School of Medicine, Winston-Salem, NC
- Department of Biostatistics and Data Science, Wake Forest School of Medicine, Winston-Salem, NC
- Centre for Sport, Exercise and Osteoarthritis Research Versus Arthritis, University of Oxford, Oxford, United Kingdom
- Sport Injury Prevention Research Center, University of Calgary, Calgary, AB, Canada
| | | | - Franco M Impellizzeri
- School of Sport, Exercise, and Rehabilitation, University of Technology Sydney, Sydney, Australia
| | - Stefan Kluzek
- Centre for Sport, Exercise and Osteoarthritis Research Versus Arthritis, University of Oxford, Oxford, United Kingdom
- Sports Medicine Research Department, University of Nottingham, Nottingham, UK
- English Institute of Sport, Marlow, United Kingdom
| | - Tom Hughes
- Department of Health Professions, Manchester Metropolitan University, Manchester, United Kingdom
| | - Charles Hillman
- Sports Medicine Research Department, University of Nottingham, Nottingham, UK
| | - Brian R Waterman
- Department of Orthopaedic Surgery & Rehabilitation, Wake Forest School of Medicine, Winston-Salem, NC
| | - Kerry Danelson
- Department of Orthopaedic Surgery & Rehabilitation, Wake Forest School of Medicine, Winston-Salem, NC
| | - Kaitlin Henry
- Department of Orthopaedic Surgery & Rehabilitation, Wake Forest School of Medicine, Winston-Salem, NC
| | - Emily Barr
- Department of Orthopaedic Surgery & Rehabilitation, Wake Forest School of Medicine, Winston-Salem, NC
| | - Kelsey Healy
- Department of Orthopaedic Surgery & Rehabilitation, Wake Forest School of Medicine, Winston-Salem, NC
| | - Anu M Räisänen
- Department of Physical Therapy Education - Oregon, College of Health Sciences-Northwest, Western University of Health Sciences, Lebanon, OR
- Sport Injury Prevention Research Centre, Faculty of Kinesiology, University of Calgary, Calgary, AB, Canada
| | - Christina Gomez
- Department of Physical Therapy Education - Oregon, College of Health Sciences-Northwest, Western University of Health Sciences, Lebanon, OR
| | - Garrett Fernandez
- Department of Orthopaedic Surgery & Rehabilitation, Wake Forest School of Medicine, Winston-Salem, NC
| | - Jakob Wolf
- Department of Orthopaedic Surgery & Rehabilitation, Wake Forest School of Medicine, Winston-Salem, NC
| | - Kristen F Nicholson
- Department of Orthopaedic Surgery & Rehabilitation, Wake Forest School of Medicine, Winston-Salem, NC
| | | | | | - Paula Dhiman
- Centre for Statistics in Medicine, Nuffield Department of Orthopaedics, Rheumatology, and Musculoskeletal Sciences, University of Oxford, Oxford, United Kingdom
| | - Richard D Riley
- Institute of Applied Health Research, College of Medical and Dental Sciences, University of Birmingham, Birmingham, United Kingdom
| | - Gary S Collins
- Centre for Statistics in Medicine, Nuffield Department of Orthopaedics, Rheumatology, and Musculoskeletal Sciences, University of Oxford, Oxford, United Kingdom
| |
Collapse
|
5
|
Hamilton DG, Hong K, Fraser H, Rowhani-Farid A, Fidler F, Page MJ. Prevalence and predictors of data and code sharing in the medical and health sciences: systematic review with meta-analysis of individual participant data. BMJ 2023; 382:e075767. [PMID: 37433624 DOI: 10.1136/bmj-2023-075767] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 07/13/2023]
Abstract
OBJECTIVES To synthesise research investigating data and code sharing in medicine and health to establish an accurate representation of the prevalence of sharing, how this frequency has changed over time, and what factors influence availability. DESIGN Systematic review with meta-analysis of individual participant data. DATA SOURCES Ovid Medline, Ovid Embase, and the preprint servers medRxiv, bioRxiv, and MetaArXiv were searched from inception to 1 July 2021. Forward citation searches were also performed on 30 August 2022. REVIEW METHODS Meta-research studies that investigated data or code sharing across a sample of scientific articles presenting original medical and health research were identified. Two authors screened records, assessed the risk of bias, and extracted summary data from study reports when individual participant data could not be retrieved. Key outcomes of interest were the prevalence of statements that declared that data or code were publicly or privately available (declared availability) and the success rates of retrieving these products (actual availability). The associations between data and code availability and several factors (eg, journal policy, type of data, trial design, and human participants) were also examined. A two stage approach to meta-analysis of individual participant data was performed, with proportions and risk ratios pooled with the Hartung-Knapp-Sidik-Jonkman method for random effects meta-analysis. RESULTS The review included 105 meta-research studies examining 2 121 580 articles across 31 specialties. Eligible studies examined a median of 195 primary articles (interquartile range 113-475), with a median publication year of 2015 (interquartile range 2012-2018). Only eight studies (8%) were classified as having a low risk of bias. Meta-analyses showed a prevalence of declared and actual public data availability of 8% (95% confidence interval 5% to 11%) and 2% (1% to 3%), respectively, between 2016 and 2021. For public code sharing, both the prevalence of declared and actual availability were estimated to be <0.5% since 2016. Meta-regressions indicated that only declared public data sharing prevalence estimates have increased over time. Compliance with mandatory data sharing policies ranged from 0% to 100% across journals and varied by type of data. In contrast, success in privately obtaining data and code from authors historically ranged between 0% and 37% and 0% and 23%, respectively. CONCLUSIONS The review found that public code sharing was persistently low across medical research. Declarations of data sharing were also low, increasing over time, but did not always correspond to actual sharing of data. The effectiveness of mandatory data sharing policies varied substantially by journal and type of data, a finding that might be informative for policy makers when designing policies and allocating resources to audit compliance. SYSTEMATIC REVIEW REGISTRATION Open Science Framework doi:10.17605/OSF.IO/7SX8U.
Collapse
Affiliation(s)
- Daniel G Hamilton
- MetaMelb Research Group, School of BioSciences, University of Melbourne, Melbourne, VIC, Australia
- Melbourne Medical School, Faculty of Medicine, Dentistry, and Health Sciences, University of Melbourne, Melbourne, VIC, Australia
| | - Kyungwan Hong
- Department of Practice, Sciences, and Health Outcomes Research, University of Maryland School of Pharmacy, Baltimore, MD, USA
| | - Hannah Fraser
- MetaMelb Research Group, School of BioSciences, University of Melbourne, Melbourne, VIC, Australia
| | - Anisa Rowhani-Farid
- Department of Practice, Sciences, and Health Outcomes Research, University of Maryland School of Pharmacy, Baltimore, MD, USA
| | - Fiona Fidler
- MetaMelb Research Group, School of BioSciences, University of Melbourne, Melbourne, VIC, Australia
- School of Historical and Philosophical Studies, University of Melbourne, Melbourne, VIC, Australia
| | - Matthew J Page
- Methods in Evidence Synthesis Unit, School of Public Health and Preventive Medicine, Monash University, Melbourne, VIC, Australia
| |
Collapse
|
6
|
Louderback ER, Gainsbury SM, Heirene RM, Amichia K, Grossman A, Bernhard BJ, LaPlante DA. Open Science Practices in Gambling Research Publications (2016-2019): A Scoping Review. J Gambl Stud 2022; 39:987-1011. [PMID: 35678905 PMCID: PMC9178323 DOI: 10.1007/s10899-022-10120-y] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 03/20/2022] [Indexed: 12/04/2022]
Abstract
The replication crisis has stimulated researchers around the world to adopt open science research practices intended to reduce publication bias and improve research quality. Open science practices include study pre-registration, open data, open access, and avoiding methods that can lead to publication bias and low replication rates. Although gambling studies uses similar research methods as behavioral research fields that have struggled with replication, we know little about the uptake of open science research practices in gambling-focused research. We conducted a scoping review of 500 recent (1/1/2016–12/1/2019) studies focused on gambling and problem gambling to examine the use of open science and transparent research practices. Our results showed that a small percentage of studies used most practices: whereas 54.6% (95% CI: [50.2, 58.9]) of studies used at least one of nine open science practices, each practice’s prevalence was: 1.6% for pre-registration (95% CI: [0.8, 3.1]), 3.2% for open data (95% CI: [2.0, 5.1]), 0% for open notebook, 35.2% for open access (95% CI: [31.1, 39.5]), 7.8% for open materials (95% CI: [5.8, 10.5]), 1.4% for open code (95% CI: [0.7, 2.9]), and 15.0% for preprint posting (95% CI: [12.1, 18.4]). In all, 6.4% (95% CI: [4.6, 8.9]) of the studies included a power analysis and 2.4% (95% CI: [1.4, 4.2]) were replication studies. Exploratory analyses showed that studies that used any open science practice, and open access in particular, had higher citation counts. We suggest several practical ways to enhance the uptake of open science principles and practices both within gambling studies and in science more generally.
Collapse
Affiliation(s)
- Eric R Louderback
- Division on Addiction, Cambridge Health Alliance, a Harvard Medical School Teaching Hospital, Malden, MA, USA
- Harvard Medical School, Boston, MA, USA
| | | | | | - Karen Amichia
- Division on Addiction, Cambridge Health Alliance, a Harvard Medical School Teaching Hospital, Malden, MA, USA
| | - Alessandra Grossman
- Division on Addiction, Cambridge Health Alliance, a Harvard Medical School Teaching Hospital, Malden, MA, USA
| | - Bo J Bernhard
- International Gaming Institute, University of Nevada, Las Vegas, NV, USA
- University of Nevada, Reno, NV, USA
| | - Debi A LaPlante
- Division on Addiction, Cambridge Health Alliance, a Harvard Medical School Teaching Hospital, Malden, MA, USA
- Harvard Medical School, Boston, MA, USA
| |
Collapse
|
7
|
Ehring T, Limburg K, Kunze AE, Wittekind CE, Werner GG, Wolkenstein L, Guzey M, Cludius B. (When and how) does basic research in clinical psychology lead to more effective psychological treatment for mental disorders? Clin Psychol Rev 2022; 95:102163. [DOI: 10.1016/j.cpr.2022.102163] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/05/2022] [Revised: 04/29/2022] [Accepted: 05/12/2022] [Indexed: 11/03/2022]
|
8
|
Hardwicke TE, Thibault RT, Kosie JE, Wallach JD, Kidwell MC, Ioannidis JPA. Estimating the Prevalence of Transparency and Reproducibility-Related Research Practices in Psychology (2014-2017). Perspect Psychol Sci 2022; 17:239-251. [PMID: 33682488 PMCID: PMC8785283 DOI: 10.1177/1745691620979806] [Citation(s) in RCA: 32] [Impact Index Per Article: 16.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Psychologists are navigating an unprecedented period of introspection about the credibility and utility of their discipline. Reform initiatives emphasize the benefits of transparency and reproducibility-related research practices; however, adoption across the psychology literature is unknown. Estimating the prevalence of such practices will help to gauge the collective impact of reform initiatives, track progress over time, and calibrate future efforts. To this end, we manually examined a random sample of 250 psychology articles published between 2014 and 2017. Over half of the articles were publicly available (154/237, 65%, 95% confidence interval [CI] = [59%, 71%]); however, sharing of research materials (26/183; 14%, 95% CI = [10%, 19%]), study protocols (0/188; 0%, 95% CI = [0%, 1%]), raw data (4/188; 2%, 95% CI = [1%, 4%]), and analysis scripts (1/188; 1%, 95% CI = [0%, 1%]) was rare. Preregistration was also uncommon (5/188; 3%, 95% CI = [1%, 5%]). Many articles included a funding disclosure statement (142/228; 62%, 95% CI = [56%, 69%]), but conflict-of-interest statements were less common (88/228; 39%, 95% CI = [32%, 45%]). Replication studies were rare (10/188; 5%, 95% CI = [3%, 8%]), and few studies were included in systematic reviews (21/183; 11%, 95% CI = [8%, 16%]) or meta-analyses (12/183; 7%, 95% CI = [4%, 10%]). Overall, the results suggest that transparency and reproducibility-related research practices were far from routine. These findings establish baseline prevalence estimates against which future progress toward increasing the credibility and utility of psychology research can be compared.
Collapse
Affiliation(s)
- Tom E. Hardwicke
- Department of Psychology, University of Amsterdam
- Meta-Research Innovation Center Berlin (METRIC-B), QUEST Center for Transforming Biomedical Research, Berlin Institute of Health, Charité–Universitätsmedizin Berlin
| | - Robert T. Thibault
- School of Psychological Science, University of Bristol
- MRC Integrative Epidemiology Unit at the University of Bristol
| | | | - Joshua D. Wallach
- Department of Environmental Health Sciences, Yale School of Public Health
| | | | - John P. A. Ioannidis
- Meta-Research Innovation Center Berlin (METRIC-B), QUEST Center for Transforming Biomedical Research, Berlin Institute of Health, Charité–Universitätsmedizin Berlin
- Department of Medicine, Stanford University
- Meta-Research Innovation Center at Stanford, Stanford University
| |
Collapse
|
9
|
Burke NL, Frank GKW, Hilbert A, Hildebrandt T, Klump KL, Thomas JJ, Wade TD, Walsh BT, Wang SB, Weissman RS. Open science practices for eating disorders research. Int J Eat Disord 2021; 54:1719-1729. [PMID: 34555191 PMCID: PMC9107337 DOI: 10.1002/eat.23607] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/27/2021] [Revised: 09/02/2021] [Accepted: 09/03/2021] [Indexed: 11/07/2022]
Abstract
This editorial seeks to encourage the increased application of three open science practices in eating disorders research: Preregistration, Registered Reports, and the sharing of materials, data, and code. For each of these practices, we introduce updated International Journal of Eating Disorders author and reviewer guidance. Updates include the introduction of open science badges; specific instructions about how to improve transparency; and the introduction of Registered Reports of systematic or meta-analytical reviews. The editorial also seeks to encourage the study of open science practices. Open science practices pose considerable time and other resource burdens. Therefore, research is needed to help determine the value of these added burdens and to identify efficient strategies for implementing open science practices.
Collapse
Affiliation(s)
- Natasha L. Burke
- Department of Psychology, Fordham University, Bronx, New York, USA
| | - Guido K. W. Frank
- Department of Psychiatry, University of California San Diego, San Diego, California, USA
| | - Anja Hilbert
- Department of Psychosomatic Medicine and Psychotherapy, Integrated Research and Treatment Center Adiposity Diseases, Behavioral Medicine Research Unit, Leipzig, Germany
| | - Thomas Hildebrandt
- Center of Excellence in Eating and Weight Disorders, Icahn School of Medicine at Mount Sinai, New York, New York, USA
| | - Kelly L. Klump
- Department of Psychology, Michigan State University, East Lansing, Michigan, USA
| | - Jennifer J. Thomas
- Eating Disorders Clinical and Research Program, Massachusetts General Hospital and Department of Psychiatry, Harvard Medical School, Boston, Massachusetts, USA
| | - Tracey D. Wade
- Blackbird Initiative, Órama Institute for Mental Health and Well-Being, Flinders University, Adelaide, South Australia, Australia
| | - B. Timothy Walsh
- New York State Psychiatric Institute and Department of Psychiatry, Columbia University Irving Medical Center, New York, New York, USA
| | - Shirley B. Wang
- Department of Psychology, Harvard University, Cambridge, Massachusetts, USA
| | | |
Collapse
|
10
|
Abstract
OBJECTIVES To explore the impact of data-sharing initiatives on the intent to share data, on actual data sharing, on the use of shared data and on research output and impact of shared data. ELIGIBILITY CRITERIA All studies investigating data-sharing practices for individual participant data (IPD) from clinical trials. SOURCES OF EVIDENCE We searched the Medline database, the Cochrane Library, the Science Citation Index Expanded and the Social Sciences Citation Index via Web of Science, and preprints and proceedings of the International Congress on Peer Review and Scientific Publication. In addition, we inspected major clinical trial data-sharing platforms, contacted major journals/publishers, editorial groups and some funders. CHARTING METHODS Two reviewers independently extracted information on methods and results from resources identified using a standardised questionnaire. A map of the extracted data was constructed and accompanied by a narrative summary for each outcome domain. RESULTS 93 studies identified in the literature search (published between 2001 and 2020, median: 2018) and 5 from additional information sources were included in the scoping review. Most studies were descriptive and focused on early phases of the data-sharing process. While the willingness to share IPD from clinical trials is extremely high, actual data-sharing rates are suboptimal. A survey of journal data suggests poor to moderate enforcement of the policies by publishers. Metrics provided by platforms suggest that a large majority of data remains unrequested. When requested, the purpose of the reuse is more often secondary analyses and meta-analyses, rarely re-analyses. Finally, studies focused on the real impact of data-sharing were rare and used surrogates such as citation metrics. CONCLUSIONS There is currently a gap in the evidence base for the impact of IPD sharing, which entails uncertainties in the implementation of current data-sharing policies. High level evidence is needed to assess whether the value of medical research increases with data-sharing practices.
Collapse
Affiliation(s)
- Christian Ohmann
- European Clinical Research Infrastructure Network, Paris, France
| | - David Moher
- Ottawa Methods Centre, Ottawa Hospital Research Institute, Ottawa, Ontario, Canada
| | - Maximilian Siebert
- CHU Rennes, CIC 1414 (Centre d'Investigation Clinique de Rennes), University Rennes, Rennes, France
| | - Edith Motschall
- Institute of Medical Biometry and Statistics, Faculty of Medicine and Medical Center - University of Freiburg, Freiburg, Baden-Württemberg, Germany
| | - Florian Naudet
- CHU Rennes, INSERM CIC 1414 (Centre d'Investigation Clinique de Rennes), University Rennes, Rennes, Bretagne, France
| |
Collapse
|
11
|
Norris E, He Y, Loh R, West R, Michie S. Assessing Markers of Reproducibility and Transparency in Smoking Behaviour Change Intervention Evaluations. J Smok Cessat 2021; 2021:6694386. [PMID: 34306236 PMCID: PMC8279208 DOI: 10.1155/2021/6694386] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2020] [Accepted: 11/27/2020] [Indexed: 12/16/2022] Open
Abstract
INTRODUCTION Activities promoting research reproducibility and transparency are crucial for generating trustworthy evidence. Evaluation of smoking interventions is one area where vested interests may motivate reduced reproducibility and transparency. AIMS Assess markers of transparency and reproducibility in smoking behaviour change intervention evaluation reports. METHODS One hundred evaluation reports of smoking behaviour change intervention randomised controlled trials published in 2018-2019 were identified. Reproducibility markers of pre-registration; protocol sharing; data, material, and analysis script sharing; replication of a previous study; and open access publication were coded in identified reports. Transparency markers of funding and conflict of interest declarations were also coded. Coding was performed by two researchers, with inter-rater reliability calculated using Krippendorff's alpha. RESULTS Seventy-one percent of reports were open access, and 73% were pre-registered. However, there are only 13% provided accessible materials, 7% accessible data, and 1% accessible analysis scripts. No reports were replication studies. Ninety-four percent of reports provided a funding source statement, and eighty-eight percent of reports provided a conflict of interest statement. CONCLUSIONS Open data, materials, analysis, and replications are rare in smoking behaviour change interventions, whereas funding source and conflict of interest declarations are common. Future smoking research should be more reproducible to enable knowledge accumulation. This study was pre-registered: https://osf.io/yqj5p.
Collapse
Affiliation(s)
- Emma Norris
- Health Behaviour Change Research Group, Department of Health Sciences, Brunel University, UK
- Centre for Behaviour Change, University College London, UK
| | - Yiwei He
- Psychology & Language Sciences, University College London, UK
| | - Rachel Loh
- Psychology & Language Sciences, University College London, UK
| | - Robert West
- Research Department of Epidemiology & Public Health, University College London, UK
| | - Susan Michie
- Centre for Behaviour Change, University College London, UK
| |
Collapse
|
12
|
Stoll M, Mancini A, Hubenschmid L, Dreimüller N, König J, Cuijpers P, Barth J, Lieb K. Discrepancies from registered protocols and spin occurred frequently in randomized psychotherapy trials—A meta-epidemiologic study. J Clin Epidemiol 2020; 128:49-56. [DOI: 10.1016/j.jclinepi.2020.08.013] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/04/2020] [Revised: 07/28/2020] [Accepted: 08/18/2020] [Indexed: 02/06/2023]
|
13
|
Cristea IA, Naudet F. Increase value and reduce waste in research on psychological therapies. Behav Res Ther 2019; 123:103479. [PMID: 31639527 DOI: 10.1016/j.brat.2019.103479] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2019] [Revised: 07/25/2019] [Accepted: 09/09/2019] [Indexed: 12/27/2022]
Abstract
A seminal Lancet series focused on increasing value and reducing waste in biomedical research, providing a transferrable template to diagnose problems in research. Our goal was to document how some of these sources of waste apply to mental health and particularly psychological treatments research. We synthesize and critically evaluate empirical findings in relation to four major sources: i) defining research priorities; ii) research design, methods and analysis; iii) accessibility of research information; iv) accuracy and usability of research reports. We demonstrate that each source of waste considered is well-represented and amply documented within this field. We describe hype and insufficient consideration of what is known in defining research priorities, persistent risk of bias, particularly due to selective outcome reporting, for psychotherapy trials across mental disorders, intellectual and financial biases, direct and indirect evidence of publication bias, largely inexistent adoption of data sharing, issues of multiplicity and fragmentation of data and findings, and insufficient adoption of reporting guidelines. We expand on a few general solutions, including supporting meta-research, properly testing interventions to increase research quality, placing open science at the center of psychological treatment research and remaining vigilant particularly regarding the strains of research currently prioritized, such as experimental psychopathology.
Collapse
|