1
|
Briney KA. Measuring data rot: An analysis of the continued availability of shared data from a Single University. PLoS One 2024; 19:e0304781. [PMID: 38838010 DOI: 10.1371/journal.pone.0304781] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/27/2023] [Accepted: 05/17/2024] [Indexed: 06/07/2024] Open
Abstract
To determine where data is shared and what data is no longer available, this study analyzed data shared by researchers at a single university. 2166 supplemental data links were harvested from the university's institutional repository and web scraped using R. All links that failed to scrape or could not be tested algorithmically were tested for availability by hand. Trends in data availability by link type, age of publication, and data source were examined for patterns. Results show that researchers shared data in hundreds of places. About two-thirds of links to shared data were in the form of URLs and one-third were DOIs, with several FTP links and links directly to files. A surprising 13.4% of shared URL links pointed to a website homepage rather than a specific record on a website. After testing, 5.4% the 2166 supplemental data links were found to be no longer available. DOIs were the type of shared link that was least likely to disappear with a 1.7% loss, with URL loss at 5.9% averaged over time. Links from older publications were more likely to be unavailable, with a data disappearance rate estimated at 2.6% per year, as well as links to data hosted on journal websites. The results support best practice guidance to share data in a data repository using a permanent identifier.
Collapse
Affiliation(s)
- Kristin A Briney
- Caltech Library, California Institute of Technology, Pasadena, CA, United States of America
| |
Collapse
|
2
|
Prosser AMB, Bagnall R, Higson-Sweeney N. Reflection over compliance: Critiquing mandatory data sharing policies for qualitative research. J Health Psychol 2024; 29:653-658. [PMID: 38282356 PMCID: PMC11141091 DOI: 10.1177/13591053231225903] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/30/2024] Open
Abstract
Many journals are moving towards a 'Mandatory Inclusion of Raw Data' (MIRD) model of data sharing, where it is expected that raw data be publicly accessible at article submission. While open data sharing is beneficial for some research topics and methodologies within health psychology, in other cases it may be ethically and epistemologically questionable. Here, we outline several questions that qualitative researchers might consider surrounding the ethics of open data sharing. Overall, we argue that universal open raw data mandates cannot adequately represent the diversity of qualitative research, and that MIRD may harm rigorous and ethical research practice within health psychology and beyond. Researchers should instead find ways to demonstrate rigour thorough engagement with questions surrounding data sharing. We propose that all researchers utilise the increasingly common 'data availability statement' to demonstrate reflexive engagement with issues of ethics, epistemology and participant protection when considering whether to open data.
Collapse
|
3
|
Ng JY, Lin B, Parikh T, Cramer H, Moher D. Investigating the nature of open science practices across complementary, alternative, and integrative medicine journals: An audit. PLoS One 2024; 19:e0302655. [PMID: 38701100 PMCID: PMC11068175 DOI: 10.1371/journal.pone.0302655] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2023] [Accepted: 04/08/2024] [Indexed: 05/05/2024] Open
Abstract
BACKGROUND Open science practices are implemented across many scientific fields to improve transparency and reproducibility in research. Complementary, alternative, and integrative medicine (CAIM) is a growing field that may benefit from adoption of open science practices. The efficacy and safety of CAIM practices, a popular concern with the field, can be validated or refuted through transparent and reliable research. Investigating open science practices across CAIM journals by using the Transparency and Openness Promotion (TOP) guidelines can potentially promote open science practices across CAIM journals. The purpose of this study is to conduct an audit that compares and ranks open science practices adopted by CAIM journals against TOP guidelines laid out by the Center for Open Science (COS). METHODS CAIM-specific journals with titles containing the words "complementary", "alternative" and/or "integrative" were included in this audit. Each of the eight TOP criteria were used to extract open science practices from each of the CAIM journals. Data was summarized by the TOP guideline and ranked using the TOP Factor to identify commonalities and differences in practices across the included journals. RESULTS A total of 19 CAIM journals were included in this audit. Across all journals, the mean TOP Factor was 2.95 with a median score of 2. The findings of this study reveal high variability among the open science practices required by journals in this field. Four journals (21%) had a final TOP score of 0, while the total scores of the remaining 15 (79%) ranged from 1 to 8. CONCLUSION While several studies have audited open science practices across discipline-specific journals, none have focused on CAIM journals. The results of this study indicate that CAIM journals provide minimal guidelines to encourage or require authors to adhere to open science practices and there is an opportunity to improve the use of open science practices in the field.
Collapse
Affiliation(s)
- Jeremy Y. Ng
- Institute of General Practice and Interprofessional Care, University Hospital Tübingen, Tübingen, Germany
- Robert Bosch Center for Integrative Medicine and Health, Bosch Health Campus, Stuttgart, Germany
- Ottawa Hospital Research Institute, Centre for Journalology, Ottawa Methods Centre, Ottawa, Ontario, Canada
| | - Brenda Lin
- Institute of General Practice and Interprofessional Care, University Hospital Tübingen, Tübingen, Germany
- Robert Bosch Center for Integrative Medicine and Health, Bosch Health Campus, Stuttgart, Germany
| | - Tisha Parikh
- Institute of General Practice and Interprofessional Care, University Hospital Tübingen, Tübingen, Germany
- Robert Bosch Center for Integrative Medicine and Health, Bosch Health Campus, Stuttgart, Germany
| | - Holger Cramer
- Institute of General Practice and Interprofessional Care, University Hospital Tübingen, Tübingen, Germany
- Robert Bosch Center for Integrative Medicine and Health, Bosch Health Campus, Stuttgart, Germany
| | - David Moher
- Ottawa Hospital Research Institute, Centre for Journalology, Ottawa Methods Centre, Ottawa, Ontario, Canada
- School of Epidemiology and Public Health, University of Ottawa, Ottawa, Canada
| |
Collapse
|
4
|
Fernández-Castilla B, Said-Metwaly S, Kreitchmann RS, Van Den Noortgate W. What do meta-analysts need in primary studies? Guidelines and the SEMI checklist for facilitating cumulative knowledge. Behav Res Methods 2024; 56:3315-3329. [PMID: 38627324 PMCID: PMC11133106 DOI: 10.3758/s13428-024-02373-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 02/17/2024] [Indexed: 05/30/2024]
Abstract
Meta-analysis is often recognized as the highest level of evidence due to its notable advantages. Therefore, ensuring the precision of its findings is of utmost importance. Insufficient reporting in primary studies poses challenges for meta-analysts, hindering study identification, effect size estimation, and meta-regression analyses. This manuscript provides concise guidelines for the comprehensive reporting of qualitative and quantitative aspects in primary studies. Adhering to these guidelines may help researchers enhance the quality of their studies and increase their eligibility for inclusion in future research syntheses, thereby enhancing research synthesis quality. Recommendations include incorporating relevant terms in titles and abstracts to facilitate study retrieval and reporting sufficient data for effect size calculation. Additionally, a new checklist is introduced to help applied researchers thoroughly report various aspects of their studies.
Collapse
Affiliation(s)
- Belén Fernández-Castilla
- Faculty of Psychology, Universidad Nacional de Educación a Distancia, Juan del Rosal 10, 28040, Madrid, Spain.
| | - Sameh Said-Metwaly
- Faculty of Psychology and Educational Sciences, KU, Leuven, Belgium
- Imec-Itec, KU, Leuven, Belgium
- Faculty of Education, Damanhour University, Damanhour, Egypt
| | - Rodrigo S Kreitchmann
- Faculty of Psychology, Universidad Nacional de Educación a Distancia, Juan del Rosal 10, 28040, Madrid, Spain
| | - Wim Van Den Noortgate
- Faculty of Psychology and Educational Sciences, KU, Leuven, Belgium
- Imec-Itec, KU, Leuven, Belgium
| |
Collapse
|
5
|
Chin J, Zeiler K, Dilevski N, Holcombe A, Gatfield-Jeffries R, Bishop R, Vazire S, Schiavone S. The transparency of quantitative empirical legal research published in highly ranked law journals (2018-2020): an observational study. F1000Res 2024; 12:144. [PMID: 37600907 PMCID: PMC10435919 DOI: 10.12688/f1000research.127563.1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 01/18/2024] [Indexed: 08/22/2023] Open
Abstract
Background Scientists are increasingly concerned with making their work easy to verify and build upon. Associated practices include sharing data, materials, and analytic scripts, and preregistering protocols. This shift towards increased transparency and rigor has been referred to as a "credibility revolution." The credibility of empirical legal research has been questioned in the past due to its distinctive peer review system and because the legal background of its researchers means that many often are not trained in study design or statistics. Still, there has been no systematic study of transparency and credibility-related characteristics of published empirical legal research. Methods To fill this gap and provide an estimate of current practices that can be tracked as the field evolves, we assessed 300 empirical articles from highly ranked law journals including both faculty-edited journals and student-edited journals. Results We found high levels of article accessibility (86%, 95% CI = [82%, 90%]), especially among student-edited journals (100%). Few articles stated that a study's data are available (19%, 95% CI = [15%, 23%]). Statements of preregistration (3%, 95% CI = [1%, 5%]) and availability of analytic scripts (6%, 95% CI = [4%, 9%]) were very uncommon. (i.e., they collected new data using the study's reported methods, but found results inconsistent or not as strong as the original). Conclusion We suggest that empirical legal researchers and the journals that publish their work cultivate norms and practices to encourage research credibility. Our estimates may be revisited to track the field's progress in the coming years.
Collapse
Affiliation(s)
- Jason Chin
- College of Law, Australian National University, Canberra, ACT, Australia
| | | | - Natali Dilevski
- Centre for Investigative Interviewing, Griffith Criminology Institute, Griffith University, Brisbane, Qld, Australia
| | - Alex Holcombe
- Psychology, University of Sydney, Sydney, NSW, Australia
| | | | - Ruby Bishop
- School of Law, University of Sydney, Sydney, NSW, Australia
| | - Simine Vazire
- Melbourne School of Psychological Sciences, University of Melbourne, Melbourne, Vic, Australia
| | | |
Collapse
|
6
|
Gooden A. A pathway to strengthening open science: comments on the draft South African Ethics in Health Research Guidelines. Front Pharmacol 2024; 15:1304950. [PMID: 38572431 PMCID: PMC10989741 DOI: 10.3389/fphar.2024.1304950] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/17/2023] [Accepted: 02/12/2024] [Indexed: 04/05/2024] Open
Abstract
The recently released draft South African Ethics in Health Research Guidelines: Principles, Processes and Structures (Draft Guidelines) by the National Health Research Ethics Council recognize open data and provide guiding principles for this in the context of health research in South Africa. While its inclusion is a positive development, there is room for improvement. Although the Draft Guidelines leverage the Draft National Policy on Data and Cloud, it lacks incorporation of other relevant government policies, notably the Draft National Open Science Policy, and fails to sufficiently detail the principles of open science and open access. This limited scope and lack of comprehensive definition and detailed guidance present challenges for researchers in conducting ethical and responsible health research in South Africa. It constrains the Draft Guidelines from fully aligning with national imperatives and from fostering African-centric approaches. To address these issues, it is recommended that the Draft Guidelines integrate broader policies and principles, enhance clarity through comprehensive definitions, provide detailed guidance on open access, and promote African-centric approaches. Implementing these solutions will strengthen the Draft Guidelines, aligning them with national visions of open science, and thereby harnessing the full potential of South Africa's diverse scientific community in advancing health research.
Collapse
Affiliation(s)
- Amy Gooden
- School of Law, University of KwaZulu-Natal, Durban, South Africa
| |
Collapse
|
7
|
Quinn TP, Hess JL, Marshe VS, Barnett MM, Hauschild AC, Maciukiewicz M, Elsheikh SSM, Men X, Schwarz E, Trakadis YJ, Breen MS, Barnett EJ, Zhang-James Y, Ahsen ME, Cao H, Chen J, Hou J, Salekin A, Lin PI, Nicodemus KK, Meyer-Lindenberg A, Bichindaritz I, Faraone SV, Cairns MJ, Pandey G, Müller DJ, Glatt SJ. A primer on the use of machine learning to distil knowledge from data in biological psychiatry. Mol Psychiatry 2024; 29:387-401. [PMID: 38177352 DOI: 10.1038/s41380-023-02334-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/12/2022] [Revised: 09/21/2023] [Accepted: 11/17/2023] [Indexed: 01/06/2024]
Abstract
Applications of machine learning in the biomedical sciences are growing rapidly. This growth has been spurred by diverse cross-institutional and interdisciplinary collaborations, public availability of large datasets, an increase in the accessibility of analytic routines, and the availability of powerful computing resources. With this increased access and exposure to machine learning comes a responsibility for education and a deeper understanding of its bases and bounds, borne equally by data scientists seeking to ply their analytic wares in medical research and by biomedical scientists seeking to harness such methods to glean knowledge from data. This article provides an accessible and critical review of machine learning for a biomedically informed audience, as well as its applications in psychiatry. The review covers definitions and expositions of commonly used machine learning methods, and historical trends of their use in psychiatry. We also provide a set of standards, namely Guidelines for REporting Machine Learning Investigations in Neuropsychiatry (GREMLIN), for designing and reporting studies that use machine learning as a primary data-analysis approach. Lastly, we propose the establishment of the Machine Learning in Psychiatry (MLPsych) Consortium, enumerate its objectives, and identify areas of opportunity for future applications of machine learning in biological psychiatry. This review serves as a cautiously optimistic primer on machine learning for those on the precipice as they prepare to dive into the field, either as methodological practitioners or well-informed consumers.
Collapse
Affiliation(s)
- Thomas P Quinn
- Applied Artificial Intelligence Institute (A2I2), Burwood, VIC, 3125, Australia
| | - Jonathan L Hess
- Department of Psychiatry and Behavioral Sciences, Norton College of Medicine at SUNY Upstate Medical University, Syracuse, NY, 13210, USA
| | - Victoria S Marshe
- Institute of Medical Science, University of Toronto, Toronto, ON, M5S 1A1, Canada
- Pharmacogenetics Research Clinic, Campbell Family Mental Health Research Institute, Centre for Addiction and Mental Health, Toronto, ON, M5S 1A1, Canada
| | - Michelle M Barnett
- School of Biomedical Sciences and Pharmacy, The University of Newcastle, Callaghan, NSW, 2308, Australia
- Precision Medicine Research Program, Hunter Medical Research Institute, Newcastle, NSW, 2308, Australia
| | - Anne-Christin Hauschild
- Department of Medical Informatics, Medical University Center Göttingen, Göttingen, Lower Saxony, 37075, Germany
| | - Malgorzata Maciukiewicz
- Hospital Zurich, University of Zurich, Zurich, 8091, Switzerland
- Department of Rheumatology and Immunology, University Hospital Bern, Bern, 3010, Switzerland
- Department for Biomedical Research (DBMR), University of Bern, Bern, 3010, Switzerland
| | - Samar S M Elsheikh
- Pharmacogenetics Research Clinic, Campbell Family Mental Health Research Institute, Centre for Addiction and Mental Health, Toronto, ON, M5S 1A1, Canada
| | - Xiaoyu Men
- Pharmacogenetics Research Clinic, Campbell Family Mental Health Research Institute, Centre for Addiction and Mental Health, Toronto, ON, M5S 1A1, Canada
- Department of Pharmacology and Toxicology, University of Toronto, Toronto, ON, M5S 1A1, Canada
| | - Emanuel Schwarz
- Department of Psychiatry and Psychotherapy, Central Institute of Mental Health, Mannheim, Baden-Württemberg, J5 68159, Germany
| | - Yannis J Trakadis
- Department Human Genetics, McGill University Health Centre, Montreal, QC, H4A 3J1, Canada
| | - Michael S Breen
- Psychiatry, Genetics and Genomic Sciences, Icahn School of Medicine at Mount Sinai, New York, NY, 10029, USA
| | - Eric J Barnett
- Department of Neuroscience and Physiology, Norton College of Medicine at SUNY Upstate Medical University, Syracuse, NY, 13210, USA
| | - Yanli Zhang-James
- Department of Psychiatry and Behavioral Sciences, Norton College of Medicine at SUNY Upstate Medical University, Syracuse, NY, 13210, USA
| | - Mehmet Eren Ahsen
- Department of Business Administration, Gies College of Business, University of Illinois at Urbana-Champaign, Champaign, IL, 61820, USA
- Department of Biomedical and Translational Sciences, Carle-Illinois School of Medicine, University of Illinois at Urbana-Champaign, Champaign, IL, 61820, USA
| | - Han Cao
- Department of Psychiatry and Psychotherapy, Central Institute of Mental Health, Mannheim, Baden-Württemberg, J5 68159, Germany
| | - Junfang Chen
- Department of Psychiatry and Psychotherapy, Central Institute of Mental Health, Mannheim, Baden-Württemberg, J5 68159, Germany
| | - Jiahui Hou
- Department of Psychiatry and Behavioral Sciences, Norton College of Medicine at SUNY Upstate Medical University, Syracuse, NY, 13210, USA
- Department of Neuroscience and Physiology, Norton College of Medicine at SUNY Upstate Medical University, Syracuse, NY, 13210, USA
| | - Asif Salekin
- Electrical Engineering and Computer Science, Syracuse University, Syracuse, NY, 13244, USA
| | - Ping-I Lin
- Discipline of Psychiatry and Mental Health, University of New South Wales, Sydney, NSW, 2052, Australia
- Mental Health Research Unit, South Western Sydney Local Health District, Liverpool, NSW, 2170, Australia
| | | | - Andreas Meyer-Lindenberg
- Clinical Department of Psychiatry and Psychotherapy, Central Institute of Mental Health, Mannheim, Baden-Württemberg, J5 68159, Germany
| | - Isabelle Bichindaritz
- Biomedical and Health Informatics/Computer Science Department, State University of New York at Oswego, Oswego, NY, 13126, USA
- Intelligent Bio Systems Lab, State University of New York at Oswego, Oswego, NY, 13126, USA
| | - Stephen V Faraone
- Department of Psychiatry and Behavioral Sciences, Norton College of Medicine at SUNY Upstate Medical University, Syracuse, NY, 13210, USA
- Department of Neuroscience and Physiology, Norton College of Medicine at SUNY Upstate Medical University, Syracuse, NY, 13210, USA
| | - Murray J Cairns
- School of Biomedical Sciences and Pharmacy, The University of Newcastle, Callaghan, NSW, 2308, Australia
- Precision Medicine Research Program, Hunter Medical Research Institute, Newcastle, NSW, 2308, Australia
| | - Gaurav Pandey
- Department of Genetics and Genomic Sciences, Icahn School of Medicine at Mount Sinai, New York, NY, 10029, USA
| | - Daniel J Müller
- Pharmacogenetics Research Clinic, Campbell Family Mental Health Research Institute, Centre for Addiction and Mental Health, Toronto, ON, M5S 1A1, Canada
- Department of Psychiatry, University of Toronto, Toronto, ON, M5S 1A1, Canada
- Department of Psychiatry, Psychosomatics and Psychotherapy, Center of Mental Health, University Hospital of Würzburg, Würzburg, 97080, Germany
| | - Stephen J Glatt
- Department of Psychiatry and Behavioral Sciences, Norton College of Medicine at SUNY Upstate Medical University, Syracuse, NY, 13210, USA.
- Department of Neuroscience and Physiology, Norton College of Medicine at SUNY Upstate Medical University, Syracuse, NY, 13210, USA.
- Department of Public Health and Preventive Medicine, Norton College of Medicine at SUNY Upstate Medical University, Syracuse, NY, 13210, USA.
| |
Collapse
|
8
|
Liu Q, Hu Q, Liu S, Hutson A, Morgan M. ReUseData: an R/Bioconductor tool for reusable and reproducible genomic data management. BMC Bioinformatics 2024; 25:8. [PMID: 38172657 PMCID: PMC10765726 DOI: 10.1186/s12859-023-05626-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/15/2023] [Accepted: 12/20/2023] [Indexed: 01/05/2024] Open
Abstract
BACKGROUND The increasing volume and complexity of genomic data pose significant challenges for effective data management and reuse. Public genomic data often undergo similar preprocessing across projects, leading to redundant or inconsistent datasets and inefficient use of computing resources. This is especially pertinent for bioinformaticians engaged in multiple projects. Tools have been created to address challenges in managing and accessing curated genomic datasets, however, the practical utility of such tools becomes especially beneficial for users who seek to work with specific types of data or are technically inclined toward a particular programming language. Currently, there exists a gap in the availability of an R-specific solution for efficient data management and versatile data reuse. RESULTS Here we present ReUseData, an R software tool that overcomes some of the limitations of existing solutions and provides a versatile and reproducible approach to effective data management within R. ReUseData facilitates the transformation of ad hoc scripts for data preprocessing into Common Workflow Language (CWL)-based data recipes, allowing for the reproducible generation of curated data files in their generic formats. The data recipes are standardized and self-contained, enabling them to be easily portable and reproducible across various computing platforms. ReUseData also streamlines the reuse of curated data files and their integration into downstream analysis tools and workflows with different frameworks. CONCLUSIONS ReUseData provides a reliable and reproducible approach for genomic data management within the R environment to enhance the accessibility and reusability of genomic data. The package is available at Bioconductor ( https://bioconductor.org/packages/ReUseData/ ) with additional information on the project website ( https://rcwl.org/dataRecipes/ ).
Collapse
Affiliation(s)
- Qian Liu
- Department of Biostatistics and Bioinformatics, Roswell Park Comprehensive Cancer Center, Buffalo, NY, 14263, USA.
| | - Qiang Hu
- Department of Biostatistics and Bioinformatics, Roswell Park Comprehensive Cancer Center, Buffalo, NY, 14263, USA
| | - Song Liu
- Department of Biostatistics and Bioinformatics, Roswell Park Comprehensive Cancer Center, Buffalo, NY, 14263, USA
| | - Alan Hutson
- Department of Biostatistics and Bioinformatics, Roswell Park Comprehensive Cancer Center, Buffalo, NY, 14263, USA
| | - Martin Morgan
- Department of Biostatistics and Bioinformatics, Roswell Park Comprehensive Cancer Center, Buffalo, NY, 14263, USA
| |
Collapse
|
9
|
Luijken K, Lohmann A, Alter U, Claramunt Gonzalez J, Clouth FJ, Fossum JL, Hesen L, Huizing AHJ, Ketelaar J, Montoya AK, Nab L, Nijman RCC, Penning de Vries BBL, Tibbe TD, Wang YA, Groenwold RHH. Replicability of simulation studies for the investigation of statistical methods: the RepliSims project. ROYAL SOCIETY OPEN SCIENCE 2024; 11:231003. [PMID: 38234442 PMCID: PMC10791519 DOI: 10.1098/rsos.231003] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/13/2023] [Accepted: 12/14/2023] [Indexed: 01/19/2024]
Abstract
Results of simulation studies evaluating the performance of statistical methods can have a major impact on the way empirical research is implemented. However, so far there is limited evidence of the replicability of simulation studies. Eight highly cited statistical simulation studies were selected, and their replicability was assessed by teams of replicators with formal training in quantitative methodology. The teams used information in the original publications to write simulation code with the aim of replicating the results. The primary outcome was to determine the feasibility of replicability based on reported information in the original publications and supplementary materials. Replicasility varied greatly: some original studies provided detailed information leading to almost perfect replication of results, whereas other studies did not provide enough information to implement any of the reported simulations. Factors facilitating replication included availability of code, detailed reporting or visualization of data-generating procedures and methods, and replicator expertise. Replicability of statistical simulation studies was mainly impeded by lack of information and sustainability of information sources. We encourage researchers publishing simulation studies to transparently report all relevant implementation details either in the research paper itself or in easily accessible supplementary material and to make their simulation code publicly available using permanent links.
Collapse
Affiliation(s)
- K. Luijken
- Department of Clinical Epidemiology, Leiden University Medical Centre, Leiden, The Netherlands
- Department of Epidemiology, Julius Center for Health Sciences and Primary Care, University Medical Center Utrecht, University Utrecht, Utrecht, The Netherlands
| | - A. Lohmann
- Department of Clinical Epidemiology, Leiden University Medical Centre, Leiden, The Netherlands
| | - U. Alter
- Department of Psychology, York University, Toronto, Ontario, Canada
| | - J. Claramunt Gonzalez
- Methodology and Statistics Unit, Institute of Psychology, Leiden University, Leiden, The Netherlands
| | - F. J. Clouth
- Department of Methodology and Statistics, Tilburg University, Tilburg, The Netherlands
- Netherlands Comprehensive Cancer Organisation (IKNL), Utrecht, The Netherlands
| | - J. L. Fossum
- Department of Psychology, University of California, Los Angeles, CA, USA
- Department of Psychology, Seattle Pacific University, Seattle, WA, USA
| | - L. Hesen
- Department of Clinical Epidemiology, Leiden University Medical Centre, Leiden, The Netherlands
| | - A. H. J. Huizing
- TNO (Netherlands Organization for Applied Scientific Research), Expertise Group Child Health, Leiden, The Netherlands
| | - J. Ketelaar
- Department of Clinical Epidemiology, Leiden University Medical Centre, Leiden, The Netherlands
| | - A. K. Montoya
- Department of Psychology, University of California, Los Angeles, CA, USA
| | - L. Nab
- Department of Clinical Epidemiology, Leiden University Medical Centre, Leiden, The Netherlands
| | - R. C. C. Nijman
- Department of Clinical Epidemiology, Leiden University Medical Centre, Leiden, The Netherlands
| | - B. B. L. Penning de Vries
- Department of Clinical Epidemiology, Leiden University Medical Centre, Leiden, The Netherlands
- Department of Biomedical Data Sciences, Leiden University Medical Centre, Leiden, The Netherlands
| | - T. D. Tibbe
- Department of Psychology, University of California, Los Angeles, CA, USA
| | - Y. A. Wang
- Department of Psychology, University of Toronto, Toronto, Ontario, Canada
| | - R. H. H. Groenwold
- Department of Clinical Epidemiology, Leiden University Medical Centre, Leiden, The Netherlands
- Department of Biomedical Data Sciences, Leiden University Medical Centre, Leiden, The Netherlands
| |
Collapse
|
10
|
Hardwicke TE, Vazire S. Transparency Is Now the Default at Psychological Science. Psychol Sci 2023:9567976231221573. [PMID: 38150599 DOI: 10.1177/09567976231221573] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/29/2023] Open
|
11
|
Siraji MA, Rahman M. Primer on Reproducible Research in R: Enhancing Transparency and Scientific Rigor. Clocks Sleep 2023; 6:1-10. [PMID: 38534796 DOI: 10.3390/clockssleep6010001] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2023] [Revised: 11/27/2023] [Accepted: 12/13/2023] [Indexed: 03/28/2024] Open
Abstract
Achieving research reproducibility is a precarious aspect of scientific practice. However, many studies across disciplines fail to be fully reproduced due to inadequate dissemination methods. Traditional publication practices often fail to provide a comprehensive description of the research context and procedures, hindering reproducibility. To address these challenges, this article presents a tutorial on reproducible research using the R programming language. The tutorial aims to equip researchers, including those with limited coding knowledge, with the necessary skills to enhance reproducibility in their work. It covers three essential components: version control using Git, dynamic document creation using rmarkdown, and managing R package dependencies with renv. The tutorial also provides insights into sharing reproducible research and offers specific considerations for the field of sleep and chronobiology research. By following the tutorial, researchers can adopt practices that enhance the transparency, rigor, and replicability of their work, contributing to a culture of reproducible research and advancing scientific knowledge.
Collapse
Affiliation(s)
- Mushfiqul Anwar Siraji
- Department of Psychology, Jeffery Cheah School of Medicine and Health Science, Monash University Malaysia, Jalan Lagoon Selatan, Bandar Sunway, Selangor Darul Ehsan 47500, Malaysia
- Department of History and Psychology, School of Humanities and Social Sciences, North South University, Dhaka 1229, Bangladesh
| | - Munia Rahman
- Department of Psychology, University of Dhaka, Dhaka 1000, Bangladesh
| |
Collapse
|
12
|
Bullock GS, Ward P, Impellizzeri FM, Kluzek S, Hughes T, Hillman C, Waterman BR, Danelson K, Henry K, Barr E, Healy K, Räisänen AM, Gomez C, Fernandez G, Wolf J, Nicholson KF, Sell T, Zerega R, Dhiman P, Riley RD, Collins GS. Up Front and Open? Shrouded in Secrecy? Or Somewhere in Between? A Meta-Research Systematic Review of Open Science Practices in Sport Medicine Research. J Orthop Sports Phys Ther 2023; 53:1-13. [PMID: 37860866 DOI: 10.2519/jospt.2023.12016] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/21/2023]
Abstract
OBJECTIVE: To investigate open science practices in research published in the top 5 sports medicine journals from May 1, 2022, and October 1, 2022. DESIGN: A meta-research systematic review. LITERATURE SEARCH: Open science practices were searched in MEDLINE. STUDY SELECTION CRITERIA: We included original scientific research published in one of the identified top 5 sports medicine journals in 2022 as ranked by Clarivate: (1) British Journal of Sports Medicine, (2) Journal of Sport and Health Science, (3) American Journal of Sports Medicine, (4) Medicine and Science in Sports and Exercise, and (5) Sports Medicine-Open. Studies were excluded if they were systematic reviews, qualitative research, gray literature, or animal or cadaver models. DATA SYNTHESIS: Open science practices were extracted in accordance with the Transparency and Openness Promotion guidelines and patient and public involvement. RESULTS: Two hundred forty-three studies were included. The median number of open science practices in each study was 2, out of a maximum of 12 (range: 0-8; interquartile range: 2). Two hundred thirty-four studies (96%, 95% confidence interval [CI]: 94%-99%) provided an author conflict-of-interest statement and 163 (67%, 95% CI: 62%-73%) reported funding. Twenty-one studies (9%, 95% CI: 5%-12%) provided open-access data. Fifty-four studies (22%, 95% CI: 17%-27%) included a data availability statement and 3 (1%, 95% CI: 0%-3%) made code available. Seventy-six studies (32%, 95% CI: 25%-37%) had transparent materials and 30 (12%, 95% CI: 8%-16%) used a reporting guideline. Twenty-eight studies (12%, 95% CI: 8%-16%) were preregistered. Six studies (3%, 95% CI: 1%-4%) published a protocol. Four studies (2%, 95% CI: 0%-3%) reported an analysis plan a priori. Seven studies (3%, 95% CI: 1%-5%) reported patient and public involvement. CONCLUSION: Open science practices in the sports medicine field are extremely limited. The least followed practices were sharing code, data, and analysis plans. J Orthop Sports Phys Ther 2023;53(12):1-13. Epub 20 October 2023. doi:10.2519/jospt.2023.12016.
Collapse
Affiliation(s)
- Garrett S Bullock
- Department of Orthopaedic Surgery & Rehabilitation, Wake Forest School of Medicine, Winston-Salem, NC
- Department of Biostatistics and Data Science, Wake Forest School of Medicine, Winston-Salem, NC
- Centre for Sport, Exercise and Osteoarthritis Research Versus Arthritis, University of Oxford, Oxford, United Kingdom
- Sport Injury Prevention Research Center, University of Calgary, Calgary, AB, Canada
| | | | - Franco M Impellizzeri
- School of Sport, Exercise, and Rehabilitation, University of Technology Sydney, Sydney, Australia
| | - Stefan Kluzek
- Centre for Sport, Exercise and Osteoarthritis Research Versus Arthritis, University of Oxford, Oxford, United Kingdom
- Sports Medicine Research Department, University of Nottingham, Nottingham, UK
- English Institute of Sport, Marlow, United Kingdom
| | - Tom Hughes
- Department of Health Professions, Manchester Metropolitan University, Manchester, United Kingdom
| | - Charles Hillman
- Sports Medicine Research Department, University of Nottingham, Nottingham, UK
| | - Brian R Waterman
- Department of Orthopaedic Surgery & Rehabilitation, Wake Forest School of Medicine, Winston-Salem, NC
| | - Kerry Danelson
- Department of Orthopaedic Surgery & Rehabilitation, Wake Forest School of Medicine, Winston-Salem, NC
| | - Kaitlin Henry
- Department of Orthopaedic Surgery & Rehabilitation, Wake Forest School of Medicine, Winston-Salem, NC
| | - Emily Barr
- Department of Orthopaedic Surgery & Rehabilitation, Wake Forest School of Medicine, Winston-Salem, NC
| | - Kelsey Healy
- Department of Orthopaedic Surgery & Rehabilitation, Wake Forest School of Medicine, Winston-Salem, NC
| | - Anu M Räisänen
- Department of Physical Therapy Education - Oregon, College of Health Sciences-Northwest, Western University of Health Sciences, Lebanon, OR
- Sport Injury Prevention Research Centre, Faculty of Kinesiology, University of Calgary, Calgary, AB, Canada
| | - Christina Gomez
- Department of Physical Therapy Education - Oregon, College of Health Sciences-Northwest, Western University of Health Sciences, Lebanon, OR
| | - Garrett Fernandez
- Department of Orthopaedic Surgery & Rehabilitation, Wake Forest School of Medicine, Winston-Salem, NC
| | - Jakob Wolf
- Department of Orthopaedic Surgery & Rehabilitation, Wake Forest School of Medicine, Winston-Salem, NC
| | - Kristen F Nicholson
- Department of Orthopaedic Surgery & Rehabilitation, Wake Forest School of Medicine, Winston-Salem, NC
| | | | | | - Paula Dhiman
- Centre for Statistics in Medicine, Nuffield Department of Orthopaedics, Rheumatology, and Musculoskeletal Sciences, University of Oxford, Oxford, United Kingdom
| | - Richard D Riley
- Institute of Applied Health Research, College of Medical and Dental Sciences, University of Birmingham, Birmingham, United Kingdom
| | - Gary S Collins
- Centre for Statistics in Medicine, Nuffield Department of Orthopaedics, Rheumatology, and Musculoskeletal Sciences, University of Oxford, Oxford, United Kingdom
| |
Collapse
|
13
|
Dumanis SB, Ratan K, McIntosh S, Shah HV, Lewis M, Vines TH, Schekman R, Riley EA. From policy to practice: Lessons learned from an open science funding initiative. PLoS Comput Biol 2023; 19:e1011626. [PMID: 38060981 PMCID: PMC10703508 DOI: 10.1371/journal.pcbi.1011626] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/18/2023] Open
Affiliation(s)
- Sonya B. Dumanis
- Coalition for Aligning Science, Chevy Chase, Maryland, United States of America
- Aligning Science Across Parkinson’s (ASAP), Chevy Chase, Maryland, United States of America
| | - Kristen Ratan
- Strategies for Open Science (Stratos) and Incentivizing Collaborative Open Research (ICOR) Santa Cruz, California, United States of America
| | - Souad McIntosh
- DataSeer Research Data Services, Vancouver, British Columbia, Canada
| | - Hetal V. Shah
- Coalition for Aligning Science, Chevy Chase, Maryland, United States of America
- Aligning Science Across Parkinson’s (ASAP), Chevy Chase, Maryland, United States of America
| | - Matt Lewis
- Coalition for Aligning Science, Chevy Chase, Maryland, United States of America
- Aligning Science Across Parkinson’s (ASAP), Chevy Chase, Maryland, United States of America
| | - Timothy H. Vines
- DataSeer Research Data Services, Vancouver, British Columbia, Canada
| | - Randy Schekman
- Aligning Science Across Parkinson’s (ASAP), Chevy Chase, Maryland, United States of America
- Howard Hughes Medical Institute, University of California, Berkeley, Berkeley, United States of America
| | - Ekemini A. Riley
- Coalition for Aligning Science, Chevy Chase, Maryland, United States of America
- Aligning Science Across Parkinson’s (ASAP), Chevy Chase, Maryland, United States of America
| |
Collapse
|
14
|
DuBois JM, Mozersky J, Parsons M, Walsh HA, Friedrich A, Pienta A. Exchanging words: Engaging the challenges of sharing qualitative research data. Proc Natl Acad Sci U S A 2023; 120:e2206981120. [PMID: 37831745 PMCID: PMC10614603 DOI: 10.1073/pnas.2206981120] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/15/2023] Open
Abstract
In January 2023, a new NIH policy on data sharing went into effect. The policy applies to both quantitative and qualitative research (QR) data such as data from interviews or focus groups. QR data are often sensitive and difficult to deidentify, and thus have rarely been shared in the United States. Over the past 5 y, our research team has engaged stakeholders on QR data sharing, developed software to support data deidentification, produced guidance, and collaborated with the ICPSR data repository to pilot the deposit of 30 QR datasets. In this perspective article, we share important lessons learned by addressing eight clusters of questions on issues such as where, when, and what to share; how to deidentify data and support high-quality secondary use; budgeting for data sharing; and the permissions needed to share data. We also offer a brief assessment of the state of preparedness of data repositories, QR journals, and QR textbooks to support data sharing. While QR data sharing could yield important benefits to the research community, we quickly need to develop enforceable standards, expertise, and resources to support responsible QR data sharing. Absent these resources, we risk violating participant confidentiality and wasting a significant amount of time and funding on data that are not useful for either secondary use or data transparency and verification.
Collapse
Affiliation(s)
- James M. DuBois
- Bioethics Research Center, Department of Medicine, Washington University School of Medicine, St. Louis, MO63110
| | - Jessica Mozersky
- Bioethics Research Center, Department of Medicine, Washington University School of Medicine, St. Louis, MO63110
| | - Meredith Parsons
- Bioethics Research Center, Department of Medicine, Washington University School of Medicine, St. Louis, MO63110
| | - Heidi A. Walsh
- Bioethics Research Center, Department of Medicine, Washington University School of Medicine, St. Louis, MO63110
| | - Annie Friedrich
- Bioethics Research Center, Department of Medicine, Washington University School of Medicine, St. Louis, MO63110
| | - Amy Pienta
- ICPSR, Institute for Social Research, University of Michigan, Ann Arbor, MI 48106
| |
Collapse
|
15
|
Thibault RT, Amaral OB, Argolo F, Bandrowski AE, Davidson AR, Drude NI. Open Science 2.0: Towards a truly collaborative research ecosystem. PLoS Biol 2023; 21:e3002362. [PMID: 37856538 PMCID: PMC10617723 DOI: 10.1371/journal.pbio.3002362] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Revised: 10/31/2023] [Indexed: 10/21/2023] Open
Abstract
Conversations about open science have reached the mainstream, yet many open science practices such as data sharing remain uncommon. Our efforts towards openness therefore need to increase in scale and aim for a more ambitious target. We need an ecosystem not only where research outputs are openly shared but also in which transparency permeates the research process from the start and lends itself to more rigorous and collaborative research. To support this vision, this Essay provides an overview of a selection of open science initiatives from the past 2 decades, focusing on methods transparency, scholarly communication, team science, and research culture, and speculates about what the future of open science could look like. It then draws on these examples to provide recommendations for how funders, institutions, journals, regulators, and other stakeholders can create an environment that is ripe for improvement.
Collapse
Affiliation(s)
- Robert T. Thibault
- 1 Meta-Research Innovation Center at Stanford (METRICS), Stanford University, Stanford, California, Unites States of America
| | - Olavo B. Amaral
- Institute of Medical Biochemistry Leopoldo de Meis, Universidade Federal do Rio de Janeiro, Rio de Janeiro, Brazil
| | | | - Anita E. Bandrowski
- FAIR Data Informatics Lab, Department of Neuroscience, UCSD, San Diego, California, United States of America
- SciCrunch Inc., San Diego, California, United States of America
| | - Alexandra R, Davidson
- Institute for Evidence-Based Health Care, Bond University, Robina, Australia
- Faculty of Health Science and Medicine, Bond University, Robina, Australia
| | - Natascha I. Drude
- Berlin Institute of Health (BIH) at Charité, BIH QUEST Center for Responsible Research, Berlin, Germany
| |
Collapse
|
16
|
Vermeir JF, White MJ, Johnson D, Crombez G, Van Ryckeghem DML. Development and Evaluation of Linguistic Stimuli for Pain Research. THE JOURNAL OF PAIN 2023; 24:1843-1858. [PMID: 37268166 DOI: 10.1016/j.jpain.2023.05.011] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/01/2022] [Revised: 04/13/2023] [Accepted: 05/29/2023] [Indexed: 06/04/2023]
Abstract
Linguistic stimuli are commonly used in research to investigate the processing of pain. To provide researchers with a dataset of pain-related and non-pain-related linguistic stimuli, this research investigated 1) the associative strength between pain-related words and the pain construct; 2) the pain-relatedness ratings of pain words; and 3) the variability in the relatedness of pain words within pain word classifications (eg, sensory pain words). In Study 1, 194 pain-related and matched non-pain-related words were retrieved by reviewing the pain-related attentional bias literature. In Study 2, adults with (n = 85) and without (n = 48) self-reported chronic pain completed a speeded word categorization paradigm and rated the pain-relatedness of a subset of pain words. Analyses revealed that 1) despite differences in associative strength of 11.3% of the words between chronic and non-chronic pain groups, no overall group difference was found, 2) the chronic pain group rated the pain words as more pain-related compared to the non-chronic pain group, and 3) there was variability in the relatedness of pain words within pain word classifications. The findings highlight the importance of validating linguistic pain stimuli. The resulting dataset is openly accessible and new published sets can be added to the Linguistic Materials for Pain (LMaP) Repository. PERSPECTIVE: This article presents the development and preliminary evaluation of a large pool of pain-related and non-pain-related words in adults with and without self-reported chronic pain. Findings are discussed and guidelines are offered to select the most suitable stimuli for future research.
Collapse
Affiliation(s)
- Julie F Vermeir
- Queensland University of Technology (QUT), Faculty of Health, School of Psychology and Counselling, Brisbane, Australia.
| | - Melanie J White
- Queensland University of Technology (QUT), Faculty of Health, School of Psychology and Counselling, Brisbane, Australia
| | - Daniel Johnson
- Queensland University of Technology (QUT), Faculty of Science, School of Computer Science, Brisbane, Australia
| | - Geert Crombez
- Ghent University, Department of Experimental Clinical and Health Psychology, Ghent, Belgium
| | - Dimitri M L Van Ryckeghem
- Ghent University, Department of Experimental Clinical and Health Psychology, Ghent, Belgium; Maastricht University, Department of Clinical Psychological Science, Maastricht, Netherlands; University of Luxembourg, Department of Behavioural and Cognitive Sciences, Esch-sur-Alzette, Luxembourg
| |
Collapse
|
17
|
Kimmel K, Avolio ML, Ferraro PJ. Empirical evidence of widespread exaggeration bias and selective reporting in ecology. Nat Ecol Evol 2023; 7:1525-1536. [PMID: 37537387 DOI: 10.1038/s41559-023-02144-3] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/29/2022] [Accepted: 06/28/2023] [Indexed: 08/05/2023]
Abstract
In many scientific disciplines, common research practices have led to unreliable and exaggerated evidence about scientific phenomena. Here we describe some of these practices and quantify their pervasiveness in recent ecology publications in five popular journals. In an analysis of over 350 studies published between 2018 and 2020, we detect empirical evidence of exaggeration bias and selective reporting of statistically significant results. This evidence implies that the published effect sizes in ecology journals exaggerate the importance of the ecological relationships that they aim to quantify. An exaggerated evidence base hinders the ability of empirical ecology to reliably contribute to science, policy, and management. To increase the credibility of ecology research, we describe a set of actions that ecologists should take, including changes to scientific norms about what high-quality ecology looks like and expectations about what high-quality studies can deliver.
Collapse
Affiliation(s)
- Kaitlin Kimmel
- Mad Agriculture, Boulder, CO, USA
- Department of Earth and Planetary Sciences, Johns Hopkins University, Baltimore, MD, USA
| | - Meghan L Avolio
- Department of Earth and Planetary Sciences, Johns Hopkins University, Baltimore, MD, USA
| | - Paul J Ferraro
- Carey Business School, Johns Hopkins University, Baltimore, MD, USA.
- Department of Environmental Health and Engineering, a joint department of the Bloomberg School of Public Health and the Whiting School of Engineering, Johns Hopkins University, Baltimore, MD, USA.
| |
Collapse
|
18
|
Keener SK, Kepes S, Torka AK. The trustworthiness of the cumulative knowledge in industrial/organizational psychology: The current state of affairs and a path forward. Acta Psychol (Amst) 2023; 239:104005. [PMID: 37625919 DOI: 10.1016/j.actpsy.2023.104005] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/12/2023] [Revised: 07/13/2023] [Accepted: 08/04/2023] [Indexed: 08/27/2023] Open
Abstract
The goal of industrial/organizational (IO) psychology, is to build and organize trustworthy knowledge about people-related phenomena in the workplace. Unfortunately, as with other scientific disciplines, our discipline may be experiencing a "crisis of confidence" stemming from the lack of reproducibility and replicability of many of our field's research findings, which would suggest that much of our research may be untrustworthy. If a scientific discipline's research is deemed untrustworthy, it can have dire consequences, including the withdraw of funding for future research. In this focal article, we review the current state of reproducibility and replicability in IO psychology and related fields. As part of this review, we discuss factors that make it less likely that research findings will be trustworthy, including the prevalence of scientific misconduct, questionable research practices (QRPs), and errors. We then identify some root causes of these issues and provide several potential remedies. In particular, we highlight the need for improved research methods and statistics training as well as a re-alignment of the incentive structure in academia. To accomplish this, we advocate for changes in the reward structure, improvements to the peer review process, and the implementation of open science practices. Overall, addressing the current "crisis of confidence" in IO psychology requires individual researchers, academic institutions, and publishers to embrace system-wide change.
Collapse
Affiliation(s)
- Sheila K Keener
- Department of Management, Old Dominion University, Norfolk, VA, United States of America.
| | - Sven Kepes
- Department of Management and Entrepreneurship, Virginia Commonwealth University, Richmond, VA, United States of America.
| | - Ann-Kathrin Torka
- Department of Social, Work, and Organizational Psychology, TU Dortmund University, Dortmund, Germany.
| |
Collapse
|
19
|
Brewin CR. Inaccuracy in the Scientific Record and Open Postpublication Critique. PERSPECTIVES ON PSYCHOLOGICAL SCIENCE 2023; 18:1244-1253. [PMID: 36745732 PMCID: PMC10475207 DOI: 10.1177/17456916221141357] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/08/2023]
Abstract
There is growing evidence that the published psychological literature is marred by multiple errors and inaccuracies and often fails to reflect the changing nature of the knowledge base. At least four types of error are common-citation error, methodological error, statistical error, and interpretation error. In the face of the apparent inevitability of these inaccuracies, core scientific values such as openness and transparency require that correction mechanisms are readily available. In this article, I reviewed standard mechanisms in psychology journals and found them to have limitations. The effects of more widely enabling open postpublication critique in the same journal in addition to conventional peer review are considered. This mechanism is well established in medicine and the life sciences but rare in psychology and may assist psychological science to correct itself.
Collapse
Affiliation(s)
- Chris R. Brewin
- Research Department of Clinical Educational & Health Psychology, University College London
| |
Collapse
|
20
|
Zettersten M, Yurovsky D, Xu TL, Uner S, Tsui ASM, Schneider RM, Saleh AN, Meylan SC, Marchman VA, Mankewitz J, MacDonald K, Long B, Lewis M, Kachergis G, Handa K, deMayo B, Carstensen A, Braginsky M, Boyce V, Bhatt NS, Bergey CA, Frank MC. Peekbank: An open, large-scale repository for developmental eye-tracking data of children's word recognition. Behav Res Methods 2023; 55:2485-2500. [PMID: 36002623 PMCID: PMC9950292 DOI: 10.3758/s13428-022-01906-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/27/2022] [Indexed: 02/03/2023]
Abstract
The ability to rapidly recognize words and link them to referents is central to children's early language development. This ability, often called word recognition in the developmental literature, is typically studied in the looking-while-listening paradigm, which measures infants' fixation on a target object (vs. a distractor) after hearing a target label. We present a large-scale, open database of infant and toddler eye-tracking data from looking-while-listening tasks. The goal of this effort is to address theoretical and methodological challenges in measuring vocabulary development. We first present how we created the database, its features and structure, and associated tools for processing and accessing infant eye-tracking datasets. Using these tools, we then work through two illustrative examples to show how researchers can use Peekbank to interrogate theoretical and methodological questions about children's developing word recognition ability.
Collapse
Affiliation(s)
- Martin Zettersten
- Department of Psychology, Princeton University, 218 Peretsman Scully Hall, Princeton, NJ, 08540, USA.
| | - Daniel Yurovsky
- Department of Psychology, Carnegie Mellon University, Pittsburgh, PA, USA
| | - Tian Linger Xu
- Department of Psychological and Brain Sciences, Indiana University, Bloomington, IN, USA
| | - Sarp Uner
- Data Science Institute, Vanderbilt University, Nashville, TN, USA
| | | | - Rose M Schneider
- Department of Psychology, University of California, San Diego, CA, USA
| | - Annissa N Saleh
- Department of Psychology, The University of Texas at Austin, Austin, TX, USA
| | - Stephan C Meylan
- Department of Psychology and Neuroscience, Duke University, Durham, NC, USA
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA, USA
| | | | | | | | - Bria Long
- Department of Psychology, Stanford University, Stanford, CA, USA
| | - Molly Lewis
- Department of Psychology, Carnegie Mellon University, Pittsburgh, PA, USA
| | - George Kachergis
- Department of Psychology, Stanford University, Stanford, CA, USA
| | | | - Benjamin deMayo
- Department of Psychology, Princeton University, 218 Peretsman Scully Hall, Princeton, NJ, 08540, USA
| | | | - Mika Braginsky
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA, USA
| | - Veronica Boyce
- Department of Psychology, Stanford University, Stanford, CA, USA
| | - Naiti S Bhatt
- Department of Psychology, New York University, New York, NY, USA
| | | | - Michael C Frank
- Department of Psychology, Stanford University, Stanford, CA, USA
| |
Collapse
|
21
|
Botvinik-Nezer R, Wager TD. Reproducibility in Neuroimaging Analysis: Challenges and Solutions. BIOLOGICAL PSYCHIATRY. COGNITIVE NEUROSCIENCE AND NEUROIMAGING 2023; 8:780-788. [PMID: 36906444 DOI: 10.1016/j.bpsc.2022.12.006] [Citation(s) in RCA: 7] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/15/2022] [Revised: 11/27/2022] [Accepted: 12/11/2022] [Indexed: 12/23/2022]
Abstract
Recent years have marked a renaissance in efforts to increase research reproducibility in psychology, neuroscience, and related fields. Reproducibility is the cornerstone of a solid foundation of fundamental research-one that will support new theories built on valid findings and technological innovation that works. The increased focus on reproducibility has made the barriers to it increasingly apparent, along with the development of new tools and practices to overcome these barriers. Here, we review challenges, solutions, and emerging best practices with a particular emphasis on neuroimaging studies. We distinguish 3 main types of reproducibility, discussing each in turn. Analytical reproducibility is the ability to reproduce findings using the same data and methods. Replicability is the ability to find an effect in new datasets, using the same or similar methods. Finally, robustness to analytical variability refers to the ability to identify a finding consistently across variation in methods. The incorporation of these tools and practices will result in more reproducible, replicable, and robust psychological and brain research and a stronger scientific foundation across fields of inquiry.
Collapse
Affiliation(s)
- Rotem Botvinik-Nezer
- Department of Psychological and Brain Sciences, Dartmouth College, Hanover, New Hampshire.
| | - Tor D Wager
- Department of Psychological and Brain Sciences, Dartmouth College, Hanover, New Hampshire
| |
Collapse
|
22
|
Cavanaugh R, Quique YM, Swiderski AM, Kallhoff L, Terhorst L, Wambaugh J, Hula WD, Evans WS. Reproducibility in Small- N Treatment Research: A Tutorial Using Examples From Aphasiology. JOURNAL OF SPEECH, LANGUAGE, AND HEARING RESEARCH : JSLHR 2023; 66:1908-1927. [PMID: 36542852 PMCID: PMC10465158 DOI: 10.1044/2022_jslhr-22-00333] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/08/2022] [Revised: 08/19/2022] [Accepted: 08/27/2022] [Indexed: 06/17/2023]
Abstract
PURPOSE Small-N studies are the dominant study design supporting evidence-based interventions in communication science and disorders, including treatments for aphasia and related disorders. However, there is little guidance for conducting reproducible analyses or selecting appropriate effect sizes in small-N studies, which has implications for scientific review, rigor, and replication. This tutorial aims to (a) demonstrate how to conduct reproducible analyses using effect sizes common to research in aphasia and related disorders and (b) provide a conceptual discussion to improve the reader's understanding of these effect sizes. METHOD We provide a tutorial on reproducible analyses of small-N designs in the statistical programming language R using published data from Wambaugh et al. (2017). In addition, we discuss the strengths, weaknesses, reporting requirements, and impact of experimental design decisions on effect sizes common to this body of research. RESULTS Reproducible code demonstrates implementation and comparison of within-case standardized mean difference, proportion of maximal gain, tau-U, and frequentist and Bayesian mixed-effects models. Data, code, and an interactive web application are available as a resource for researchers, clinicians, and students. CONCLUSIONS Pursuing reproducible research is key to promoting transparency in small-N treatment research. Researchers and clinicians must understand the properties of common effect size measures to make informed decisions in order to select ideal effect size measures and act as informed consumers of small-N studies. Together, a commitment to reproducibility and a keen understanding of effect sizes can improve the scientific rigor and synthesis of the evidence supporting clinical services in aphasiology and in communication sciences and disorders more broadly. Supplemental Material and Open Science Form: https://doi.org/10.23641/asha.21699476.
Collapse
Affiliation(s)
- Robert Cavanaugh
- Department of Communication Science and Disorders, University of Pittsburgh, PA
- Audiology and Speech Pathology Program, VA Pittsburgh Healthcare System, PA
| | - Yina M. Quique
- Center for Education in Health Sciences and Shirley Ryan Ability Lab, Northwestern University, Evanston, IL
| | - Alexander M. Swiderski
- Department of Communication Science and Disorders, University of Pittsburgh, PA
- Audiology and Speech Pathology Program, VA Pittsburgh Healthcare System, PA
- Center for Neural Basis of Cognition, Carnegie Mellon University, Pittsburgh, PA
| | - Lydia Kallhoff
- Department of Communication Sciences and Disorders, University of Utah, Salt Lake City
| | - Lauren Terhorst
- Department of Occupational Therapy, University of Pittsburgh, PA
| | - Julie Wambaugh
- Department of Communication Sciences and Disorders, University of Utah, Salt Lake City
| | - William D. Hula
- Department of Communication Science and Disorders, University of Pittsburgh, PA
- Audiology and Speech Pathology Program, VA Pittsburgh Healthcare System, PA
| | - William S. Evans
- Department of Communication Science and Disorders, University of Pittsburgh, PA
| |
Collapse
|
23
|
Liu L, Jones BF, Uzzi B, Wang D. Data, measurement and empirical methods in the science of science. Nat Hum Behav 2023:10.1038/s41562-023-01562-4. [PMID: 37264084 DOI: 10.1038/s41562-023-01562-4] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2022] [Accepted: 02/17/2023] [Indexed: 06/03/2023]
Abstract
The advent of large-scale datasets that trace the workings of science has encouraged researchers from many different disciplinary backgrounds to turn scientific methods into science itself, cultivating a rapidly expanding 'science of science'. This Review considers this growing, multidisciplinary literature through the lens of data, measurement and empirical methods. We discuss the purposes, strengths and limitations of major empirical approaches, seeking to increase understanding of the field's diverse methodologies and expand researchers' toolkits. Overall, new empirical developments provide enormous capacity to test traditional beliefs and conceptual frameworks about science, discover factors associated with scientific productivity, predict scientific outcomes and design policies that facilitate scientific progress.
Collapse
Affiliation(s)
- Lu Liu
- Center for Science of Science and Innovation, Northwestern University, Evanston, IL, USA
- Northwestern Institute on Complex Systems, Northwestern University, Evanston, IL, USA
- Kellogg School of Management, Northwestern University, Evanston, IL, USA
- College of Information Sciences and Technology, Pennsylvania State University, University Park, PA, USA
| | - Benjamin F Jones
- Center for Science of Science and Innovation, Northwestern University, Evanston, IL, USA
- Northwestern Institute on Complex Systems, Northwestern University, Evanston, IL, USA
- Kellogg School of Management, Northwestern University, Evanston, IL, USA
- National Bureau of Economic Research, Cambridge, MA, USA
- Brookings Institution, Washington, DC, USA
| | - Brian Uzzi
- Center for Science of Science and Innovation, Northwestern University, Evanston, IL, USA
- Northwestern Institute on Complex Systems, Northwestern University, Evanston, IL, USA
- Kellogg School of Management, Northwestern University, Evanston, IL, USA
| | - Dashun Wang
- Center for Science of Science and Innovation, Northwestern University, Evanston, IL, USA.
- Northwestern Institute on Complex Systems, Northwestern University, Evanston, IL, USA.
- Kellogg School of Management, Northwestern University, Evanston, IL, USA.
- McCormick School of Engineering, Northwestern University, Evanston, IL, USA.
| |
Collapse
|
24
|
Mathur MB, Fox MP. Toward Open and Reproducible Epidemiology. Am J Epidemiol 2023; 192:658-664. [PMID: 36627249 PMCID: PMC10089067 DOI: 10.1093/aje/kwad007] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/25/2022] [Revised: 12/08/2022] [Accepted: 01/09/2023] [Indexed: 01/12/2023] Open
Abstract
Starting in the 2010s, researchers in the experimental social sciences rapidly began to adopt increasingly open and reproducible scientific practices. These practices include publicly sharing deidentified data when possible, sharing analytical code, and preregistering study protocols. Empirical evidence from the social sciences suggests such practices are feasible, can improve analytical reproducibility, and can reduce selective reporting. In academic epidemiology, adoption of open-science practices has been slower than in the social sciences (with some notable exceptions, such as registering clinical trials). Epidemiologic studies are often large, complex, conceived after data have already been collected, and difficult to replicate directly by collecting new data. These characteristics make it especially important to ensure their integrity and analytical reproducibility. Open-science practices can also pay immediate dividends to researchers' own work by clarifying scientific reasoning and encouraging well-documented, organized workflows. We consider how established epidemiologists and early-career researchers alike can help midwife a culture of open science in epidemiology through their research practices, mentorship, and editorial activities.
Collapse
Affiliation(s)
- Maya B Mathur
- Correspondence to Dr. Maya B. Mathur, Quantitative Sciences Unit, 3180 Porter Drive, Palo Alto, CA 94304 (e-mail: )
| | | |
Collapse
|
25
|
Sadeh Y, Denejkina A, Karyotaki E, Lenferink LIM, Kassam-Adams N. Opportunities for improving data sharing and FAIR data practices to advance global mental health. Glob Ment Health (Camb) 2023; 10:e14. [PMID: 37860102 PMCID: PMC10581864 DOI: 10.1017/gmh.2023.7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/10/2022] [Revised: 01/24/2023] [Accepted: 02/23/2023] [Indexed: 03/06/2023] Open
Abstract
It is crucial to optimize global mental health research to address the high burden of mental health challenges and mental illness for individuals and societies. Data sharing and reuse have demonstrated value for advancing science and accelerating knowledge development. The FAIR (Findable, Accessible, Interoperable, and Reusable) Guiding Principles for scientific data provide a framework to improve the transparency, efficiency, and impact of research. In this review, we describe ethical and equity considerations in data sharing and reuse, delineate the FAIR principles as they apply to mental health research, and consider the current state of FAIR data practices in global mental health research, identifying challenges and opportunities. We describe noteworthy examples of collaborative efforts, often across disciplinary and national boundaries, to improve Findability and Accessibility of global mental health data, as well as efforts to create integrated data resources and tools that improve Interoperability and Reusability. Based on this review, we suggest a vision for the future of FAIR global mental health research and suggest practical steps for researchers with regard to study planning, data preservation and indexing, machine-actionable metadata, data reuse to advance science and improve equity, metrics and recognition.
Collapse
Affiliation(s)
- Yaara Sadeh
- Center for Injury Research and Prevention, Children’s Hospital of Philadelphia, Philadelphia, PA, USA
- Trauma Data Institute, Lovingston, VA, USA
| | - Anna Denejkina
- Graduate Research School, Western Sydney University, Penrith, NSW, Australia
- Translational Health Research Institute, Sydney, Australia
- Young and Resilient Research Centre, Sydney, Australia
| | - Eirini Karyotaki
- Department of Clinical, Neuro- and Developmental Psychology, Vrije Universiteit Amsterdam, Amsterdam, Netherlands
- Amsterdam Public Health Institute, Amsterdam, Netherlands
| | - Lonneke I. M. Lenferink
- Department of Psychology, Health & Technology, University of Twente, Enschede, Netherlands
- Department of Clinical Psychology, Utrecht University, Utrecht, Netherlands
- Department of Clinical Psychology and Experimental Psychopathology, University of Groningen, Groningen, Netherlands
| | - Nancy Kassam-Adams
- Center for Injury Research and Prevention, Children’s Hospital of Philadelphia, Philadelphia, PA, USA
- Trauma Data Institute, Lovingston, VA, USA
- Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, USA
| |
Collapse
|
26
|
Buzbas EO, Devezer B, Baumgaertner B. The logical structure of experiments lays the foundation for a theory of reproducibility. ROYAL SOCIETY OPEN SCIENCE 2023; 10:221042. [PMID: 36938532 PMCID: PMC10014247 DOI: 10.1098/rsos.221042] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/11/2022] [Accepted: 02/02/2023] [Indexed: 06/18/2023]
Abstract
The scientific reform movement has proposed openness as a potential remedy to the putative reproducibility or replication crisis. However, the conceptual relationship among openness, replication experiments and results reproducibility has been obscure. We analyse the logical structure of experiments, define the mathematical notion of idealized experiment and use this notion to advance a theory of reproducibility. Idealized experiments clearly delineate the concepts of replication and results reproducibility, and capture key differences with precision, allowing us to study the relationship among them. We show how results reproducibility varies as a function of the elements of an idealized experiment, the true data-generating mechanism, and the closeness of the replication experiment to an original experiment. We clarify how openness of experiments is related to designing informative replication experiments and to obtaining reproducible results. With formal backing and evidence, we argue that the current 'crisis' reflects inadequate attention to a theoretical understanding of results reproducibility.
Collapse
Affiliation(s)
- Erkan O. Buzbas
- Department of Mathematics and Statistical Science, University of Idaho, Moscow, ID 83844, USA
| | - Berna Devezer
- Department of Mathematics and Statistical Science, University of Idaho, Moscow, ID 83844, USA
- Department of Business, University of Idaho, Moscow, ID 83844, USA
| | - Bert Baumgaertner
- Department of Politics and Philosophy, University of Idaho, Moscow, ID 83844, USA
| |
Collapse
|
27
|
Wulff JN, Sajons GB, Pogrebna G, Lonati S, Bastardoz N, Banks GC, Antonakis J. Common methodological mistakes. THE LEADERSHIP QUARTERLY 2023. [DOI: 10.1016/j.leaqua.2023.101677] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/23/2023]
|
28
|
Si L, Liu L, He Y. Scientific data management policy in China: a quantitative content analysis based on policy text. ASLIB J INFORM MANAG 2023. [DOI: 10.1108/ajim-05-2022-0257] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/22/2023]
Abstract
PurposeThis paper aims to understand the current development situation of scientific data management policy in China, analyze the content structure of the policy and provide a theoretical basis for the improvement and optimization of the policy system.Design/methodology/approachChina's scientific data management policies were obtained through various channels such as searching government websites and policy and legal database, and 209 policies were finally identified as the sample for analysis after being screened and integrated. A three-dimensional framework was constructed based on the perspective of policy tools, combining stakeholder and lifecycle theories. And the content of policy texts was coded and quantitatively analyzed according to this framework.FindingsChina's scientific data management policies can be divided into four stages according to the time sequence: infancy, preliminary exploration, comprehensive promotion and key implementation. The policies use a combination of three types of policy tools: supply-side, environmental-side and demand-side, involving multiple stakeholders and covering all stages of the lifecycle. But policy tools and their application to stakeholders and lifecycle stages are imbalanced. The development of future scientific data management policy should strengthen the balance of policy tools, promote the participation of multiple subjects and focus on the supervision of the whole lifecycle.Originality/valueThis paper constructs a three-dimensional analytical framework and uses content analysis to quantitatively analyze scientific data management policy texts, extending the research perspective and research content in the field of scientific data management. The study identifies policy focuses and proposes several strategies that will help optimize the scientific data management policy.
Collapse
|
29
|
Claesen A, Vanpaemel W, Maerten AS, Verliefde T, Tuerlinckx F, Heyman T. Data sharing upon request and statistical consistency errors in psychology: A replication of Wicherts, Bakker and Molenaar (2011). PLoS One 2023; 18:e0284243. [PMID: 37053137 PMCID: PMC10101414 DOI: 10.1371/journal.pone.0284243] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2022] [Accepted: 03/08/2023] [Indexed: 04/14/2023] Open
Abstract
Sharing research data allows the scientific community to verify and build upon published work. However, data sharing is not common practice yet. The reasons for not sharing data are myriad: Some are practical, others are more fear-related. One particular fear is that a reanalysis may expose errors. For this explanation, it would be interesting to know whether authors that do not share data genuinely made more errors than authors who do share data. (Wicherts, Bakker and Molenaar 2011) examined errors that can be discovered based on the published manuscript only, because it is impossible to reanalyze unavailable data. They found a higher prevalence of such errors in papers for which the data were not shared. However, (Nuijten et al. 2017) did not find support for this finding in three large studies. To shed more light on this relation, we conducted a replication of the study by (Wicherts et al. 2011). Our study consisted of two parts. In the first part, we reproduced the analyses from (Wicherts et al. 2011) to verify the results, and we carried out several alternative analytical approaches to evaluate the robustness of the results against other analytical decisions. In the second part, we used a unique and larger data set that originated from (Vanpaemel et al. 2015) on data sharing upon request for reanalysis, to replicate the findings in (Wicherts et al. 2011). We applied statcheck for the detection of consistency errors in all included papers and manually corrected false positives. Finally, we again assessed the robustness of the replication results against other analytical decisions. Everything taken together, we found no robust empirical evidence for the claim that not sharing research data for reanalysis is associated with consistency errors.
Collapse
Affiliation(s)
- Aline Claesen
- Faculty of Psychology and Educational Sciences, KU Leuven, Leuven, Belgium
| | - Wolf Vanpaemel
- Faculty of Psychology and Educational Sciences, KU Leuven, Leuven, Belgium
| | - Anne-Sofie Maerten
- Faculty of Psychology and Educational Sciences, KU Leuven, Leuven, Belgium
| | - Thomas Verliefde
- Department of Psychology, University of Tuebingen, Tuebingen, Germany
| | - Francis Tuerlinckx
- Faculty of Psychology and Educational Sciences, KU Leuven, Leuven, Belgium
| | - Tom Heyman
- Faculty of Psychology and Educational Sciences, KU Leuven, Leuven, Belgium
- Faculty of Social Sciences, Leiden University, Leiden, The Netherlands
| |
Collapse
|
30
|
Huff M, Bongartz EC. Low Research-Data Availability in Educational-Psychology Journals: No Indication of Effective Research-Data Policies. ADVANCES IN METHODS AND PRACTICES IN PSYCHOLOGICAL SCIENCE 2023. [DOI: 10.1177/25152459231156419] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/30/2023]
Abstract
Research-data availability contributes to the transparency of the research process and the credibility of educational-psychology research and science in general. Recently, there have been many initiatives to increase the availability and quality of research data. Many research institutions have adopted research-data policies. This increased awareness might have raised the sharing of research data in empirical articles. To test this idea, we coded 1,242 publications from six educational-psychology journals and the psychological journal Cognition (as a baseline) published in 2018 and 2020. Research-data availability was low (3.85% compared with 62.74% in Cognition) but has increased from 0.32% (2018) to 7.16% (2020). However, neither the data-transparency level of the journal nor the existence of an official research-data policy on the level of the corresponding author’s institution was related to research-data availability. We discuss the consequences of these findings for institutional research-data-management processes.
Collapse
|
31
|
Abstract
Concerns about a crisis of mass irreplicability across scientific fields ("the replication crisis") have stimulated a movement for open science, encouraging or even requiring researchers to publish their raw data and analysis code. Recently, a rule at the US Environmental Protection Agency (US EPA) would have imposed a strong open data requirement. The rule prompted significant public discussion about whether open science practices are appropriate for fields of environmental public health. The aims of this paper are to assess (1) whether the replication crisis extends to fields of environmental public health; and (2) in general whether open science requirements can address the replication crisis. There is little empirical evidence for or against mass irreplicability in environmental public health specifically. Without such evidence, strong claims about whether the replication crisis extends to environmental public health - or not - seem premature. By distinguishing three concepts - reproducibility, replicability, and robustness - it is clear that open data initiatives can promote reproducibility and robustness but do little to promote replicability. I conclude by reviewing some of the other benefits of open science, and offer some suggestions for funding streams to mitigate the costs of adoption of open science practices in environmental public health.
Collapse
|
32
|
Déglin SE, Burstyn I, Chen CL, Miller DJ, Gribble MO, Hamade AK, Chang ET, Avanasi R, Boon D, Reed J. Considerations towards the better integration of epidemiology into quantitative risk assessment. GLOBAL EPIDEMIOLOGY 2022; 4:100084. [PMID: 37637021 PMCID: PMC10445996 DOI: 10.1016/j.gloepi.2022.100084] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/23/2022] [Accepted: 09/07/2022] [Indexed: 11/16/2022] Open
Abstract
Environmental epidemiology has proven critical to study various associations between environmental exposures and adverse human health effects. However, there is a perception that it often does not sufficiently inform quantitative risk assessment. To help address this concern, in 2017, the Health and Environmental Sciences Institute initiated a project engaging the epidemiology, exposure science, and risk assessment communities with tripartite representation from government agencies, industry, and academia, in a dialogue on the use of environmental epidemiology for quantitative risk assessment and public health decision making. As part of this project, four meetings attended by experts in epidemiology, exposure science, toxicology, statistics, and risk assessment, as well as one additional meeting engaging funding agencies, were organized to explore incentives and barriers to realizing the full potential of epidemiological data in quantitative risk assessment. A set of questions was shared with workshop participants prior to the meetings, and two case studies were used to support the discussion. Five key ideas emerged from these meetings as areas of desired improvement to ensure that human data can more consistently become an integral part of quantitative risk assessment: 1) reducing confirmation and publication bias, 2) increasing communication with funding agencies to raise awareness of research needs, 3) developing alternative funding channels targeted to support quantitative risk assessment, 4) making data available for reuse and analysis, and 5) developing cross-disciplinary and cross-sectoral interactions, collaborations, and training. We explored and integrated these themes into a roadmap illustrating the need for a multi-stakeholder effort to ensure that epidemiological data can fully contribute to the quantitative evaluation of human health risks, and to build confidence in a reliable decision-making process that leverages the totality of scientific evidence.
Collapse
Affiliation(s)
- Sandrine E. Déglin
- Health and Environmental Sciences Institute, Washington, DC, United States of America
| | - Igor Burstyn
- Department of Environmental and Occupational Health, Drexel University, Philadelphia, PA, United States of America
| | - Connie L. Chen
- Health and Environmental Sciences Institute, Washington, DC, United States of America
| | - David J. Miller
- U.S. Environmental Protection Agency, Washington, DC, United States of America
| | - Matthew O. Gribble
- Department of Epidemiology, University of Alabama at Birmingham School of Public Health, Birmingham, AL, United States of America
| | - Ali K. Hamade
- Oregon Health Authority, Portland, OR, United States of America
| | - Ellen T. Chang
- Center for Health Sciences, Exponent, Inc., Menlo Park, CA, United States of America
| | | | - Denali Boon
- Corteva Agriscience, Indianapolis, IN, United States of America
| | - Jennifer Reed
- Bayer Crop Science, Chesterfield, MO, United States of America
| |
Collapse
|
33
|
Nguyen PY, Kanukula R, McKenzie JE, Alqaidoom Z, Brennan SE, Haddaway NR, Hamilton DG, Karunananthan S, McDonald S, Moher D, Nakagawa S, Nunan D, Tugwell P, Welch VA, Page MJ. Changing patterns in reporting and sharing of review data in systematic reviews with meta-analysis of the effects of interventions: cross sectional meta-research study. BMJ 2022; 379:e072428. [PMID: 36414269 PMCID: PMC9679891 DOI: 10.1136/bmj-2022-072428] [Citation(s) in RCA: 11] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
Abstract
OBJECTIVES To examine changes in completeness of reporting and frequency of sharing data, analytical code, and other review materials in systematic reviews over time; and factors associated with these changes. DESIGN Cross sectional meta-research study. POPULATION Random sample of 300 systematic reviews with meta-analysis of aggregate data on the effects of a health, social, behavioural, or educational intervention. Reviews were indexed in PubMed, Science Citation Index, Social Sciences Citation Index, Scopus, and Education Collection in November 2020. MAIN OUTCOME MEASURES The extent of complete reporting and the frequency of sharing review materials in the systematic reviews indexed in 2020 were compared with 110 systematic reviews indexed in February 2014. Associations between completeness of reporting and various factors (eg, self-reported use of reporting guidelines, journal policies on data sharing) were examined by calculating risk ratios and 95% confidence intervals. RESULTS Several items were reported suboptimally among 300 systematic reviews from 2020, such as a registration record for the review (n=113; 38%), a full search strategy for at least one database (n=214; 71%), methods used to assess risk of bias (n=185; 62%), methods used to prepare data for meta-analysis (n=101; 34%), and source of funding for the review (n=215; 72%). Only a few items not already reported at a high frequency in 2014 were reported more frequently in 2020. No evidence indicated that reviews using a reporting guideline were more completely reported than reviews not using a guideline. Reviews published in 2020 in journals that mandated either data sharing or inclusion of data availability statements were more likely to share their review materials (eg, data, code files) than reviews in journals without such mandates (16/87 (18%) v 4/213 (2%)). CONCLUSION Incomplete reporting of several recommended items for systematic reviews persists, even in reviews that claim to have followed a reporting guideline. Journal policies on data sharing might encourage sharing of review materials.
Collapse
Affiliation(s)
- Phi-Yen Nguyen
- Methods in Evidence Synthesis Unit, School of Public Health and Preventive Medicine, Monash University, Melbourne, VIC, Australia
| | - Raju Kanukula
- Methods in Evidence Synthesis Unit, School of Public Health and Preventive Medicine, Monash University, Melbourne, VIC, Australia
| | - Joanne E McKenzie
- Methods in Evidence Synthesis Unit, School of Public Health and Preventive Medicine, Monash University, Melbourne, VIC, Australia
| | - Zainab Alqaidoom
- Methods in Evidence Synthesis Unit, School of Public Health and Preventive Medicine, Monash University, Melbourne, VIC, Australia
| | - Sue E Brennan
- Methods in Evidence Synthesis Unit, School of Public Health and Preventive Medicine, Monash University, Melbourne, VIC, Australia
| | - Neal R Haddaway
- Leibniz-Centre for Agricultural Landscape Research, Müncheberg, Germany
- Stockholm Environment Institute, Stockholm, Sweden
- African Centre for Evidence, University of Johannesburg, Johannesburg, South Africa
| | - Daniel G Hamilton
- School of BioSciences, University of Melbourne, Melbourne, VIC, Australia
| | - Sathya Karunananthan
- Interdisciplinary School of Health Sciences, University of Ottawa, Ottawa, ON, Canada
- Bruyère Research Institute, Ottawa, ON, Canada
| | - Steve McDonald
- Methods in Evidence Synthesis Unit, School of Public Health and Preventive Medicine, Monash University, Melbourne, VIC, Australia
| | - David Moher
- Centre for Journalology, Clinical Epidemiology Programme, Ottawa Hospital Research Institute, Ottawa, ON, Canada
- School of Epidemiology and Public Health, Faculty of Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Shinichi Nakagawa
- Evolution & Ecology Research Centre and School of Biological, Earth and Environmental Sciences, University of New South Wales, Sydney, NSW, Australia
| | - David Nunan
- Centre for Evidence-Based Medicine, Nuffield Department of Primary Care Health Sciences, Oxford University, Oxford, UK
| | - Peter Tugwell
- Bruyère Research Institute, Ottawa, ON, Canada
- School of Epidemiology and Public Health, Faculty of Medicine, University of Ottawa, Ottawa, ON, Canada
- Department of Medicine, Faculty of Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Vivian A Welch
- Bruyère Research Institute, Ottawa, ON, Canada
- School of Epidemiology and Public Health, Faculty of Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Matthew J Page
- Methods in Evidence Synthesis Unit, School of Public Health and Preventive Medicine, Monash University, Melbourne, VIC, Australia
| |
Collapse
|
34
|
Ehlers MR, Lonsdorf TB. Data sharing in experimental fear and anxiety research: From challenges to a dynamically growing database in 10 simple steps. Neurosci Biobehav Rev 2022; 143:104958. [DOI: 10.1016/j.neubiorev.2022.104958] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/23/2022] [Revised: 11/07/2022] [Accepted: 11/08/2022] [Indexed: 11/13/2022]
|
35
|
Schiavone SR, Vazire S. Reckoning With Our Crisis: An Agenda for the Field of Social and Personality Psychology. PERSPECTIVES ON PSYCHOLOGICAL SCIENCE 2022; 18:710-722. [PMID: 36301777 DOI: 10.1177/17456916221101060] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
The replication crisis and credibility revolution in the 2010s brought a wave of doubts about the credibility of social and personality psychology. We argue that as a field, we must reckon with the concerns brought to light during this critical decade. How the field responds to this crisis will reveal our commitment to self-correction. If we do not take the steps necessary to address our problems and simply declare the crisis to be over or the problems to be fixed without evidence, we risk further undermining our credibility. To fully reckon with this crisis, we must empirically assess the state of the field to take stock of how credible our science actually is and whether it is improving. We propose an agenda for metascientific research, and we review approaches to empirically evaluate and track where we are as a field (e.g., analyzing the published literature, surveying researchers). We describe one such project (Surveying the Past and Present State of Published Studies in Social and Personality Psychology) underway in our research group. Empirical evidence about the state of our field is necessary if we are to take self-correction seriously and if we hope to avert future crises.
Collapse
Affiliation(s)
| | - Simine Vazire
- Melbourne School of Psychological Sciences, University of Melbourne
| |
Collapse
|
36
|
Forscher PS, Wagenmakers EJ, Coles NA, Silan MA, Dutra N, Basnight-Brown D, IJzerman H. The Benefits, Barriers, and Risks of Big-Team Science. PERSPECTIVES ON PSYCHOLOGICAL SCIENCE 2022; 18:607-623. [PMID: 36190899 DOI: 10.1177/17456916221082970] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Progress in psychology has been frustrated by challenges concerning replicability, generalizability, strategy selection, inferential reproducibility, and computational reproducibility. Although often discussed separately, these five challenges may share a common cause: insufficient investment of intellectual and nonintellectual resources into the typical psychology study. We suggest that the emerging emphasis on big-team science can help address these challenges by allowing researchers to pool their resources together to increase the amount available for a single study. However, the current incentives, infrastructure, and institutions in academic science have all developed under the assumption that science is conducted by solo principal investigators and their dependent trainees, an assumption that creates barriers to sustainable big-team science. We also anticipate that big-team science carries unique risks, such as the potential for big-team-science organizations to be co-opted by unaccountable leaders, become overly conservative, and make mistakes at a grand scale. Big-team-science organizations must also acquire personnel who are properly compensated and have clear roles. Not doing so raises risks related to mismanagement and a lack of financial sustainability. If researchers can manage its unique barriers and risks, big-team science has the potential to spur great progress in psychology and beyond.
Collapse
Affiliation(s)
- Patrick S Forscher
- Research and Innovation Division, Busara Center for Behavioral Economics, Nairobi, Kenya.,Laboratoire Interuniversitaire de Psychologie, Université Grenoble Alpes
| | | | - Nicholas A Coles
- Center for the Study of Language and Information, Stanford University
| | - Miguel Alejandro Silan
- Unité de recherche Développement Individu Processus Handicap Éducation, Université Lumière Lyon 2.,Annecy Behavioral Science Lab, Menthon-Saint-Bernard, France.,Social and Political Laboratory, Psychology Department, University of the Philippines Diliman
| | - Natália Dutra
- Núcleo de Teoria e Pesquisa do Comportamento, Universidade Federal do Pará
| | | | - Hans IJzerman
- Laboratoire Interuniversitaire de Psychologie, Université Grenoble Alpes.,Institut Universitaire de France
| |
Collapse
|
37
|
Federer LM. Long-term availability of data associated with articles in PLOS ONE. PLoS One 2022; 17:e0272845. [PMID: 36001577 PMCID: PMC9401135 DOI: 10.1371/journal.pone.0272845] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2021] [Accepted: 07/26/2022] [Indexed: 11/18/2022] Open
Abstract
The adoption of journal policies requiring authors to include a Data Availability Statement has helped to increase the availability of research data associated with research articles. However, having a Data Availability Statement is not a guarantee that readers will be able to locate the data; even if provided with an identifier like a uniform resource locator (URL) or a digital object identifier (DOI), the data may become unavailable due to link rot and content drift. To explore the long-term availability of resources including data, code, and other digital research objects associated with papers, this study extracted 8,503 URLs and DOIs from a corpus of nearly 50,000 Data Availability Statements from papers published in PLOS ONE between 2014 and 2016. These URLs and DOIs were used to attempt to retrieve the data through both automated and manual means. Overall, 80% of the resources could be retrieved automatically, compared to much lower retrieval rates of 10–40% found in previous papers that relied on contacting authors to locate data. Because a URL or DOI might be valid but still not point to the resource, a subset of 350 URLs and 350 DOIs were manually tested, with 78% and 98% of resources, respectively, successfully retrieved. Having a DOI and being shared in a repository were both positively associated with availability. Although resources associated with older papers were slightly less likely to be available, this difference was not statistically significant, suggesting that URLs and DOIs may be an effective means for accessing data over time. These findings point to the value of including URLs and DOIs in Data Availability Statements to ensure access to data on a long-term basis.
Collapse
Affiliation(s)
- Lisa M. Federer
- Office of Strategic Initiatives, National Library of Medicine, National Institutes of Health, Bethesda, Maryland, United States of America
- * E-mail:
| |
Collapse
|
38
|
Abstract
Concern over social scientists' inability to reproduce empirical research has spawned a vast and rapidly growing literature. The size and growth of this literature make it difficult for newly interested academics to come up to speed. Here, we provide a formal text modeling approach to characterize the entirety of the field, which allows us to summarize the breadth of this literature and identify core themes. We construct and analyze text networks built from 1,947 articles to reveal differences across social science disciplines within the body of reproducibility publications and to discuss the diversity of subtopics addressed in the literature. This field-wide view suggests that reproducibility is a heterogeneous problem with multiple sources for errors and strategies for solutions, a finding that is somewhat at odds with calls for largely passive remedies reliant on open science. We propose an alternative rigor and reproducibility model that takes an active approach to rigor prior to publication, which may overcome some of the shortfalls of the postpublication model.
Collapse
Affiliation(s)
- James W Moody
- Department of Sociology, Duke University, Durham, North Carolina, USA
- Duke Network Analysis Center, Duke University, Durham, North Carolina, USA
| | - Lisa A Keister
- Department of Sociology, Duke University, Durham, North Carolina, USA
- Duke Network Analysis Center, Duke University, Durham, North Carolina, USA
- Sanford School of Public Policy, Duke University, Durham, North Carolina, USA
| | - Maria C Ramos
- Interdisciplinary Social Science Program, Florida State University, Tallahassee, Florida, USA
| |
Collapse
|
39
|
Jiao C, Li K, Fang Z. Data sharing practices across knowledge domains: A dynamic examination of data availability statements in PLOS ONE publications. J Inf Sci 2022. [DOI: 10.1177/01655515221101830] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
As the importance of research data gradually grows in sciences, data sharing has come to be encouraged and even mandated by journals and funders in recent years. Following this trend, the data availability statement has been increasingly embraced by academic communities as a means of sharing research data as part of research articles. This article presents a quantitative study of which mechanisms and repositories are used to share research data in PLOS ONE articles. We offer a dynamic examination of this topic from the disciplinary and temporal perspectives based on all statements in English-language research articles published between 2014 and 2020 in the journal. We find a slow yet steady growth in the use of data repositories to share data over time, as opposed to sharing data in the article and/or supplementary materials; this indicates improved compliance with the journal’s data sharing policies. We also find that multidisciplinary data repositories have been increasingly used over time, whereas some disciplinary repositories show a decreasing trend. Our findings can help academic publishers and funders to improve their data sharing policies and serve as an important baseline dataset for future studies on data sharing activities.
Collapse
Affiliation(s)
- Chenyue Jiao
- School of Information Sciences, University of Illinois Urbana-Champaign, USA
| | - Kai Li
- School of Information Resource Management, Renmin University of China, China
| | - Zhichao Fang
- Centre for Science and Technology Studies, Leiden University, The Netherlands
| |
Collapse
|
40
|
Roche DG, Berberi I, Dhane F, Lauzon F, Soeharjono S, Dakin R, Binning SA. Slow improvement to the archiving quality of open datasets shared by researchers in ecology and evolution. Proc Biol Sci 2022; 289:20212780. [PMID: 35582791 PMCID: PMC9114975 DOI: 10.1098/rspb.2021.2780] [Citation(s) in RCA: 10] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/04/2023] Open
Abstract
Many leading journals in ecology and evolution now mandate open data upon publication. Yet, there is very little oversight to ensure the completeness and reusability of archived datasets, and we currently have a poor understanding of the factors associated with high-quality data sharing. We assessed 362 open datasets linked to first- or senior-authored papers published by 100 principal investigators (PIs) in the fields of ecology and evolution over a period of 7 years to identify predictors of data completeness and reusability (data archiving quality). Datasets scored low on these metrics: 56.4% were complete and 45.9% were reusable. Data reusability, but not completeness, was slightly higher for more recently archived datasets and PIs with less seniority. Journal open data policy, PI gender and PI corresponding author status were unrelated to data archiving quality. However, PI identity explained a large proportion of the variance in data completeness (27.8%) and reusability (22.0%), indicating consistent inter-individual differences in data sharing practices by PIs across time and contexts. Several PIs consistently shared data of either high or low archiving quality, but most PIs were inconsistent in how well they shared. One explanation for the high intra-individual variation we observed is that PIs often conduct research through students and postdoctoral researchers, who may be responsible for the data collection, curation and archiving. Levels of data literacy vary among trainees and PIs may not regularly perform quality control over archived files. Our findings suggest that research data management training and culture within a PI's group are likely to be more important determinants of data archiving quality than other factors such as a journal's open data policy. Greater incentives and training for individual researchers at all career stages could improve data sharing practices and enhance data transparency and reusability.
Collapse
Affiliation(s)
- Dominique G. Roche
- Department of Biology, Carleton University, 1125 Colonel By Drive, Ottawa, Ontario, Canada K1S 5B6,Département de science biologiques, Université de Montréal, Montréal, Canada H3C 3J7,Institut de Biologie, Université de Neuchâtel, Neuchâtel 2000, Switzerland
| | - Ilias Berberi
- Department of Biology, Carleton University, 1125 Colonel By Drive, Ottawa, Ontario, Canada K1S 5B6
| | - Fares Dhane
- Département de science biologiques, Université de Montréal, Montréal, Canada H3C 3J7
| | - Félix Lauzon
- Département de science biologiques, Université de Montréal, Montréal, Canada H3C 3J7,Department of Biology, McGill University, Montréal, Canada H3A 1B1
| | - Sandrine Soeharjono
- Département de science biologiques, Université de Montréal, Montréal, Canada H3C 3J7
| | - Roslyn Dakin
- Department of Biology, Carleton University, 1125 Colonel By Drive, Ottawa, Ontario, Canada K1S 5B6
| | - Sandra A. Binning
- Département de science biologiques, Université de Montréal, Montréal, Canada H3C 3J7
| |
Collapse
|
41
|
Pessin VZ, Yamane LH, Siman RR. Smart bibliometrics: an integrated method of science mapping and bibliometric analysis. Scientometrics 2022. [DOI: 10.1007/s11192-022-04406-6] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
42
|
Evans TR, Pownall M, Collins E, Henderson EL, Pickering JS, O'Mahony A, Zaneva M, Jaquiery M, Dumbalska T. A network of change: united action on research integrity. BMC Res Notes 2022; 15:141. [PMID: 35421988 PMCID: PMC9008612 DOI: 10.1186/s13104-022-06026-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/16/2022] [Accepted: 04/04/2022] [Indexed: 11/10/2022] Open
Abstract
The last decade has seen renewed concern within the scientific community over the reproducibility and transparency of research findings. This paper outlines some of the various responsibilities of stakeholders in addressing the systemic issues that contribute to this concern. In particular, this paper asserts that a united, joined-up approach is needed, in which all stakeholders, including researchers, universities, funders, publishers, and governments, work together to set standards of research integrity and engender scientific progress and innovation. Using two developments as examples: the adoption of Registered Reports as a discrete initiative, and the use of open data as an ongoing norm change, we discuss the importance of collaboration across stakeholders.
Collapse
Affiliation(s)
- Thomas Rhys Evans
- School of Human Sciences, University of Greenwich, London, England. .,Institute for Lifecourse Development, University of Greenwich, London, England.
| | | | | | | | | | - Aoife O'Mahony
- School of Psychology, Cardiff University, Cardiff, Wales
| | | | | | | |
Collapse
|
43
|
Abstract
Background: Numerous mechanisms exist to incentivise researchers to share their data.
This scoping review aims to identify and summarise evidence of the efficacy of different interventions to promote open data practices and provide an overview of current research. Methods: This scoping review is based on data identified from Web of Science and LISTA, limited from 2016 to 2021. A total of 1128 papers were screened, with 38 items being included. Items were selected if they focused on designing or evaluating an intervention or presenting an initiative to incentivise sharing. Items comprised a mixture of research papers, opinion pieces and descriptive articles. Results: Seven major themes in the literature were identified: publisher/journal data sharing policies, metrics, software solutions, research data sharing agreements in general, open science ‘badges’, funder mandates, and initiatives. Conclusions: A number of key messages for data sharing include: the need to build on existing cultures and practices, meeting people where they are and tailoring interventions to support them; the importance of publicising and explaining the policy/service widely; the need to have disciplinary data champions to model good practice and drive cultural change; the requirement to resource interventions properly; and the imperative to provide robust technical infrastructure and protocols, such as labelling of data sets, use of DOIs, data standards and use of data repositories.
Collapse
Affiliation(s)
- Helen Buckley Woods
- Research on Research Institute, Information School, University of Sheffield, Sheffield, South Yorkshire, S10 2TN, UK
| | - Stephen Pinfield
- Research on Research Institute, Information School, University of Sheffield, Sheffield, South Yorkshire, S10 2TN, UK
| |
Collapse
|
44
|
|
45
|
Walia T, Kalra G, Mathur VP, Dhillon JK. Authors submission guidelines, a survey of pediatric dentistry journals regarding ethical issues. PLoS One 2022; 17:e0261881. [PMID: 35045095 PMCID: PMC8769321 DOI: 10.1371/journal.pone.0261881] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/05/2021] [Accepted: 12/13/2021] [Indexed: 12/24/2022] Open
Abstract
OBJECTIVE To assess the pattern of instructions regarding the ethical requirements given to authors in various Pediatric Dental Journals. MATERIAL & METHODS A cross-sectional survey of 'instructions for authors,' for analysis of guidelines on ethical processes, was done. Instructions to authors in journals of pediatric dentistry across the globe were reviewed for guidelines with regards to fourteen key ethical issues. Descriptive statistics were used, and results were expressed in percentages as well as numbers. RESULTS Of the 18journals of pediatric dentistry, all 14 ethical issues were covered by the instructions to authors in only three journals with only 50% of these providing clarity about authorship using ICMJE guidelines. Furthermore, COI declaration was found to be present as mandatory in about 44% of the journals. 38.9% of the sampled journals mentioned guidelines on research misconduct, publication issues such as plagiarism, overlapping/fragmented publications, and availability of raw research data from authors. Guidelines on handling of complaints about editorial team was provided to authors by slightly over 33% of the selected pediatric dentistry titles while handling of complaints about authors and reviewers were mentioned in 16.7%and 55.6% of the journals respectively. CONCLUSION A significant proportion of Journals of Pediatric Dentistry did not provide adequate instructions to authors regarding ethical issues.
Collapse
Affiliation(s)
- Tarun Walia
- College of Dentistry, Ajman University, Ajman, United Arab Emirates
| | - Gauri Kalra
- Department of Pedodontics & Preventive Dentistry, Sudha Rustagi College of Dental Sciences & Research, Faridabad, Haryana, India
| | - Vijay Prakash Mathur
- Division of Pedodontics and Preventive Dentistry, Centre for Dental Education and Research, All India Institute of Medical Sciences, New Delhi, India
| | - Jatinder Kaur Dhillon
- College of Dental Medicine, Nova Southeastern University, Florida, United States of America
| |
Collapse
|
46
|
Scheel AM. Why most psychological research findings are not even wrong. INFANT AND CHILD DEVELOPMENT 2022. [DOI: 10.1002/icd.2295] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/08/2023]
Affiliation(s)
- Anne M. Scheel
- Human‐Technology Interaction Group Eindhoven University of Technology Eindhoven The Netherlands
| |
Collapse
|
47
|
Hardwicke TE, Thibault RT, Kosie JE, Wallach JD, Kidwell MC, Ioannidis JPA. Estimating the Prevalence of Transparency and Reproducibility-Related Research Practices in Psychology (2014-2017). PERSPECTIVES ON PSYCHOLOGICAL SCIENCE 2022; 17:239-251. [PMID: 33682488 PMCID: PMC8785283 DOI: 10.1177/1745691620979806] [Citation(s) in RCA: 34] [Impact Index Per Article: 17.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Psychologists are navigating an unprecedented period of introspection about the credibility and utility of their discipline. Reform initiatives emphasize the benefits of transparency and reproducibility-related research practices; however, adoption across the psychology literature is unknown. Estimating the prevalence of such practices will help to gauge the collective impact of reform initiatives, track progress over time, and calibrate future efforts. To this end, we manually examined a random sample of 250 psychology articles published between 2014 and 2017. Over half of the articles were publicly available (154/237, 65%, 95% confidence interval [CI] = [59%, 71%]); however, sharing of research materials (26/183; 14%, 95% CI = [10%, 19%]), study protocols (0/188; 0%, 95% CI = [0%, 1%]), raw data (4/188; 2%, 95% CI = [1%, 4%]), and analysis scripts (1/188; 1%, 95% CI = [0%, 1%]) was rare. Preregistration was also uncommon (5/188; 3%, 95% CI = [1%, 5%]). Many articles included a funding disclosure statement (142/228; 62%, 95% CI = [56%, 69%]), but conflict-of-interest statements were less common (88/228; 39%, 95% CI = [32%, 45%]). Replication studies were rare (10/188; 5%, 95% CI = [3%, 8%]), and few studies were included in systematic reviews (21/183; 11%, 95% CI = [8%, 16%]) or meta-analyses (12/183; 7%, 95% CI = [4%, 10%]). Overall, the results suggest that transparency and reproducibility-related research practices were far from routine. These findings establish baseline prevalence estimates against which future progress toward increasing the credibility and utility of psychology research can be compared.
Collapse
Affiliation(s)
- Tom E. Hardwicke
- Department of Psychology, University of Amsterdam
- Meta-Research Innovation Center Berlin (METRIC-B), QUEST Center for Transforming Biomedical Research, Berlin Institute of Health, Charité–Universitätsmedizin Berlin
| | - Robert T. Thibault
- School of Psychological Science, University of Bristol
- MRC Integrative Epidemiology Unit at the University of Bristol
| | | | - Joshua D. Wallach
- Department of Environmental Health Sciences, Yale School of Public Health
| | | | - John P. A. Ioannidis
- Meta-Research Innovation Center Berlin (METRIC-B), QUEST Center for Transforming Biomedical Research, Berlin Institute of Health, Charité–Universitätsmedizin Berlin
- Department of Medicine, Stanford University
- Meta-Research Innovation Center at Stanford, Stanford University
| |
Collapse
|
48
|
Woods HB, Pinfield S. Incentivising research data sharing: a scoping review. Wellcome Open Res 2021; 6:355. [DOI: 10.12688/wellcomeopenres.17286.1] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/11/2021] [Indexed: 11/20/2022] Open
Abstract
Background: Numerous mechanisms exist to incentivise researchers to share their data. This scoping review aims to identify and summarise evidence of the efficacy of different interventions to promote open data practices and provide an overview of current research. Methods: This scoping review is based on data identified from Web of Science and LISTA, limited from 2016 to 2021. A total of 1128 papers were screened, with 38 items being included. Items were selected if they focused on designing or evaluating an intervention or presenting an initiative to incentivise sharing. Items comprised a mixture of research papers, opinion pieces and descriptive articles. Results: Seven major themes in the literature were identified: publisher/journal data sharing policies, metrics, software solutions, research data sharing agreements in general, open science ‘badges’, funder mandates, and initiatives. Conclusions: A number of key messages for data sharing include: the need to build on existing cultures and practices, meeting people where they are and tailoring interventions to support them; the importance of publicising and explaining the policy/service widely; the need to have disciplinary data champions to model good practice and drive cultural change; the requirement to resource interventions properly; and the imperative to provide robust technical infrastructure and protocols, such as labelling of data sets, use of DOIs, data standards and use of data repositories.
Collapse
|
49
|
Rauh S, Bowers A, Rorah D, Tritz D, Pate H, Frye L, Vassar M. Evaluating the reproducibility of research in obstetrics and gynecology. Eur J Obstet Gynecol Reprod Biol 2021; 269:24-29. [PMID: 34954422 DOI: 10.1016/j.ejogrb.2021.12.021] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/08/2021] [Revised: 11/19/2021] [Accepted: 12/11/2021] [Indexed: 01/21/2023]
Abstract
OBJECTIVE Reproducibility is a core tenet of scientific research. A reproducible study is one where the results can be recreated by using the same methodology and materials as the original researchers. Unfortunately, reproducibility is not a standard to which the majority of research is currently adherent. METHODS Our cross-sectional survey evaluated 300 trials in the field of Obstetrics and Gynecology. Our primary objective was to identify nine indicators of reproducibility and transparency. These indicators include availability of data, analysis scripts, pre-registration information, study protocols, funding source, conflict of interest statements and whether or not the study was available via Open Access. RESULTS Of the 300 trials in our sample, 208 contained empirical data that could be assessed for reproducibility. None of the trials in our sample provided a link to their protocols or provided a statement on availability of materials. None were replication studies. Just 10.58% provided a statement regarding their data availability, while only 5.82% provided a statement on preregistration. 25.85% failed to report the presence or absence of conflicts of interest and 54.08% did not state the origin of their funding. CONCLUSION In the studies we examined, research in the field of Obstetrics and Gynecology is not consistently reproducible and frequently lacks conflict of interest disclosure. Consequences of this could be far-reaching and include increased research waste, widespread acceptance of misleading results and erroneous conclusions guiding clinical decision-making.
Collapse
Affiliation(s)
- Shelby Rauh
- Oklahoma State University Center for Health Sciences, Tulsa, OK, United States.
| | - Aaron Bowers
- Oklahoma State University Center for Health Sciences, Tulsa, OK, United States
| | - Drayton Rorah
- Kansas City University of Medicine and Biosciences, Joplin, MO, United States
| | - Daniel Tritz
- Oklahoma State University Center for Health Sciences, Tulsa, OK, United States
| | - Heather Pate
- Department of Obstetrics and Gynecology, Oklahoma State University Medical Center, Tulsa, OK, United States
| | - Lance Frye
- Department of Obstetrics and Gynecology, Oklahoma State University Medical Center, Tulsa, OK, United States
| | - Matt Vassar
- Oklahoma State University Center for Health Sciences, Tulsa, OK, United States
| |
Collapse
|
50
|
Abstract
Computational reproducibility is the ability to obtain identical results from the same data with the same computer code. It is a building block for transparent and cumulative science because it enables the originator and other researchers, on other computers and later in time, to reproduce and thus understand how results came about, while avoiding a variety of errors that may lead to erroneous reporting of statistical and computational results. In this tutorial, we demonstrate how the R package repro supports researchers in creating fully computationally reproducible research projects with tools from the software engineering community. Building upon this notion of fully automated reproducibility, we present several applications including the preregistration of research plans with code (Preregistration as Code, PAC). PAC eschews all ambiguity of traditional preregistration and offers several more advantages. Making technical advancements that serve reproducibility more widely accessible for researchers holds the potential to innovate the research process and to help it become more productive, credible, and reliable.
Collapse
|