1
|
Linardon J, Xie Q, Swords C, Torous J, Sun S, Goldberg SB. Methodological quality in randomised clinical trials of mental health apps: systematic review and longitudinal analysis. BMJ MENTAL HEALTH 2025; 28:e301595. [PMID: 40221143 PMCID: PMC11997814 DOI: 10.1136/bmjment-2025-301595] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/03/2025] [Accepted: 03/18/2025] [Indexed: 04/14/2025]
Abstract
QUESTION This study investigated the methodological rigour of randomised controlled trials (RCTs) of mental health apps for depression and anxiety, and whether quality has improved over time. STUDY SELECTION AND ANALYSIS RCTs were drawn from the most recent meta-analysis of mental health apps for depression and anxiety symptoms. 20 indicators of study quality were coded, encompassing risk of bias, participant diversity, study design features and app accessibility measures. Regression models tested associations between year of publication and each quality indicator. FINDINGS 176 RCTs conducted between 2011 and 2023 were included. Methodological concerns were common for several quality indicators (eg, <20% were replication trials, <35% of trials reported adverse events). Regression models revealed only three significant changes over time: an increase in preregistration (OR=1.27; 95% CI 1.10, 1.46) and reporting of adverse events (OR=1.32; 95% CI 1.11, 1.56), and a decrease in apps reported to be compatible with iOS and/or Android (OR=0.78; 95% CI 0.64, 0.96). Results were unchanged when excluding outliers. Results were similar when excluding three high-quality studies published between 2011 and 2013, with additional evidence for an increase in modern missing data methods (OR=1.22; 95% CI 1.04, 1.42) and studies reporting intention-to-treat analysis (OR=1.20; 95% CI 1.03, 1.39). CONCLUSIONS Findings provide minimal evidence of improvements in the quality of clinical trials of mental health apps, highlighting the need for higher methodological standards in future research to ensure the reliability and generalisability of evidence for these digital tools.
Collapse
Affiliation(s)
- Jake Linardon
- SEED Lifespan Strategic Research Centre, School of Psychology, Faculty of Health, Deakin University, Geelong, Victoria, Australia
| | - Qiang Xie
- Department of Counselling Psychology, University of Wisconsin-Madison, Madison, Wisconsin, USA
- University of Wisconsin-Madison, Madison, Wisconsin, USA
- Center for Healthy Minds, University of Wisconsin - Madison, Madison, Wisconsin, USA
| | - Caroline Swords
- Department of Counselling Psychology, University of Wisconsin-Madison, Madison, Wisconsin, USA
- University of Wisconsin-Madison, Madison, Wisconsin, USA
| | - John Torous
- Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, Massachusetts, USA
| | - Shufang Sun
- Department of Behavioral and Social Sciences, School of Public Health, Brown University, Providence, Rhode Island, USA
| | - Simon B Goldberg
- Department of Counselling Psychology, University of Wisconsin-Madison, Madison, Wisconsin, USA
- University of Wisconsin-Madison, Madison, Wisconsin, USA
| |
Collapse
|
2
|
Blackwell J, Beitner J, Holcombe A. How Transparent and Reproducible Are Studies That Use Animal Models of Opioid Addiction? Addict Biol 2025; 30:e70027. [PMID: 40190211 PMCID: PMC11973454 DOI: 10.1111/adb.70027] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2024] [Revised: 01/27/2025] [Accepted: 02/25/2025] [Indexed: 04/10/2025]
Abstract
The reproducibility crisis in psychology has caused various fields to consider the reliability of their own findings. Many of the unfortunate aspects of research design that undermine reproducibility also threaten translation potential. In preclinical addiction research, the rates of translation have been disappointing. We tallied indices of transparency and accurate and thorough reporting in animal models of opioid addiction from 2019 to 2023. By examining the prevalence of these practices, we aimed to understand whether efforts to improve reproducibility are relevant to this field. For 255 articles, we report the prevalence of transparency measures such as preregistration, registered reports, open data and open code, as well as compliance to the Animal Research: Reporting of In Vivo Experiments (ARRIVE) guidelines. We also report rates of bias minimization practices (randomization, masking and data exclusion), sample size calculations and multiple corrections adjustments. Lastly, we estimated the accuracy of test statistic reporting using a version of StatCheck. All the transparency measures and the ARRIVE guideline items had low prevalence, including no cases of study preregistration and no cases where authors shared their analysis code. Similarly, the levels of bias minimization practices and sample size calculations were unsatisfactory. In contrast, adjustments for multiple comparisons were implemented in most articles (76.5%). Lastly, p-value inconsistencies with test statistics were detected in about half of papers, and 11% contained statistical significance errors. We recommend that researchers, journal editors and others take steps to improve study reporting and to facilitate both replication and translation.
Collapse
Affiliation(s)
| | - Julia Beitner
- Department of PsychologyGoethe University FrankfurtFrankfurt am MainGermany
- Clinical Psychology, Central Institute of Mental Health, Medical Faculty MannheimUniversity of HeidelbergMannheimGermany
- Addiction Behavior and Addiction Medicine, Central Institute of Mental Health, Medical Faculty MannheimUniversity of HeidelbergMannheimGermany
- Psychiatry and Psychotherapy, Central Institute of Mental Health, Medical Faculty MannheimUniversity of HeidelbergMannheimGermany
- German Center for Mental Health (DZPG), Partner Site Mannheim‐Heidelberg‐UlmMannheimGermany
| | | |
Collapse
|
3
|
van den Akker OR, Thibault RT, Ioannidis JPA, Schorr SG, Strech D. Transparency in the secondary use of health data: assessing the status quo of guidance and best practices. ROYAL SOCIETY OPEN SCIENCE 2025; 12:241364. [PMID: 40144285 PMCID: PMC11937929 DOI: 10.1098/rsos.241364] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/11/2024] [Revised: 12/18/2024] [Accepted: 12/31/2024] [Indexed: 03/28/2025]
Abstract
We evaluated what guidance exists in the literature to improve the transparency of studies that make secondary use of health data. To find peer-reviewed papers, we searched PubMed and Google Scholar. To find institutional documents, we used our personal expertise to draft a list of health organizations and searched their websites. We quantitatively and qualitatively coded different types of research transparency: registration, methods reporting, results reporting, data sharing and code sharing. We found 56 documents that provide recommendations to improve the transparency of studies making secondary use of health data, mainly in relation to study registration (n = 27) and/or methods reporting (n = 39). Only three documents made recommendations on data sharing or code sharing. Recommendations for study registration and methods reporting mainly came in the form of structured documents like registration templates and reporting guidelines. Aside from the recommendations aimed directly at researchers, we also found recommendations aimed at the wider research community, typically on how to improve research infrastructure. Limitations or challenges of improving transparency were rarely mentioned, highlighting the need for more nuance in providing transparency guidance for studies that make secondary use of health data.
Collapse
Affiliation(s)
| | - Robert T. Thibault
- Meta-Research Innovation Center at Stanford (METRICS), Stanford University, Stanford, CA, USA
- Coalition for Aligning Science, Chevy Chase, MD, USA
| | - John P. A. Ioannidis
- Meta-Research Innovation Center at Stanford (METRICS), Stanford University, Stanford, CA, USA
- Departments of Medicine and of Epidemiology and Population Health, Stanford University, Stanford, CA, USA
| | - Susanne G. Schorr
- QUEST Center for Responsible Research, Berlin Institute of Health, Berlin, Germany
| | - Daniel Strech
- QUEST Center for Responsible Research, Berlin Institute of Health, Berlin, Germany
| |
Collapse
|
4
|
Murigu A, Wong KHF, Mercer RT, Hinchliffe RJ, Twine CP. Reporting and Methodological Quality of Systematic Reviews Underpinning Clinical Practice Guidelines for Vascular Surgery: A Systematic Review. Eur J Vasc Endovasc Surg 2024:S1078-5884(24)00966-3. [PMID: 39547389 DOI: 10.1016/j.ejvs.2024.11.010] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2024] [Revised: 09/30/2024] [Accepted: 11/08/2024] [Indexed: 11/17/2024]
Abstract
OBJECTIVE Clinical practice guideline recommendations are often informed by systematic reviews. This review aimed to appraise the reporting and methodological quality of systematic reviews informing clinical practice recommendations relevant to vascular surgery. DATA SOURCES MEDLINE and Embase. METHODS MEDLINE and Embase were searched from 1 January 2021 to 5 May 2023 for clinical practice guidelines relevant to vascular surgery. Guidelines were then screened for systematic reviews informing recommendations. The reporting and methodological quality of these systematic reviews were assessed using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) 2020 statement and Assessment of Multiple Systematic Reviews 2 (AMSTAR 2) 2017 tool. Pearson correlation and multiple regression analyses were performed to determine associations between these scores and extracted study characteristics. RESULTS Eleven clinical practice guidelines were obtained, containing 1 783 references informing guideline recommendations. From these, 215 systematic reviews were included for synthesis. PRISMA item completeness ranged 14 - 100%, with a mean of 63% across reviews. AMSTAR 2 item completeness ranged 2 - 95%, with a mean of 50%. Pearson correlation highlighted a statistically significant association between a review's PRISMA and AMSTAR 2 score (r = 0.85, p < .001). A more recent publication year was associated with a statistically significant increase in both scores (PRISMA coefficient 1.28, p < .001; and AMSTAR 2 coefficient 1.31, p < .001). Similarly, the presence of funding in a systematic review was shown to be statistically significantly associated with an increase in both PRISMA and AMSTAR 2 scores (coefficient 4.93, p = .024; and coefficient 6.07, p = .019, respectively). CONCLUSION Systematic reviews informing clinical practice guidelines relevant to vascular surgery were of moderate quality at best. Organisations producing clinical practice guidelines should consider funding systematic reviews to improve the quality of their recommendations.
Collapse
Affiliation(s)
- Alex Murigu
- Bristol Medical School, University of Bristol, Bristol, UK
| | - Kitty H F Wong
- Bristol Medical School, University of Bristol, Bristol, UK; North Bristol NHS Trust, Bristol, UK
| | - Ross T Mercer
- University Hospitals Bristol and Weston NHS Foundation Trust, Bristol, UK
| | - Robert J Hinchliffe
- North Bristol NHS Trust, Bristol, UK; University Hospitals Bristol and Weston NHS Foundation Trust, Bristol, UK
| | - Christopher P Twine
- North Bristol NHS Trust, Bristol, UK; University Hospitals Bristol and Weston NHS Foundation Trust, Bristol, UK.
| |
Collapse
|
5
|
Ng JY, Lin BX, Kreuder L, Cramer H, Moher D. Open science practices among authors published in complementary, alternative, and integrative medicine journals: An international, cross-sectional survey. Medicine (Baltimore) 2024; 103:e40259. [PMID: 39495970 PMCID: PMC11537614 DOI: 10.1097/md.0000000000040259] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/19/2024] [Accepted: 10/08/2024] [Indexed: 11/06/2024] Open
Abstract
Open science practices aim to increase transparency in research and increase research availability through open data, open access platforms, and public access. Due to the increasing popularity of complementary, alternative, and integrative medicine (CAIM) research, our study aims to explore current open science practices and perceived barriers among CAIM researchers in their own respective research articles. We conducted an international cross-sectional online survey that was sent to authors that published articles in MEDLINE-indexed journals categorized under the broad subject of "Complementary Therapies" or articles indexed under the MeSH term "Complementary Therapies." Articles were extracted to obtain the names and emails of all corresponding authors. Eight thousand seven hundred eighty-six researchers were emailed our survey, which included questions regarding participants' familiarity with open science practices, their open science practices, and perceived barriers to open science in CAIM with respect to participants' most recently published article. Basic descriptive statistics was generated based on the quantitative data. The survey was completed by 292 participants (3.32% response rate). Results indicate that the majority of participants were "very familiar" (n = 83, 31.68%) or "moderately familiar" (n = 83, 31.68%) with the concept of open science practices while creating their study. Open access publishing was the most familiar to participants, with 51.96% (n = 136) of survey respondents publishing with open access. Despite participants being familiar with other open science practices, the actual implementation of these practices was low. Common barriers participants experienced in implementing open science practices include not knowing where to share the study materials, where to share the data, or not knowing how to make a preprint. Although participants responded that they were familiar with the concept of open science practices, the actual implementation and uses of these practices were low. Barriers included a lack of overall knowledge about open science, and an overall lack of funding or institutional support. Future efforts should aim to explore how to implement methods to improve open science training for CAIM researchers.
Collapse
Affiliation(s)
- Jeremy Y. Ng
- Institute of General Practice and Interprofessional Care, University Hospital Tübingen, Tübingen, Germany
- Robert Bosch Center for Integrative Medicine and Health, Bosch Health Campus, Stuttgart, Germany
- Centre for Journalology, Ottawa Methods Centre, Ottawa Hospital Research Institute, Ottawa, Ontario, Canada
| | - Brenda X. Lin
- Institute of General Practice and Interprofessional Care, University Hospital Tübingen, Tübingen, Germany
- Robert Bosch Center for Integrative Medicine and Health, Bosch Health Campus, Stuttgart, Germany
| | - Liliane Kreuder
- Institute of General Practice and Interprofessional Care, University Hospital Tübingen, Tübingen, Germany
- Robert Bosch Center for Integrative Medicine and Health, Bosch Health Campus, Stuttgart, Germany
| | - Holger Cramer
- Institute of General Practice and Interprofessional Care, University Hospital Tübingen, Tübingen, Germany
- Robert Bosch Center for Integrative Medicine and Health, Bosch Health Campus, Stuttgart, Germany
| | - David Moher
- Centre for Journalology, Ottawa Methods Centre, Ottawa Hospital Research Institute, Ottawa, Ontario, Canada
- School of Epidemiology and Public Health, University of Ottawa, Ottawa, Canada
| |
Collapse
|
6
|
Purgar M, Glasziou P, Klanjscek T, Nakagawa S, Culina A. Supporting study registration to reduce research waste. Nat Ecol Evol 2024; 8:1391-1399. [PMID: 38839851 DOI: 10.1038/s41559-024-02433-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/05/2023] [Accepted: 05/08/2024] [Indexed: 06/07/2024]
Abstract
An estimated 82-89% of ecological research and 85% of medical research has limited or no value to the end user because of various inefficiencies. We argue that registration and registered reports can enhance the quality and impact of ecological research. Drawing on evidence from other fields, chiefly medicine, we support our claim that registration can reduce research waste. However, increasing registration rates, quality and impact will be very slow without coordinated effort of funders, publishers and research institutions. We therefore call on them to facilitate the adoption of registration by providing adequate support. We outline several aspects to be considered when designing a registration system that would best serve the field of ecology. To further inform the development of such a system, we call for more research to identify the causes of low registration rates in ecology. We suggest short- and long-term actions to bolster registration and reduce research waste.
Collapse
Affiliation(s)
| | - Paul Glasziou
- Institute for Evidence-Based Healthcare, Bond University, Gold Coast, Queensland, Australia
| | | | - Shinichi Nakagawa
- Evolution & Ecology Research Centre and School of Biological, Earth and Environmental Sciences, University of New South Wales, Sydney, New South Wales, Australia
- Theoretical Sciences Visiting Program, Okinawa Institute of Science and Technology Graduate University, Onna, Japan
| | - Antica Culina
- Ruđer Bošković Institute, Zagreb, Croatia.
- Netherlands Institute of Ecology, Royal Netherlands Academy of Arts and Sciences, Wageningen, the Netherlands.
| |
Collapse
|
7
|
Paredes J, Carré D. Looking for a broader mindset in psychometrics: the case for more participatory measurement practices. Front Psychol 2024; 15:1389640. [PMID: 38601828 PMCID: PMC11004427 DOI: 10.3389/fpsyg.2024.1389640] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2024] [Accepted: 03/14/2024] [Indexed: 04/12/2024] Open
Abstract
Psychometrics and the consequences of its use as the method of quantitative empirical psychology has been continuously criticized by both psychologists and psychometrists. However, the scope of the possible solutions to these issues has been mostly focused on the establishment of methodological-statistical best practices for researchers, without any regard to the pitfalls of previous stages of measurement as well as theory development of the targeted phenomenon. Conversely, other researchers advance the idea that, since psychometrics is riddled with many issues, the best way forward is a complete rework of the discipline even if it leaves psychologists and other practitioners without any way to measure quantitatively for a long period of time. Given these tensions, we therefore advocate for an alternative path to consider while we work on making substantive change in measurement. We propose a set of research practices focusing on the inclusion and active participation of groups involved in measurement activities, such as psychometrists, researchers but most importantly practitioners and potential participants. Involving a wider community while measuring in psychology could tackle some key issues that would take us closer to a more authentic approach to our phenomenon of interest.
Collapse
Affiliation(s)
- Javiera Paredes
- Laboratorio de Lenguaje, Interacción y Fenomenología, Escuela de Psicología, Pontificia Universidad Católica de Chile, Santiago, Chile
| | - David Carré
- Instituto de Ciencias de la Salud, Universidad de O’Higgins, Rancagua, Chile
| |
Collapse
|
8
|
Syed M, Frank MC, Roisman GI. Registered Reports in Child Development: Introduction to the Special Section. Child Dev 2023; 94:1093-1101. [PMID: 37603615 DOI: 10.1111/cdev.14003] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/04/2023] [Accepted: 08/04/2023] [Indexed: 08/23/2023]
Abstract
Registered Reports (RRs) are an emerging format for publishing empirical journal articles in which the decision to publish an article is based on sound conceptualization, methods, and planned analyses rather than the specific nature of the results. This article introduces the Special Section on Registered Reports in Child Development by describing what RRs are and why they are necessary, outlining the thought process that guided the Special Section, describing key thematic insights across the eight articles included in the collection, and providing recommendations for developmental researchers interested in publishing via the RR format. This article also serves as a formal announcement that RRs will be a standard publishing option at Child Development, effective immediately.
Collapse
Affiliation(s)
- Moin Syed
- Department of Psychology, University of Minnesota, Minneapolis, Minnesota, USA
| | - Michael C Frank
- Department of Psychology, Stanford University, Stanford, California, USA
| | - Glenn I Roisman
- Institute of Child Development, University of Minnesota, Minneapolis, Minnesota, USA
| |
Collapse
|