1
|
Braun V, Clarke V. Is thematic analysis used well in health psychology? A critical review of published research, with recommendations for quality practice and reporting. Health Psychol Rev 2023; 17:695-718. [PMID: 36656762 DOI: 10.1080/17437199.2022.2161594] [Citation(s) in RCA: 46] [Impact Index Per Article: 46.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/20/2022] [Accepted: 12/16/2022] [Indexed: 01/20/2023]
Abstract
Despite the persistent dominance of a 'scientific psychology' paradigm in health psychology, the use of qualitative research continues to grow. Qualitative approaches are often based on fundamentally different values from (post)positivistempiricism, raising important considerations for quality, and whether qualitative work adheres to, and is judged by, appropriate publication standards. Thematic analysis (TA) has become a particularly popular method in qualitative health psychology, but poor practice is widespread. To support high quality, methodologically coherent TA practice and reporting, we critically reviewed 100 systematically selected papers reporting TA, published in five prominent health psychology journals. Our review assessed reported practice, and considered this in relation to methodological and quality recommendations. We identified 10 common areas of problematic practice in the reviewed papers, the majority citing reflexive TA. Considering the role of three 'arbiters of quality' in a peer review publication system - authors, reviewers, and editors - we developed 20 recommendations for authors, to support them in conducting and reporting high quality TA research, with associated questions for reviewers and editors to consider when evaluating TA manuscripts for publication. We end with considerations for facilitating better qualitative research, and enriching the understandings and knowledge base from which health psychology is practiced.
Collapse
Affiliation(s)
- Virginia Braun
- Te Kura Mātai Hinengaro/School of Psychology, Waipapa Taumata Rau/The University of Auckland, Auckland, Aotearoa/New Zealand
| | - Victoria Clarke
- School of Social Sciences, University of the West of England, Bristol, UK
| |
Collapse
|
2
|
Deardorff WJ, Diaz-Ramirez LG, Boscardin WJ, Smith AK, Lee SJ. Around the EQUATOR with Clin-STAR: Prediction modeling opportunities and challenges in aging research. J Am Geriatr Soc 2023. [PMID: 38032070 DOI: 10.1111/jgs.18704] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/16/2023] [Revised: 10/16/2023] [Accepted: 10/30/2023] [Indexed: 12/01/2023]
Abstract
The 2015 Transparent Reporting of a multivariable prediction model for Individual Prognosis Or Diagnosis (TRIPOD) Statement was published to improve reporting transparency for prediction modeling studies. The objective of this review is to highlight methodologic challenges that aging-focused researchers will encounter when designing and reporting studies involving prediction models for older adults and provide guidance for addressing these challenges. In following the 22-item TRIPOD checklist, researchers must consider the representativeness of cohorts used (e.g., whether older adults with frailty, cognitive impairment, and social isolation were included), strategies for incorporating common geriatric predictors (e.g., age, comorbidities, functional status, and frailty), methods for handling missing data and competing risk of death, and assessment of model performance heterogeneity across important subgroups (e.g., age, sex, race, and ethnicity). We provide guidance to help aging-focused researchers develop, validate, and report models that can inform and improve patient care, which we label "TRIPOD-65."
Collapse
Affiliation(s)
- W James Deardorff
- Division of Geriatrics, University of California, San Francisco, San Francisco, California, USA
- Geriatrics, Palliative and Extended Care Service Line, San Francisco Veterans Affairs Medical Center, San Francisco, California, USA
| | - L Grisell Diaz-Ramirez
- Division of Geriatrics, University of California, San Francisco, San Francisco, California, USA
- Geriatrics, Palliative and Extended Care Service Line, San Francisco Veterans Affairs Medical Center, San Francisco, California, USA
| | - W John Boscardin
- Division of Geriatrics, University of California, San Francisco, San Francisco, California, USA
- Geriatrics, Palliative and Extended Care Service Line, San Francisco Veterans Affairs Medical Center, San Francisco, California, USA
- Department of Epidemiology and Biostatistics, University of California, San Francisco, San Francisco, California, USA
| | - Alexander K Smith
- Division of Geriatrics, University of California, San Francisco, San Francisco, California, USA
- Geriatrics, Palliative and Extended Care Service Line, San Francisco Veterans Affairs Medical Center, San Francisco, California, USA
| | - Sei J Lee
- Division of Geriatrics, University of California, San Francisco, San Francisco, California, USA
- Geriatrics, Palliative and Extended Care Service Line, San Francisco Veterans Affairs Medical Center, San Francisco, California, USA
| |
Collapse
|
3
|
Villuendas H, Vilches C, Quidant R. Standardization of In Vitro Studies for Plasmonic Photothermal therapy. ACS Nanosci Au 2023; 3:347-352. [PMID: 37868227 PMCID: PMC10588432 DOI: 10.1021/acsnanoscienceau.3c00011] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/16/2023] [Revised: 07/04/2023] [Accepted: 07/05/2023] [Indexed: 10/24/2023]
Abstract
Lack of standardization is a systematic problem that impacts nanomedicine by challenging data comparison from different studies. Translation from preclinical to clinical stages indeed requires reproducible data that can be easily accessed and compared. In this work, we propose a series of experimental standards for in vitro plasmonic photothermal therapy (PPTT). This best practice guide covers the five main aspects of PPTT studies in vitro: nanomaterials, biological samples, pre-, during, and postirradiation characterization. We are confident that such standardization of experimental protocols and reported data will benefit the development of PPTT as a transversal therapy.
Collapse
Affiliation(s)
- Helena Villuendas
- Nanophotonic
Systems Laboratory, Department of Mechanical and Process Engineering, ETH Zürich, 8092 Zürich, Switzerland
| | - Clara Vilches
- ICFO
− Institut de Ciències Fotòniques, the Barcelona Institute of Science and Technology, 08860 Castelldefels, Barcelona, Spain
| | - Romain Quidant
- Nanophotonic
Systems Laboratory, Department of Mechanical and Process Engineering, ETH Zürich, 8092 Zürich, Switzerland
| |
Collapse
|
4
|
Kostygina G, Kim Y, Seeskin Z, LeClere F, Emery S. Disclosure Standards for Social Media and Generative Artificial Intelligence Research: Toward Transparency and Replicability. Soc Media Soc 2023; 9:10.1177/20563051231216947. [PMID: 38239338 PMCID: PMC10795517 DOI: 10.1177/20563051231216947] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Indexed: 01/22/2024]
Abstract
Social media dominate today's information ecosystem and provide valuable information for social research. Market researchers, social scientists, policymakers, government entities, public health researchers, and practitioners recognize the potential for social data to inspire innovation, support products and services, characterize public opinion, and guide decisions. The appeal of mining these rich datasets is clear. However, there is potential risk of data misuse, underscoring an equally huge and fundamental flaw in the research: there are no procedural standards and little transparency. Transparency across the processes of collecting and analyzing social media data is often limited due to proprietary algorithms. Spurious findings and biases introduced by artificial intelligence (AI) demonstrate the challenges this lack of transparency poses for research. Social media research remains a virtual "wild west," with no clear standards for reporting regarding data retrieval, preprocessing steps, analytic methods, or interpretation. Use of emerging generative AI technologies to augment social media analytics can undermine validity and replicability of findings, potentially turning this research into a "black box" enterprise. Clear guidance for social media analyses and reporting is needed to assure the quality of the resulting research. In this article, we propose criteria for evaluating the quality of studies using social media data, grounded in established scientific practice. We offer clear documentation guidelines to ensure that social data are used properly and transparently in research and applications. A checklist of disclosure elements to meet minimal reporting standards is proposed. These criteria will make it possible for scholars and practitioners to assess the quality, credibility, and comparability of research findings using digital data.
Collapse
|
5
|
Cowan IA, Floyd RA. Measurement of radiologist reporting times: Assessment of precision, comparison of three different measurement techniques and review of potential applications. J Med Imaging Radiat Oncol 2023; 67:734-741. [PMID: 37608491 DOI: 10.1111/1754-9485.13570] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/12/2022] [Accepted: 07/31/2023] [Indexed: 08/24/2023]
Abstract
INTRODUCTION Radiologist reporting times are a key component of radiology department workload assessment, but reliable measurement remains challenging. Currently, there are three contenders for this task: median reporting times (MRTs), extracted directly from a department's radiology information system (RIS); study-ascribed times (SATs), using published tables of individual descriptors derived from a combination of measurement and consensus; and radiology reporting figures (RRFs), using published tables of measured times based on modality and numbers of anatomical areas. METHODS We review these techniques, their possible uses and some potential pitfalls. We discuss the level of precision that can realistically be attained in measuring reporting times, and list the strengths and weaknesses of each technique, comparing them in relation to each of eight potential applications. RESULTS We believe that SATs are challenging for practical use due to their static nature, absent common descriptors and large number. RRFs are more user-friendly but are also static and require ongoing updates; currently, they do not include ultrasound. MRTs cannot currently be extracted from every RIS, but where available they are easy to use and their dynamic nature provides the most objective data. They underestimate the unmeasurable components of a radiologist's work and therefore the total time spent in a reporting session. CONCLUSION MRTs are superior to the other methods in flexibility, precision and ease of use. All institutions should have access to this data and we call on vendors of Radiology Information Systems which are currently not capable of providing it to make the necessary modifications.
Collapse
Affiliation(s)
- Ian A Cowan
- Everlight Radiology, Sydney, New South Wales, Australia
| | - Richard A Floyd
- Radiology Department, Christchurch Hospital, Christchurch, New Zealand
| |
Collapse
|
6
|
Fingerhut J, Moeyaert M, Manolov R, Xu X, Park KH. Systematic Review of Descriptions and Justifications Provided for Single-Case Quantification Techniques. Behav Modif 2023; 47:1115-1143. [PMID: 37254563 DOI: 10.1177/01454455231178469] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 06/01/2023]
Abstract
There are currently a multitude of quantification techniques that have been developed for use with single-case designs. As a result, choosing an appropriate quantification technique can be overwhelming and it can be difficult for researchers to properly describe and justify their use of quantification techniques. However, providing clear descriptions and justifications is important for enhancing the credibility of single-case research, and allowing others to evaluate the appropriateness of the quantification technique used. The aim of this systematic literature review is to provide an overview of the quantification techniques that are used to analyze single-case designs, with a focus on the descriptions and justifications that are provided. A total of 290 quantifications occurred across 218 articles, and the descriptions and justifications that were provided for the quantification techniques that were used are systematically examined. Results show that certain quantification techniques, such as the non-overlap indices, are more commonly used. Descriptions and justifications provided for using the quantification techniques are sometimes vague or subjective. Single-case researchers are encouraged to complement visual analysis with the use of quantification techniques for which they can provide objective and appropriate descriptions and justifications, and are encouraged to use tools to guide their choice of quantification techniques.
Collapse
Affiliation(s)
| | | | | | - Xinyun Xu
- State University of New York, Albany, USA
| | | |
Collapse
|
7
|
Perrin Franck C, Babington-Ashaye A, Dietrich D, Bediang G, Veltsos P, Gupta PP, Juech C, Kadam R, Collin M, Setian L, Serrano Pons J, Kwankam SY, Garrette B, Barbe S, Bagayoko CO, Mehl G, Lovis C, Geissbuhler A. iCHECK-DH: Guidelines and Checklist for the Reporting on Digital Health Implementations. J Med Internet Res 2023; 25:e46694. [PMID: 37163336 PMCID: PMC10209789 DOI: 10.2196/46694] [Citation(s) in RCA: 9] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2023] [Revised: 04/18/2023] [Accepted: 04/21/2023] [Indexed: 05/11/2023] Open
Abstract
BACKGROUND Implementation of digital health technologies has grown rapidly, but many remain limited to pilot studies due to challenges, such as a lack of evidence or barriers to implementation. Overcoming these challenges requires learning from previous implementations and systematically documenting implementation processes to better understand the real-world impact of a technology and identify effective strategies for future implementation. OBJECTIVE A group of global experts, facilitated by the Geneva Digital Health Hub, developed the Guidelines and Checklist for the Reporting on Digital Health Implementations (iCHECK-DH, pronounced "I checked") to improve the completeness of reporting on digital health implementations. METHODS A guideline development group was convened to define key considerations and criteria for reporting on digital health implementations. To ensure the practicality and effectiveness of the checklist, it was pilot-tested by applying it to several real-world digital health implementations, and adjustments were made based on the feedback received. The guiding principle for the development of iCHECK-DH was to identify the minimum set of information needed to comprehensively define a digital health implementation, to support the identification of key factors for success and failure, and to enable others to replicate it in different settings. RESULTS The result was a 20-item checklist with detailed explanations and examples in this paper. The authors anticipate that widespread adoption will standardize the quality of reporting and, indirectly, improve implementation standards and best practices. CONCLUSIONS Guidelines for reporting on digital health implementations are important to ensure the accuracy, completeness, and consistency of reported information. This allows for meaningful comparison and evaluation of results, transparency, and accountability and informs stakeholder decision-making. i-CHECK-DH facilitates standardization of the way information is collected and reported, improving systematic documentation and knowledge transfer that can lead to the development of more effective digital health interventions and better health outcomes.
Collapse
Affiliation(s)
- Caroline Perrin Franck
- Department of Radiology and Medical Informatics, Faculty of Medicine, University of Geneva, Geneva, Switzerland
- Geneva Digital Health Hub, Geneva, Switzerland
| | - Awa Babington-Ashaye
- Department of Radiology and Medical Informatics, Faculty of Medicine, University of Geneva, Geneva, Switzerland
- Geneva Digital Health Hub, Geneva, Switzerland
| | | | - Georges Bediang
- Faculty of Medicine and Biomedical Sciences, University of Yaoundé 1, Yaoundé, Cameroon
| | | | | | - Claudia Juech
- Government Innovation, Bloomberg Philanthropies, New York, NY, United States
| | - Rigveda Kadam
- Foundation for Innovative New Diagnostics, Geneva, Switzerland
| | | | | | | | - S Yunkap Kwankam
- International Society for Telemedicine & eHealth, Basel, Switzerland
| | | | | | - Cheick Oumar Bagayoko
- Centre d'Innovation et de Santé Digitale, DigiSanté-Mali, Université des sciences, des techniques et des technologies de Bamako, Bamako, Mali
- Centre d'Expertise et de Recherche en Télémédecine et E-Santé, Bamako, Mali
| | - Garrett Mehl
- Department of Digital Health and Innovation, World Health Organization, Geneva, Switzerland
| | - Christian Lovis
- Department of Radiology and Medical Informatics, Faculty of Medicine, University of Geneva, Geneva, Switzerland
- Division of Medical Information Sciences, Geneva University Hospitals, Geneva, Switzerland
| | - Antoine Geissbuhler
- Department of Radiology and Medical Informatics, Faculty of Medicine, University of Geneva, Geneva, Switzerland
- Geneva Digital Health Hub, Geneva, Switzerland
- Division of Medical Information Sciences, Geneva University Hospitals, Geneva, Switzerland
| |
Collapse
|
8
|
Lundin RM, Yeap Y, Menkes DB. Adverse Effects of Virtual and Augmented Reality Interventions in Psychiatry: Systematic Review. JMIR Ment Health 2023; 10:e43240. [PMID: 37145841 DOI: 10.2196/43240] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/05/2022] [Revised: 12/22/2022] [Accepted: 01/05/2023] [Indexed: 05/06/2023] Open
Abstract
BACKGROUND Virtual reality (VR) and augmented reality (AR) are emerging treatment modalities in psychiatry, which are capable of producing clinical outcomes broadly comparable to those achieved with standard psychotherapies. OBJECTIVE Because the side effect profile associated with the clinical use of VR and AR remains largely unknown, we systematically reviewed available evidence of their adverse effects. METHODS A systematic review was conducted in accordance with the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) framework across 3 mental health databases (PubMed, PsycINFO, and Embase) to identify VR and AR interventions targeting mental health diagnoses. RESULTS Of 73 studies meeting the inclusion criteria, 7 reported worsening clinical symptoms or an increased fall risk. Another 21 studies reported "no adverse effects" but failed to identify obvious adverse effects, mainly cybersickness, documented in their results. More concerningly, 45 of the 73 studies made no mention of adverse effects whatsoever. CONCLUSIONS An appropriate screening tool would help ensure that VR adverse effects are correctly identified and reported.
Collapse
Affiliation(s)
- Robert M Lundin
- Change to Improve Mental Health, Mental Health Drugs and Alcohol Services, Barwon Health, Geelong, Australia
- Institute for Mental and Physical Health and Clinical Translation, Deakin University, Geelong, Australia
- Waikato Clinical Campus, University of Auckland, Hamilton, New Zealand
| | - Yuhern Yeap
- Mental Health and Addictions, Waikato District Health Board, Hamilton, New Zealand
| | - David B Menkes
- Waikato Clinical Campus, University of Auckland, Hamilton, New Zealand
| |
Collapse
|
9
|
Carpenter CR, Southerland LT, Lucey BP, Prusaczyk B. Around the EQUATOR with clinician-scientists transdisciplinary aging research (Clin-STAR) principles: Implementation science challenges and opportunities. J Am Geriatr Soc 2022; 70:3620-3630. [PMID: 36005482 PMCID: PMC10538952 DOI: 10.1111/jgs.17993] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/16/2022] [Revised: 06/25/2022] [Accepted: 07/04/2022] [Indexed: 12/24/2022]
Abstract
The Institute of Medicine and the National Institute on Aging increasingly understand that knowledge alone is necessary but insufficient to improve healthcare outcomes. Adapting the behaviors of clinicians, patients, and stakeholders to new standards of evidence-based clinical practice is often significantly delayed. In response, over the past twenty years, Implementation Science has developed as the study of methods and strategies that facilitate the uptake of evidence-based practice into regular use by practitioners and policymakers. One important advance in Implementation Science research was the development of Standards for Reporting Implementation Studies (StaRI), which provided a 27-item checklist for researchers to consistently report essential elements of the implementation and intervention strategies. Using StaRI as a framework, this review discusses specific Implementation Science challenges for research with older adults, provides solutions for those obstacles, and opportunities to improve the value of this evolving approach to reduce the knowledge translation losses that exist between published research and clinical practice.
Collapse
Affiliation(s)
- Christopher R Carpenter
- Department of Emergency Medicine and Emergency Care Research Core, Washington University in St. Louis School of Medicine, St. Louis, Missouri, USA
| | - Lauren T Southerland
- Department of Emergency Medicine, The Ohio State University, Columbus, Ohio, USA
| | - Brendan P Lucey
- Department of Neurology, Washington University in St Louis School of Medicine, St. Louis, Missouri, USA
| | - Beth Prusaczyk
- Department of Medicine Institute for Informatics, Washington University in St. Louis School of Medicine, St. Louis, Missouri, USA
| |
Collapse
|
10
|
Carpenter CR, Gill TM. Transparent transdisciplinary reporting in geriatric research using the EQUATOR Network. J Am Geriatr Soc 2022; 70:3352-3355. [PMID: 36289574 PMCID: PMC9772164 DOI: 10.1111/jgs.18097] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/12/2022] [Accepted: 09/26/2022] [Indexed: 12/24/2022]
Abstract
This editorial comments on the article by Carpenter et al. in this issue.
Collapse
Affiliation(s)
- Christopher R. Carpenter
- Washington University in St. Louis School of Medicine Department of Emergency Medicine and Emergency Care Research Core, St. Louis MO
| | - Thomas M. Gill
- Yale School of Medicine, Department of Internal Medicine, New Haven CT
| |
Collapse
|
11
|
Lin Y, Tu JF, Wang LQ, Shi GX, Yang JW, Li HW, Qi LY, Yu FT, Kang SB, Liu CZ. [Application of "patient and public involvement" in acupuncture clinical research]. Zhongguo Zhen Jiu 2022; 42:1179-83. [PMID: 37199211 DOI: 10.13703/j.0255-2930.20211231-0001] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Subscribe] [Scholar Register] [Indexed: 05/19/2023]
Abstract
In order to explore the application of "patient and public involvement" (PPI) in acupuncture clinical research, the connotation, reporting standards and research status of PPI at home and abroad are collated, and the key problems of PPI encountered in acupuncture clinical research are deeply considered and summarized. It is suggested that the short-form checklist of the Guidance for Reporting Involvement of Patients and the Public (GRIPP) of the 2nd edition should be applied to acupuncture clinical research. PPI provides a new perspective for acupuncture clinical research. It is beneficial for each stage of research, contributes to the improvement of acupuncture medical service mode and increases the success rate and cost-effectiveness of research so that the innovation and development of acupuncture science can be promoted.
Collapse
Affiliation(s)
- Ying Lin
- International Acupuncture and Moxibustion Innovation Institute, School of Acupuncture-Moxibustion and Tuina, Beijing University of CM, Beijing 100029, China
| | - Jian-Feng Tu
- International Acupuncture and Moxibustion Innovation Institute, School of Acupuncture-Moxibustion and Tuina, Beijing University of CM, Beijing 100029, China
| | - Li-Qiong Wang
- International Acupuncture and Moxibustion Innovation Institute, School of Acupuncture-Moxibustion and Tuina, Beijing University of CM, Beijing 100029, China
| | - Guang-Xia Shi
- International Acupuncture and Moxibustion Innovation Institute, School of Acupuncture-Moxibustion and Tuina, Beijing University of CM, Beijing 100029, China
| | - Jing-Wen Yang
- International Acupuncture and Moxibustion Innovation Institute, School of Acupuncture-Moxibustion and Tuina, Beijing University of CM, Beijing 100029, China
| | - He-Wen Li
- International Acupuncture and Moxibustion Innovation Institute, School of Acupuncture-Moxibustion and Tuina, Beijing University of CM, Beijing 100029, China
| | - Ling-Yu Qi
- International Acupuncture and Moxibustion Innovation Institute, School of Acupuncture-Moxibustion and Tuina, Beijing University of CM, Beijing 100029, China
| | - Fang-Ting Yu
- International Acupuncture and Moxibustion Innovation Institute, School of Acupuncture-Moxibustion and Tuina, Beijing University of CM, Beijing 100029, China
| | - Si-Bo Kang
- International Acupuncture and Moxibustion Innovation Institute, School of Acupuncture-Moxibustion and Tuina, Beijing University of CM, Beijing 100029, China
| | - Cun-Zhi Liu
- International Acupuncture and Moxibustion Innovation Institute, School of Acupuncture-Moxibustion and Tuina, Beijing University of CM, Beijing 100029, China
| |
Collapse
|
12
|
Shukralla A, Carton R, Benson KA, El Naggar H, Lacey A, Cavalleri G, Delanty N. Whole exome sequencing studies in epilepsy: A deep analysis of the published literature. Am J Med Genet A 2022; 188:1407-1419. [PMID: 35088532 DOI: 10.1002/ajmg.a.62655] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/27/2021] [Revised: 12/03/2021] [Accepted: 12/17/2021] [Indexed: 11/07/2022]
Abstract
To evaluate the quality of whole-exome sequencing (WES) reporting in the epilepsy literature. We aimed to assess the quality of reporting of WES in epilepsy. We compared studies based on journal type and if outcome reporting biases exist. We used a self-constructed benchmark to quantitatively analyze studies. We included 451 publications. Reporting was heterogeneous with poor reporting of (1) ACMG guideline application 13% and (2) Human Phenotype Ontology (HPO) numbers in 3% of studies, 3) VUS in 19%. Predictors of reporting included journal type and journal impact factor. Date of publication and publication type were not predictors of poor reporting. Pairwise comparisons of genetics versus neurology journals using relative risks yielded significant differences in reporting of ACMG guideline application (RR 1.88, 95% CI 1.04-3.38); HPO numbers (RR 8.62, 95% CI 1.08-63.37) and deposition of findings to ClinVar (RR 2.50, 95% CI 1.03-6.1). Reporting of WES literature is heterogeneous in quality, and poor reporting hinders collaboration and accession of data into large databases like OMIM and OrphaNet. This study highlights reporting bias in this area and, formal structural guidelines like the CONSORT guidelines used in the reporting of clinical trials are needed to address the issue.
Collapse
Affiliation(s)
- Arif Shukralla
- The National Epilepsy Programme, Beaumont Hospital, Dublin, Ireland
| | - Robert Carton
- FutureNeuro, The SFI Research Centre for Chronic and Rare Neurological Disease, Dublin, Ireland.,The Royal College of Surgeons in Ireland, Dublin, Ireland.,School of Pharmacy and Biomolecular Science, RCSI, Dublin, Ireland
| | - Katherine A Benson
- FutureNeuro, The SFI Research Centre for Chronic and Rare Neurological Disease, Dublin, Ireland.,The Royal College of Surgeons in Ireland, Dublin, Ireland.,School of Pharmacy and Biomolecular Science, RCSI, Dublin, Ireland
| | - Hany El Naggar
- The National Epilepsy Programme, Beaumont Hospital, Dublin, Ireland.,FutureNeuro, The SFI Research Centre for Chronic and Rare Neurological Disease, Dublin, Ireland.,The Royal College of Surgeons in Ireland, Dublin, Ireland
| | - Austin Lacey
- FutureNeuro, The SFI Research Centre for Chronic and Rare Neurological Disease, Dublin, Ireland.,The Royal College of Surgeons in Ireland, Dublin, Ireland
| | - Gianpiero Cavalleri
- FutureNeuro, The SFI Research Centre for Chronic and Rare Neurological Disease, Dublin, Ireland.,The Royal College of Surgeons in Ireland, Dublin, Ireland.,School of Pharmacy and Biomolecular Science, RCSI, Dublin, Ireland
| | - Norman Delanty
- The National Epilepsy Programme, Beaumont Hospital, Dublin, Ireland.,FutureNeuro, The SFI Research Centre for Chronic and Rare Neurological Disease, Dublin, Ireland.,The Royal College of Surgeons in Ireland, Dublin, Ireland
| |
Collapse
|
13
|
Augustovski F, García Martí S, Espinoza MA, Palacios A, Husereau D, Pichon-Riviere A. Estándares Consolidados de Reporte de Evaluaciones Económicas Sanitarias: adaptación al español de la lista de comprobación CHEERS 2022. Value Health Reg Issues 2022; 27:110-114. [PMID: 35031081 DOI: 10.1016/j.vhri.2021.11.001] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2021] [Accepted: 11/01/2021] [Indexed: 01/31/2023]
Abstract
OBJECTIVES Health economic evaluations (HEEs) are comparative analyses of courses of action in terms of both costs and consequences. The Consolidated Health Economic Evaluation Reporting Standards (CHEERS) original version and its adaptation to Spanish were published in 2013. Its objectives were to promote that the HEEs are identifiable, interpretable, and useful for decision making and serve as a reporting guide. The new CHEERS 2022 replaces the previous one and tries to be more easily applied to any HEE and incorporates recent methodological advances and the importance of stakeholder involvement including patients and the general public. METHODS For the present adaptation, the following stages were followed: (1) independent translations of the original list into Spanish, (2) blind back-translations, (3) evaluation of their quality, (4) preparation of a new version in Spanish, (5) review and improvement by the author team, (6) preparation of a new version in Spanish, (7) distribution of the preliminary Spanish version and the original one to the American HTA Network (Red de las Américas de Evaluación de Tecnologías Sanitarias) and Spanish-speaking experts for evaluation and feedback, (8) monitoring of changes to the original list under peer review at BritishMedicalJournal, and (9) consolidation of the final adaptation of the Spanish CHEERS 2022 checklist. RESULTS In this article, we detail the process and the Spanish adaptation of the 28-item CHEERS 2022 checklist and its recommendations. CONCLUSIONS This list is intended for researchers reporting HEE in peer-reviewed journals and reviewers, editors, and, among others, health technology assessment bodies.
Collapse
Affiliation(s)
- Federico Augustovski
- Instituto de Efectividad Clínica y Sanitaria, Buenos Aires, Argentina, Universidad de Buenos Aires, Buenos Aires, Argentina; Escuela de Salud Pública, Facultad de Medicina, Universidad de Buenos Aires, Buenos Aires, Argentina.
| | - Sebastián García Martí
- Instituto de Efectividad Clínica y Sanitaria, Buenos Aires, Argentina, Universidad de Buenos Aires, Buenos Aires, Argentina
| | - Manuel A Espinoza
- Departamento de Salud Pública, Pontificia Universidad Católica de Chile, Santiago, Chile; Unidad de Evaluación de Tecnologías en Salud, Facultad de Medicina, Pontificia Universidad Católica de Chile, Santiago, Chile
| | - Alfredo Palacios
- Instituto de Efectividad Clínica y Sanitaria, Buenos Aires, Argentina, Universidad de Buenos Aires, Buenos Aires, Argentina; Facultad de Ciencias Económicas, Universidad de Buenos Aires, Buenos Aires, Argentina
| | - Don Husereau
- School of Epidemiology and Public Health, University of Ottawa, Ontario, Canada; Institute of Health Economics, Alberta, Canada
| | - Andrés Pichon-Riviere
- Instituto de Efectividad Clínica y Sanitaria, Buenos Aires, Argentina, Universidad de Buenos Aires, Buenos Aires, Argentina; Escuela de Salud Pública, Facultad de Medicina, Universidad de Buenos Aires, Buenos Aires, Argentina
| |
Collapse
|
14
|
Abstract
As the final outputs of the Reproducibility Project: Cancer Biology are published, it is clear that preclinical research in cancer biology is not as reproducible as it should be.
Collapse
|
15
|
Hepkema WM, Horbach SPJM, Hoek JM, Halffman W. Misidentified biomedical resources: Journal guidelines are not a quick fix. Int J Cancer 2021; 150:1233-1243. [PMID: 34807460 PMCID: PMC9300184 DOI: 10.1002/ijc.33882] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2021] [Revised: 10/14/2021] [Accepted: 11/02/2021] [Indexed: 01/22/2023]
Abstract
Biomedical researchers routinely use a variety of biological models and resources, such as cultured cell lines, antibodies and laboratory animals. Unfortunately, these resources are not flawless: cell lines can be misidentified; for antibodies, problems with specificity, lot‐to‐lot consistency and sensitivity are common; and the reliability of animal models is questioned due to poor translation of animal studies to human clinical trials. In some cases, these problems can render the results of a study meaningless. As a response, some journals have implemented guidelines regarding the use and reporting of cell lines, antibodies and laboratory animals. In our study we use a portfolio of existing and newly created datasets to investigate identification and authentication information of cell lines, antibodies and organisms before and after guideline introduction, compared to journals without guidelines. We observed a general improvement of reporting quality over time, which the implementation of guidelines accelerated only in some cases. We therefore conclude that the effectiveness of journal guidelines is likely to be context dependent, affected by factors such as implementation conditions, research community support and monitoring and resource availability. Hence, journal reporting guidelines in themselves are not a quick fix to repair shortcomings in biomedical resource documentation, even though they can be part of the solution.
Collapse
Affiliation(s)
- Wytske M Hepkema
- Institute of Sociology, Technische Universität Berlin, Berlin, Germany
| | - Serge P J M Horbach
- Danish Centre for Studies in Research and Research Policy, Aarhus University, Aarhus, Denmark.,Centre for Science and Technology Studies, Leiden University, Leiden, The Netherlands
| | - Joyce M Hoek
- Department of Psychology, University of Groningen, Groningen, The Netherlands
| | - Willem Halffman
- Institute for Science in Society, Radboud University Nijmegen, Nijmegen, The Netherlands
| |
Collapse
|
16
|
Wu Q, Zhang P. Longitudinal validity of self-rated health: the presence and impact of response shift. Psychol Health 2021:1-21. [PMID: 34714204 DOI: 10.1080/08870446.2021.1994571] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
Abstract
Objective: This paper aimed to examine the longitudinal validity of self-rated health (SRH) and whether it would be affected by possible changes in evaluation standards (i.e., response shift) over time.Design: Data are from a longitudinal survey of a nationally representative sample in China. Analytical sample was restricted to respondents aged 45 and above (n = 15,893). Individual fixed effects models were used to analyze changes in ratings on health anchoring vignettes and self-rated health over time.Main outcome measures: SRH at two time points with a -two-year span.Results: Both SRH and anchoring vignettes ratings displayed changes over a two-year span for all the studied age groups. Compared with the self-assessed change in health ("How would you rate your health as compared to that of last year?"), changes in SRH reported over time displayed a more stable and optimistic pattern. SRH responded to doctor diagnosed chronic disease and changes in functional limitation, before and after adjusting for evaluation standards.Conclusion: SRH is responsive to the newly diagnosed chronic disease and functional limitation, regardless of whether we consider response shift within the same respondents over time.
Collapse
Affiliation(s)
- Qiong Wu
- Institute of Social Science Survey, Peking University, Beijing, China
| | - Peikang Zhang
- Graduate School of Education, Peking University, Beijing, China
| |
Collapse
|
17
|
Pascoe MC, Bailey AP, Craike M, Carter T, Patten RK, Stepto NK, Parker AG. Poor reporting of physical activity and exercise interventions in youth mental health trials: A brief report. Early Interv Psychiatry 2021; 15:1414-1422. [PMID: 32924318 PMCID: PMC8451843 DOI: 10.1111/eip.13045] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/30/2020] [Revised: 08/03/2020] [Accepted: 08/30/2020] [Indexed: 12/11/2022]
Abstract
AIM To describe the quality and completeness of the description and reporting of physical activity and exercise interventions delivered to young people to promote mental health or treat mental illness. METHODS We conducted a series of scoping reviews identifying 64 controlled trials of physical activity and exercise interventions delivered to young people. We extracted: intervention characteristics, personnel and delivery format, the intensity, duration, frequency and type of physical activity or exercise. RESULTS There was limited reporting of intervention details across studies; 52% did not provide information to confidently assess intervention intensity, 29% did not state who delivered the intervention, and 44% did not specify the intervention delivery format. CONCLUSIONS We recommend that authors adhere to the CONSORT reporting requirements and its intervention reporting extensions, (a) the Template for Intervention Description and Replication, (b) Consensus for Exercise Reporting Template and (c) as part of this, detail the frequency, intensity, time and type of physical activity recommendations and prescriptions. Without this, future trials are unable to replicate and extend previous work to support or disconfirm existing knowledge, leading to research waste and diminishing translation and implementation potential.
Collapse
Affiliation(s)
- Michaela C. Pascoe
- Institute for Health and SportVictoria UniversityMelbourneVictoriaAustralia
- Peter MacCallum Cancer CentreMelbourneVictoriaAustralia
| | - Alan P. Bailey
- Orygen, and Centre for Youth Mental HealthThe University of MelbourneParkvilleVictoriaAustralia
| | - Melinda Craike
- Institute for Health and SportVictoria UniversityMelbourneVictoriaAustralia
- Mitchell InstituteVictoria UniversityMelbourneVictoriaAustralia
| | - Tim Carter
- Institute of Mental Health, School of Health SciencesUniversity of NottinghamNottinghamUK
| | - Rhiannon K. Patten
- Institute for Health and SportVictoria UniversityMelbourneVictoriaAustralia
| | - Nigel K. Stepto
- Institute for Health and SportVictoria UniversityMelbourneVictoriaAustralia
| | - Alexandra G. Parker
- Institute for Health and SportVictoria UniversityMelbourneVictoriaAustralia
- Orygen, and Centre for Youth Mental HealthThe University of MelbourneParkvilleVictoriaAustralia
| |
Collapse
|
18
|
Abstract
PURPOSE OF REVIEW This article reviews recent efforts about standardized imaging features and reporting of chronic pancreatitis and recently published or ongoing imaging studies, which aim to establish novel imaging biomarkers for detection of parenchymal changes seen in chronic pancreatitis. RECENT FINDINGS New novel MRI techniques are being developed to increase the diagnostic yield of chronic pancreatitis specifically in the early stage. T1 relaxation time, T1 signal intensity ratio and extracellular volume fraction offer potential advantages over conventional cross-sectional imaging, including simplicity of analysis and more objective interpretation of observations allowing population-based comparisons. In addition, standardized definitions and reporting guidelines for chronic pancreatitis based on available evidence and expert consensus have been proposed. These new imaging biomarkers and reporting guidelines are being validated for prognostic/therapeutic assessment of adult patients participating in longitudinal studies of The Consortium for the Study of Chronic Pancreatitis, Diabetes and Pancreatic Cancer. SUMMARY New imaging biomarkers derived from novel MRI sequences promise a new chapter for diagnosis and severity assessment of chronic pancreatitis; a cross-sectional imaging-based diagnostic criteria for chronic pancreatitis combining ductal and parenchymal findings. Standardized imaging findings and reporting guidelines of chronic pancreatitis would enhance longitudinal assessment of disease severity in clinical trials and improve communication between radiologists and pancreatologists in clinical practice.
Collapse
Affiliation(s)
- Temel Tirkes
- Associate Professor of Radiology, Imaging Sciences, Medicine and Urology, Department of Radiology, Indiana University School of Medicine, Indianapolis, IN, USA
| | - Anil K. Dasyam
- Associate Professor of Radiology and Medicine, Department of Radiology, University of Pittsburgh Medical Center, Pittsburgh, PA, USA
| | - Zarine K. Shah
- Associate Professor of Radiology, Department of Radiology, Ohio State University Wexner Medical Center, Columbus, OH, USA
| | - Evan L. Fogel
- Professor of Medicine, Lehman, Bucksot and Sherman Section of Pancreatobiliary Endoscopy, Indiana University School of Medicine, Indianapolis, IN, USA
| |
Collapse
|
19
|
Hernandez-Boussard T, Lundgren MP, Shah N. Conflicting information from the Food and Drug Administration: Missed opportunity to lead standards for safe and effective medical artificial intelligence solutions. J Am Med Inform Assoc 2021; 28:1353-1355. [PMID: 33674865 PMCID: PMC8661389 DOI: 10.1093/jamia/ocab035] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2021] [Accepted: 02/10/2021] [Indexed: 11/14/2022] Open
Abstract
The Food & Drug Administration (FDA) is considering the permanent exemption of premarket notification requirements for several Class I and II medical device products, including several artificial Intelligence (AI)-driven devices. The exemption is based on the need to rapidly more quickly disseminate devices to the public, estimated cost-savings, a lack of documented adverse events reported to the FDA's database. However, this ignores emerging issues related to AI-based devices, including utility, reproducibility and bias that may not only affect an individual but entire populations. We urge the FDA to reinforce the messaging on safety and effectiveness regulations of AI-based Software as a Medical Device products to better promote fair AI-driven clinical decision tools and for preventing harm to the patients we serve.
Collapse
Affiliation(s)
| | | | - Nigam Shah
- Department of Medicine, Stanford University, Stanford, California, USA
| |
Collapse
|
20
|
Hernandez-Boussard T, Bozkurt S, Ioannidis JPA, Shah NH. MINIMAR (MINimum Information for Medical AI Reporting): Developing reporting standards for artificial intelligence in health care. J Am Med Inform Assoc 2021; 27:2011-2015. [PMID: 32594179 PMCID: PMC7727333 DOI: 10.1093/jamia/ocaa088] [Citation(s) in RCA: 123] [Impact Index Per Article: 41.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/16/2020] [Revised: 04/24/2020] [Accepted: 04/29/2020] [Indexed: 12/23/2022] Open
Abstract
The rise of digital data and computing power have contributed to significant advancements in artificial intelligence (AI), leading to the use of classification and prediction models in health care to enhance clinical decision-making for diagnosis, treatment and prognosis. However, such advances are limited by the lack of reporting standards for the data used to develop those models, the model architecture, and the model evaluation and validation processes. Here, we present MINIMAR (MINimum Information for Medical AI Reporting), a proposal describing the minimum information necessary to understand intended predictions, target populations, and hidden biases, and the ability to generalize these emerging technologies. We call for a standard to accurately and responsibly report on AI in health care. This will facilitate the design and implementation of these models and promote the development and use of associated clinical decision support tools, as well as manage concerns regarding accuracy and bias.
Collapse
Affiliation(s)
- Tina Hernandez-Boussard
- Department of Medicine, Stanford University, Stanford, California, USA.,Department of Biomedical Data Science, Stanford University, Stanford, California, USA.,Department of Surgery, Stanford University, Stanford, California, USA
| | - Selen Bozkurt
- Department of Medicine, Stanford University, Stanford, California, USA
| | - John P A Ioannidis
- Department of Medicine, Stanford University, Stanford, California, USA.,Department of Statistics, Stanford University, Stanford, California, USA.,Meta-Research Innovation Center at Stanford, Stanford University, Stanford, California, USA
| | - Nigam H Shah
- Department of Medicine, Stanford University, Stanford, California, USA.,Department of Biomedical Data Science, Stanford University, Stanford, California, USA
| |
Collapse
|
21
|
Abstract
PURPOSE OF REVIEW Polygenic scores (PGS) are used to quantify the genetic predisposition for heritable traits, with hypothesized utility for personalized risk assessments. Lipid PGS are primed for clinical translation, but evidence-based practice changes will require rigorous PGS standards to ensure reproducibility and generalizability. Here we review applicable reporting and technical standards for dyslipidemia PGS translation along phases of the ACCE (Analytical validity, Clinical validity, Clinical utility, Ethical considerations) framework for evaluating genetic tests. RECENT FINDINGS New guidance suggests existing standards for study designs incorporating the ACCE framework are applicable to PGS and should be adopted. One recent example is the Clinical Genomics Resource (ClinGen) and Polygenic Score Catalog's PRS reporting standards, which define minimal requirements for describing rationale for score development, study population definitions and data parameters, risk model development and application, risk model evaluation, and translational considerations, such as generalizability beyond the target population studied. SUMMARY Lipid PGS are likely to be integrated into clinical practice in the future. Clinicians will need to be prepared to determine if and when lipid PGS is useful and valid. This decision-making will depend on the quality of evidence for the clinical use of PGS. Establishing reporting standards for PGS will help facilitate data sharing and transparency for critical evaluation, ultimately benefiting the efficiency of evidence-based practice.
Collapse
Affiliation(s)
| | - Joshua W. Knowles
- Division of Cardiovascular Medicine
- Cardiovascular Institute
- Stanford Diabetes Research Center
- Stanford Prevention Research Center, Stanford University, Stanford
- The FH Foundation, Pasadena, California, USA
| | | |
Collapse
|
22
|
Kohane IS, Aronow BJ, Avillach P, Beaulieu-Jones BK, Bellazzi R, Bradford RL, Brat GA, Cannataro M, Cimino JJ, García-Barrio N, Gehlenborg N, Ghassemi M, Gutiérrez-Sacristán A, Hanauer DA, Holmes JH, Hong C, Klann JG, Loh NHW, Luo Y, Mandl KD, Daniar M, Moore JH, Murphy SN, Neuraz A, Ngiam KY, Omenn GS, Palmer N, Patel LP, Pedrera-Jiménez M, Sliz P, South AM, Tan ALM, Taylor DM, Taylor BW, Torti C, Vallejos AK, Wagholikar KB, Weber GM, Cai T. What Every Reader Should Know About Studies Using Electronic Health Record Data but May Be Afraid to Ask. J Med Internet Res 2021; 23:e22219. [PMID: 33600347 PMCID: PMC7927948 DOI: 10.2196/22219] [Citation(s) in RCA: 46] [Impact Index Per Article: 15.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/13/2020] [Revised: 09/14/2020] [Accepted: 01/10/2021] [Indexed: 12/13/2022] Open
Abstract
Coincident with the tsunami of COVID-19–related publications, there has been a surge of studies using real-world data, including those obtained from the electronic health record (EHR). Unfortunately, several of these high-profile publications were retracted because of concerns regarding the soundness and quality of the studies and the EHR data they purported to analyze. These retractions highlight that although a small community of EHR informatics experts can readily identify strengths and flaws in EHR-derived studies, many medical editorial teams and otherwise sophisticated medical readers lack the framework to fully critically appraise these studies. In addition, conventional statistical analyses cannot overcome the need for an understanding of the opportunities and limitations of EHR-derived studies. We distill here from the broader informatics literature six key considerations that are crucial for appraising studies utilizing EHR data: data completeness, data collection and handling (eg, transformation), data type (ie, codified, textual), robustness of methods against EHR variability (within and across institutions, countries, and time), transparency of data and analytic code, and the multidisciplinary approach. These considerations will inform researchers, clinicians, and other stakeholders as to the recommended best practices in reviewing manuscripts, grants, and other outputs from EHR-data derived studies, and thereby promote and foster rigor, quality, and reliability of this rapidly growing field.
Collapse
Affiliation(s)
- Isaac S Kohane
- Department of Biomedical Informatics, Harvard Medical School, Boston, MA, United States
| | - Bruce J Aronow
- Biomedical Informatics, Cincinnati Children's Hospital Medical Center, University of Cincinnati, Cincinnati, OH, United States
| | - Paul Avillach
- Department of Biomedical Informatics, Harvard Medical School, Boston, MA, United States
| | | | - Riccardo Bellazzi
- Department of Electrical, Computer and Biomedical Engineering, University of Pavia, Pavia, Italy.,ICS Maugeri, Pavia, Italy
| | - Robert L Bradford
- North Carolina Translational and Clinical Sciences Institute, University of North Carolina at Chapel Hill, Chapel Hill, NC, United States
| | - Gabriel A Brat
- Department of Biomedical Informatics, Harvard Medical School, Boston, MA, United States
| | - Mario Cannataro
- Data Analytics Research Center, University Magna Graecia of Catanzaro, Catanzaro, Italy.,Department of Medical and Surgical Sciences, University Magna Graecia of Catanzaro, Catanzaro, Italy
| | - James J Cimino
- Informatics Institute, University of Alabama at Birmingham, Birmingham, AL, United States
| | | | - Nils Gehlenborg
- Department of Biomedical Informatics, Harvard Medical School, Boston, MA, United States
| | - Marzyeh Ghassemi
- Department of Computer Science and Medicine, University of Toronto, Toronto, ON, Canada
| | | | - David A Hanauer
- Department of Learning Health Sciences, University of Michigan Medical School, Ann Arbor, MI, United States
| | - John H Holmes
- Department of Biostatistics, Epidemiology, and Informatics, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, United States
| | - Chuan Hong
- Department of Biomedical Informatics, Harvard Medical School, Boston, MA, United States
| | - Jeffrey G Klann
- Department of Medicine, Harvard Medical School, Boston, MA, United States.,Laboratory of Computer Science, Massachusetts General Hospital, Boston, MA, United States
| | | | - Yuan Luo
- Department of Preventive Medicine, Northwestern University, Chicago, IL, United States
| | - Kenneth D Mandl
- Computational Health Informatics Program, Boston Children's Hospital, Boston, MA, United States
| | - Mohamad Daniar
- Clinical Research Informatics, Boston Children's Hospital, Boston, MA, United States
| | - Jason H Moore
- Institute for Biomedical Informatics, University of Pennsylvania, Philadelphia, PA, United States
| | - Shawn N Murphy
- Department of Biomedical Informatics, Harvard Medical School, Boston, MA, United States.,Department of Neurology, Massachusetts General Hospital, Boston, MA, United States
| | - Antoine Neuraz
- Department of Biomedical Informatics, Necker-Enfant Malades Hospital, Assistance Publique - Hôpitaux de Paris, Paris, France.,Centre de Recherche des Cordeliers, INSERM UMRS 1138 Team 22, Université de Paris, Paris, France
| | - Kee Yuan Ngiam
- National University Health Systems, Singapore, Singapore
| | - Gilbert S Omenn
- Department of Computational Medicine & Bioinformatics, University of Michigan, Ann Arbor, MI, United States
| | - Nathan Palmer
- Department of Biomedical Informatics, Harvard Medical School, Boston, MA, United States
| | - Lav P Patel
- Department of Internal Medicine, Division of Medical Informatics, University of Kansas Medical Center, Kansas City, KS, United States
| | | | - Piotr Sliz
- Computational Health Informatics Program, Boston Children's Hospital, Boston, MA, United States
| | - Andrew M South
- Section of Nephrology, Department of Pediatrics, Brenner Children's Hospital, Wake Forest School of Medicine, Winston Salem, NC, United States
| | - Amelia Li Min Tan
- Department of Biomedical Informatics, Harvard Medical School, Boston, MA, United States.,Department of Biomedical Informatics, National University of Singapore, Singapore, Singapore
| | - Deanne M Taylor
- Department of Biomedical and Health Informatics, The Children's Hospital of Philadelphia, Philadelphia, PA, United States.,Department of Pediatrics, Perelman School of Medicine, The University of Pennsylvania, Philadelphia, PA, United States
| | - Bradley W Taylor
- Clinical and Translational Science Institute, Medical College of Wisconsin, Milwaukee, WI, United States
| | - Carlo Torti
- Department of Medical and Surgical Sciences, University Magna Graecia of Catanzaro, Catanzaro, Italy
| | - Andrew K Vallejos
- Clinical and Translational Science Institute, Medical College of Wisconsin, Milwaukee, WI, United States
| | - Kavishwar B Wagholikar
- Department of Medicine, Harvard Medical School, Boston, MA, United States.,Laboratory of Computer Science, Massachusetts General Hospital, Boston, MA, United States
| | | | - Griffin M Weber
- Department of Biomedical Informatics, Harvard Medical School, Boston, MA, United States
| | - Tianxi Cai
- Department of Biomedical Informatics, Harvard Medical School, Boston, MA, United States
| |
Collapse
|
23
|
Elliott L, Coulman K, Blencowe NS, Qureshi MI, Lee KS, Hinchliffe RJ, Mouton R. A systematic review of reporting quality for anaesthetic interventions in randomised controlled trials. Anaesthesia 2020; 76:832-836. [PMID: 33150618 PMCID: PMC8246731 DOI: 10.1111/anae.15294] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/21/2020] [Indexed: 12/31/2022]
Abstract
Interventions from randomised controlled trials can only be replicated if they are reported in sufficient detail. The results of trials can only be confidently interpreted if the delivery of the intervention was systematic and the protocol adhered to. We systematically reviewed trials of anaesthetic interventions published in 12 journals from January 2016 to September 2019. We assessed the detail with which interventions were reported, using the Consolidated Standards of Reporting Trials statement for non‐pharmacological treatments. We analysed 162 interventions reported by 78 trials in 18,675 participants. Detail sufficiently precise to replicate the intervention was reported for 111 (69%) interventions. Intervention standardisation was reported for 135 (83%) out of the 162 interventions, and protocol adherence was reported for 20 (12%) interventions. Sixty (77%) out of the 78 trials reported the administrative context in which interventions were delivered and 36 (46%) trials detailed the expertise of the practitioners. We conclude that bespoke reporting tools should be developed for anaesthetic interventions and interventions in other areas such as critical care.
Collapse
Affiliation(s)
- L Elliott
- Department of Anaesthesia, Bristol Centre for Surgical Research, Bristol, UK
| | - K Coulman
- Department of Vascular Surgery, Bristol Centre for Surgical Research, Bristol, UK
| | - N S Blencowe
- Department of Vascular Surgery, Bristol Centre for Surgical Research, Bristol, UK
| | - M I Qureshi
- Department of Vascular Surgery, Bristol Centre for Surgical Research, Bristol, UK
| | - K S Lee
- Bristol Centre for Surgical Research, Bristol, UK
| | | | - R Mouton
- North Bristol NHS Trust, Bristol, UK
| |
Collapse
|
24
|
Nuijten MB, Polanin JR. "statcheck": Automatically detect statistical reporting inconsistencies to increase reproducibility of meta-analyses. Res Synth Methods 2020; 11:574-579. [PMID: 32275351 PMCID: PMC7540394 DOI: 10.1002/jrsm.1408] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2019] [Revised: 01/24/2020] [Accepted: 03/26/2020] [Indexed: 11/11/2022]
Abstract
We present the R package and web app statcheck to automatically detect statistical reporting inconsistencies in primary studies and meta-analyses. Previous research has shown a high prevalence of reported p-values that are inconsistent - meaning a re-calculated p-value, based on the reported test statistic and degrees of freedom, does not match the author-reported p-value. Such inconsistencies affect the reproducibility and evidential value of published findings. The tool statcheck can help researchers to identify statistical inconsistencies so that they may correct them. In this paper, we provide an overview of the prevalence and consequences of statistical reporting inconsistencies. We also discuss the tool statcheck in more detail and give an example of how it can be used in a meta-analysis. We end with some recommendations concerning the use of statcheck in meta-analyses and make a case for better reporting standards of statistical results.
Collapse
Affiliation(s)
- Michèle B. Nuijten
- The Department of Methodology and Statistics, Tilburg UniversityTilburgThe Netherlands
| | - Joshua R. Polanin
- Research & Evaluation, American Institutes for ResearchWashingtonDCUSA
| |
Collapse
|
25
|
Abstract
A variety of microscopy techniques are used by researchers in the life and biomedical sciences. As these techniques become more powerful and more complex, it is vital that scientific articles containing images obtained with advanced microscopes include full details about how each image was obtained. To explore the reporting of such details we examined 240 original research articles published in eight journals. We found that the quality of reporting was poor, with some articles containing no information about how images were obtained, and many articles lacking important basic details. Efforts by researchers, funding agencies, journals, equipment manufacturers and staff at shared imaging facilities are required to improve the reporting of experiments that rely on microscopy techniques.
Collapse
Affiliation(s)
- Guillermo Marqués
- University Imaging Centers and Department of Neuroscience, University of Minnesota, Minneapolis, United States
| | - Thomas Pengo
- University of Minnesota Informatics Institute , University of Minnesota, Minneapolis, United States
| | - Mark A Sanders
- University Imaging Centers and Department of Neuroscience, University of Minnesota, Minneapolis, United States
| |
Collapse
|
26
|
Abstract
Because of the different philosophy of Bayesian statistics, where parameters are random variables and data are considered fixed, the analysis and presentation of results will differ from that of frequentist statistics. Most importantly, the probabilities that a parameter is in certain regions of the parameter space are crucial quantities in Bayesian statistics that are not calculable (or considered important) in the frequentist approach that is the basis of much of traditional statistics. In this article, I discuss the implications of these differences for presentation of the results of Bayesian analyses. In doing so, I present more detailed guidelines than are usually provided and explain the rationale for my suggestions.
Collapse
Affiliation(s)
- David Rindskopf
- Educational Psychology, 14772CUNY Graduate Center, New York, NY, USA
| |
Collapse
|
27
|
Gadaire DM, Kilmer RP. Use of the Template for Intervention Description and Replication (Tidier) Checklist in Social Work Research. J Evid Based Soc Work (2019) 2020; 17:137-148. [PMID: 33300468 DOI: 10.1080/26408066.2020.1724226] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
Social work has a longstanding commitment to sound research and the development and dissemination of evidence-based practice. To that end, multiple professional groups have developed or refined guidelines for reporting research procedures and findings, with the objectives of enhancing transparency, integrity, and rigor in science. Such guidelines can also facilitate replication and systematic review. The Template for Intervention Description and Replication (TIDieR) checklist represents the culmination of a multi-stage process to expand upon existing reporting guidelines. As such, the checklist provides a framework for more transparent communication about empirically-grounded interventions addressing a broad range of social and behavioral health issues. Use of this checklist can be beneficial for researchers, practitioners, and recipients of social work interventions. After discussing selected background regarding the need for and benefit of reporting standards and describing the TIDieR measure, we outline practical considerations in the checklist's use by those engaged in social work research.
Collapse
Affiliation(s)
- Dana M Gadaire
- Department of Counseling and Human Services, University of Scranton, Scranton, USA
| | - Ryan P Kilmer
- Department of Psychological Science, University of North Carolina at Charlotte, Charlotte, USA
| |
Collapse
|
28
|
Booth A, Briscoe S, Wright JM. The "realist search": A systematic scoping review of current practice and reporting. Res Synth Methods 2019; 11:14-35. [PMID: 31714016 DOI: 10.1002/jrsm.1386] [Citation(s) in RCA: 36] [Impact Index Per Article: 7.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/11/2019] [Revised: 10/23/2019] [Accepted: 10/29/2019] [Indexed: 11/06/2022]
Abstract
The requirement for literature searches that identify studies for inclusion in systematic reviews should be systematic, explicit, and reproducible extends, at least by implication, to other types of literature review. However, realist reviews commonly require literature searches that challenge systematic reporting; searches are iterative and involve multiple search strategies and approaches. Notwithstanding these challenges, reporting of the "realist search" can be structured to be transparent and to facilitate identification of innovative retrieval practices. Our six-component search framework consolidates and extends the structure advanced by Pawson, one of the originators of realist review: formulating the question, conducting the background search, searching for program theory, searching for empirical studies, searching to refine program theory and identifying relevant mid-range theory, and documenting and reporting the search process. This study reviews reports of search methods in 34 realist reviews published within the calendar year of 2016. Data from all eligible reviews were extracted from the search framework. Realist search reports poorly differentiate between the different search components. Review teams often conduct a single "big bang" multipurpose search to fulfill multiple functions within the review. However, it is acknowledged that realist searches are likely to be iterative and responsive to emergent data. Overall, the search for empirical studies appears most comprehensive in conduct and reporting detail. In contrast, searches to identify and refine program theory are poorly conducted, if at all, and poorly reported. Use of this framework offers greater transparency in conduct and reporting while preserving flexibility and methodological innovation.
Collapse
Affiliation(s)
- Andrew Booth
- School of Health and Related Research (ScHARR), University of Sheffield, Sheffield, UK
| | - Simon Briscoe
- Exeter HS&DR Evidence Synthesis Centre, Institute of Health Research, College of Medicine and Health, University of Exeter, Exeter, UK
| | - Judy M Wright
- Academic Unit of Health Economics, Leeds Institute of Health Science, University of Leeds, Leeds, UK
| |
Collapse
|
29
|
Webster RK, Howick J, Hoffmann T, Macdonald H. Inadequate description of placebo and sham controls in a systematic review of recent trials. Eur J Clin Invest 2019; 49:e13169. [PMID: 31519047 PMCID: PMC6819221 DOI: 10.1111/eci.13169] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/24/2019] [Revised: 08/19/2019] [Accepted: 09/07/2019] [Indexed: 12/19/2022]
Abstract
BACKGROUND Poorly described placebo/sham controls make it difficult to appraise active intervention benefits and harms. The 12-item Template for Intervention Description and Replication (TIDieR) checklist was developed to improve the reporting of active interventions. The extent to which TIDieR has been used to improve description of placebo or sham control is not known. MATERIALS AND METHODS We systematically identified and examined all placebo/sham-controlled randomised trials published in 2018 in the top six general medical journals. We reported how many of the TIDieR checklist items were used to describe the placebo/sham control(s). We supplemented this with a sample of 100 placebo/sham-controlled trials from any journal and searched Google Scholar to identify placebo/sham-controlled trials citing TIDieR. RESULTS We identified 94 placebo/sham-controlled trials published in the top journals in 2018. None reported using TIDieR, and none reported placebo or sham components completely. On average eight TIDieR items were addressed, with placebo/sham control name (100%) and when and how much was administered (97.9%) most commonly reported. Some items (rationale, 8.5%, whether there were modifications, 25.5%) were less often reported. In our sample of less well-cited journals, reporting was poorer (average of six items) and followed a similar pattern. Since TIDieR's first publication, six placebo-controlled trials have cited it according to Google Scholar. Two of these used the checklist to describe placebo controls; neither one completely desribed the placebo intervention. CONCLUSIONS Placebo and sham controls are poorly described within randomised trials, and TIDieR is rarely used to guide these descriptions. We recommend developing guidelines to promote better descriptions of placebo/sham control components within clinical trials.
Collapse
|
30
|
Fergusson DA, Wesch NL, Leung GJ, MacNeil JL, Conic I, Presseau J, Cobey KD, Diallo JS, Auer R, Kimmelman J, Kekre N, El-Sayes N, Krishnan R, Keller BA, Ilkow C, Lalu MM. Assessing the Completeness of Reporting in Preclinical Oncolytic Virus Therapy Studies. Mol Ther Oncolytics 2019; 14:179-187. [PMID: 31276026 PMCID: PMC6586991 DOI: 10.1016/j.omto.2019.05.004] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/25/2019] [Accepted: 05/09/2019] [Indexed: 01/08/2023] Open
Abstract
Irreproducibility of preclinical findings could be a significant barrier to the "bench-to-bedside" development of oncolytic viruses (OVs). A contributing factor is the incomplete and non-transparent reporting of study methodology and design. Using the NIH Principles and Guidelines for Reporting Preclinical Research, a core set of seven recommendations, we evaluated the completeness of reporting of preclinical OV studies. We also developed an evidence map identifying the current trends in OV research. A systematic search of MEDLINE and Embase identified all relevant articles published over an 18 month period. We screened 1,554 articles, and 236 met our a priori-defined inclusion criteria. Adenovirus (43%) was the most commonly used viral platform. Frequently investigated cancers included colorectal (14%), skin (12%), and breast (11%). Xenograft implantation (61%) in mice (96%) was the most common animal model. The use of preclinical reporting guidelines was listed in 0.4% of articles. Biological and technical replicates were completely reported in 1% of studies, statistics in 49%, randomization in 1%, blinding in 2%, sample size estimation in 0%, and inclusion/exclusion criteria in 0%. Overall, completeness of reporting in the preclinical OV therapy literature is poor. This may hinder efforts to interpret, replicate, and ultimately translate promising preclinical OV findings.
Collapse
Affiliation(s)
- Dean A. Fergusson
- Clinical Epidemiology Program, BLUEPRINT Translational Research Group, Ottawa Hospital Research Institute, Ottawa, ON K1H 8L6, Canada
- Faculty of Medicine, University of Ottawa, Ottawa, ON K1H 8M5, Canada
| | - Neil L. Wesch
- Clinical Epidemiology Program, BLUEPRINT Translational Research Group, Ottawa Hospital Research Institute, Ottawa, ON K1H 8L6, Canada
| | - Garvin J. Leung
- Clinical Epidemiology Program, BLUEPRINT Translational Research Group, Ottawa Hospital Research Institute, Ottawa, ON K1H 8L6, Canada
- Faculty of Medicine, University of Ottawa, Ottawa, ON K1H 8M5, Canada
| | - Jenna L. MacNeil
- Clinical Epidemiology Program, BLUEPRINT Translational Research Group, Ottawa Hospital Research Institute, Ottawa, ON K1H 8L6, Canada
| | - Isidora Conic
- Clinical Epidemiology Program, BLUEPRINT Translational Research Group, Ottawa Hospital Research Institute, Ottawa, ON K1H 8L6, Canada
| | - Justin Presseau
- Clinical Epidemiology Program, BLUEPRINT Translational Research Group, Ottawa Hospital Research Institute, Ottawa, ON K1H 8L6, Canada
- School of Epidemiology and Public Health, University of Ottawa, Ottawa, ON K1H 8M5, Canada
| | - Kelly D. Cobey
- School of Epidemiology and Public Health, University of Ottawa, Ottawa, ON K1H 8M5, Canada
- Centre for Journalology, Ottawa Hospital Research Institute, Ottawa, ON K1H 8L6, Canada
- Department of Psychology, University of Stirling, Stirling FK9 4LA, UK
| | - Jean-Simon Diallo
- Cancer Therapeutics Program, Ottawa Hospital Research Institute, Ottawa, ON K1H 8L6, Canada
- Department of Biochemistry, Microbiology and Immunology, University of Ottawa, Ottawa, ON K1H 8M5, Canada
| | - Rebecca Auer
- Faculty of Medicine, University of Ottawa, Ottawa, ON K1H 8M5, Canada
- Cancer Therapeutics Program, Ottawa Hospital Research Institute, Ottawa, ON K1H 8L6, Canada
- Department of Biochemistry, Microbiology and Immunology, University of Ottawa, Ottawa, ON K1H 8M5, Canada
| | | | - Natasha Kekre
- Faculty of Medicine, University of Ottawa, Ottawa, ON K1H 8M5, Canada
| | - Nader El-Sayes
- Faculty of Medicine, University of Ottawa, Ottawa, ON K1H 8M5, Canada
- Cancer Therapeutics Program, Ottawa Hospital Research Institute, Ottawa, ON K1H 8L6, Canada
| | - Ramya Krishnan
- Faculty of Medicine, University of Ottawa, Ottawa, ON K1H 8M5, Canada
- Cancer Therapeutics Program, Ottawa Hospital Research Institute, Ottawa, ON K1H 8L6, Canada
| | - Brian A. Keller
- Faculty of Medicine, University of Ottawa, Ottawa, ON K1H 8M5, Canada
- Cancer Therapeutics Program, Ottawa Hospital Research Institute, Ottawa, ON K1H 8L6, Canada
| | - Carolina Ilkow
- Cancer Therapeutics Program, Ottawa Hospital Research Institute, Ottawa, ON K1H 8L6, Canada
- Department of Biochemistry, Microbiology and Immunology, University of Ottawa, Ottawa, ON K1H 8M5, Canada
| | - Manoj M. Lalu
- Clinical Epidemiology Program, BLUEPRINT Translational Research Group, Ottawa Hospital Research Institute, Ottawa, ON K1H 8L6, Canada
- Regenerative Medicine Program, Ottawa Hospital Research Institute, Ottawa, ON K1H 8L6, Canada
- Department of Anesthesiology and Pain Medicine, University of Ottawa, The Ottawa Hospital, Ottawa, ON K1H 8L6, Canada
- Department of Cellular and Molecular Medicine, University of Ottawa, Ottawa, ON K1H 8M5, Canada
| |
Collapse
|
31
|
Affiliation(s)
- Adelson Pinon
- Independent Researcher, Santiago de Compostela, Spain
| |
Collapse
|
32
|
Song Y, Darzi A, Ballesteros M, Martínez García L, Alonso-Coello P, Arayssi T, Bhaumik S, Chen Y, Cluzeau F, Ghersi D, Padilla PF, Langlois EV, Schünemann HJ, Vernooij RWM, Akl EA. Extending the RIGHT statement for reporting adapted practice guidelines in healthcare: the RIGHT-Ad@pt Checklist protocol. BMJ Open 2019; 9:e031767. [PMID: 31551391 PMCID: PMC6773334 DOI: 10.1136/bmjopen-2019-031767] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/16/2022] Open
Abstract
INTRODUCTION The adaptation of guidelines is an increasingly used methodology for the efficient development of contextualised recommendations. Nevertheless, there is no specific reporting guidance. The essential Reporting Items of Practice Guidelines in Healthcare (RIGHT) statement could be useful for reporting adapted guidelines, but it does not address all the important aspects of the adaptation process. The objective of our project is to develop an extension of the RIGHT statement for the reporting of adapted guidelines (RIGHT-Ad@pt Checklist). METHODS AND ANALYSIS To develop the RIGHT-Ad@pt Checklist, we will use a multistep process that includes: (1) establishment of a Working Group; (2) generation of an initial checklist based on the RIGHT statement; (3) optimisation of the checklist (an initial assessment of adapted guidelines, semistructured interviews, a Delphi consensus survey, an external review by guideline developers and users and a final assessment of adapted guidelines); and (4) approval of the final checklist. At each step of the process, we will calculate absolute frequencies and proportions, use content analysis to summarise and draw conclusions, discuss the results, draft a report and refine the checklist. ETHICS AND DISSEMINATION We have obtained a waiver of approval from the Clinical Research Ethics Committee at the Hospital de la Santa Creu i Sant Pau (Barcelona, Spain). We will disseminate the RIGHT-Ad@pt Checklist by publishing into a peer-reviewed journal, presenting to relevant stakeholders and translating into different languages. We will continuously seek feedback from stakeholders, surveil new relevant evidence and, if necessary, update the checklist.
Collapse
Affiliation(s)
- Yang Song
- Iberoamerican Cochrane Centre-Biomedical Research Institute Sant Pau (IIB Sant Pau), Barcelona, Spain
| | - Andrea Darzi
- AUB GRADE Center, American University of Beirut, Beirut, Lebanon
| | | | - Laura Martínez García
- Iberoamerican Cochrane Centre-Biomedical Research Institute Sant Pau (IIB Sant Pau), Barcelona, Spain
- CIBER de Epidemiología y Salud Pública (CIBERESP), Barcelona, Spain
| | - Pablo Alonso-Coello
- Iberoamerican Cochrane Centre-Biomedical Research Institute Sant Pau (IIB Sant Pau), Barcelona, Spain
- CIBER de Epidemiología y Salud Pública (CIBERESP), Barcelona, Spain
- Department of Health Research Methods, Evidence, and Impact, McMaster GRADE center, McMaster University, Hamilton, Ontario, Canada
| | | | | | - Yaolong Chen
- Evidence-Based Medicine Center, School of Basic Medical Sciences, Lanzhou University, Lanzhou, China
- WHO Collaborating Centre for Guideline Implementation and Knowledge Translation, Lanzhou, China
| | - Francoise Cluzeau
- Faculty of Medicine, School of Public Health, Imperial College London, London, UK
| | - Davina Ghersi
- National Health and Medical Research Council, Canberra, Australian Capital Territory, Australia
| | - Paulina F Padilla
- Facultad de Medicina y Odontología, Universidad de Antofagasta, Antofagasta, Chile
| | - Etienne V Langlois
- Alliance for Health Policy and Systems Research, World Health Organization, Geneve, Switzerland
| | - Holger J Schünemann
- Department of Health Research Methods, Evidence, and Impact, McMaster GRADE center, McMaster University, Hamilton, Ontario, Canada
| | - Robin W M Vernooij
- Department of Research, Netherlands Comprehensive Cancer Organisation (IKNL), Utrecht, The Netherlands
| | - Elie A Akl
- AUB GRADE Center, American University of Beirut, Beirut, Lebanon
- Department of Health Research Methods, Evidence, and Impact, McMaster GRADE center, McMaster University, Hamilton, Ontario, Canada
- Department of Internal Medicine, American University of Beirut, Beirut, Lebanon
| |
Collapse
|
33
|
|
34
|
Beintner I, Vollert B, Zarski AC, Bolinski F, Musiat P, Görlich D, Ebert DD, Jacobi C. Adherence Reporting in Randomized Controlled Trials Examining Manualized Multisession Online Interventions: Systematic Review of Practices and Proposal for Reporting Standards. J Med Internet Res 2019; 21:e14181. [PMID: 31414664 PMCID: PMC6713038 DOI: 10.2196/14181] [Citation(s) in RCA: 33] [Impact Index Per Article: 6.6] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/29/2019] [Revised: 06/07/2019] [Accepted: 06/27/2019] [Indexed: 02/02/2023] Open
Abstract
Background Adherence reflects the extent to which individuals experience or engage with the content of online interventions and poses a major challenge. Neglecting to examine and report adherence and its relation to outcomes can compromise the interpretation of research findings. Objective The aim of this systematic review is to analyze how adherence is accounted for in publications and to propose standards for measuring and reporting adherence to online interventions. Methods We performed a systematic review of randomized controlled trials on online interventions for the prevention and treatment of common mental disorders (depression, anxiety disorders, substance related disorders, and eating disorders) published between January 2006 and May 2018 and indexed in Medline and Web of Science. We included primary publications on manualized online treatments (more than 1 session and successive access to content) and examined how adherence was reported in these publications. Results We identified 216 publications that met our inclusion criteria. Adherence was addressed in 85% of full-text manuscripts, but only in 31% of abstracts. A median of three usage metrics were reported; the most frequently reported usage metric (61%) was intervention completion. Manuscripts published in specialized electronic health journals more frequently included information on the relation of adherence and outcomes. Conclusions We found substantial variety in the reporting of adherence and the usage metrics used to operationalize adherence. This limits the comparability of results and impedes the integration of findings from different studies. Based on our findings, we propose reporting standards for future publications on online interventions.
Collapse
Affiliation(s)
- Ina Beintner
- Faculty of Psychology, School of Science, Technische Universität Dresden, Dresden, Germany
| | - Bianka Vollert
- Faculty of Psychology, School of Science, Technische Universität Dresden, Dresden, Germany
| | - Anna-Carlotta Zarski
- Institute of Psychology, Faculty of Humanities, Social Sciences, and Theology, Friedrich-Alexander-University Erlangen-Nürnberg, Erlangen, Germany
| | - Felix Bolinski
- Department of Clinical, Neuro- and Developmental Psychology, Faculty of Behavioural and Movement Sciences, Vrije Universiteit Amsterdam, Amsterdam, Netherlands
| | - Peter Musiat
- Department of Psychological Medicine, Institute of Psychiatry, Psychology and Neuroscience, King's College London, London, United Kingdom
| | - Dennis Görlich
- Institute of Biostatistics and Clinical Research, Faculty of Medicine, Westfälische Wilhelms-Universität Münster, Münster, Germany
| | - David Daniel Ebert
- Institute of Psychology, Faculty of Humanities, Social Sciences, and Theology, Friedrich-Alexander-University Erlangen-Nürnberg, Erlangen, Germany
| | - Corinna Jacobi
- Faculty of Psychology, School of Science, Technische Universität Dresden, Dresden, Germany
| |
Collapse
|
35
|
Mo M, Liao X, Zhang XX, Li B, Chen W, Zhao GZ, Guo YB. [ Reporting standards for expert consensus on clinical practice of Chinese patent medicines of China Association of Chinese Medicine]. Zhongguo Zhong Yao Za Zhi 2019; 44:2644-2651. [PMID: 31359735 DOI: 10.19540/j.cnki.cjcmm.20190308.002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
In 2018,Standardization Department of China Association of Chinese Medicine invited methodologists with the background of evidence-based medicine to discuss and draft a series of standards for expert consensus on clinical practice of Chinese patent medicines. These standards have been made by reference to the published standards for developing expert consensus and clinical practice guidelines. These standards were made based on full consideration of the current evidence status and the history of clinical practice of Chinese patent medicines. These standards were composed of four parts,namely information summary items,normative general items,normative technical items and information supplementary items,including cover,content,preface,introduction,title,scope,basic Information of Chinese patent medicine,suggestions for clinical application,safety,conflict of interest,appendix,and references,so as to provides reference for improving the quality of expert consensus-based compilation and enhancing the applicability of expert consensus.
Collapse
Affiliation(s)
- Mei Mo
- Standardization Department,China Association of Chinese Medicine Beijing 100029,China
| | - Xing Liao
- Center of Evidence Based Traditional Chinese Medicine,Institute of Basic Research in Clinical Medicine,China Academy of Chinese Medical Sciences Beijing 100700,China
| | - Xiao-Xiao Zhang
- Standardization Department,China Association of Chinese Medicine Beijing 100029,China
| | - Bo Li
- Beijing Hospital of Traditional Chinese Affiliated to Capital Medical University Beijing 100010,China
| | - Wei Chen
- Centre for Evidence-Based Chinese Medicine,Beijing University of Chinese Medicine Beijing 100029,China
| | - Guo-Zhen Zhao
- Beijing Hospital of Traditional Chinese Affiliated to Capital Medical University Beijing 100010,China
| | - Yu-Bo Guo
- Standardization Department,China Association of Chinese Medicine Beijing 100029,China
| |
Collapse
|
36
|
Abstract
Qualitative evidence synthesis (QES) encompasses more than 20 methods for synthesizing qualitative accounts of research phenomena documenting real-life contexts. However, tensions frequently arise from the different heritages that shape QES methodology: namely, systematic reviews of effectiveness and primary qualitative research. Methodological innovations either derive from each heritage or are stimulated when both are in juxtaposition; it is important to broker a rapprochement. This article draws on practical experience from a range of syntheses and methodological development work conducted with the Cochrane Qualitative and Implementation Methods Group. The legacy of both heritages is briefly characterized. Three stages of the QES process offer exemplars: searching/sampling, quality assessment, and data synthesis. Rather than an antagonistic clash of research paradigms, this dual heritage offers an opportunity to harness the collective energies of both paradigms. Future methodological research is needed to identify further applications by which this dual heritage might be optimally harnessed.
Collapse
Affiliation(s)
- Andrew Booth
- 1 The University of Sheffield, Sheffield, United Kingdom
| |
Collapse
|
37
|
Grant S. The CONSORT-SPI 2018 extension: a new guideline for reporting social and psychological intervention trials. Addiction 2019; 114:4-8. [PMID: 30091280 DOI: 10.1111/add.14411] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/10/2017] [Revised: 10/30/2017] [Accepted: 11/14/2017] [Indexed: 11/30/2022]
Affiliation(s)
| | -
- RAND Corporation, Santa Monica, CA, USA
| |
Collapse
|
38
|
Bolam FC, Grainger MJ, Mengersen KL, Stewart GB, Sutherland WJ, Runge MC, McGowan PJK. Using the Value of Information to improve conservation decision making. Biol Rev Camb Philos Soc 2018; 94:629-647. [PMID: 30280477 DOI: 10.1111/brv.12471] [Citation(s) in RCA: 31] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/07/2017] [Revised: 08/31/2018] [Accepted: 09/04/2018] [Indexed: 12/11/2022]
Abstract
Conservation decisions are challenging, not only because they often involve difficult conflicts among outcomes that people value, but because our understanding of the natural world and our effects on it is fraught with uncertainty. Value of Information (VoI) methods provide an approach for understanding and managing uncertainty from the standpoint of the decision maker. These methods are commonly used in other fields (e.g. economics, public health) and are increasingly used in biodiversity conservation. This decision-analytical approach can identify the best management alternative to select where the effectiveness of interventions is uncertain, and can help to decide when to act and when to delay action until after further research. We review the use of VoI in the environmental domain, reflect on the need for greater uptake of VoI, particularly for strategic conservation planning, and suggest promising areas for new research. We also suggest common reporting standards as a means of increasing the leverage of this powerful tool. The environmental science, ecology and biodiversity categories of the Web of Knowledge were searched using the terms 'Value of Information,' 'Expected Value of Perfect Information,' and the abbreviation 'EVPI.' Google Scholar was searched with the same terms, and additionally the terms decision and biology, biodiversity conservation, fish, or ecology. We identified 1225 papers from these searches. Included studies were limited to those that showed an application of VoI in biodiversity conservation rather than simply describing the method. All examples of use of VOI were summarised regarding the application of VoI, the management objectives, the uncertainties, the models used, how the objectives were measured, and the type of VoI. While the use of VoI appears to be on the increase in biodiversity conservation, the reporting of results is highly variable, which can make it difficult to understand the decision context and which uncertainties were considered. Moreover, it was unclear if, and how, the papers informed management and policy interventions, which is why we suggest a range of reporting standards that would aid the use of VoI. The use of VoI in conservation settings is at an early stage. There are opportunities for broader applications, not only for species-focussed management problems, but also for setting local or global research priorities for biodiversity conservation, making funding decisions, or designing or improving protected area networks and management. The long-term benefits of applying VoI methods to biodiversity conservation include a more structured and decision-focused allocation of resources to research.
Collapse
Affiliation(s)
- Friederike C Bolam
- School of Natural and Environmental Sciences, Newcastle University, Newcastle upon Tyne, NE1 7RU, U.K
| | - Matthew J Grainger
- School of Natural and Environmental Sciences, Newcastle University, Newcastle upon Tyne, NE1 7RU, U.K
| | - Kerrie L Mengersen
- School of Mathematical Sciences, Queensland University of Technology, 2 George St, Brisbane, QLD 4001, Australia
| | - Gavin B Stewart
- School of Natural and Environmental Sciences, Newcastle University, Newcastle upon Tyne, NE1 7RU, U.K
| | - William J Sutherland
- Conservation Science Group, Department of Zoology, Cambridge University, The David Attenborough Building, Cambridge, CB2 3QZ, U.K
| | - Michael C Runge
- US Geological Survey, Patuxent Wildlife Research Centre, 12100 Beech Forest Road, Laurel, MD 20708, U.S.A
| | - Philip J K McGowan
- School of Natural and Environmental Sciences, Newcastle University, Newcastle upon Tyne, NE1 7RU, U.K
| |
Collapse
|
39
|
Abstract
This article is based on a presentation that was made at the 2014 annual meeting of the editorial board of Health Education & Behavior. The article addresses critical issues related to standards of scientific reporting in journals, including concerns about external and internal validity and reporting bias. It reviews current reporting guidelines, effects of adopting guidelines, and offers suggestions for improving reporting. The evidence about the effects of guideline adoption and implementation is briefly reviewed. Recommendations for adoption and implementation of appropriate guidelines, including considerations for journals, are provided.
Collapse
Affiliation(s)
| | | | - Evan Mayo-Wilson
- Johns Hopkins Bloomberg School of Public Health, Baltimore, MD, USA
| | | |
Collapse
|
40
|
Abstract
Objective: There is striking paucity in consensus on the terminology, definition, and diagnostic criteria of mycotic aortic aneurysms. This literature study aims to elucidate this scientific omission, discuss its consequences, and present a proposition for reporting items on this disease. Methods: A systematic literature review on PubMed and Medline using mycotic and infected aortic aneurysms between 1850 and 2017 was performed. Articles were assessed according to a protocol regarding terminology, definition, and diagnostic criteria. Case series with less than 5 patients were excluded. Results: A total of 49 articles were included. The most prevalent term was mycotic aortic aneurysm but there was no widely accepted definition. Most modern publications used a diagnostic workup based on a combination on clinical presentation, laboratory results, imaging findings, and intraoperative findings. How these protean variables should be balanced was unclear. A proposition of reporting items was framed and consisted of definition of disease used, basis of diagnostic workup, exclusion criteria, patient characteristics, laboratory and imaging findings, aneurysm anatomy, details on treatment, pre/postoperative antibiotic treatment, and details on follow-up. Conclusions: This article emphasizes the need to standardize definition, terminology, and diagnostic criteria for mycotic aortic aneurysms and proposes reporting items enhancing comparability between studies.
Collapse
Affiliation(s)
- Karl Sörelius
- Department of Surgical Sciences, Section of Vascular Surgery, Uppsala University, Uppsala, Sweden
| | - Pietro G di Summa
- Department of Plastic, Reconstructive, and Hand Surgery, Centre Hospitalier Universitaire Vaudois, Lausanne, Switzerland
| |
Collapse
|
41
|
Thomas R, Sims R, Degeling C, Street JM, Carter SM, Rychetnik L, Whitty JA, Wilson A, Ward P, Glasziou P. CJCheck Stage 1: development and testing of a checklist for reporting community juries - Delphi process and analysis of studies published in 1996-2015. Health Expect 2016; 20:626-637. [PMID: 27704684 PMCID: PMC5513001 DOI: 10.1111/hex.12493] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 08/05/2016] [Indexed: 12/25/2022] Open
Abstract
Background Opportunities for community members to actively participate in policy development are increasing. Community/citizen's juries (CJs) are a deliberative democratic process aimed to illicit informed community perspectives on difficult topics. But how comprehensive these processes are reported in peer‐reviewed literature is unknown. Adequate reporting of methodology enables others to judge process quality, compare outcomes, facilitate critical reflection and potentially repeat a process. We aimed to identify important elements for reporting CJs, to develop an initial checklist and to review published health and health policy CJs to examine reporting standards. Design Using the literature and expertise from CJ researchers and policy advisors, a list of important CJ reporting items was suggested and further refined. We then reviewed published CJs within the health literature and used the checklist to assess the comprehensiveness of reporting. Results CJCheck was developed and examined reporting of CJ planning, juror information, procedures and scheduling. We screened 1711 studies and extracted data from 38. No studies fully reported the checklist items. The item most consistently reported was juror numbers (92%, 35/38), while least reported was the availability of expert presentations (5%, 2/38). Recruitment strategies were described in 66% of studies (25/38); however, the frequency and timing of deliberations was inadequately described (29%, 11/38). Conclusions Currently CJ publications in health and health policy literature are inadequately reported, hampering their use in policy making. We propose broadening the CJCheck by creating a reporting standards template in collaboration with international CJ researchers, policy advisors and consumer representatives to ensure standardized, systematic and transparent reporting.
Collapse
Affiliation(s)
- Rae Thomas
- Centre for Research in Evidence-Based Practice (CREBP), Faculty of Health Sciences and Medicine, Bond University, Gold Coast, Qld, Australia
| | - Rebecca Sims
- Centre for Research in Evidence-Based Practice (CREBP), Faculty of Health Sciences and Medicine, Bond University, Gold Coast, Qld, Australia
| | - Chris Degeling
- Centre for Values, Ethics and the Law in Medicine, University of Sydney, Sydney, NSW, Australia
| | - Jackie M Street
- School of Public Health, The University of Adelaide, Adelaide, SA, Australia
| | - Stacy M Carter
- Centre for Values, Ethics and the Law in Medicine, University of Sydney, Sydney, NSW, Australia
| | - Lucie Rychetnik
- School of Medicine, University of Notre Dame, Sydney, NSW, Australia.,School of Public Health, University of Sydney, Sydney, NSW, Australia
| | - Jennifer A Whitty
- Health Economics Group, Norwich Medical School, University of East Anglia, Norwich, UK
| | - Andrew Wilson
- Menzies Centre for Health Policy, University of Sydney, Sydney, NSW, Australia
| | - Paul Ward
- Discipline of Public Health, School of Health Sciences, Faculty of Medicine, Nursing and Health Sciences, Flinders University, Adelaide, SA, Australia
| | - Paul Glasziou
- Centre for Research in Evidence-Based Practice (CREBP), Faculty of Health Sciences and Medicine, Bond University, Gold Coast, Qld, Australia
| |
Collapse
|
42
|
Abstract
Objective: The objective of present study was to survey and determine the reporting standards of animal studies published during three years from 2012 to 2014 in the Indian Journal of Pharmacology (IJP). Material and Methods: All issues of IJP published in the year 2012, 2013 and 2014 were reviewed to identify animal studies. Each animal study was searched for 15 parameters specifically designed to review standards of animal experimentation and research methodology. Observation: All published studies had clearly defined aims and objectives while a statement on ethical clearance about the study protocol was provided in 97% of papers. Information about animal strain and sex was given in 91.8% and 90% of papers respectively. Age of experimental animals was mentioned by 44.4% papers while source of animals was given in 50.8% papers. Randomization was reported by 37.4% while 9.9% studies reported blinding. Only 3.5% studies mentioned any limitations of their work. Conclusion: Present study demonstrates relatively good reporting standards in animal studies published in IJP. The items which need to be improved are randomization, blinding, sample size calculation, stating the limitations of study, sources of support and conflict of interest. The knowledge shared in the present paper could be used for better reporting of animal based experiments.
Collapse
Affiliation(s)
- Umme Aiman
- Department of Pharmacology, Jawaharlal Nehru Medical College, Aligarh Muslim University, Aligarh, Uttar Pradesh, India
| | - Syed Ziaur Rahman
- Department of Pharmacology, Jawaharlal Nehru Medical College, Aligarh Muslim University, Aligarh, Uttar Pradesh, India
| |
Collapse
|
43
|
Abstract
We discuss the articles of this special issue with reference to an important yet previously only implicit dimension of study quality: alignment across the theoretical and methodological decisions that collectively define an approach to self-regulated learning. Integrating and extending work by leaders in the field, we propose a framework for evaluating alignment in the way self-regulated learning research is both conducted and reported. Within this framework, the special issue articles provide a springboard for discussing methodological promises and pitfalls of increasingly sophisticated research on the dynamic, contingent, and contextualized features of self-regulated learning.
Collapse
|
44
|
Pinnock H, Epiphaniou E, Sheikh A, Griffiths C, Eldridge S, Craig P, Taylor SJC. Developing standards for reporting implementation studies of complex interventions (StaRI): a systematic review and e-Delphi. Implement Sci 2015; 10:42. [PMID: 25888928 PMCID: PMC4393562 DOI: 10.1186/s13012-015-0235-z] [Citation(s) in RCA: 79] [Impact Index Per Article: 8.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/16/2014] [Accepted: 03/16/2015] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND Dissemination and implementation of health care interventions are currently hampered by the variable quality of reporting of implementation research. Reporting of other study types has been improved by the introduction of reporting standards (e.g. CONSORT). We are therefore developing guidelines for reporting implementation studies (StaRI). METHODS Using established methodology for developing health research reporting guidelines, we systematically reviewed the literature to generate items for a checklist of reporting standards. We then recruited an international, multidisciplinary panel for an e-Delphi consensus-building exercise which comprised an initial open round to revise/suggest a list of potential items for scoring in the subsequent two scoring rounds (scale 1 to 9). Consensus was defined a priori as 80% agreement with the priority scores of 7, 8, or 9. RESULTS We identified eight papers from the literature review from which we derived 36 potential items. We recruited 23 experts to the e-Delphi panel. Open round comments resulted in revisions, and 47 items went forward to the scoring rounds. Thirty-five items achieved consensus: 19 achieved 100% agreement. Prioritised items addressed the need to: provide an evidence-based justification for implementation; describe the setting, professional/service requirements, eligible population and intervention in detail; measure process and clinical outcomes at population level (using routine data); report impact on health care resources; describe local adaptations to the implementation strategy and describe barriers/facilitators. Over-arching themes from the free-text comments included balancing the need for detailed descriptions of interventions with publishing constraints, addressing the dual aims of reporting on the process of implementation and effectiveness of the intervention and monitoring fidelity to an intervention whilst encouraging adaptation to suit diverse local contexts. CONCLUSIONS We have identified priority items for reporting implementation studies and key issues for further discussion. An international, multidisciplinary workshop, where participants will debate the issues raised, clarify specific items and develop StaRI standards that fit within the suite of EQUATOR reporting guidelines, is planned. REGISTRATION The protocol is registered with Equator: http://www.equator-network.org/library/reporting-guidelines-under-development/#17 .
Collapse
Affiliation(s)
- Hilary Pinnock
- Asthma UK Centre for Applied Research, Allergy and Respiratory Research Group, Centre for Population Health Sciences, University of Edinburgh, Doorway 3, Medical School, Teviot Place, Edinburgh, EH8 9AG, Scotland, UK.
| | - Eleni Epiphaniou
- Centre for Primary Care and Public Health, Blizard Institute, Barts and The London School of Medicine and Dentistry, London, UK.
| | - Aziz Sheikh
- Asthma UK Centre for Applied Research, Allergy and Respiratory Research Group, Centre for Population Health Sciences, University of Edinburgh, Doorway 3, Medical School, Teviot Place, Edinburgh, EH8 9AG, Scotland, UK.
| | - Chris Griffiths
- Centre for Primary Care and Public Health, Blizard Institute, Barts and The London School of Medicine and Dentistry, London, UK.
| | - Sandra Eldridge
- Pragmatic Clinical Trials Unit, Centre for Primary Care and Public Health, Blizard Institute, Barts and The London School of Medicine and Dentistry, London, UK.
| | - Peter Craig
- MRC/CSO Social and Public Health Sciences Unit, University of Glasgow, Glasgow, UK.
| | - Stephanie J C Taylor
- Centre for Primary Care and Public Health, Blizard Institute, Barts and The London School of Medicine and Dentistry, London, UK.
| |
Collapse
|
45
|
Abstract
Published scientific protocols are advocated as a means of controlling bias in research reporting. Indeed, many journals require a study protocol with manuscript submission. However, publishing protocols of partnered research (PPR) can be challenging in light of the research model's dynamic nature, especially as no current reporting standards exist. Nevertheless, as these protocols become more prevalent, a priori documentation of methods in partnered research studies becomes increasingly important. Using as illustration a suite of studies aimed at improving coordination and communication in the primary care setting, we sought to identify challenges in publishing PPR relative to traditional designs, present alternative solutions to PPR publication, and propose an initial checklist of content to be included in protocols of partnered research. Challenges to publishing PPR include reporting details of research components intended to be co-created with operational partners, changes to sampling and entry strategy, and alignment of scientific and operational goals. Proposed solutions include emulating reporting standards of qualitative research, participatory action research, and adaptive trial designs, as well as embracing technological tools that facilitate publishing adaptive protocols, with version histories that are able to be updated as major protocol changes occur. Finally, we present a proposed checklist of reporting elements for partnered research protocols.
Collapse
|
46
|
Wagaman AS, Coburn A, Brand-Thomas I, Dash B, Jaswal SS. A comprehensive database of verified experimental data on protein folding kinetics. Protein Sci 2014; 23:1808-12. [PMID: 25229122 DOI: 10.1002/pro.2551] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2014] [Accepted: 09/15/2014] [Indexed: 11/11/2022]
Abstract
Insights into protein folding rely increasingly on the synergy between experimental and theoretical approaches. Developing successful computational models requires access to experimental data of sufficient quantity and high quality. We compiled folding rate constants for what initially appeared to be 184 proteins from 15 published collections/web databases. To generate the highest confidence in the dataset, we verified the reported lnkf value and exact experimental construct and conditions from the original experimental report(s). The resulting comprehensive database of 126 verified entries, ACPro, will serve as a freely accessible resource (https://www.ats.amherst.edu/protein/) for the protein folding community to enable confident testing of predictive models. In addition, we provide a streamlined submission form for researchers to add new folding kinetics results, requiring specification of all the relevant experimental information according to the standards proposed in 2005 by the protein folding consortium organized by Plaxco. As the number and diversity of proteins whose folding kinetics are studied expands, our curated database will enable efficient and confident incorporation of new experimental results into a standardized collection. This database will support a more robust symbiosis between experiment and theory, leading ultimately to more rapid and accurate insights into protein folding, stability, and dynamics.
Collapse
Affiliation(s)
- Amy S Wagaman
- Department of Mathematics and Statistics, Amherst College, Amherst, Massachusetts
| | | | | | | | | |
Collapse
|
47
|
Rader T, Mann M, Stansfield C, Cooper C, Sampson M. Methods for documenting systematic review searches: a discussion of common issues. Res Synth Methods 2014; 5:98-115. [PMID: 26052650 DOI: 10.1002/jrsm.1097] [Citation(s) in RCA: 22] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/03/2012] [Revised: 08/19/2013] [Accepted: 09/01/2013] [Indexed: 11/11/2022]
Abstract
INTRODUCTION As standardized reporting requirements for systematic reviews are being adopted more widely, review authors are under greater pressure to accurately record their search process. With careful planning, documentation to fulfill the Preferred Reporting Items for Systematic Reviews and Meta-Analyses requirements can become a valuable tool for organizing a systematic review literature search and planning updates. METHODS A working group of information specialists convened to discuss current practice and were informed by a Web-based survey of over 260 systematic review authors, trials search coordinators, librarians, and other information specialists conducted in February/March 2011. DISCUSSION Survey responses provided insight into current practices and difficulties of reporting searches. These included a lack of time, tools, clear understanding of the requirements, and uncertainty about responsibility for documenting these elements. This paper will present some of the practical aspects of documenting the systematic literature search. Section 1 provides background information and rationale for this paper. Section 2 discusses issues and recommendations arising from survey results. Section 3 outlines specific elements to be recorded. Section 4 guides the reader through the information management process. Section 5 concludes with implications for future research and practice. These principles are applicable to any large literature search for systematic reviews, health technology assessments, and guideline development.
Collapse
Affiliation(s)
- Tamara Rader
- Cochrane Musculoskeletal Group, University of Ottawa, Ottawa, Canada
| | - Mala Mann
- Support Unit for Research Evidence, Cardiff University, Cardiff, Wales
| | - Claire Stansfield
- Evidence for Policy and Practice Information and Co-ordinating Centre (EPPI-Centre), Social Science Research Unit, Institute of Education, University of London, London, UK
| | - Chris Cooper
- Peninsula Technology Assessment Group (PenTAG), Peninsula College of Medicine and Dentistry, University of Exeter, Exeter, UK
| | - Margaret Sampson
- Children's Hospital of Eastern Ontario Research Institute, Ottawa, Canada
| |
Collapse
|
48
|
Stovold E, Beecher D, Foxlee R, Noel-Storr A. Study flow diagrams in Cochrane systematic review updates: an adapted PRISMA flow diagram. Syst Rev 2014; 3:54. [PMID: 24886533 PMCID: PMC4046496 DOI: 10.1186/2046-4053-3-54] [Citation(s) in RCA: 175] [Impact Index Per Article: 17.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/27/2014] [Accepted: 05/21/2014] [Indexed: 11/13/2022] Open
Abstract
Cochrane systematic reviews are conducted and reported according to rigorous standards. A study flow diagram must be included in a new review, and there is clear guidance from the PRISMA statement on how to do this. However, for a review update, there is currently no guidance on how study flow diagrams should be presented. To address this, a working group was formed to find a solution and produce guidance on how to use these diagrams in review updates.A number of different options were devised for how these flow diagrams could be used in review updates, and also in cases where multiple searches for a review or review update have been conducted. These options were circulated to the Cochrane information specialist community for consultation and feedback. Following the consultation period, the working group refined the guidance and made the recommendation that for review updates an adapted PRISMA flow diagram should be used, which includes an additional box with the number of previously included studies feeding into the total. Where multiple searches have been conducted, the results should be added together and treated as one set of results.There is no existing guidance for using study flow diagrams in review updates. Our adapted diagram is a simple and pragmatic solution for showing the flow of studies in review updates.
Collapse
Affiliation(s)
- Elizabeth Stovold
- Cochrane Airways Group, St George's, University of London, Cranmer Terrace, Tooting, London SW19 2HG, UK.
| | | | | | | |
Collapse
|
49
|
Grant S, Montgomery P, Hopewell S, Macdonald G, Moher D, Mayo-Wilson E. Developing a Reporting Guideline for Social and Psychological Intervention Trials. Res Soc Work Pract 2013; 23:595-602. [PMID: 25076832 PMCID: PMC4108299 DOI: 10.1177/1049731513498118] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
Social and psychological interventions are often complex. Understanding randomized controlled trials (RCTs) of these complex interventions requires a detailed description of the interventions tested and the methods used to evaluate them; however, RCT reports often omit, or inadequately report, this information. Incomplete and inaccurate reporting hinders the optimal use of research, wastes resources, and fails to meet ethical obligations to research participants and consumers. In this article, we explain how reporting guidelines have improved the quality of reports in medicine and describe the ongoing development of a new reporting guideline for RCTs: Consolidated Standards of Reporting Trials-SPI (an extension for social and psychological interventions). We invite readers to participate in the project by visiting our website, in order to help us reach the best-informed consensus on these guidelines (http://tinyurl.com/CONSORT-study).
Collapse
Affiliation(s)
- Sean Grant
- Centre for Evidence-Based Intervention,
University of Oxford, Oxford, UK
| | - Paul Montgomery
- Centre for Evidence-Based Intervention,
University of Oxford, Oxford, UK
| | - Sally Hopewell
- Centre for Statistics in Medicine, University
of Oxford, Oxford, UK
| | | | - David Moher
- Clinical Epidemiology Program, Ottawa Hospital
Research Institute, Centre for Practice-Changing Research (CPCR), The Ottawa Hospital,
Ottawa, ON, Canada
| | - Evan Mayo-Wilson
- Research Department of Clinical, Educational
& Health Psychology, Centre for Outcomes Research and Effectiveness, University College
London, London, UK
| |
Collapse
|
50
|
Mayo-Wilson E, Grant S, Hopewell S, Macdonald G, Moher D, Montgomery P. Developing a reporting guideline for social and psychological intervention trials. Trials 2013; 14:242. [PMID: 23915044 PMCID: PMC3734002 DOI: 10.1186/1745-6215-14-242] [Citation(s) in RCA: 35] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/21/2012] [Accepted: 07/22/2013] [Indexed: 11/29/2022] Open
Abstract
Social and psychological interventions are often complex. Understanding randomised controlled trials (RCTs) of these complex interventions requires a detailed description of the interventions tested and the methods used to evaluate them; however, RCT reports often omit, or inadequately report, this information. Incomplete and inaccurate reporting hinders the optimal use of research, wastes resources, and fails to meet ethical obligations to research participants and consumers. In this paper, we explain how reporting guidelines have improved the quality of reports in medicine, and describe the ongoing development of a new reporting guideline for RCTs: CONSORT-SPI (an Extension for social and psychological interventions). We invite readers to participate in the project by visiting our website, in order to help us reach the best-informed consensus on these guidelines (http://tinyurl.com/CONSORT-study).
Collapse
Affiliation(s)
- Evan Mayo-Wilson
- Centre for Outcomes Research and Effectiveness, Research Department of Clinical, Educational & Health Psychology, University College London, 1-19 Torrington Place, London WC1E 7HB, UK
| | - Sean Grant
- Centre for Evidence-Based Intervention, University of Oxford Barnett House, 32 Wellington Square, Oxford OX1 2ER, UK
| | - Sally Hopewell
- Centre for Statistics in Medicine, University of Oxford Wolfson College Annexe, Linton Road, Oxford OX2 6UD, UK
| | - Geraldine Macdonald
- Institute of Child Care Research, Queen’s University Belfast, 6 College Park, Belfast BT7 1LP, UK
| | - David Moher
- Clinical Epidemiology Program, Ottawa Hospital Research Institute Centre for Practice-Changing Research (CPCR), The Ottawa Hospital − General Campus, 501 Smyth Rd Room L1288, Ottawa, ON K1H 8L6, Canada
| | - Paul Montgomery
- Centre for Evidence-Based Intervention, University of Oxford Barnett House, 32 Wellington Square, Oxford OX1 2ER, UK
| |
Collapse
|