1
|
Schulte PJ, Goldberg JD, Oster RA, Ambrosius WT, Bonner LB, Cabral H, Carter RE, Chen Y, Desai M, Li D, Lindsell CJ, Pomann GM, Slade E, Tosteson TD, Yu F, Spratt H. Peer review of clinical and translational research manuscripts: Perspectives from statistical collaborators. J Clin Transl Sci 2024; 8:e20. [PMID: 38384899 PMCID: PMC10879991 DOI: 10.1017/cts.2023.707] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/23/2023] [Revised: 11/29/2023] [Accepted: 12/19/2023] [Indexed: 02/23/2024] Open
Abstract
Research articles in the clinical and translational science literature commonly use quantitative data to inform evaluation of interventions, learn about the etiology of disease, or develop methods for diagnostic testing or risk prediction of future events. The peer review process must evaluate the methodology used therein, including use of quantitative statistical methods. In this manuscript, we provide guidance for peer reviewers tasked with assessing quantitative methodology, intended to complement guidelines and recommendations that exist for manuscript authors. We describe components of clinical and translational science research manuscripts that require assessment including study design and hypothesis evaluation, sampling and data acquisition, interventions (for studies that include an intervention), measurement of data, statistical analysis methods, presentation of the study results, and interpretation of the study results. For each component, we describe what reviewers should look for and assess; how reviewers should provide helpful comments for fixable errors or omissions; and how reviewers should communicate uncorrectable and irreparable errors. We then discuss the critical concepts of transparency and acceptance/revision guidelines when communicating with responsible journal editors.
Collapse
Affiliation(s)
- Phillip J. Schulte
- Division of Clinical Trials and Biostatistics, Department of Quantitative Health Sciences, Mayo Clinic, Rochester, MN, USA
| | - Judith D. Goldberg
- Division of Biostatistics, Department of Population Health, New York University Grossman School of Medicine, New York, NY, USA
| | - Robert A. Oster
- Department of Medicine, Division of Preventive Medicine, University of Alabama at Birmingham, Birmingham, AL, USA
| | - Walter T. Ambrosius
- Department of Biostatistics and Data Science, Wake Forest University School of Medicine, Winston-Salem, NC, USA
| | - Lauren Balmert Bonner
- Department of Preventive Medicine, Northwestern University Feinberg School of Medicine, Chicago, IL, USA
| | - Howard Cabral
- Department of Biostatistics, Boston University School of Public Health, Boston, MA, USA
| | - Rickey E. Carter
- Department of Quantitative Health Sciences, Mayo Clinic, Jacksonville, FL, USA
| | - Ye Chen
- Biostatistics, Epidemiology and Research Design (BERD), Tufts Clinical and Translational Science Institute (CTSI), Boston, MA, USA
| | - Manisha Desai
- Quantitative Sciences Unit, Departments of Medicine, Biomedical Data Science, and Epidemiology and Population Health, Stanford University, Stanford, CA, USA
| | - Dongmei Li
- Department of Clinical and Translational Research, Obstetrics and Gynecology and Public Health Sciences, University of Rochester Medical Center, Rochester, NY, USA
| | | | - Gina-Maria Pomann
- Department of Biostatistics and Bioinformatics, Duke University, Durham, NC, USA
| | - Emily Slade
- Department of Biostatistics, University of Kentucky, Lexington, KY, USA
| | - Tor D. Tosteson
- Department of Biomedical Data Science, Geisel School of Medicine, Dartmouth College, Hanover, NH, USA
| | - Fang Yu
- Department of Biostatistics, University of Nebraska Medical Center, Omaha, NE, USA
| | - Heidi Spratt
- Department of Biostatistics and Data Science, School of Public and Population Health, University of Texas Medical Branch, Galveston, TX, USA
| |
Collapse
|
2
|
Abstract
INTRODUCTION In recent years, unfortunately, low quality of statistical analyses in medicine has been observed. As it turns out, this also applies to COVID-19 subject matter. METHODS The study included 2600 medical articles published between the beginning of 2020 and June 2021, in which the authors described the obtained results, i.e. related to COVID-19. RESULTS Of the analysed articles, 39% were correct in terms of the statistical analysis performed. CONCLUSIONS There should be more emphasis on conducting statistical reviews in authors' contributions on various aspects of COVID-19.
Collapse
Affiliation(s)
- Michal Ordak
- Department of Pharmacodynamics, Centre for Preclinical Research and Technology (CePT), Medical University of Warsaw, Warsaw, Poland
| |
Collapse
|
3
|
Реброва ОЮ, Федяева ВК, Аксёнов ВА. Towards Good Statistical Practice. CORSTAN Validated Questionnaire for Assessing the Correctness of Statistical Analysis in Medical Research. Probl Endokrinol (Mosk) 2021; 67:11-17. [PMID: 35018757 PMCID: PMC9112928 DOI: 10.14341/probl12797] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/06/2021] [Accepted: 10/19/2021] [Indexed: 06/14/2023]
Abstract
BACKGROUND In evidence-based medicine, the research methodology is determined by the risks of systematic biases and incorrect data analysis. Minimizing both risks increases the internal validity of the study. There are numerous recommendations and guidelines for data analysis and reporting, but the international community has not yet developed a questionnaire for reviewers to assess the quality of statistical analysis. AIM To develop a tool for formalized assessment of the quality of statistical analysis presented in scientific medical publications. MATERIALS AND METHODS The questionnaire was developed based on the authors' decades of experience in statistical data analysis and reviewing the statistical aspects of biomedical articles and dissertations. The SAMPL guidelines, ICH E9, and other guidelines were taken into account when developing the questionnaire. Internal validation of the questionnaire was based on an independent assessment by two experts of 20 randomly selected articles on randomized controlled trials (RCTs) from elibrary.ru, and further statistical analysis of the agreement of experts' conclusions. RESULTS The CORSTAN (CORrect STatistical ANalysis) questionnaire was developed, which consists of two parts: the first part (10 questions) is intended for evaluating studies of any designs, while the second (following eight questions) is for additional assessment of RCTs. A stratification of the risk of incorrect statistical analysis is proposed. The evaluation of the questionnaire's internal validity showed its substantial and almost perfect agreement for each question and each article both in the sum of points and risk level. CONCLUSION The use of the questionnaire will simplify and harmonize the statistical review of publications and manuscripts in various institutions - scientific journals, dissertation boards, etc. The questionnaire can also be helpful for authors during preparing manuscripts; it will also help improve the quality of publications and research itself. We plan to improve the questionnaire as we gain experience in its application.
Collapse
Affiliation(s)
- О. Ю. Реброва
- Российский национальный исследовательский медицинский университет им. Н.И. Пирогова;
Национальный медицинский исследовательский центр эндокринологии;
Межрегиональная общественная организация «Общество специалистов доказательной медицины»
| | - В. К. Федяева
- Межрегиональная общественная организация «Общество специалистов доказательной медицины»
| | - В. А. Аксёнов
- Межрегиональная общественная организация «Общество специалистов доказательной медицины»
| |
Collapse
|
4
|
Nieminen P, Uribe SE. The Quality of Statistical Reporting and Data Presentation in Predatory Dental Journals Was Lower Than in Non-Predatory Journals. Entropy (Basel) 2021; 23:468. [PMID: 33923391 PMCID: PMC8071575 DOI: 10.3390/e23040468] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/12/2021] [Revised: 04/05/2021] [Accepted: 04/14/2021] [Indexed: 01/18/2023]
Abstract
Proper peer review and quality of published articles are often regarded as signs of reliable scientific journals. The aim of this study was to compare whether the quality of statistical reporting and data presentation differs among articles published in 'predatory dental journals' and in other dental journals. We evaluated 50 articles published in 'predatory open access (OA) journals' and 100 clinical trials published in legitimate dental journals between 2019 and 2020. The quality of statistical reporting and data presentation of each paper was assessed on a scale from 0 (poor) to 10 (high). The mean (SD) quality score of the statistical reporting and data presentation was 2.5 (1.4) for the predatory OA journals, 4.8 (1.8) for the legitimate OA journals, and 5.6 (1.8) for the more visible dental journals. The mean values differed significantly (p < 0.001). The quality of statistical reporting of clinical studies published in predatory journals was found to be lower than in open access and highly cited journals. This difference in quality is a wake-up call to consume study results critically. Poor statistical reporting indicates wider general lower quality in publications where the authors and journals are less likely to be critiqued by peer review.
Collapse
Affiliation(s)
- Pentti Nieminen
- Medical Informatics and Data Analysis Research Group, University of Oulu, 90014 Oulu, Finland
| | - Sergio E. Uribe
- Department of Conservative Dentistry and Oral Health, Riga Stradins University, LV-1007 Riga, Latvia;
- School of Dentistry, Universidad Austral de Chile, Rudloff, Valdivia 1640, Chile
- Baltic Biomaterials Centre of Excellence, Headquarters at Riga Technical University, LV-1658 Riga, Latvia
| |
Collapse
|
5
|
Dijkers MP. A guide to peer reviewing for Spinal Cord. Spinal Cord 2021; 59:571-581. [PMID: 33828248 DOI: 10.1038/s41393-021-00627-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/17/2020] [Revised: 12/27/2020] [Accepted: 12/29/2020] [Indexed: 11/09/2022]
Abstract
Peer reviewing is a key mechanism underlying science publishing, but during their graduate training clinicians and researchers are unlikely to be taught the skill. This paper sets forth the art of peer reviewing in general, and the types of reviews that are most useful to the Editors of Spinal Cord (SC). The topics addressed are: the SC editorial process; the role of the referee; review process steps; the content and language of a review; and resources available to peer reviewers.
Collapse
|
6
|
Nüst D, Eglen SJ. CODECHECK: an Open Science initiative for the independent execution of computations underlying research articles during peer review to improve reproducibility. F1000Res 2021; 10:253. [PMID: 34367614 PMCID: PMC8311796 DOI: 10.12688/f1000research.51738.2] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 06/15/2021] [Indexed: 11/20/2022] Open
Abstract
The traditional scientific paper falls short of effectively communicating computational research. To help improve this situation, we propose a system by which the computational workflows underlying research articles are checked. The CODECHECK system uses open infrastructure and tools and can be integrated into review and publication processes in multiple ways. We describe these integrations along multiple dimensions (importance, who, openness, when). In collaboration with academic publishers and conferences, we demonstrate CODECHECK with 25 reproductions of diverse scientific publications. These CODECHECKs show that asking for reproducible workflows during a collaborative review can effectively improve executability. While CODECHECK has clear limitations, it may represent a building block in Open Science and publishing ecosystems for improving the reproducibility, appreciation, and, potentially, the quality of non-textual research artefacts. The CODECHECK website can be accessed here: https://codecheck.org.uk/.
Collapse
Affiliation(s)
- Daniel Nüst
- Institute for Geoinformatics, University of Münster, Münster, Germany
| | - Stephen J. Eglen
- Department of Applied Mathematics and Theoretical Physics, University of Cambridge, Cambridge, UK
| |
Collapse
|
7
|
Nüst D, Eglen SJ. CODECHECK: an Open Science initiative for the independent execution of computations underlying research articles during peer review to improve reproducibility. F1000Res 2021; 10:253. [PMID: 34367614 PMCID: PMC8311796 DOI: 10.12688/f1000research.51738.1] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 03/22/2021] [Indexed: 11/08/2023] Open
Abstract
The traditional scientific paper falls short of effectively communicating computational research. To help improve this situation, we propose a system by which the computational workflows underlying research articles are checked. The CODECHECK system uses open infrastructure and tools and can be integrated into review and publication processes in multiple ways. We describe these integrations along multiple dimensions (importance, who, openness, when). In collaboration with academic publishers and conferences, we demonstrate CODECHECK with 25 reproductions of diverse scientific publications. These CODECHECKs show that asking for reproducible workflows during a collaborative review can effectively improve executability. While CODECHECK has clear limitations, it may represent a building block in Open Science and publishing ecosystems for improving the reproducibility, appreciation, and, potentially, the quality of non-textual research artefacts. The CODECHECK website can be accessed here: https://codecheck.org.uk/.
Collapse
Affiliation(s)
- Daniel Nüst
- Institute for Geoinformatics, University of Münster, Münster, Germany
| | - Stephen J. Eglen
- Department of Applied Mathematics and Theoretical Physics, University of Cambridge, Cambridge, UK
| |
Collapse
|
8
|
Ellsworth PC. Truth and Advocacy: Reducing Bias in Policy-Related Research. Perspect Psychol Sci 2021; 16:1226-1241. [PMID: 33593149 DOI: 10.1177/1745691620959832] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Critics have suggested that psychological research is characterized by a pervasive liberal bias, and this problem may be particularly acute in research on issues related to public policy. In this article, I consider the sources of bias in basic and applied research in the evaluation, conduct, and communication of research. Techniques are suggested for counteracting bias at each of these stages.
Collapse
|
9
|
Grossetta Nardini HK, Batten J, Funaro MC, Garcia-Milian R, Nyhan K, Spak JM, Wang L, Glover JG. Librarians as methodological peer reviewers for systematic reviews: results of an online survey. Res Integr Peer Rev 2019; 4:23. [PMID: 31798974 PMCID: PMC6882225 DOI: 10.1186/s41073-019-0083-5] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/19/2019] [Accepted: 10/10/2019] [Indexed: 12/11/2022] Open
Abstract
BACKGROUND Developing a comprehensive, reproducible literature search is the basis for a high-quality systematic review (SR). Librarians and information professionals, as expert searchers, can improve the quality of systematic review searches, methodology, and reporting. Likewise, journal editors and authors often seek to improve the quality of published SRs and other evidence syntheses through peer review. Health sciences librarians contribute to systematic review production but little is known about their involvement in peer reviewing SR manuscripts. METHODS This survey aimed to assess how frequently librarians are asked to peer review systematic review manuscripts and to determine characteristics associated with those invited to review. The survey was distributed to a purposive sample through three health sciences information professional listservs. RESULTS There were 291 complete survey responses. Results indicated that 22% (n = 63) of respondents had been asked by journal editors to peer review systematic review or meta-analysis manuscripts. Of the 78% (n = 228) of respondents who had not already been asked, 54% (n = 122) would peer review, and 41% (n = 93) might peer review. Only 4% (n = 9) would not review a manuscript. Respondents had peer reviewed manuscripts for 38 unique journals and believed they were asked because of their professional expertise. Of respondents who had declined to peer review (32%, n = 20), the most common explanation was "not enough time" (60%, n = 12) followed by "lack of expertise" (50%, n = 10).The vast majority of respondents (95%, n = 40) had "rejected or recommended a revision of a manuscript| after peer review. They based their decision on the "search methodology" (57%, n = 36), "search write-up" (46%, n = 29), or "entire article" (54%, n = 34). Those who selected "other" (37%, n = 23) listed a variety of reasons for rejection, including problems or errors in the PRISMA flow diagram; tables of included, excluded, and ongoing studies; data extraction; reporting; and pooling methods. CONCLUSIONS Despite being experts in conducting literature searches and supporting SR teams through the review process, few librarians have been asked to review SR manuscripts, or even just search strategies; yet many are willing to provide this service. Editors should involve experienced librarians with peer review and we suggest some strategies to consider.
Collapse
Affiliation(s)
- Holly K. Grossetta Nardini
- Harvey Cushing/John Hay Whitney Medical Library, Yale University, 333 Cedar Street, New Haven, CT 06520-8014 USA
| | - Janene Batten
- Harvey Cushing/John Hay Whitney Medical Library, Yale University, 333 Cedar Street, New Haven, CT 06520-8014 USA
| | - Melissa C. Funaro
- Harvey Cushing/John Hay Whitney Medical Library, Yale University, 333 Cedar Street, New Haven, CT 06520-8014 USA
| | - Rolando Garcia-Milian
- Harvey Cushing/John Hay Whitney Medical Library, Yale University, 333 Cedar Street, New Haven, CT 06520-8014 USA
| | - Kate Nyhan
- Harvey Cushing/John Hay Whitney Medical Library, Yale University, 333 Cedar Street, New Haven, CT 06520-8014 USA
| | - Judy M. Spak
- Harvey Cushing/John Hay Whitney Medical Library, Yale University, 333 Cedar Street, New Haven, CT 06520-8014 USA
| | - Lei Wang
- Harvey Cushing/John Hay Whitney Medical Library, Yale University, 333 Cedar Street, New Haven, CT 06520-8014 USA
| | - Janis G. Glover
- Harvey Cushing/John Hay Whitney Medical Library, Yale University, 333 Cedar Street, New Haven, CT 06520-8014 USA
| |
Collapse
|
10
|
García Garmendia J, Maroto Monserrat F. Interpretación de resultados estadísticos. Med Intensiva 2018; 42:370-9. [DOI: 10.1016/j.medin.2017.12.013] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/08/2017] [Revised: 12/18/2017] [Accepted: 12/25/2017] [Indexed: 12/30/2022]
|
11
|
Nieminen P, Virtanen JI, Vähänikkilä H. An instrument to assess the statistical intensity of medical research papers. PLoS One 2017; 12:e0186882. [PMID: 29053734 PMCID: PMC5650171 DOI: 10.1371/journal.pone.0186882] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2017] [Accepted: 10/09/2017] [Indexed: 01/15/2023] Open
Abstract
Background There is widespread evidence that statistical methods play an important role in original research articles, especially in medical research. The evaluation of statistical methods and reporting in journals suffers from a lack of standardized methods for assessing the use of statistics. The objective of this study was to develop and evaluate an instrument to assess the statistical intensity in research articles in a standardized way. Methods A checklist-type measure scale was developed by selecting and refining items from previous reports about the statistical contents of medical journal articles and from published guidelines for statistical reporting. A total of 840 original medical research articles that were published between 2007–2015 in 16 journals were evaluated to test the scoring instrument. The total sum of all items was used to assess the intensity between sub-fields and journals. Inter-rater agreement was examined using a random sample of 40 articles. Four raters read and evaluated the selected articles using the developed instrument. Results The scale consisted of 66 items. The total summary score adequately discriminated between research articles according to their study design characteristics. The new instrument could also discriminate between journals according to their statistical intensity. The inter-observer agreement measured by the ICC was 0.88 between all four raters. Individual item analysis showed very high agreement between the rater pairs, the percentage agreement ranged from 91.7% to 95.2%. Conclusions A reliable and applicable instrument for evaluating the statistical intensity in research papers was developed. It is a helpful tool for comparing the statistical intensity between sub-fields and journals. The novel instrument may be applied in manuscript peer review to identify papers in need of additional statistical review.
Collapse
Affiliation(s)
- Pentti Nieminen
- Medical Informatics and Statistics Research Group, University of Oulu, Oulu, Finland
- * E-mail:
| | - Jorma I. Virtanen
- Research Unit of Oral Health Sciences, Faculty of Medicine, University of Oulu, Oulu, Finland
- Medical Research Center, Oulu University Hospital, Oulu, Finland
| | - Hannu Vähänikkilä
- Research Unit of Oral Health Sciences, Faculty of Medicine, University of Oulu, Oulu, Finland
- Medical Research Center, Oulu University Hospital, Oulu, Finland
| |
Collapse
|
12
|
Abstract
This editorial introduces a series of tutorials by experts, who provide tips and advice for junior reviewers on how to conduct peer review based on specific study designs. The aim of these articles is to provide an easy-to-use, quick reference for those who are seeking more guidance on how to peer review biomedical research papers. Unlike previous tips and guides on peer review, this series is the first to provide advice from experts for those in their specific fields.
Collapse
Affiliation(s)
- Sabina Alam
- BioMed Central Ltd, 236 Gray's Inn Road, London, WC1X 8HB, UK.
| | - Jigisha Patel
- BioMed Central Ltd, 236 Gray's Inn Road, London, WC1X 8HB, UK.
| |
Collapse
|