1
|
Stafford T, Rombach I, Hind D, Mateen B, Woods HB, Dimario M, Wilsdon J. Where next for partial randomisation of research funding? The feasibility of RCTs and alternatives. Wellcome Open Res 2024; 8:309. [PMID: 37663796 PMCID: PMC10474338 DOI: 10.12688/wellcomeopenres.19565.1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/24/2024] [Indexed: 09/05/2023] Open
Abstract
We outline essential considerations for any study of partial randomisation of research funding, and consider scenarios in which randomised controlled trials (RCTs) would be feasible and appropriate. We highlight the interdependence of target outcomes, sample availability and statistical power for determining the cost and feasibility of a trial. For many choices of target outcome, RCTs may be less practical and more expensive than they at first appear (in large part due to issues pertaining to sample size and statistical power). As such, we briefly discuss alternatives to RCTs. It is worth noting that many of the considerations relevant to experiments on partial randomisation may also apply to other potential experiments on funding processes (as described in The Experimental Research Funder's Handbook. RoRI, June 2022).
Collapse
Affiliation(s)
- Tom Stafford
- The University of Sheffield, Sheffield, England, UK
| | - Ines Rombach
- The University of Sheffield, Sheffield, England, UK
| | - Dan Hind
- The University of Sheffield, Sheffield, England, UK
| | | | | | | | | |
Collapse
|
2
|
Dresler M. FENS-Kavli Network of Excellence: Postponed, non-competitive peer review for research funding. Eur J Neurosci 2023; 58:4441-4448. [PMID: 36085597 DOI: 10.1111/ejn.15818] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/03/2022] [Revised: 08/17/2022] [Accepted: 08/29/2022] [Indexed: 11/29/2022]
Abstract
Receiving research grants is among the highlights of an academic career, affirming previous accomplishments and enabling new research endeavours. Much of the process of acquiring research funding, however, belongs to the less favourite duties of many researchers: It is time consuming, often stressful and, in the majority of cases, unsuccessful. This resentment towards funding acquisition is backed up by empirical research: The current system to distribute research funding, via competitive calls for extensive research applications that undergo peer review, has repeatedly been shown to fail in its task to reliably rank proposals according to their merit, while at the same time being highly inefficient. The simplest, fairest and broadly supported alternative would be to distribute funding more equally across researchers, for example, by an increase of universities' base funding, thereby saving considerable time that can be spent on research instead. Here, I propose how to combine such a 'funding flat rate' model-or other efficient distribution strategies-with quality control through postponed, non-competitive peer review using open science practices.
Collapse
Affiliation(s)
- Martin Dresler
- Donders Institute for Brain, Cognition and Behaviour, Nijmegen, The Netherlands
- Radboud University Medical Center, Nijmegen, The Netherlands
| |
Collapse
|
3
|
van Ravenzwaaij D, Bakker M, Heesen R, Romero F, van Dongen N, Crüwell S, Field SM, Held L, Munafò MR, Pittelkow MM, Tiokhin L, Traag VA, van den Akker OR, van ‘t Veer AE, Wagenmakers EJ. Perspectives on scientific error. ROYAL SOCIETY OPEN SCIENCE 2023; 10:230448. [PMID: 37476516 PMCID: PMC10354464 DOI: 10.1098/rsos.230448] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 04/07/2023] [Accepted: 06/27/2023] [Indexed: 07/22/2023]
Abstract
Theoretical arguments and empirical investigations indicate that a high proportion of published findings do not replicate and are likely false. The current position paper provides a broad perspective on scientific error, which may lead to replication failures. This broad perspective focuses on reform history and on opportunities for future reform. We organize our perspective along four main themes: institutional reform, methodological reform, statistical reform and publishing reform. For each theme, we illustrate potential errors by narrating the story of a fictional researcher during the research cycle. We discuss future opportunities for reform. The resulting agenda provides a resource to usher in an era that is marked by a research culture that is less error-prone and a scientific publication landscape with fewer spurious findings.
Collapse
Affiliation(s)
- D. van Ravenzwaaij
- Department of Psychology, University of Groningen, Grote Kruisstraat 2/1, Heymans Building, room 239, 9712 TS Groningen, The Netherlands
| | - M. Bakker
- Tilburg University, 5037 AB Tilburg, The Netherlands
| | - R. Heesen
- University of Western Australia, Perth, Western Australia 6009, Australia
- London School of Economics and Political Science, London WC2A 2AE, UK
| | - F. Romero
- Department of Psychology, University of Groningen, Grote Kruisstraat 2/1, Heymans Building, room 239, 9712 TS Groningen, The Netherlands
| | - N. van Dongen
- University of Amsterdam, 1012 WP Amsterdam, The Netherlands
| | - S. Crüwell
- Department of History and Philosophy of Science, University of Cambridge, Cambridge CB2 1TN, UK
| | - S. M. Field
- Centre for Science and Technology Studies (CWTS), Leiden University, 2311 EZ Leiden, The Netherlands
| | - L. Held
- University of Zurich, 8006 Zürich, Switzerland
| | - M. R. Munafò
- School of Psychological Science, University of Bristol, Bristol BS8 1QU, UK
| | - M. M. Pittelkow
- Department of Psychology, University of Groningen, Grote Kruisstraat 2/1, Heymans Building, room 239, 9712 TS Groningen, The Netherlands
- QUEST Center for Transforming Biomedical Research, Berlin Institute of Health, Charité—Universitätsmedizin, 10178 Berlin, Germany
| | - L. Tiokhin
- IG&H Consulting, 3528 AC Utrecht, The Netherlands
| | - V. A. Traag
- Centre for Science and Technology Studies (CWTS), Leiden University, 2311 EZ Leiden, The Netherlands
| | - O. R. van den Akker
- Tilburg University, 5037 AB Tilburg, The Netherlands
- QUEST Center for Transforming Biomedical Research, Berlin Institute of Health, Charité—Universitätsmedizin, 10178 Berlin, Germany
| | - A. E. van ‘t Veer
- Methodology and Statistics Unit, Institute of Psychology, Leiden University, 2333 AK Leiden, The Netherlands
| | | |
Collapse
|
4
|
Luebber F, Krach S, Martinez Mateo M, Paulus FM, Rademacher L, Rahal RM, Specht J. Rethink funding by putting the lottery first. Nat Hum Behav 2023:10.1038/s41562-023-01649-y. [PMID: 37349356 DOI: 10.1038/s41562-023-01649-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 06/24/2023]
Affiliation(s)
- Finn Luebber
- Department of Psychiatry and Psychotherapy, University of Lübeck, Lübeck, Germany
- Department of Rheumatology and Clinical Immunology, University of Lübeck, Lübeck, Germany
- Open Science Initiative, University of Lübeck, Lübeck, Germany
| | - Sören Krach
- Department of Psychiatry and Psychotherapy, University of Lübeck, Lübeck, Germany.
- Open Science Initiative, University of Lübeck, Lübeck, Germany.
| | | | - Frieder M Paulus
- Department of Psychiatry and Psychotherapy, University of Lübeck, Lübeck, Germany
- Open Science Initiative, University of Lübeck, Lübeck, Germany
| | - Lena Rademacher
- Department of Psychiatry and Psychotherapy, University of Lübeck, Lübeck, Germany
- Open Science Initiative, University of Lübeck, Lübeck, Germany
| | - Rima-Maria Rahal
- Behavioral Law & Economics, Max Planck Institute for Research on Collective Goods, Bonn, Germany
| | - Jule Specht
- Department of Psychology, Humboldt Universität zu Berlin, Berlin, Germany
| |
Collapse
|
5
|
Dresler M, Buddeberg E, Endesfelder U, Haaker J, Hof C, Kretschmer R, Pflüger D, Schmidt F. Effective or predatory funding? Evaluating the hidden costs of grant applications. Immunol Cell Biol 2023; 101:104-111. [PMID: 36214095 DOI: 10.1111/imcb.12592] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/19/2022] [Revised: 10/05/2022] [Accepted: 10/07/2022] [Indexed: 12/04/2022]
Abstract
Researchers are spending an increasing fraction of their time on applying for funding; however, the current funding system has considerable deficiencies in reliably evaluating the merit of research proposals, despite extensive efforts on the sides of applicants, grant reviewers and decision committees. For some funding schemes, the systemic costs of the application process as a whole can even outweigh the granted resources-a phenomenon that could be considered as predatory funding. We present five recommendations to remedy this unsatisfactory situation.
Collapse
Affiliation(s)
- Martin Dresler
- Radboud University Medical Center, Donders Institute for Brain, Cognition and Behavior, Nijmegen, The Netherlands
| | | | | | - Jan Haaker
- University Medical Center Hamburg-Eppendorf, Hamburg, Germany
| | - Christian Hof
- Terrestrial Ecology Research Group, TUM School of Life Sciences, Technical University of Munich, Freising, Germany
| | | | | | - Fabian Schmidt
- Max Planck Institute for Astrophysics, Garching, Germany
| |
Collapse
|
6
|
Urai AE, Kelly C. Rethinking academia in a time of climate crisis. eLife 2023; 12:84991. [PMID: 36748915 PMCID: PMC9904754 DOI: 10.7554/elife.84991] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/21/2022] [Accepted: 01/25/2023] [Indexed: 02/08/2023] Open
Abstract
Addressing the climate crisis requires radical and urgent action at all levels of society. Universities are ideally positioned to lead such action but are largely failing to do so. At the same time, many academic scientists find their work impeded by bureaucracy, excessive competitiveness, and a loss of academic freedom. Here, drawing on the framework of "Doughnut Economics," developed by Kate Raworth, we suggest seven new principles for rethinking the norms of scientific practice. Based on these, we propose a call to action, and encourage academics to take concrete steps towards the creation of a flourishing scientific enterprise that is fit for the challenges of the 21st century.
Collapse
|
7
|
Dienes Z. The credibility crisis and democratic governance: how to reform university governance to be compatible with the nature of science. ROYAL SOCIETY OPEN SCIENCE 2023; 10:220808. [PMID: 36704257 PMCID: PMC9874275 DOI: 10.1098/rsos.220808] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 06/17/2022] [Accepted: 01/06/2023] [Indexed: 06/18/2023]
Abstract
To address the credibility crisis facing many disciplines, change is needed at the institutional level. Science will only function optimally if the culture by which it is governed becomes aligned with the way of thinking required in science itself. The paper suggests a series of graduated reforms to university governance, to radically reform the operation of universities. The reforms are based on existing established open democratic practices. The aim is for universities to become consistent with the flourishing of science and research more generally.
Collapse
Affiliation(s)
- Zoltan Dienes
- School of Psychology, University of Sussex, Brighton, UK
| |
Collapse
|
8
|
Shaw J. Peer review in funding-by-lottery: A systematic overview and expansion. RESEARCH EVALUATION 2022. [DOI: 10.1093/reseval/rvac022] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022]
Abstract
Abstract
Despite the surging interest in introducing lottery mechanisms into decision-making procedures for science funding bodies, the discourse on funding-by-lottery remains underdeveloped and, at times, misleading. Funding-by-lottery is sometimes presented as if it were a single mechanism when, in reality, there are many funding-by-lottery mechanisms with important distinguishing features. Moreover, funding-by-lottery is sometimes portrayed as an alternative to traditional methods of peer review when peer review is still used within funding-by-lottery approaches. This obscures a proper analysis of the (hypothetical and actual) variants of funding-by-lottery and important differences amongst them. The goal of this article is to provide a preliminary taxonomy of funding-by-lottery variants and evaluate how the existing evidence on peer review might lend differentiated support for variants of funding-by-lottery. Moreover, I point to gaps in the literature on peer review that must be addressed in future research. I conclude by building off of the work of Avin in moving toward a more holistic evaluation of funding-by-lottery. Specifically, I consider implications funding-by-lottery variants may have regarding trust and social responsibility.
Collapse
Affiliation(s)
- Jamie Shaw
- Institut für Philosophie, Leibniz Universität Hannover , Hannover, Germany
| |
Collapse
|
9
|
Schiavone SR, Vazire S. Reckoning With Our Crisis: An Agenda for the Field of Social and Personality Psychology. PERSPECTIVES ON PSYCHOLOGICAL SCIENCE 2022; 18:710-722. [PMID: 36301777 DOI: 10.1177/17456916221101060] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
The replication crisis and credibility revolution in the 2010s brought a wave of doubts about the credibility of social and personality psychology. We argue that as a field, we must reckon with the concerns brought to light during this critical decade. How the field responds to this crisis will reveal our commitment to self-correction. If we do not take the steps necessary to address our problems and simply declare the crisis to be over or the problems to be fixed without evidence, we risk further undermining our credibility. To fully reckon with this crisis, we must empirically assess the state of the field to take stock of how credible our science actually is and whether it is improving. We propose an agenda for metascientific research, and we review approaches to empirically evaluate and track where we are as a field (e.g., analyzing the published literature, surveying researchers). We describe one such project (Surveying the Past and Present State of Published Studies in Social and Personality Psychology) underway in our research group. Empirical evidence about the state of our field is necessary if we are to take self-correction seriously and if we hope to avert future crises.
Collapse
Affiliation(s)
| | - Simine Vazire
- Melbourne School of Psychological Sciences, University of Melbourne
| |
Collapse
|
10
|
Forscher PS, Wagenmakers EJ, Coles NA, Silan MA, Dutra N, Basnight-Brown D, IJzerman H. The Benefits, Barriers, and Risks of Big-Team Science. PERSPECTIVES ON PSYCHOLOGICAL SCIENCE 2022; 18:607-623. [PMID: 36190899 DOI: 10.1177/17456916221082970] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Progress in psychology has been frustrated by challenges concerning replicability, generalizability, strategy selection, inferential reproducibility, and computational reproducibility. Although often discussed separately, these five challenges may share a common cause: insufficient investment of intellectual and nonintellectual resources into the typical psychology study. We suggest that the emerging emphasis on big-team science can help address these challenges by allowing researchers to pool their resources together to increase the amount available for a single study. However, the current incentives, infrastructure, and institutions in academic science have all developed under the assumption that science is conducted by solo principal investigators and their dependent trainees, an assumption that creates barriers to sustainable big-team science. We also anticipate that big-team science carries unique risks, such as the potential for big-team-science organizations to be co-opted by unaccountable leaders, become overly conservative, and make mistakes at a grand scale. Big-team-science organizations must also acquire personnel who are properly compensated and have clear roles. Not doing so raises risks related to mismanagement and a lack of financial sustainability. If researchers can manage its unique barriers and risks, big-team science has the potential to spur great progress in psychology and beyond.
Collapse
Affiliation(s)
- Patrick S Forscher
- Research and Innovation Division, Busara Center for Behavioral Economics, Nairobi, Kenya.,Laboratoire Interuniversitaire de Psychologie, Université Grenoble Alpes
| | | | - Nicholas A Coles
- Center for the Study of Language and Information, Stanford University
| | - Miguel Alejandro Silan
- Unité de recherche Développement Individu Processus Handicap Éducation, Université Lumière Lyon 2.,Annecy Behavioral Science Lab, Menthon-Saint-Bernard, France.,Social and Political Laboratory, Psychology Department, University of the Philippines Diliman
| | - Natália Dutra
- Núcleo de Teoria e Pesquisa do Comportamento, Universidade Federal do Pará
| | | | - Hans IJzerman
- Laboratoire Interuniversitaire de Psychologie, Université Grenoble Alpes.,Institut Universitaire de France
| |
Collapse
|
11
|
Kowalczyk OS, Lautarescu A, Blok E, Dall'Aglio L, Westwood SJ. What senior academics can do to support reproducible and open research: a short, three-step guide. BMC Res Notes 2022; 15:116. [PMID: 35317865 PMCID: PMC8938725 DOI: 10.1186/s13104-022-05999-0] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/24/2021] [Accepted: 03/09/2022] [Indexed: 01/31/2023] Open
Abstract
Increasingly, policies are being introduced to reward and recognise open research practices, while the adoption of such practices into research routines is being facilitated by many grassroots initiatives. However, despite this widespread endorsement and support, as well as various efforts led by early career researchers, open research is yet to be widely adopted. For open research to become the norm, initiatives should engage academics from all career stages, particularly senior academics (namely senior lecturers, readers, professors) given their routine involvement in determining the quality of research. Senior academics, however, face unique challenges in implementing policy changes and supporting grassroots initiatives. Given that—like all researchers—senior academics are motivated by self-interest, this paper lays out three feasible steps that senior academics can take to improve the quality and productivity of their research, that also serve to engender open research. These steps include changing (a) hiring criteria, (b) how scholarly outputs are credited, and (c) how we fund and publish in line with open research principles. The guidance we provide is accompanied by material for further reading.
Collapse
Affiliation(s)
- Olivia S Kowalczyk
- Department of Neuroimaging, Institute of Psychiatry, Psychology and Neuroscience, King's College London, London, UK
| | - Alexandra Lautarescu
- Forensic and Neurodevelopmental Sciences, Institute of Psychiatry, Psychology, Neuroscience, King's College London, London, UK.,Department of Perinatal Imaging and Health, Centre for the Developing Brain, School of Biomedical Imaging and Medical Sciences, King's College London, London, UK
| | - Elisabet Blok
- Department of Child and Adolescent Psychiatry/Psychology, Erasmus MC-Sophia Children's Hospital, University Medical Centre Rotterdam, Rotterdam, The Netherlands.,The Generation R Study Group, Erasmus MC, University Medical Centre Rotterdam, Rotterdam, The Netherlands
| | - Lorenza Dall'Aglio
- Department of Child and Adolescent Psychiatry/Psychology, Erasmus MC-Sophia Children's Hospital, University Medical Centre Rotterdam, Rotterdam, The Netherlands.,The Generation R Study Group, Erasmus MC, University Medical Centre Rotterdam, Rotterdam, The Netherlands
| | - Samuel J Westwood
- Institute of Psychiatry, Psychology, Neuroscience, King's College London, London, UK. .,Department of Psychology, School of Social Science, University of Westminster, 115 New Cavendish Street, London, W1W 6UW, UK.
| |
Collapse
|
12
|
Recio-Saucedo A, Crane K, Meadmore K, Fackrell K, Church H, Fraser S, Blatch-Jones A. What works for peer review and decision-making in research funding: a realist synthesis. Res Integr Peer Rev 2022; 7:2. [PMID: 35246264 PMCID: PMC8894828 DOI: 10.1186/s41073-022-00120-2] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/05/2021] [Accepted: 02/01/2022] [Indexed: 11/17/2022] Open
Abstract
Introduction Allocation of research funds relies on peer review to support funding decisions, and these processes can be susceptible to biases and inefficiencies. The aim of this work was to determine which past interventions to peer review and decision-making have worked to improve research funding practices, how they worked, and for whom. Methods Realist synthesis of peer-review publications and grey literature reporting interventions in peer review for research funding. Results We analysed 96 publications and 36 website sources. Sixty publications enabled us to extract stakeholder-specific context-mechanism-outcomes configurations (CMOCs) for 50 interventions, which formed the basis of our synthesis. Shorter applications, reviewer and applicant training, virtual funding panels, enhanced decision models, institutional submission quotas, applicant training in peer review and grant-writing reduced interrater variability, increased relevance of funded research, reduced time taken to write and review applications, promoted increased investment into innovation, and lowered cost of panels. Conclusions Reports of 50 interventions in different areas of peer review provide useful guidance on ways of solving common issues with the peer review process. Evidence of the broader impact of these interventions on the research ecosystem is still needed, and future research should aim to identify processes that consistently work to improve peer review across funders and research contexts. Supplementary Information The online version contains supplementary material available at 10.1186/s41073-022-00120-2.
Collapse
Affiliation(s)
- Alejandra Recio-Saucedo
- Wessex Institute, National Institute of Health Research Evaluation, Trials and Studies Coordinating Centre, University of Southampton, Alpha House, Enterprise Road, Southampton, Southampton, SO16 7NS, UK.
| | - Ksenia Crane
- Wessex Institute, National Institute of Health Research Evaluation, Trials and Studies Coordinating Centre, University of Southampton, Alpha House, Enterprise Road, Southampton, Southampton, SO16 7NS, UK
| | - Katie Meadmore
- Wessex Institute, National Institute of Health Research Evaluation, Trials and Studies Coordinating Centre, University of Southampton, Alpha House, Enterprise Road, Southampton, Southampton, SO16 7NS, UK
| | - Kathryn Fackrell
- Wessex Institute, National Institute of Health Research Evaluation, Trials and Studies Coordinating Centre, University of Southampton, Alpha House, Enterprise Road, Southampton, Southampton, SO16 7NS, UK
| | - Hazel Church
- Wessex Institute, National Institute of Health Research Evaluation, Trials and Studies Coordinating Centre, University of Southampton, Alpha House, Enterprise Road, Southampton, Southampton, SO16 7NS, UK
| | - Simon Fraser
- Wessex Institute, National Institute of Health Research Evaluation, Trials and Studies Coordinating Centre, University of Southampton, Alpha House, Enterprise Road, Southampton, Southampton, SO16 7NS, UK.,School of Primary Care, Population Sciences and Medical Education, Faculty of Medicine, University of Southampton, Southampton, SO17 1BJ, UK
| | - Amanda Blatch-Jones
- Wessex Institute, National Institute of Health Research Evaluation, Trials and Studies Coordinating Centre, University of Southampton, Alpha House, Enterprise Road, Southampton, Southampton, SO16 7NS, UK
| |
Collapse
|
13
|
The interplay of the size of the research system, ways of collaboration, level, and method of funding in determining bibliometric outputs. Scientometrics 2022. [DOI: 10.1007/s11192-021-04232-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
|
14
|
Braganza O. Proxyeconomics, a theory and model of proxy-based competition and cultural evolution. ROYAL SOCIETY OPEN SCIENCE 2022; 9:211030. [PMID: 35223051 PMCID: PMC8864350 DOI: 10.1098/rsos.211030] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 06/14/2021] [Accepted: 01/24/2022] [Indexed: 06/14/2023]
Abstract
Competitive societal systems by necessity rely on imperfect proxy measures. For instance, profit is used to measure value to consumers, patient volumes to measure hospital performance, or the journal impact factor to measure scientific value. While there are numerous reasons why proxies will deviate from the underlying societal goals, they will nevertheless determine the selection of cultural practices and guide individual decisions. These considerations suggest that the study of proxy-based competition requires the integration of cultural evolution theory and economics or decision theory. Here, we attempt such an integration in two ways. First, we describe an agent-based simulation model, combining methods and insights from these disciplines. The model suggests that an individual intrinsic incentive can constrain a cultural evolutionary pressure, which would otherwise enforce fully proxy-oriented practices. The emergent outcome is distinct from that with either the isolated economic or evolutionary mechanism. It reflects what we term lock-in, where competitive pressure can undermine the ability of agents to pursue the shared social goal. Second, we elaborate the broader context, outlining the system-theoretic foundations as well as some philosophical and practical implications, towards a broader theory. Overall, we suggest such a theory may offer an explanatory and predictive framework for diverse subjects, ranging from scientific replicability to climate inaction, and outlining strategies for diagnosis and mitigation.
Collapse
Affiliation(s)
- Oliver Braganza
- Institute for Experimental Epileptology and Cognition Research, University of Bonn, Bonn, Germany
- Center for Science and Thought, University of Bonn, Bonn, Germany
| |
Collapse
|
15
|
Roberts RG. The first six years of meta-research at PLOS Biology. PLoS Biol 2022; 20:e3001553. [PMID: 35100252 PMCID: PMC8830785 DOI: 10.1371/journal.pbio.3001553] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Revised: 02/10/2022] [Indexed: 11/18/2022] Open
Affiliation(s)
- Roland G. Roberts
- Public Library of Science, San Francisco, California, United States of America and Cambridge, United Kingdom
- * E-mail:
| | | |
Collapse
|
16
|
|
17
|
Brette R. [A critique of the managerial model of research]. Med Sci (Paris) 2022; 38:84-88. [PMID: 35060892 DOI: 10.1051/medsci/2021247] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022] Open
Abstract
To be a scientist is to make an implicit ethical commitment: to try to tell the truth about the world. The managerial model of research, which is the ideological foundation of modern political reforms of the research system around the world, stands in direct conflict with this assertion. It consists in identifying the scientist with a homo economicus looking to maximize its own profit, which a bureaucracy is tasked to align with performance objectives. This model is incoherent and destructive. Science is made possible by curiosity, emulation and intellectual ethics. These are the human traits that a rational research organization should try to favor and exploit.
Collapse
Affiliation(s)
- Romain Brette
- Sorbonne Université, UPMC Univ Paris 6, Inserm, CNRS, Institut de la vision, 17 rue Moreau, 75012 Paris, France
| |
Collapse
|
18
|
OUP accepted manuscript. RESEARCH EVALUATION 2022. [DOI: 10.1093/reseval/rvac007] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022]
|
19
|
Bekele A, Chu K, D'Ambruoso L, Davies JI, Ferriolli E, Greig C, Manaseki-Holland S, Regnier D, Siddiqi S. Global health research funding applications: brain drain under another name? Lancet Glob Health 2022; 10:e22-e23. [PMID: 34919848 DOI: 10.1016/s2214-109x(21)00505-2] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/05/2021] [Accepted: 10/08/2021] [Indexed: 01/05/2023]
Affiliation(s)
- Abebe Bekele
- University of Global Health Equity, Butaro, Rwanda
| | - Kathryn Chu
- Centre for Global Surgery, Department of Global Health, Stellenbosch University, Cape Town, South Africa
| | - Lucia D'Ambruoso
- Aberdeen Centre for Health Data Science, Institute of Applied Health Sciences, School of Medicine, University of Aberdeen, Aberdeen, UK; Medical Research Council/Wits University Rural Public Health and Health Transitions Research Unit, Faculty of Health Sciences, School of Public Health, University of the Witwatersrand, Johannesburg, South Africa
| | - Justine I Davies
- Centre for Global Surgery, Department of Global Health, Stellenbosch University, Cape Town, South Africa; Medical Research Council/Wits University Rural Public Health and Health Transitions Research Unit, Faculty of Health Sciences, School of Public Health, University of the Witwatersrand, Johannesburg, South Africa; Institute of Applied Health Sciences, University of Birmingham, Birmingham, UK.
| | - Eduardo Ferriolli
- Ribeirão Preto School of Medicine (FMRP), University of Sao Paulo, Sao Paulo, Brazil
| | - Carolyn Greig
- School of Sport, Exercise, and Rehabilitation Sciences, University of Birmingham, Birmingham, UK
| | | | | | - Sameen Siddiqi
- Department of Community Health Sciences, Aga Khan University, Karachi, Pakistan
| |
Collapse
|
20
|
Abstract
Scientists in some fields are concerned that many published results are false. Recent models predict selection for false positives as the inevitable result of pressure to publish, even when scientists are penalized for publications that fail to replicate. We model the cultural evolution of research practices when laboratories are allowed to expend effort on theory, enabling them, at a cost, to identify hypotheses that are more likely to be true, before empirical testing. Theory can restore high effort in research practice and suppress false positives to a technical minimum, even without replication. The mere ability to choose between two sets of hypotheses, one with greater prior chance of being correct, promotes better science than can be achieved with effortless access to the set of stronger hypotheses. Combining theory and replication can have synergistic effects. On the basis of our analysis, we propose four simple recommendations to promote good science.
Collapse
|
21
|
Tiokhin L, Yan M, Morgan TJH. Competition for priority harms the reliability of science, but reforms can help. Nat Hum Behav 2021; 5:857-867. [PMID: 33510392 DOI: 10.1038/s41562-020-01040-1] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/22/2020] [Accepted: 12/18/2020] [Indexed: 01/30/2023]
Abstract
Incentives for priority of discovery are hypothesized to harm scientific reliability. Here, we evaluate this hypothesis by developing an evolutionary agent-based model of a competitive scientific process. We find that rewarding priority of discovery causes populations to culturally evolve towards conducting research with smaller samples. This reduces research reliability and the information value of the average study. Increased start-up costs for setting up single studies and increased payoffs for secondary results (also known as scoop protection) attenuate the negative effects of competition. Furthermore, large rewards for negative results promote the evolution of smaller sample sizes. Our results confirm the logical coherence of scoop protection reforms at several journals. Our results also imply that reforms to increase scientific efficiency, such as rapid journal turnaround times, may produce collateral damage by incentivizing lower-quality research; in contrast, reforms that increase start-up costs, such as pre-registration and registered reports, may generate incentives for higher-quality research.
Collapse
Affiliation(s)
- Leonid Tiokhin
- Human-Technology Interaction Group, Eindhoven University of Technology, Eindhoven, the Netherlands.
| | - Minhua Yan
- School of Human Evolution and Social Change, Arizona State University, Tempe, AZ, USA.,Institute of Human Origins, Arizona State University, Tempe, AZ, USA
| | - Thomas J H Morgan
- School of Human Evolution and Social Change, Arizona State University, Tempe, AZ, USA.,Institute of Human Origins, Arizona State University, Tempe, AZ, USA
| |
Collapse
|
22
|
De Peuter S, Conix S. The modified lottery: Formalizing the intrinsic randomness of research funding. Account Res 2021; 29:324-345. [PMID: 33970719 DOI: 10.1080/08989621.2021.1927727] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
Abstract
Competition for research funds has, in the recent decade, become hypercompetitive. Commonly, to determine which proposals receive funding, a system of peer review is used, which is broadly accepted, easily understood, and broadly trusted among researchers. It is often considered the best system in use, but it suffers from important shortcomings, and adaptations to overcome these shortcomings have small and often short-lived effects. Hence, the preference for peer review does not mean it necessarily outperforms all other systems. In fact, it is time for an open discussion about alternative allocation mechanisms. Random allocation of research funding may be a viable alternative to the current peer review system. In particular the "organized randomness" of a modified lottery is interesting, combining the benefits of randomization with some of the most valuable aspects of peer review. Still, many questions remain and this is certainly not a plea to allocate all research funds using lotteries without further research. But we need to be prepared to consider alternatives, even though they are not perfect, and modified lotteries should be part of the solution.
Collapse
Affiliation(s)
- Steven De Peuter
- Faculty of Psychology and Educational Sciences, KU Leuven, Leuven, Belgium
| | - S Conix
- Centre for Logic and Philosophy of Science, Institute of Philosophy, KU Leuven, Leuven, Belgium
| |
Collapse
|
23
|
Balietti S, Riedl C. Incentives, competition, and inequality in markets for creative production. RESEARCH POLICY 2021. [DOI: 10.1016/j.respol.2021.104212] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
|
24
|
Triggle CR, MacDonald R, Triggle DJ, Grierson D. Requiem for impact factors and high publication charges. Account Res 2021; 29:133-164. [PMID: 33787413 DOI: 10.1080/08989621.2021.1909481] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/08/2023]
Abstract
Journal impact factors, publication charges and assessment of quality and accuracy of scientific research are critical for researchers, managers, funders, policy makers, and society. Editors and publishers compete for impact factor rankings, to demonstrate how important their journals are, and researchers strive to publish in perceived top journals, despite high publication and access charges. This raises questions of how top journals are identified, whether assessments of impacts are accurate and whether high publication charges borne by the research community are justified, bearing in mind that they also collectively provide free peer-review to the publishers. Although traditional journals accelerated peer review and publication during the COVID-19 pandemic, preprint servers made a greater impact with over 30,000 open access articles becoming available and accelerating a trend already seen in other fields of research. We review and comment on the advantages and disadvantages of a range of assessment methods and the way in which they are used by researchers, managers, employers and publishers. We argue that new approaches to assessment are required to provide a realistic and comprehensive measure of the value of research and journals and we support open access publishing at a modest, affordable price to benefit research producers and consumers.
Collapse
Affiliation(s)
- Chris R Triggle
- Departments of Medical Education & Pharmacology, Weill Cornell Medicine-Qatar, Doha, Qatar
| | - Ross MacDonald
- Distributed eLibrary, Weill Cornell Medicine-Qatar, Doha, New York, Qatar
| | - David J Triggle
- School of Pharmacy and Pharmaceutical Sciences, State University of New York, Buffalo, New York, USA
| | - Donald Grierson
- School of Biosciences, University of Nottingham, Loughborough, UK
| |
Collapse
|
25
|
The imaginary carrot: no correlation between raising funds and research productivity in geosciences. Scientometrics 2021. [DOI: 10.1007/s11192-020-03855-1] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
Abstract
AbstractThe ability of researchers to raise funding is central to academic achievement. However, whether success in obtaining research funds correlates with the productivity, quality or impact of a researcher is debated. Here we analyse 10 years of grant funding by the Swiss National Science Foundation in Earth and Environmental Sciences, and compare it to the publication record of the researchers who were awarded the funds. No significant statistical correlation can be established between the publication or citation record of a researcher and the amount of money this researcher obtains in grant funding. These results imply that researchers successful in raising funds are not necessarily in a position to be more productive or produce more impactful publications. Those results should be considered for deciding whether to use grant funding as a criterion for career advancement procedures.
Collapse
|
26
|
Tiokhin L, Panchanathan K, Lakens D, Vazire S, Morgan T, Zollman K. Honest signaling in academic publishing. PLoS One 2021; 16:e0246675. [PMID: 33621261 PMCID: PMC7901761 DOI: 10.1371/journal.pone.0246675] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2020] [Accepted: 01/22/2021] [Indexed: 11/19/2022] Open
Abstract
Academic journals provide a key quality-control mechanism in science. Yet, information asymmetries and conflicts of interests incentivize scientists to deceive journals about the quality of their research. How can honesty be ensured, despite incentives for deception? Here, we address this question by applying the theory of honest signaling to the publication process. Our models demonstrate that several mechanisms can ensure honest journal submission, including differential benefits, differential costs, and costs to resubmitting rejected papers. Without submission costs, scientists benefit from submitting all papers to high-ranking journals, unless papers can only be submitted a limited number of times. Counterintuitively, our analysis implies that inefficiencies in academic publishing (e.g., arbitrary formatting requirements, long review times) can serve a function by disincentivizing scientists from submitting low-quality work to high-ranking journals. Our models provide simple, powerful tools for understanding how to promote honest paper submission in academic publishing.
Collapse
Affiliation(s)
- Leonid Tiokhin
- Department of Industrial Engineering & Innovation Sciences, Human Technology Interaction Group, Eindhoven University of Technology, Eindhoven, The Netherlands
| | - Karthik Panchanathan
- Department of Anthropology, University of Missouri, Columbia, Missouri, United States of America
| | - Daniel Lakens
- Department of Industrial Engineering & Innovation Sciences, Human Technology Interaction Group, Eindhoven University of Technology, Eindhoven, The Netherlands
| | - Simine Vazire
- Melbourne School of Psychological Sciences, University of Melbourne, Melbourne, Victoria, Australia
| | - Thomas Morgan
- School of Human Evolution and Social Change, Arizona State University, Tempe, Arizona, United States of America
- Institute of Human Origins, Arizona State University, Tempe, Arizona, United States of America
| | - Kevin Zollman
- Department of Philosophy, Carnegie Mellon University, Pittsburgh, Pennsylvania, United States of America
| |
Collapse
|
27
|
Philipps A. Science rules! A qualitative study of scientists’ approaches to grant lottery. RESEARCH EVALUATION 2020. [DOI: 10.1093/reseval/rvaa027] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022]
Abstract
Abstract
Using peer review to assess the validity of research proposals has always had its fair share of critics, including a more-than-fair-share of scholars. The debate about this method of assessing these proposals now seems trivial when compared with assessing the validity for granting funding by lottery. Some of the same scholars have suggested that the way grant lottery was being assessed has made random allocation seem even-handed, less biased and more supportive of innovative research. But we know little of what researchers actually think about grant lottery and even less about the thoughts of those scientists who rely on funding. This paper examines scientists’ perspectives on selecting grants by ‘lots’ and how they justify their support or opposition. How do they approach something scientifically that is, in itself, not scientific? These approaches were investigated with problem-centered interviews conducted with natural scientists in Germany. The qualitative interviews for this paper reveal that scientists in dominated and dominating field positions are, more or less, open to the idea of giving a selection process by lots a try. Nonetheless, they are against pure randomization because from their point of view it is incompatible with scientific principles. They rather favor a combination of grant lottery and peer review processes, assuming that only under these conditions could randomly allocated funding be an integral and legitimate part of science.
Collapse
Affiliation(s)
- Axel Philipps
- Leibniz Center for Science and Society (LCSS), Leibniz University Hannover, Lange Laube 32, 30159 Hannover, Germany
- Institute of Sociology, Leibniz University Hannover, Schneiderberg 50, 30167 Hannover, Germany
| |
Collapse
|
28
|
Braganza O. A simple model suggesting economically rational sample-size choice drives irreproducibility. PLoS One 2020; 15:e0229615. [PMID: 32160229 PMCID: PMC7065751 DOI: 10.1371/journal.pone.0229615] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/23/2019] [Accepted: 02/10/2020] [Indexed: 11/30/2022] Open
Abstract
Several systematic studies have suggested that a large fraction of published research is not reproducible. One probable reason for low reproducibility is insufficient sample size, resulting in low power and low positive predictive value. It has been suggested that insufficient sample-size choice is driven by a combination of scientific competition and 'positive publication bias'. Here we formalize this intuition in a simple model, in which scientists choose economically rational sample sizes, balancing the cost of experimentation with income from publication. Specifically, assuming that a scientist's income derives only from 'positive' findings (positive publication bias) and that individual samples cost a fixed amount, allows to leverage basic statistical formulas into an economic optimality prediction. We find that if effects have i) low base probability, ii) small effect size or iii) low grant income per publication, then the rational (economically optimal) sample size is small. Furthermore, for plausible distributions of these parameters we find a robust emergence of a bimodal distribution of obtained statistical power and low overall reproducibility rates, both matching empirical findings. Finally, we explore conditional equivalence testing as a means to align economic incentives with adequate sample sizes. Overall, the model describes a simple mechanism explaining both the prevalence and the persistence of small sample sizes, and is well suited for empirical validation. It proposes economic rationality, or economic pressures, as a principal driver of irreproducibility and suggests strategies to change this.
Collapse
Affiliation(s)
- Oliver Braganza
- Institute for Experimental Epileptology and Cognition Research, University of Bonn, Bonn, Germany
| |
Collapse
|
29
|
Liu M, Choy V, Clarke P, Barnett A, Blakely T, Pomeroy L. The acceptability of using a lottery to allocate research funding: a survey of applicants. Res Integr Peer Rev 2020; 5:3. [PMID: 32025338 PMCID: PMC6996170 DOI: 10.1186/s41073-019-0089-z] [Citation(s) in RCA: 24] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/22/2019] [Accepted: 12/23/2019] [Indexed: 11/24/2022] Open
Abstract
Background The Health Research Council of New Zealand is the first major government funding agency to use a lottery to allocate research funding for their Explorer Grant scheme. This is a somewhat controversial approach because, despite the documented problems of peer review, many researchers believe that funding should be allocated solely using peer review, and peer review is used almost ubiquitously by funding agencies around the world. Given the rarity of alternative funding schemes, there is interest in hearing from the first cohort of researchers to ever experience a lottery. Additionally, the Health Research Council of New Zealand wanted to hear from applicants about the acceptability of the randomisation process and anonymity of applicants. Methods This paper presents the results of a survey of Health Research Council applicants from 2013 to 2019. The survey asked about the acceptability of using a lottery and if the lottery meant researchers took a different approach to their application. Results The overall response rate was 39% (126 of 325 invites), with 30% (76 of 251) from applicants in the years 2013 to 2018, and 68% (50 of 74) for those in the year 2019 who were not aware of the funding result. There was agreement that randomisation is an acceptable method for allocating Explorer Grant funds with 63% (n = 79) in favour and 25% (n = 32) against. There was less support for allocating funds randomly for other grant types with only 40% (n = 50) in favour and 37% (n = 46) against. Support for a lottery was higher amongst those that had won funding. Multiple respondents stated that they supported a lottery when ineligible applications had been excluded and outstanding applications funded, so that the remaining applications were truly equal. Most applicants reported that the lottery did not change the time they spent preparing their application. Conclusions The Health Research Council’s experience through the Explorer Grant scheme supports further uptake of a modified lottery.
Collapse
Affiliation(s)
- Mengyao Liu
- 1Health Research Council of New Zealand, Auckland, New Zealand
| | - Vernon Choy
- 1Health Research Council of New Zealand, Auckland, New Zealand
| | - Philip Clarke
- 2Health Economics Research Centre, Nuffield Department of Population Health, University of Oxford, Oxford, UK
| | - Adrian Barnett
- 3Institute of Health and Biomedical Innovation & School of Public Health and Social Work, Queensland University of Technology, Brisbane, Queensland Australia
| | - Tony Blakely
- 4Melbourne School of Population and Global Health, University of Melbourne, Melbourne, Victoria Australia
| | - Lucy Pomeroy
- 1Health Research Council of New Zealand, Auckland, New Zealand
| |
Collapse
|
30
|
Abstract
AbstractThe purpose of this paper is to analyze the causes and effects of arbitrariness in the peer review process. This paper focuses on two main reasons for the arbitrariness in peer review. The first is that referees are not homogenous and display homophily in their taste and perception of innovative ideas. The second element is that reviewers are different in the time they allocate for peer review. Our model replicates the NIPS experiment of 2014, showing that the ratings of peer review are not robust, and that altering reviewers leads to a dramatic impact on the ranking of the papers. This paper also shows that innovative works are not highly ranked in the existing peer review process, and in consequence are often rejected.
Collapse
|
31
|
Bedessem B. Should we fund research randomly? An epistemological criticism of the lottery model as an alternative to peer review for the funding of science. RESEARCH EVALUATION 2019. [DOI: 10.1093/reseval/rvz034] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022]
Abstract
Abstract
The way research is, and should be, funded by the public sphere is the subject of renewed interest for sociology, economics, management sciences, and more recently, for the philosophy of science. In this contribution, I propose a qualitative, epistemological criticism of the funding by lottery model, which is advocated by a growing number of scholars as an alternative to peer review. This lottery scheme draws on the lack of efficiency and of robustness of the peer-review-based evaluation to argue that the majority of public resources for basic science should be allocated randomly. I first differentiate between two distinct arguments used to defend this alternative funding scheme based on considerations about the logic of scientific research. To assess their epistemological limits, I then present and develop a conceptual frame, grounded on the notion of ‘system of practice’, which can be used to understand what precisely it means, for a research project, to be interesting or significant. I use this epistemological analysis to show that the lottery model is not theoretically optimal, since it underestimates the integration of all scientific projects in densely interconnected systems of conceptual, experimental, or technical practices which confer their proper interest to them. I also apply these arguments in order to criticize the classical peer-review process. I finally suggest, as a discussion, that some recently proposed models that bring to the fore a principle of decentralization of the evaluation and selection process may constitute a better alternative, if the practical conditions of their implementation are adequately settled.
Collapse
Affiliation(s)
- Baptiste Bedessem
- Laboratoire IRPHIL, Université Lyon 3 (Faculté de Philosophie), 18 rue Chevreul, Lyon 69007, France
| |
Collapse
|
32
|
Dunn BD, O'Mahen H, Wright K, Brown G. A commentary on research rigour in clinical psychological science: How to avoid throwing out the innovation baby with the research credibility bath water in the depression field. Behav Res Ther 2019; 120:103417. [DOI: 10.1016/j.brat.2019.103417] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2019] [Revised: 05/14/2019] [Accepted: 06/03/2019] [Indexed: 11/27/2022]
|
33
|
Smaldino PE, Turner MA, Contreras Kallens PA. Correction to 'Open science and modified funding lotteries can impede the natural selection of bad science'. ROYAL SOCIETY OPEN SCIENCE 2019; 6:191249. [PMID: 31543978 PMCID: PMC6731693 DOI: 10.1098/rsos.191249] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
[This corrects the article DOI: 10.1098/rsos.190194.].
Collapse
|
34
|
Rodriguez RM. The Folly of the R: A Case Study. Acad Emerg Med 2019; 26:956-958. [PMID: 30933407 DOI: 10.1111/acem.13759] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Affiliation(s)
- Robert M Rodriguez
- Department of Emergency Medicine, The University of California San Francisco School of Medicine, Zuckerberg San Francisco General Hospital, San Francisco, CA
| |
Collapse
|
35
|
Smaldino PE, Turner MA, Contreras Kallens PA. Open science and modified funding lotteries can impede the natural selection of bad science. ROYAL SOCIETY OPEN SCIENCE 2019; 6:190194. [PMID: 31417725 PMCID: PMC6689639 DOI: 10.1098/rsos.190194] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 02/04/2019] [Accepted: 06/04/2019] [Indexed: 06/10/2023]
Abstract
Assessing scientists using exploitable metrics can lead to the degradation of research methods even without any strategic behaviour on the part of individuals, via 'the natural selection of bad science.' Institutional incentives to maximize metrics like publication quantity and impact drive this dynamic. Removing these incentives is necessary, but institutional change is slow. However, recent developments suggest possible solutions with more rapid onsets. These include what we call open science improvements, which can reduce publication bias and improve the efficacy of peer review. In addition, there have been increasing calls for funders to move away from prestige- or innovation-based approaches in favour of lotteries. We investigated whether such changes are likely to improve the reproducibility of science even in the presence of persistent incentives for publication quantity through computational modelling. We found that modified lotteries, which allocate funding randomly among proposals that pass a threshold for methodological rigour, effectively reduce the rate of false discoveries, particularly when paired with open science improvements that increase the publication of negative results and improve the quality of peer review. In the absence of funding that targets rigour, open science improvements can still reduce false discoveries in the published literature but are less likely to improve the overall culture of research practices that underlie those publications.
Collapse
Affiliation(s)
- Paul E. Smaldino
- Department of Cognitive and Information Sciences, University of California, Merced, CA, USA
| | - Matthew A. Turner
- Department of Cognitive and Information Sciences, University of California, Merced, CA, USA
| | | |
Collapse
|