1
|
Eben C, Bőthe B, Brevers D, Clark L, Grubbs JB, Heirene R, Kräplin A, Lewczuk K, Palmer L, Perales JC, Peters J, van Holst RJ, Billieux J. The landscape of open science in behavioral addiction research: Current practices and future directions. J Behav Addict 2023; 12:862-870. [PMID: 38141055 PMCID: PMC10786235 DOI: 10.1556/2006.2023.00052] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/17/2023] [Revised: 07/24/2023] [Accepted: 09/10/2023] [Indexed: 12/24/2023] Open
Abstract
Open science refers to a set of practices that aim to make scientific research more transparent, accessible, and reproducible, including pre-registration of study protocols, sharing of data and materials, the use of transparent research methods, and open access publishing. In this commentary, we describe and evaluate the current state of open science practices in behavioral addiction research. We highlight the specific value of open science practices for the field; discuss recent field-specific meta-scientific reviews that show the adoption of such practices remains in its infancy; address the challenges to engaging with open science; and make recommendations for how researchers, journals, and scientific institutions can work to overcome these challenges and promote high-quality, transparently reported behavioral addiction research. By collaboratively promoting open science practices, the field can create a more sustainable and productive research environment that benefits both the scientific community and society as a whole.
Collapse
Affiliation(s)
- Charlotte Eben
- Department of Experimental Psychology, Ghent
University, GhentBelgium
| | - Beáta Bőthe
- Département de Psychologie, Université de
Montréal, Montréal, Canada
| | - Damien Brevers
- Louvain for Experimental Psychopathology
Research Group (LEP), Psychological Sciences Research Institute,
UCLouvain, Louvain-la-Neuve,
Belgium
| | - Luke Clark
- Centre for Gambling Research at UBC,
Department of Psychology, University of British Columbia,
Vancouver, B.C., Canada
- Djavad Mowafaghian Centre for Brain Health,
University of British Columbia, Vancouver, B.C.,
Canada
| | - Joshua B. Grubbs
- Department of Psychology, University of New
Mexico, Albuquerque, NM, USA
- Center for Alcohol, Substance Use, And
Addiction (CASAA), University of New Mexico, Albuquerque,
NM, USA
| | - Robert Heirene
- School of Psychology, University of
Plymouth, Plymouth, UK
| | - Anja Kräplin
- Faculty of Psychology, Technische
Universität Dresden, Dresden,
Germany
| | - Karol Lewczuk
- Institute of Psychology, Cardinal Stefan
Wyszynski University in Warsaw, Warsaw,
Poland
| | - Lucas Palmer
- Centre for Gambling Research at UBC,
Department of Psychology, University of British Columbia,
Vancouver, B.C., Canada
| | - José C. Perales
- Department of Experimental Psychology;
Mind, Brain and Behavior Research Center (CIMCYC), University of
Granada, Granada, Spain
| | - Jan Peters
- Department of Psychology, Biological
Psychology, University of Cologne, Cologne,
Germany
| | - Ruth J. van Holst
- Department of Psychiatry, Amsterdam UMC
-University of Amsterdam, Amsterdam, The
Netherlands
- Center for Urban Mental Health, University
of Amsterdam, Amsterdam, The
Netherlands
| | - Joël Billieux
- Institute of Psychology, University of
Lausanne, Lausanne,
Switzerland
- Center for Excessive Gambling, Addiction
Medicine, Lausanne University Hospital (CHUV),
Lausanne, Switzerland
| |
Collapse
|
2
|
Boyce V, Mathur M, Frank MC. Eleven years of student replication projects provide evidence on the correlates of replicability in psychology. R Soc Open Sci 2023; 10:231240. [PMID: 38026006 PMCID: PMC10645069 DOI: 10.1098/rsos.231240] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/22/2023] [Accepted: 10/20/2023] [Indexed: 12/01/2023]
Abstract
Cumulative scientific progress requires empirical results that are robust enough to support theory construction and extension. Yet in psychology, some prominent findings have failed to replicate, and large-scale studies suggest replicability issues are widespread. The identification of predictors of replication success is limited by the difficulty of conducting large samples of independent replication experiments, however: most investigations reanalyse the same set of 170 replications . We introduce a new dataset of 176 replications from students in a graduate-level methods course. Replication results were judged to be successful in 49% of replications; of the 136 where effect sizes could be numerically compared, 46% had point estimates within the prediction interval of the original outcome (versus the expected 95%). Larger original effect sizes and within-participants designs were especially related to replication success. Our results indicate that, consistent with prior reports, the robustness of the psychology literature is low enough to limit cumulative progress by student investigators.
Collapse
Affiliation(s)
- Veronica Boyce
- Department of Psychology, Department of Medicine, Stanford University, Stanford, CA, USA
| | - Maya Mathur
- Quantitative Sciences Unit, Department of Medicine, Stanford University, Stanford, CA, USA
| | - Michael C. Frank
- Department of Psychology, Department of Medicine, Stanford University, Stanford, CA, USA
| |
Collapse
|
3
|
Strand JF, Brown VA. Spread the Word: Enhancing Replicability of Speech Research Through Stimulus Sharing. J Speech Lang Hear Res 2023; 66:1967-1976. [PMID: 36749834 PMCID: PMC10465150 DOI: 10.1044/2022_jslhr-22-00267] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/12/2022] [Revised: 08/03/2022] [Accepted: 10/26/2022] [Indexed: 06/18/2023]
Abstract
PURPOSE The ongoing replication crisis within and beyond psychology has revealed the numerous ways in which flexibility in the research process can affect study outcomes. In speech research, examples of these "researcher degrees of freedom" include the particular syllables, words, or sentences presented; the talkers who produce the stimuli and the instructions given to them; the population tested; whether and how stimuli are matched on amplitude; the type of masking noise used and its presentation level; and many others. In this research note, we argue that even seemingly minor methodological choices have the potential to affect study outcomes. To that end, we present a reanalysis of six existing data sets on spoken word identification in noise to assess how differences in talkers, stimulus processing, masking type, and listeners affect identification accuracy. CONCLUSIONS Our reanalysis revealed relatively low correlations among word identification rates across studies. The data suggest that some of the seemingly innocuous methodological details that differ across studies-details that cannot possibly be reported in text given the idiosyncrasies inherent to speech-introduce unknown variability that may affect replicability of our findings. We therefore argue that publicly sharing stimuli is a crucial step toward improved replicability in speech research. SUPPLEMENTAL MATERIAL https://doi.org/10.23641/asha.21985907.
Collapse
Affiliation(s)
| | - Violet A. Brown
- Department of Psychological & Brain Sciences, Washington University in St. Louis, MO
| |
Collapse
|
4
|
Pownall M, Azevedo F, König LM, Slack HR, Evans TR, Flack Z, Grinschgl S, Elsherif MM, Gilligan-Lee KA, de Oliveira CMF, Gjoneska B, Kalandadze T, Button K, Ashcroft-Jones S, Terry J, Albayrak-Aydemir N, Děchtěrenko F, Alzahawi S, Baker BJ, Pittelkow MM, Riedl L, Schmidt K, Pennington CR, Shaw JJ, Lüke T, Makel MC, Hartmann H, Zaneva M, Walker D, Verheyen S, Cox D, Mattschey J, Gallagher-Mitchell T, Branney P, Weisberg Y, Izydorczak K, Al-Hoorie AH, Creaven AM, Stewart SLK, Krautter K, Matvienko-Sikar K, Westwood SJ, Arriaga P, Liu M, Baum MA, Wingen T, Ross RM, O'Mahony A, Bochynska A, Jamieson M, Tromp MV, Yeung SK, Vasilev MR, Gourdon-Kanhukamwe A, Micheli L, Konkol M, Moreau D, Bartlett JE, Clark K, Brekelmans G, Gkinopoulos T, Tyler SL, Röer JP, Ilchovska ZG, Madan CR, Robertson O, Iley BJ, Guay S, Sladekova M, Sadhwani S. Teaching open and reproducible scholarship: a critical review of the evidence base for current pedagogical methods and their outcomes. R Soc Open Sci 2023; 10:221255. [PMID: 37206965 PMCID: PMC10189598 DOI: 10.1098/rsos.221255] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/27/2022] [Accepted: 04/26/2023] [Indexed: 05/21/2023]
Abstract
In recent years, the scientific community has called for improvements in the credibility, robustness and reproducibility of research, characterized by increased interest and promotion of open and transparent research practices. While progress has been positive, there is a lack of consideration about how this approach can be embedded into undergraduate and postgraduate research training. Specifically, a critical overview of the literature which investigates how integrating open and reproducible science may influence student outcomes is needed. In this paper, we provide the first critical review of literature surrounding the integration of open and reproducible scholarship into teaching and learning and its associated outcomes in students. Our review highlighted how embedding open and reproducible scholarship appears to be associated with (i) students' scientific literacies (i.e. students' understanding of open research, consumption of science and the development of transferable skills); (ii) student engagement (i.e. motivation and engagement with learning, collaboration and engagement in open research) and (iii) students' attitudes towards science (i.e. trust in science and confidence in research findings). However, our review also identified a need for more robust and rigorous methods within pedagogical research, including more interventional and experimental evaluations of teaching practice. We discuss implications for teaching and learning scholarship.
Collapse
Affiliation(s)
| | - Flávio Azevedo
- Department of Psychology, University of Cambridge, CB2 3EB, UK
| | - Laura M. König
- Faculty of Life Sciences: Food, Nutrition and Health, University of Bayreuth, 95447 Bayreuth, Germany
| | - Hannah R. Slack
- School of Psychology, University of Nottingham, Nottingham NG7 2RD, UK
| | - Thomas Rhys Evans
- School of Human Sciences, University of Greenwich, London SE10 9LS, UK
- Centre for Workforce Development, Institute for Lifecourse Development, University of Greenwich, London SE10 9LS, UK
| | - Zoe Flack
- School of Humanities and Social Science, University of Brighton, BN2 0JY, UK
| | | | | | | | | | - Biljana Gjoneska
- Macedonian Academy of Sciences and Arts, North Macedonia, XCWR+GJM, 1000
| | - Tamara Kalandadze
- Faculty of Teacher Education and Languages, Department of Education, ICT and Learning, Ostfold University College, 1757 Halden, Norway
| | | | - Sarah Ashcroft-Jones
- Department of Experimental Psychology, University of Oxford, Oxford OX1 4BH18, UK
| | - Jenny Terry
- School of Psychology, University of Sussex, Brighton BN1 9RH, UK
| | - Nihan Albayrak-Aydemir
- School of Psychology and Counselling, the Open University, Milton Keynes MK7 6AA, UK
- Department of Psychological and Behavioural Science, London School of Economics and Political Science, UK
| | - Filip Děchtěrenko
- Department of Mathematics, College of Polytechnics Jihlava, 1556/16, 586 01, Czech Republic
| | | | - Bradley J. Baker
- Department of Sport and Recreation Management, Temple University, PA 19122, USA
| | - Merle-Marie Pittelkow
- Department of Psychology, University of Groningen, 9712 CP, Groningen, the Netherlands
| | - Lydia Riedl
- Department of Psychiatry and Psychotherapy, Philipps-University Marburg, D-35039 Marburg, Germany
| | | | | | - John J. Shaw
- Division of Psychology, De Montfort University, Leicester LE1 9BH, UK
| | - Timo Lüke
- Institute for Educational Research and Teacher Education, University of Graz, Graz, 8010 Graz, Austria
| | | | - Helena Hartmann
- Department for Cognition, Emotion, and Methods in Psychology, University of Vienna, Vienna 1010, Austria
| | - Mirela Zaneva
- Department of Experimental Psychology, University of Oxford, Oxford OX1 4BH18, UK
| | - Daniel Walker
- Department of Psychology, Faculty of Management, Law and Social Sciences, University of Bradford, Bradford BD7 1DP, UK
| | - Steven Verheyen
- Department of Psychology, Education and Child Studies, Erasmus University Rotterdam, Rotterdam 3000, The Netherlands
| | - Daniel Cox
- Division of Neuroscience and Experimental Psychology, University of Manchester, Manchester M13 9PL, UK
| | - Jennifer Mattschey
- School of Psychology and Counselling, the Open University, Milton Keynes MK7 6AA, UK
| | | | - Peter Branney
- Department of Psychology, Faculty of Management, Law and Social Sciences, University of Bradford, Bradford BD7 1DP, UK
| | - Yanna Weisberg
- Department of Psychology, Linfield University, Linfield, 503-883-2200, USA
| | - Kamil Izydorczak
- Faculty of Psychology in Wrocław, SWPS University of Social Sciences and Humanities, Wrocław 03-81536, Al Jubail 35819, Poland
| | - Ali H. Al-Hoorie
- Jubail English Language and Preparatory Year Institute, Royal Commission for Jubail and Yanbu, Saudi Arabia
| | | | | | - Kai Krautter
- Department of Psychology, Saarland University, 66123 Saarbrücken, Germany
| | | | - Samuel J. Westwood
- Department of Psychology, School of Social Science, University of Westminster, London W1B 2HW, UK
| | - Patrícia Arriaga
- Iscte-Universty Institute of Lisbon, CIS-IUL, 1649-026 Lisboa, Portugal
| | - Meng Liu
- Faculty of Education, University of Cambridge, Cambridge CB2 1TN, UK
| | - Myriam A. Baum
- Department of Psychology, Saarland University, 66123 Saarbrücken, Germany
| | - Tobias Wingen
- Institute of General Practice and Family Medicine, University Hospital Bonn, University of Bonn, 53127 Bonn, Germany
| | - Robert M. Ross
- Department of Philosophy, Macquarie University, NSW 2109, Australia
| | - Aoife O'Mahony
- School of Psychology, Cardiff University, Cardiff CF10 3AT, UK
| | | | - Michelle Jamieson
- School of Social and Political Sciences, University of Glasgow, Glasgow G12 8QQ, UK
| | - Myrthe Vel Tromp
- Department of Psychology, Leiden University, 2311 EZ Leiden, The Netherlands
| | - Siu Kit Yeung
- Department of Psychology, the Chinese University of Hong Kong, Hong Kong, SAR 100871, People's Republic of China
| | - Martin R. Vasilev
- Department of Psychology, Bournemouth University, Poole BH12 5BB, UK
| | | | - Leticia Micheli
- Department of Psychology III, University of Würzburg, 97070 Würzburg, Germany
| | - Markus Konkol
- Faculty for Geo-Information Science and Earth Observation, University of Twente, 7522 NB, The Netherlands
| | - David Moreau
- School of Psychology, University of Auckland, Auckland 1142, New Zealand
| | - James E. Bartlett
- School of Psychology and Neuroscience, University of Glasgow, Glasgow G12 8QQ, UK
| | - Kait Clark
- Department of Social Sciences, University of the West of England, Bristol BS16 1QY, UK
| | - Gwen Brekelmans
- Department of Biological and Experimental Psychology, Queen Mary University of London, E1 4NS, UK
| | | | - Samantha L. Tyler
- Department of Neuroscience, Psychology and Behaviour, University of Leicester, UK
| | | | | | | | - Olly Robertson
- Departments of Psychiatry and Experimental Psychology, University of Oxford, UK
- School of Psychology, Keele University, Newcastle ST5 5BG, UK
| | - Bethan J. Iley
- School of Psychology, Queen's University Belfast, Belfast BT7 1NN, UK
| | - Samuel Guay
- Department of Psychology, University of Montreal, Canada
| | - Martina Sladekova
- School of Humanities and Social Science, University of Brighton, BN2 0JY, UK
- School of Psychology, University of Sussex, Brighton BN1 9RH, UK
| | - Shanu Sadhwani
- School of Humanities and Social Science, University of Brighton, BN2 0JY, UK
| | - FORRT
- Framework for Open and Reproducible Research Training
| |
Collapse
|
5
|
Reyes ML, Daganzo MAA, Calleja MO. International Large-Scale Recurring Surveys with Open Access to Data: Contributions to International Research Collaborations and Publications in Psychology. Trends in Psychol 2023. [DOI: 10.1007/s43076-023-00273-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/06/2023]
|
6
|
Quintana DS. A Guide for Calculating Study-Level Statistical Power for Meta-Analyses. Advances in Methods and Practices in Psychological Science 2023. [DOI: 10.1177/25152459221147260] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/31/2023]
Abstract
Meta-analysis is a popular approach in the psychological sciences for synthesizing data across studies. However, the credibility of meta-analysis outcomes depends on the evidential value of studies included in the body of evidence used for data synthesis. One important consideration for determining a study’s evidential value is the statistical power of the study’s design/statistical test combination for detecting hypothetical effect sizes of interest. Studies with a design/test combination that cannot reliably detect a wide range of effect sizes are more susceptible to questionable research practices and exaggerated effect sizes. Therefore, determining the statistical power for design/test combinations for studies included in meta-analyses can help researchers make decisions regarding confidence in the body of evidence. Because the one true population effect size is unknown when hypothesis testing, an alternative approach is to determine statistical power for a range of hypothetical effect sizes. This tutorial introduces the metameta R package and web app, which facilitates the straightforward calculation and visualization of study-level statistical power in meta-analyses for a range of hypothetical effect sizes. Readers will be shown how to reanalyze data using information typically presented in meta-analysis forest plots or tables and how to integrate the metameta package when reporting novel meta-analyses. A step-by-step companion screencast video tutorial is also provided to assist readers using the R package.
Collapse
|
7
|
Forscher PS, Wagenmakers EJ, Coles NA, Silan MA, Dutra N, Basnight-Brown D, IJzerman H. The Benefits, Barriers, and Risks of Big-Team Science. Perspect Psychol Sci 2022; 18:607-623. [PMID: 36190899 DOI: 10.1177/17456916221082970] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Progress in psychology has been frustrated by challenges concerning replicability, generalizability, strategy selection, inferential reproducibility, and computational reproducibility. Although often discussed separately, these five challenges may share a common cause: insufficient investment of intellectual and nonintellectual resources into the typical psychology study. We suggest that the emerging emphasis on big-team science can help address these challenges by allowing researchers to pool their resources together to increase the amount available for a single study. However, the current incentives, infrastructure, and institutions in academic science have all developed under the assumption that science is conducted by solo principal investigators and their dependent trainees, an assumption that creates barriers to sustainable big-team science. We also anticipate that big-team science carries unique risks, such as the potential for big-team-science organizations to be co-opted by unaccountable leaders, become overly conservative, and make mistakes at a grand scale. Big-team-science organizations must also acquire personnel who are properly compensated and have clear roles. Not doing so raises risks related to mismanagement and a lack of financial sustainability. If researchers can manage its unique barriers and risks, big-team science has the potential to spur great progress in psychology and beyond.
Collapse
Affiliation(s)
- Patrick S Forscher
- Research and Innovation Division, Busara Center for Behavioral Economics, Nairobi, Kenya.,Laboratoire Interuniversitaire de Psychologie, Université Grenoble Alpes
| | | | - Nicholas A Coles
- Center for the Study of Language and Information, Stanford University
| | - Miguel Alejandro Silan
- Unité de recherche Développement Individu Processus Handicap Éducation, Université Lumière Lyon 2.,Annecy Behavioral Science Lab, Menthon-Saint-Bernard, France.,Social and Political Laboratory, Psychology Department, University of the Philippines Diliman
| | - Natália Dutra
- Núcleo de Teoria e Pesquisa do Comportamento, Universidade Federal do Pará
| | | | - Hans IJzerman
- Laboratoire Interuniversitaire de Psychologie, Université Grenoble Alpes.,Institut Universitaire de France
| |
Collapse
|
8
|
Plakhotnik MS. Co‐authoring
with undergraduate students: An emerging process from the
semi‐periphery
of the world of science. Learned Publishing 2022. [DOI: 10.1002/leap.1469] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
9
|
Wagge JR, Hurst MA, Brandt MJ, Lazarevic LB, Legate N, Grahe JE. Teaching Research in Principle and in Practice: What Do Psychology Instructors Think of Research Projects in Their Courses? Psychology Learning & Teaching 2022. [DOI: 10.1177/14757257221101942] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Psychology majors typically conduct at least one research project during their undergraduate studies, yet these projects rarely make a scientific contribution beyond the classroom. In this study, we explored one potential reason for this—that student projects may not be aligned with best practices in the field. In other words, we wondered if there was a mismatch between what instructors teach in principle and what student projects are in practice. To answer this, we asked psychology instructors ( n = 111) who regularly teach courses involving research projects questions about these projects. Instructors endorsed many of the commonly assumed pitfalls of student projects, such as not using rigorous methodology. Notably, the characteristics of these typical student projects did not align with the qualities instructors reported as being important in research practice. We highlight opportunities to align these qualities by employing resources such as crowdsourced projects specifically developed for student researchers.
Collapse
|
10
|
Azevedo F, Liu M, Pennington CR, Pownall M, Evans TR, Parsons S, Elsherif MM, Micheli L, Westwood SJ. Towards a culture of open scholarship: the role of pedagogical communities. BMC Res Notes 2022; 15:75. [PMID: 35193662 PMCID: PMC8862562 DOI: 10.1186/s13104-022-05944-1] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/08/2021] [Accepted: 02/01/2022] [Indexed: 11/30/2022] Open
Abstract
The UK House of Commons Science and Technology Committee has called for evidence on the roles that different stakeholders play in reproducibility and research integrity. Of central priority are proposals for improving research integrity and quality, as well as guidance and support for researchers. In response to this, we argue that there is one important component of research integrity that is often absent from discussion: the pedagogical consequences of how we teach, mentor, and supervise students through open scholarship. We justify the need to integrate open scholarship principles into research training within higher education and argue that pedagogical communities play a key role in fostering an inclusive culture of open scholarship. We illustrate these benefits by presenting the Framework for Open and Reproducible Research Training (FORRT), an international grassroots community whose goal is to provide support, resources, visibility, and advocacy for the adoption of principled, open teaching and mentoring practices, whilst generating conversations about the ethics and social impact of higher-education pedagogy. Representing a diverse group of early-career researchers and students across specialisms, we advocate for greater recognition of and support for pedagogical communities, and encourage all research stakeholders to engage with these communities to enable long-term, sustainable change.
Collapse
|
11
|
Abstract
Zusammenfassung. In den letzten Jahren gab es innerhalb der Psychologie eine intensive Auseinandersetzung mit den Auswirkungen der Replikationskrise sowie dem hieraus entstandenen Diskurs über die Weiterentwicklung der Disziplin. Als ein Grund für die mangelnde Replizierbarkeit psychologischer Forschung wurde die Verwendung fragwürdiger Forschungspraktiken (eng. QRPs) identifiziert. Während es umfangreiche Untersuchungen zur Prävalenz von QRPs unter Wissenschaftler*innen gibt, ist bisher wenig über die Verbreitung dieser Praktiken unter Studierenden bekannt. Mit der hier vorgestellten Arbeit wurde erstmals eine größere Befragung unter 1397 Psychologie-Studierenden im deutschsprachigen Raum durchgeführt, um die Verbreitung von QRPs in studentischen Projekten sowie den aktuellen Stand der akademischen Lehre in Bezug auf die Replikationskrise und Open Science zu erheben. Die gemeinsame Betrachtung der Lehre und des Einsatzes fragwürdiger Forschungspraktiken versprechen Aufschluss darüber, wie die psychologische Lehre mit dem empirischen Vorgehen der Studierenden zusammenhängt. Die Ergebnisse zeigen, dass QRPs auch in studentischen Projekten vorkommen, wobei große Unterschiede in der Verbreitung einzelner QRPs bestehen. Auch zwischen den verschiedenen Projekttypen zeigten sich Unterschiede, so war die Anwendung von QRPs in Experimentalpraktika am stärksten und in Masterarbeiten am schwächsten ausgeprägt. Unsere Daten weisen insgesamt darauf hin, dass die selbstberichtete Verbreitung von QRPs über den Studienverlauf abnimmt. Zudem scheint ein Großteil der Studierenden bereits mit der Thematik der Replikationskrise in der Lehre in Berührung gekommen zu sein. Deren Behandlung findet größtenteils in der Methodenlehre und weniger in inhaltlich spezialisierten Lehrveranstaltungen statt. Wir geben abschließend Impulse zur Weiterentwicklung der psychologischen Lehre, in denen die Prinzipien der Offenheit, Transparenz und Kollaboration beim Hervorbringen inhaltlich robuster Forschung bereits während des Studiums im Vordergrund stehen.
Collapse
|
12
|
Chen J, Kwan LC, Ma LY, Choi HY, Lo YC, Au SY, Tsang CH, Cheng BL, Feldman G. Retrospective and prospective hindsight bias: Replications and extensions of Fischhoff (1975) and Slovic and Fischhoff (1977). Journal of Experimental Social Psychology 2021. [DOI: 10.1016/j.jesp.2021.104154] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/03/2023]
|
13
|
Abstract
The scientific enterprise has long been based on the presumption of replication, although scientists have recently become aware of various corruptions of the enterprise that have hurt replicability. In this article, we begin by considering three illustrations of research paradigms that have all been subject to intense scrutiny through replications and theoretical concerns. The three paradigms are one for which the corpus of research points to a real finding, one for which the corpus of research points to a significantly attenuated effect, and one for which the debate is ongoing. We then discuss what scientists can learn-and how science can be improved-through replications more generally. From there, we discuss what we believe needs to be done to improve scientific inquiry with regard to replication moving forward. Finally, we conclude by providing readers with several different approaches to replication and how these approaches progress science. The approaches discussed include multilab replications of many effects, multilab replications of specific effects, adversarial collaborations, and stand-alone applications.
Collapse
Affiliation(s)
- John E Edlund
- Department of Psychology, Rochester Institute of Technology
| | - Kelly Cuccolo
- Department of Psychology, University of North Dakota
| | | | | | - Martha S Zlokovich
- Psi Chi, the International Honor Society in Psychology, Chattanooga, Tennessee
| |
Collapse
|
14
|
Hanmer J. Cross-sectional validation of the PROMIS-Preference scoring system by its association with social determinants of health. Qual Life Res 2021; 30:881-9. [PMID: 33161483 DOI: 10.1007/s11136-020-02691-3] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 10/31/2020] [Indexed: 10/23/2022]
Abstract
PURPOSE PROMIS-Preference (PROPr) is a generic, societal, preference-based summary score that uses seven domains from the Patient-Reported Outcomes Measurement Information System (PROMIS). This report evaluates construct validity of PROPr by its association with social determinants of health (SDoH). METHODS An online panel survey of the US adult population included PROPr, SDoH, demographics, chronic conditions, and four other scores: the EuroQol-5D-5L (EQ-5D-5L), Health Utilities Index (HUI) Mark 2 and Mark 3, and the Short Form-6D (SF-6D). Each score was regressed on age, gender, health conditions, and a single SDoH. The SDoH coefficient represents the strength of its association to PROPr and was used to assess known-groups validity. Convergent validity was evaluated using Pearson correlations between different summary scores and Spearman correlations between SDoH coefficients from different summary scores. RESULTS From 4142 participants, all summary scores had statistically significant differences for variables related to education, income, food and financial insecurity, and social interactions. Of the 42 SDoH variables tested, the number of statistically significant variables was 27 for EQ-5D-5L, 17 for HUI Mark 2, 23 for HUI Mark 3, 27 for PROPr, and 27 for SF-6D. The average SDoH coefficients were - 0.086 for EQ-5D-5L, - 0.039 for HUI Mark 2, - 0.063 for HUI Mark 3, - 0.064 for PROPr, and - 0.037 for SF-6D. Despite the difference in magnitude across the measures, Pearson correlations were 0.60 to 0.76 and Spearman correlations were 0.74 to 0.87. CONCLUSIONS These results provide evidence of construct validity supporting the use of PROPr monitor population health in the general US population.
Collapse
|
15
|
Ghelfi E, Christopherson CD, Urry HL, Lenne RL, Legate N, Ann Fischer M, Wagemans FMA, Wiggins B, Barrett T, Bornstein M, de Haan B, Guberman J, Issa N, Kim J, Na E, O’Brien J, Paulk A, Peck T, Sashihara M, Sheelar K, Song J, Steinberg H, Sullivan D. Reexamining the Effect of Gustatory Disgust on Moral Judgment: A Multilab Direct Replication of Eskine, Kacinik, and Prinz (2011). Advances in Methods and Practices in Psychological Science 2020. [DOI: 10.1177/2515245919881152] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/22/2022]
Abstract
Eskine, Kacinik, and Prinz’s (2011) influential experiment demonstrated that gustatory disgust triggers a heightened sense of moral wrongness. We report a large-scale multisite direct replication of this study conducted by labs in the Collaborative Replications and Education Project. Subjects in each sample were randomly assigned to one of three beverage conditions: bitter (disgusting), control (neutral), or sweet. Then, subjects made a series of judgments about the moral wrongness of the behavior depicted in six vignettes. In the original study ( N = 57), drinking the bitter beverage led to higher ratings of moral wrongness than did drinking the control or sweet beverage; a contrast between the bitter condition and the other two conditions was significant among conservative ( n = 19) but not liberal ( n = 25) subjects. In the current project, random-effects meta-analyses across all subjects ( N = 1,137, k = 11 studies), conservative subjects ( n = 142, k = 5), and liberal subjects ( n = 635, k = 9) revealed standardized overall effect sizes across replications that were smaller than reported in the original study. Some were in the opposite of the predicted direction; all had 95% confidence intervals containing zero, and all were smaller than the effect size the original authors could have meaningfully detected. Results of linear mixed-effects regressions revealed that drinking the bitter beverage led to higher ratings of moral wrongness than did drinking the control beverage but not the sweet beverage. Bayes factor tests revealed greater relative support for the null than for the replication hypothesis. The overall pattern provides little to no support for the theory that physical disgust via taste perception harshens judgments of moral wrongness.
Collapse
Affiliation(s)
- Eric Ghelfi
- Department of Psychology, Brigham Young University
| | | | | | - Richie L. Lenne
- Department of Psychology, University of Minnesota, Minneapolis
| | - Nicole Legate
- Department of Psychology, Illinois Institute of Technology
| | | | - Fieke M. A. Wagemans
- Department of Psychology, Tilburg University
- Institute for Socio-Economics, University of Duisburg-Essen
| | - Brady Wiggins
- Department of Psychology, Brigham Young University–Idaho
| | | | | | | | | | - Nada Issa
- Department of Psychology, Indiana University Northwest
| | - Joan Kim
- Department of Medicine, Massachusetts General Hospital, Boston, Massachusetts
| | - Elim Na
- Department of Medicine, Boston University School of Medicine
| | | | - Aidan Paulk
- Department of Acupuncture, Oregon College of Oriental Medicine
| | - Tayler Peck
- Department of Psychology, Southern Oregon University
| | | | - Karen Sheelar
- Department of Psychology, Southern Oregon University
| | | | - Hannah Steinberg
- Department of Psychology, Pacific Graduate School of Psychology/Stanford University Doctor of Psychology Consortium
| | | |
Collapse
|
16
|
Olsen J, Mosen J, Voracek M, Kirchler E. Research practices and statistical reporting quality in 250 economic psychology master's theses: a meta-research investigation. R Soc Open Sci 2019; 6:190738. [PMID: 31903199 PMCID: PMC6936276 DOI: 10.1098/rsos.190738] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/19/2019] [Accepted: 11/28/2019] [Indexed: 05/30/2023]
Abstract
The replicability of research findings has recently been disputed across multiple scientific disciplines. In constructive reaction, the research culture in psychology is facing fundamental changes, but investigations of research practices that led to these improvements have almost exclusively focused on academic researchers. By contrast, we investigated the statistical reporting quality and selected indicators of questionable research practices (QRPs) in psychology students' master's theses. In a total of 250 theses, we investigated utilization and magnitude of standardized effect sizes, along with statistical power, the consistency and completeness of reported results, and possible indications of p-hacking and further testing. Effect sizes were reported for 36% of focal tests (median r = 0.19), and only a single formal power analysis was reported for sample size determination (median observed power 1 - β = 0.67). Statcheck revealed inconsistent p-values in 18% of cases, while 2% led to decision errors. There were no clear indications of p-hacking or further testing. We discuss our findings in the light of promoting open science standards in teaching and student supervision.
Collapse
Affiliation(s)
- Jerome Olsen
- Faculty of Psychology, Department of Applied Psychology: Work, Education and Economy, University of Vienna, Universitaetsstrasse 7, 1010 Vienna, Austria
| | - Johanna Mosen
- Faculty of Psychology, Department of Applied Psychology: Work, Education and Economy, University of Vienna, Universitaetsstrasse 7, 1010 Vienna, Austria
| | - Martin Voracek
- Faculty of Psychology, Department of Basic Psychological Research and Research Methods, University of Vienna, Liebiggasse 5, 1010 Vienna, Austria
| | - Erich Kirchler
- Faculty of Psychology, Department of Applied Psychology: Work, Education and Economy, University of Vienna, Universitaetsstrasse 7, 1010 Vienna, Austria
| |
Collapse
|
17
|
Abstract
Zusammenfassung. In der (Pädagogischen) Psychologie sind Replikationsstudien bislang extrem seltene Ausnahmen. Dieser Artikel legt dar, dass und warum Wiederholungsstudien unentbehrlich sind. Weiterhin wird der Frage nachgegangen, warum – trotz des enormen Mehrwerts – nahezu keine Replikationen publiziert werden und warum viele „Ergebnisse“ der psychologischen Forschung nicht replizierbar sind. Dass es sich bei diesen Sachverhalten nicht um Vermutungen handelt, wird durch vorliegende Untersuchungen belegt. Die Ursachen dafür liegen in verschiedenen – teilweise voneinander abhängigen – Ebenen des Wissenschaftssystems: die verbreitete – aber abwegige – Ansicht, „statistische Signifikanz“ indiziere auch die Wahrscheinlichkeit, einen Befund replizieren zu können; die Verwechslung von „statistisch signifikant“ mit relevant; die Unsitte, getestete Untersuchungshypothesen erst im Nachhinein (ex post), also in Kenntnis der Resultate einer Studie, aufgestellt zu haben, aber in der Publikation als theoretisch abgeleiteten Ausgangspunkt (d. h. a priori formuliert) auszugeben; die α-Fehler-Inflationierung durch multiple statistische Signifikanztestungen; das exklusive Berichten von Ergebnissen, welche die Forschungshypothesen stützen, verbunden mit dem Unterschlagen abweichender Befunde; mangelnde Konstruktvalidität der verwendeten Messinstrumente; Lug und Betrug in der Wissenschaft; die Geringschätzung von Replikationen durch Zeitschriftenherausgeber, Gutachter und Drittmittelgeber. All das führt dazu, dass fast ausschließlich „statistisch signifikante“ und „neue“ Ergebnisse veröffentlicht werden und falsche Theorien persistieren. Als Gegenmaßnahmen werden beispielhaft genannt: eine großzügige finanzielle Förderung von Replikationsprojekten und ihrer Publikation; die nachdrückliche gutachterliche Befürwortung der Veröffentlichung methodisch adäquater Wiederholungsstudien; die Bereitschaft von Fachzeitschriften, dafür genug Platz bereitzustellen; die Anerkennung des großen wissenschaftlichen Werts von Wiederholungsstudien, auch in Berufungsverfahren. Daraus ergibt sich, dass mit den aufgezeigten Möglichkeiten und Forderungen zur Etablierung und Förderung von Replikationsstudien unterschiedliche Adressaten parallel angesprochen werden müssen. Nachhaltige Veränderungen sind allerdings nur erreichbar, wenn die einzelnen Akteure (Forscher; Gutachter; Zeitschriftenherausgeber; Berufungskommissionen; Drittmittelgeber) ihre individuelle Verantwortung anerkennen und entsprechende Taten folgen lassen.
Collapse
Affiliation(s)
- Detlef H. Rost
- Southwest University Chongqing, Faculty of Psychology, Chongqing, P. R. China
- Philipps-Universität Marburg, Fachbereich Psychologie, Marburg, Deutschland
| | - Marc Bienefeld
- Universität Bielefeld, Fakultät für Erziehungswissenschaft, Bielefeld, Deutschland
| |
Collapse
|
18
|
Grahe JE, Cuccolo K, Leighton DC, Cramblet Alvarez LD. Open Science Promotes Diverse, Just, and Sustainable Research and Educational Outcomes. Psychology Learning & Teaching 2019. [DOI: 10.1177/1475725719869164] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Open science initiatives, which are often collaborative efforts focused on making research more transparent, have experienced increasing popularity in the past decade. Open science principles of openness and transparency provide opportunities to advance diversity, justice, and sustainability by promoting diverse, just, and sustainable outcomes among both undergraduate and senior researchers. We review models that demonstrate the importance of greater diversity, justice, and sustainability in psychological science before describing how open science initiatives promote these values. Open science initiatives also promote diversity, justice, and sustainability through increased levels of inclusion and access, equitable distribution of opportunities and dissemination of knowledge, and increased sustainability stemming from increased generalizability. In order to provide an application of the concepts discussed, we offer a set of diversity, justice, and sustainability lens questions for individuals to use while assessing research projects and other organizational systems and consider concrete classroom applications for these initiatives.
Collapse
Affiliation(s)
- Jon E Grahe
- Psychology Department, Pacific Lutheran University, USA
| | - Kelly Cuccolo
- Psychology Department, University of North Dakota, USA
| | - Dana C Leighton
- College of Arts, Science, and Education, Texas A&M University – Texarkana, USA
| | | |
Collapse
|
19
|
Jekel M, Fiedler S, Allstadt Torras R, Mischkowski D, Dorrough AR, Glöckner A. How to Teach Open Science Principles in the Undergraduate Curriculum—The Hagen Cumulative Science Project. Psychology Learning & Teaching 2019. [DOI: 10.1177/1475725719868149] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/15/2022]
Abstract
The Hagen Cumulative Science Project is a large-scale replication project based on students’ thesis work. In the project, we aim to (a) teach students to conduct the entire research process for conducting a replication according to open science standards and (b) contribute to cumulative science by increasing the number of direct replications. We describe the procedural steps of the project from choosing suitable replication studies to guiding students through the process of conducting a replication, and processing results in a meta-analysis. Based on the experience of more than 80 replications, we summarize how such a project can be implemented. We present practical solutions that have been shown to be successful as well as discuss typical obstacles and how they can be solved. We argue that replication projects are beneficial for all groups involved: Students benefit by being guided through a highly structured protocol and making actual contributions to science. Instructors benefit by using time resources effectively for cumulative science and fulfilling teaching obligations in a meaningful way. The scientific community benefits from the resulting greater number of replications and teaching state-of-the-art methodology. We encourage the use of student thesis-based replication projects for thesis work in academic bachelor and master curricula.
Collapse
Affiliation(s)
| | | | | | | | | | - Andreas Glöckner
- University of Cologne, Germany
- Max Planck Institute for Collective Goods, Germany
| |
Collapse
|
20
|
Skorinko JLM. Scholarship of Discovery and Beyond: Thinking About Multiple Forms of Scholarship and Elements of Project-Based Learning to Engage Undergraduates in Publishable Research. Front Psychol 2019; 10:917. [PMID: 31110486 PMCID: PMC6499216 DOI: 10.3389/fpsyg.2019.00917] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/31/2018] [Accepted: 04/05/2019] [Indexed: 11/13/2022] Open
Affiliation(s)
- Jeanine L. M. Skorinko
- Department of Social Science and Policy Studies, Worcester Polytechnic Institute, Worcester, MA, United States
| |
Collapse
|
21
|
Affiliation(s)
- Traci A. Giuliano
- Department of Psychology, Southwestern University, Georgetown, TX, United States
| |
Collapse
|