1
|
Petersen IT, Apfelbaum KS, McMurray B. Adapting Open Science and Pre-registration to Longitudinal Research. Infant Child Dev 2024; 33:e2315. [PMID: 38425545 PMCID: PMC10904029 DOI: 10.1002/icd.2315] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/01/2021] [Accepted: 03/16/2022] [Indexed: 03/02/2024]
Abstract
Open science practices, such as pre-registration and data sharing, increase transparency and may improve the replicability of developmental science. However, developmental science has lagged behind other fields in implementing open science practices. This lag may arise from unique challenges and considerations of longitudinal research. In this paper, preliminary guidelines are provided for adapting open science practices to longitudinal research to facilitate researchers' use of these practices. The guidelines propose a serial and modular approach to registration that includes an initial pre-registration of the methods and focal hypotheses of the longitudinal study, along with subsequent pre- or co-registered questions, hypotheses, and analysis plans associated with specific papers. Researchers are encouraged to share their research materials and relevant data with associated papers, and to report sufficient information for replicability. In addition, there should be careful consideration about requirements regarding the timing of data sharing, to avoid disincentivizing longitudinal research.
Collapse
Affiliation(s)
- Isaac T Petersen
- Department of Psychological and Brain Sciences, University of Iowa
| | | | - Bob McMurray
- Department of Psychological and Brain Sciences, Department of Communication Sciences and Disorders and Department of Linguistics, University of Iowa
| |
Collapse
|
2
|
Li W, Germine LT, Mehr SA, Srinivasan M, Hartshorne J. Developmental psychologists should adopt citizen science to improve generalization and reproducibility. Infant Child Dev 2024; 33:e2348. [PMID: 38515737 PMCID: PMC10957098 DOI: 10.1002/icd.2348] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/06/2021] [Accepted: 05/17/2022] [Indexed: 11/08/2022]
Abstract
Widespread failures of replication and generalization are, ironically, a scientific triumph, in that they confirm the fundamental metascientific theory that underlies our field. Generalizable and replicable findings require testing large numbers of subjects from a wide range of demographics with a large, randomly-sampled stimulus set, and using a variety of experimental parameters. Because few studies accomplish any of this, meta-scientists predict that findings will frequently fail to replicate or generalize. We argue that to be more robust and replicable, developmental psychology needs to find a mechanism for collecting data at greater scale and from more diverse populations. Luckily, this mechanism already exists: Citizen science, in which large numbers of uncompensated volunteers provide data. While best-known for its contributions to astronomy and ecology, citizen science has also produced major findings in neuroscience and psychology, and increasingly in developmental psychology. We provide examples, address practical challenges, discuss limitations, and compare to other methods of obtaining large datasets. Ultimately, we argue that the range of studies where it makes sense *not* to use citizen science is steadily dwindling.
Collapse
Affiliation(s)
- Wei Li
- Department of Psychology and Neuroscience, Boston College, Chestnut Hill, MA, USA
| | - Laura Thi Germine
- McLean Hospital, Belmont, MA, USA
- Department of Psychiatry, Harvard Medical School, Cambridge, MA
| | - Samuel A. Mehr
- Data Science Initiative, Harvard University, Cambridge, MA
- School of Psychology, Victoria University of Wellington, Wellington, New Zealand
| | | | - Joshua Hartshorne
- Department of Psychology and Neuroscience, Boston College, Chestnut Hill, MA, USA
| |
Collapse
|
3
|
Abstract
We are now in a time of readily available brain imaging data. Not only are researchers now sharing data more than ever before, but additionally large-scale data collecting initiatives are underway with the vision that many future researchers will use the data for secondary analyses. Here I provide an overview of available datasets and some example use cases. Example use cases include examining individual differences, more robust findings, reproducibility-both in public input data and availability as a replication sample, and methods development. I further discuss a variety of considerations associated with using existing data and the opportunities associated with large datasets. Suggestions for further readings on general neuroimaging and topic-specific discussions are also provided.
Collapse
|
4
|
Ashworth M, Palikara O, Burchell E, Purser H, Nikolla D, Van Herwegen J. Online and Face-to-Face Performance on Two Cognitive Tasks in Children With Williams Syndrome. Front Psychol 2021; 11:594465. [PMID: 33613354 PMCID: PMC7889503 DOI: 10.3389/fpsyg.2020.594465] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2020] [Accepted: 12/24/2020] [Indexed: 12/02/2022] Open
Abstract
There has been an increase in cognitive assessment via the Internet, especially since the coronavirus disease 2019 surged the need for remote psychological assessment. This is the first study to investigate the appropriability of conducting cognitive assessments online with children with a neurodevelopmental condition and intellectual disability, namely, Williams syndrome. This study compared Raven’s Colored Progressive Matrices (RCPM) and British Picture Vocabulary Scale (BPVS) scores from two different groups of children with WS age 10–11 years who were assessed online (n = 14) or face-to-face (RCPM n = 12; BPVS n = 24). Bayesian t-tests showed that children’s RCPM scores were similar across testing conditions, but suggested BPVS scores were higher for participants assessed online. The differences between task protocols are discussed in line with these findings, as well as the implications for neurodevelopmental research.
Collapse
Affiliation(s)
- Maria Ashworth
- Department of Psychology and Human Development, UCL Institute of Education, University College London, London, United Kingdom
| | - Olympia Palikara
- Department of Education Studies, University of Warwick, Coventry, United Kingdom
| | - Elizabeth Burchell
- Department of Psychological Sciences, Birkbeck, University of London, London, United Kingdom
| | - Harry Purser
- Department of Psychology, Nottingham Trent University, Nottingham, United Kingdom
| | - Dritan Nikolla
- Department of Psychology, Kingston University, Kingston upon Thames, United Kingdom
| | - Jo Van Herwegen
- Department of Psychology and Human Development, UCL Institute of Education, University College London, London, United Kingdom
| |
Collapse
|
5
|
Brinberg M, Ram N, Yang X, Cho MJ, Sundar SS, Robinson TN, Reeves B. The Idiosyncrasies of Everyday Digital Lives: Using the Human Screenome Project to Study User Behavior on Smartphones. Comput Human Behav 2021; 114:106570. [PMID: 33041494 PMCID: PMC7543997 DOI: 10.1016/j.chb.2020.106570] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
Abstract
Most methods used to make theory-relevant observations of technology use rely on self-report or application logging data where individuals' digital experiences are purposively summarized into aggregates meant to describe how the average individual engages with broadly defined segments of content. This aggregation and averaging masks heterogeneity in how and when individuals actually engage with their technology. In this study, we use screenshots (N > 6 million) collected every five seconds that were sequenced and processed using text and image extraction tools into content-, context-, and temporally-informative "screenomes" from 132 smartphone users over several weeks to examine individuals' digital experiences. Analyses of screenomes highlight extreme between-person and within-person heterogeneity in how individuals switch among and titrate their engagement with different content. Our simple quantifications of textual and graphical content and flow throughout the day illustrate the value screenomes have for the study of individuals' smartphone use and the cognitive and psychological processes that drive use. We demonstrate how temporal, textual, graphical, and topical features of people's smartphone screens can lay the foundation for expanding the Human Screenome Project with full-scale mining that will inform researchers' knowledge of digital life.
Collapse
|
6
|
Wang Z. When Large-Scale Assessments Meet Data Science: The Big-Fish-Little-Pond Effect in Fourth- and Eighth-Grade Mathematics Across Nations. Front Psychol 2020; 11:579545. [PMID: 33101148 PMCID: PMC7554313 DOI: 10.3389/fpsyg.2020.579545] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/02/2020] [Accepted: 08/25/2020] [Indexed: 11/13/2022] Open
Abstract
The programming language of R has useful data science tools that can automate analysis of large-scale educational assessment data such as those available from the United States Department of Education's National Center for Education Statistics (NCES). This study used three R packages: EdSurvey, MplusAutomation, and tidyverse to examine the big-fish-little-pond effect (BFLPE) in 56 countries in fourth grade and 46 countries in eighth grade for the subject of mathematics with data from the Trends in International Mathematics and Science Study (TIMSS) 2015. The BFLPE refers to the phenomenon that students in higher-achieving contexts tend to have lower self-concept than similarly able students in lower-achieving contexts due to social comparison. In this study, it is used as a substantive theory to illustrate the implementation of data science tools to carry out large-scale cross-national analysis. For each country and grade, two statistical models were applied for cross-level measurement invariance testing, and for testing the BFLPE, respectively. The first model was a multilevel confirmatory factor analysis for the measurement of mathematics self-concept using three items. The second model was multilevel latent variable modeling that decomposed the effect of achievement on self-concept into between and within components; the difference between them was the contextual effect of the BFLPE. The BFLPE was found in 51 of the 56 countries in fourth grade and 44 of the 46 countries in eighth grade. The study provides syntax and discusses problems encountered while using the tools for modeling and processing of modeling results.
Collapse
Affiliation(s)
- Ze Wang
- Department of Educational, School & Counseling Psychology, University of Missouri, Columbia, MO, United States
| |
Collapse
|
7
|
Barbot B, Hein S, Trentacosta C, Beckmann JF, Bick J, Crocetti E, Liu Y, Rao SF, Liew J, Overbeek G, Ponguta LA, Scheithauer H, Super C, Arnett J, Bukowski W, Cook TD, Côté J, Eccles JS, Eid M, Hiraki K, Johnson M, Juang L, Landi N, Leckman J, McCardle P, Mulvey KL, Piquero AR, Preiss DD, Siegler R, Soenens B, Yousafzai AK, Bornstein MH, Cooper CR, Goossens L, Harkness S, van IJzendoorn MH. Manifesto for new directions in developmental science. New Dir Child Adolesc Dev 2020; 2020:135-149. [PMID: 32960503 DOI: 10.1002/cad.20359] [Citation(s) in RCA: 19] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/28/2022]
Abstract
Although developmental science has always been evolving, these times of fast-paced and profound social and scientific changes easily lead to disorienting fragmentation rather than coherent scientific advances. What directions should developmental science pursue to meaningfully address real-world problems that impact human development throughout the lifespan? What conceptual or policy shifts are needed to steer the field in these directions? The present manifesto is proposed by a group of scholars from various disciplines and perspectives within developmental science to spark conversations and action plans in response to these questions. After highlighting four critical content domains that merit concentrated and often urgent research efforts, two issues regarding "how" we do developmental science and "what for" are outlined. This manifesto concludes with five proposals, calling for integrative, inclusive, transdisciplinary, transparent, and actionable developmental science. Specific recommendations, prospects, pitfalls, and challenges to reach this goal are discussed.
Collapse
Affiliation(s)
- Baptiste Barbot
- Psychological Sciences Research Institute, UCLouvain, Belgium & Yale Child Study Center, Yale University, USA
| | | | | | | | - Johanna Bick
- Department of Psychology, University of Houston, USA
| | | | | | | | - Jeffrey Liew
- Department of Educational Psychology, Texas A&M University, USA
| | | | | | | | - Charles Super
- Department of Human Development and Family Sciences & Center for the Study of Culture, Health, and Human Development, University of Connecticut, USA
| | | | | | - Thomas D Cook
- GW Institute of Public Policy, George Washington University & Northwestern University, USA
| | - James Côté
- Department of Sociology, University of Western Ontario, Canada
| | | | - Michael Eid
- Department of Education and Psychology, Freie Universität Berlin, Germany
| | - Kazuo Hiraki
- Department of General Systems Studies, University of Tokyo, Japan
| | | | | | - Nicole Landi
- Department of Psychological Sciences, University of Connecticut, USA
| | | | - Peggy McCardle
- Haskins Laboratories & Peggy McCardle Consulting, LLC, USA
| | | | | | - David D Preiss
- Psychology, Pontifical Catholic University of Chile, Chile
| | | | - Bart Soenens
- Department of Developmental, Personality, and Social Psychology, Ghent University, Belgium
| | - Aisha Khizar Yousafzai
- Department of Global Health and Population, Harvard T.H. Chan School of Public Health, USA
| | | | | | - Luc Goossens
- School Psychology and Development, KU Leuven, Belgium
| | - Sara Harkness
- Center for the Study of Culture, Health, and Human Development and Department of Human Development and Family Sciences, University of Connecticut, USA
| | | |
Collapse
|
8
|
Abstract
Emerging technologies for analyzing biospecimens have led to advances in understanding the interacting role of genetics and environment on development and individual responsivity to prevention and intervention programs. The scientific study of gene-environment influences has also benefited from the growth of Big Data tools that allow linking genomic data to health, educational, and other information stored in large integrated datasets. These advances have created a new frontier of ethical challenges for scientists as they collect, store, or engage in secondary use of potentially identifiable information and biospecimens. To address challenges arising from technological advances and the expanding contexts in which potentially identifiable information and biospecimens are collected and stored, the Office of Human Research Protections has revised federal regulations for the protection of human subjects. The revised regulations create a new format, content, and transparency requirements for informed consent, including a new mechanism known as broad consent. Broad consent offers participants a range of choices regarding consent for the storage and future use of their personally identifiable data. These regulations have important implications for how prevention scientists and oversight boards acquire participant consent for the collection, storage, and future use of their data by other investigators for scientific purposes significantly different from the original study. This article describes regulatory changes and challenges affecting traditional informed consent for prevention research, followed by a description of the rationale and requirements for obtaining broad consent, and concludes with a discussion of future challenges involving ongoing transparency and protections for participants and their communities.
Collapse
Affiliation(s)
- Celia B Fisher
- Department of Psychology, Fordham University, Dealy Hall 441, East Fordham Road, Bronx, NY, 10458, USA.
| | - Deborah M Layman
- Department of Psychology, Fordham University, Dealy Hall 441, East Fordham Road, Bronx, NY, 10458, USA
| |
Collapse
|
9
|
Abstract
Widespread sharing of data and materials (including displays and text- and video-based descriptions of experimental procedures) will improve the reproducibility of psychological science and accelerate the pace of discovery. In this article, we discuss some of the challenges to open sharing and offer practical solutions for researchers who wish to share more of the products-and process-of their research. Many of these solutions were devised by the Databrary.org data library for storing and sharing video, audio, and other forms of sensitive or personally identifiable data. We also discuss ways in which researchers can make shared data and materials easier for others to find and reuse. Widely adopted, these solutions and practices will increase transparency and speed progress in psychological science.
Collapse
|
10
|
Gilmore RO, Diaz MT, Wyble BA, Yarkoni T. Progress toward openness, transparency, and reproducibility in cognitive neuroscience. Ann N Y Acad Sci 2017; 1396:5-18. [PMID: 28464561 PMCID: PMC5545750 DOI: 10.1111/nyas.13325] [Citation(s) in RCA: 44] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/02/2016] [Revised: 01/23/2017] [Accepted: 01/28/2017] [Indexed: 11/27/2022]
Abstract
Accumulating evidence suggests that many findings in psychological science and cognitive neuroscience may prove difficult to reproduce; statistical power in brain imaging studies is low and has not improved recently; software errors in analysis tools are common and can go undetected for many years; and, a few large-scale studies notwithstanding, open sharing of data, code, and materials remain the rare exception. At the same time, there is a renewed focus on reproducibility, transparency, and openness as essential core values in cognitive neuroscience. The emergence and rapid growth of data archives, meta-analytic tools, software pipelines, and research groups devoted to improved methodology reflect this new sensibility. We review evidence that the field has begun to embrace new open research practices and illustrate how these can begin to address problems of reproducibility, statistical power, and transparency in ways that will ultimately accelerate discovery.
Collapse
Affiliation(s)
| | - Michele T. Diaz
- Department of Psychology, The Pennsylvania State University
- Social, Life, & Engineering Sciences Imaging Center
| | - Brad A. Wyble
- Department of Psychology, The Pennsylvania State University
| | | |
Collapse
|
11
|
Gilmore RO, Adolph KE, Millman DS, Gordon A. Transforming Education Research Through Open Video Data Sharing. Adv Eng Educ 2016; 5:http://advances.asee.org/wp-content/uploads/vol05/issue02/Papers/AEE-18-Gilmore.pdf. [PMID: 28042361 PMCID: PMC5199018] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
Open data sharing promises to accelerate the pace of discovery in the developmental and learning sciences, but significant technical, policy, and cultural barriers have limited its adoption. As a result, most research on learning and development remains shrouded in a culture of isolation. Data sharing is the rare exception (Gilmore, 2016). Many researchers who study teaching and learning in classroom, laboratory, museum, and home contexts use video as a primary source of raw research data. Unlike other measures, video captures the complexity, richness, and diversity of behavior. Moreover, because video is self-documenting, it presents significant potential for reuse. However, the potential for reuse goes largely unrealized because videos are rarely shared. Research videos contain information about participants' identities making the materials challenging to share. The large size of video files, diversity of formats, and incompatible software tools pose technical challenges. The Databrary (databrary.org) digital library enables researchers who study learning and development to store, share, stream, and annotate videos. In this article, we describe how Databrary has overcome barriers to sharing research videos and associated data and metadata. Databrary has developed solutions for respecting participants' privacy; for storing, streaming, and sharing videos; and for managing videos and associated metadata. The Databrary experience suggests ways that videos and other identifiable data collected in the context of educational research might be shared. Open data sharing enabled by Databrary can serve as a catalyst for a truly multidisciplinary science of learning.
Collapse
|