Looking for Data Curators for FORRT’s Replication Database

The Framework for Open and Reproducible Research Training (FORRT) and the Münster Center for Open Science (MüCOS) seek dedicated researchers to expand the world’s largest replication database. Supported by the Center for Open Science (COS) and the Netherlands Organisation for Scientific Research (NWO), we aim to increase coverage of social, cognitive, and behavioral science replications as well as improve data quality (validation) to the Open Science community resource that supports meta-science.

Position Details

Duration: Starting as soon as possible, with the potential to work until September 2025. We would expect you to work 4-15 hours per week, depending on your availability.

Training and supervision will be provided

Trial: Chosen applicants will be assigned a set of studies to code and submit a dataset with the extracted values. Based on their performance on that initial set of studies, they will receive 250€ for adherence to the strict guidelines, 200€ if they only meet the guidelines after a revision, or 50€ if they do not meet the guidelines for your participation.

Compensation: Candidates who meet the guidelines are offered further work with a base pay of €20/hour with additional performance bonuses based on productivity and accuracy. You will usually work as an independent freelancer, so you might need to pay applicable taxes yourself.

Transparent contributions: Contributors will receive immediate recognition on FORRT’s official contributorship page. Moreover, there are opportunities for ongoing academic and scientific contributions, with potential co-authorship on future manuscripts.

Work Mode: 100% remote (English language).

Responsibilities

As a replication recorder or coder, you will:

Record (or code) replications: Extract details from published articles and add them to FORRT’s Replication Database (FReD; https://forrt.org/replication-hub/).

Validate and expand existing entries: Check the accuracy of existing data and enhance records with additional details on statistics and results.

Record, validate, and report: Maintain transparent documentation of coding processes and discrepancies, double-check work done by other coders and resolve discrepancies through discussion.

Expected Skills & Qualifications

Ideal candidates for this role will:

● Have an academic background or demonstrated interest in social, cognitive, or behavioral sciences.

● Have experience with data coding (e.g. for meta-analyses) or systematic literature reviews.

● Be motivated to contribute to replication research and open science initiatives.

● Demonstrate meticulous attention to detail and a commitment to accuracy.

● Be comfortable working independently and adhering to structured coding protocols.

● Have a solid understanding of statistics so that they can understand quantitative results reported in various social science disciplines.

Application Process

Please complete the tasks below and fill out the application form.

● Submit a brief CV and the latest academic transcript (if available).

● Applications will be reviewed on a rolling basis. If you have any questions, contact Lukas Röseler at lukas.roeseler@uni-muenster.de and cc info@forrt.org.

Assessment form

The tasks that we are recruiting for require a special skill set involving experience with different types of effect sizes and research articles, as well as a high degree of diligence in coding values. Before reaching out to us, please check if you are up for the task with this self-assessment. Each of the different (sub)tasks should not take you longer than 5 minutes.

Task 1: Check inclusion criteria

Find out if these studies meet the definition for replication, that is: Self-identification as a replication, reference to a specific original study that is the replication target, testing a claim that was already tested in an original study, collecting new data and reporting the test results.

  1. Magne, V. (2024). Replication research in the domain of perceived L2 fluency: Approximate and close replications of Kormos and Dénes (2004) and Rossiter (2009). Language Teaching, 1-9. https://doi.org/10.1017/S0261444824000120

  2. Stephenson, Corinne (2024). Trends in U.S. Wage Inequality: Revising the Revisionists. A Replication Study of Autor, Katz, and Kearney (2008). Journal of Comments and Replications in Economics, Vol.3 (2024-4). https://doi.org/10.18718/81781.34

  3. Huensch, A. (2024). Clarifying the role of inhibitory control in L2 phonological processing: A preregistered, close replication of Darcy et al.(2016). Studies in Second Language Acquisition, 1-21. https://doi.org/10.1017/S0272263124000238

Task 2: Quick but rigorous identification of metrics

Extract the sample size for Study 3 (paper available for download on Researchgate: https://www.researchgate.net/publication/319414600_Awe_and_Humility).

Stellar, J. E., Gordon, A., Anderson, C. L., Piff, P. K., McNeil, G. D., & Keltner, D. (2018). Awe and humility. Journal of Personality and Social Psychology, 114(2), 258–269. https://doi.org/10.1037/pspi0000109

Task 3: Extraction and documentation of multiple values

Extract the claim, page number, and result from the replication study in the following article (Study 2).

Murtagh, A. M., & Todd, S. A. (2004). Self-regulation: A challenge to the strength model. Journal of Articles in Support of the Null Hypothesis, 3(1), 19-51.

About FORRT’s Replication Database (FReD)

FORRT’s Replication Database aims to document replication studies across the social, behavioral, and cognitive sciences - and then make them discoverable through various interfaces. So far, we have identified more than 1,600 replication results from large-scale replication projects and individual articles, yet there are many more to go. You can explore the full database through the FReD Explorer or search for specific results in the FReD Annotator.