“The proof established by the test must have a specific form, namely, repeatability. The issue of the experiment must be a statement of the hypothesis, the conditions of test, and the results, in such form that another experimenter, from the description alone, may be able to repeat the experiment. Nothing is accepted as proof, in psychology or in any other science, which does not conform to this requirement.” – (Dunlap 1926)
Repeatability is the cornerstone of many sciences: A majority of the scientific progress rests on the successful accumulation of evidence for claims through reproduction and replications to establish robust discoveries. Reproductions and replications, that is repeated testing of a hypothesis with the same (reproduction) or different (replication) data, are necessary.
Cumulative science without repetition is costly. The aim of this guide is to empower researchers to conduct high-quality reproductions and replications and thereby contribute to making their fields of research more cumulative and robust. Issues of replicability have been discussed across many disciplines, such as psychology ((Open Science Collaboration 2015)), economics ((Dreber and Johannesson 2024)), biology ((Errington et al. 2021)), marketing ((Urminsky and Dietvorst 2024)), linguistics ((McManus 2024)), computer science ((Hummel and Manner 2024)) and epidemiology ((Lash, Collin, and Van Dyke 2018)) and the number of replications has been rising sharply (see Figure 1).
Figure 1
Number of replication studies by year of publication based on the FORRT Replication Database (FReD, (Röseler et al. 2024)) based on the version from July 16, 2025. Code to reproduce the figure: https://osf.io/dznrb.

While the number of replication and reproduction studies has increased, the overall proportion of them is still very small, with reviews finding yearly replication rates of up to 1% ((Perry, Morris, and Lea 2022)). Moreover, much of the guidance on replications is being developed actively ((Clarke et al. 2024)) and in narrow parts of science, which leads to fragmentation, siloing, and potentially inconsistent information.
Here we attempt to integrate useful guidelines (e.g., (Block and Kuckertz 2018; Jekel et al. 2020)) into a comprehensive overview that allows diverse fields to profit from each other. In sum, this guide provides information about the entire process of research allowing researchers at all career stages to plan, conduct, and publish reproduction and replication studies. We limit our scope to quantitative research, given that the concept of reproducibility and replicability itself is highly contested among qualitative researchers (see Makel, Plucker, and Hegarty (2012); Cole et al. (2024); Pownall (2022); Bennett (2021)).
Bennett, E. A. 2021.
“Open Science from a Qualitative, Feminist Perspective: Epistemological Dogmas and a Call for Critical Examination.” Psychology of Women Quarterly 45 (4): 448–56.
https://doi.org/10.1177/03616843211036460.
Block, J., and A. Kuckertz. 2018.
“Seven Principles of Effective Replication Studies: Strengthening the Evidence Base of Management Research.” Management Review Quarterly 68 (4): 355–59.
https://doi.org/10.1007/s11301-018-0149-3.
Clarke, B., P. Y. (K.) Lee, S. R. Schiavone, M. Rhemtulla, and S. Vazire. 2024.
“The Prevalence of Direct Replication Articles in Top-Ranking Psychology Journals.” American Psychologist.
https://doi.org/10.1037/amp0001385.
Cole, N. L., S. Ulpts, A. Bochynska, E. Kormann, M. Good, B. Leitner, and T. Ross-Hellauer. 2024.
“Reproducibility and Replicability of Qualitative Research: An Integrative Review of Concepts, Barriers and Enablers.” https://doi.org/10.31222/osf.io/n5zkw_v1.
Dreber, A., and M. Johannesson. 2024.
“A Framework for Evaluating Reproducibility and Replicability in Economics.” Economic Inquiry.
https://doi.org/10.1111/ecin.13244.
Dunlap, K. 1926.
“The Experimental Methods of Psychology.” In
Psychologies of 1925, edited by C. Murchison, 331–51. Clark University Press.
https://doi.org/10.1037/11020-022.
Errington, T. M., M. Mathur, C. K. Soderberg, A. Denis, N. Perfito, E. Iorns, and B. A. Nosek. 2021.
“Investigating the Replicability of Preclinical Cancer Biology.” eLife 10: e71601.
https://doi.org/10.7554/eLife.71601.
Hummel, T., and J. Manner. 2024. “A Literature Review on Reproducibility Studies in Computer Science.” In Proceedings of the 16th ZEUS Workshop on Services and Their Composition (ZEUS 2024)(CEUR). Vol. 3673.
Jekel, M., S. Fiedler, R. Allstadt Torras, D. Mischkowski, A. R. Dorrough, and A. Glöckner. 2020.
“How to Teach Open Science Principles in the Undergraduate Curriculum—the Hagen Cumulative Science Project.” Psychology Learning & Teaching 19 (1): 91–106.
https://doi.org/10.1177/1475725719868149.
Lash, T. L., L. J. Collin, and M. E. Van Dyke. 2018. “The Replication Crisis in Epidemiology: Snowball, Snow Job, or Winter Solstice?” Current Epidemiology Reports 5: 175–83.
Makel, M. C., J. A. Plucker, and B. Hegarty. 2012.
“Replications in Psychology Research: How Often Do They Really Occur?” Perspectives on Psychological Science 7 (6): 537–42.
https://doi.org/10.1177/1745691612460688.
McManus, K. 2024.
“Replication Studies in Second Language Acquisition Research: Definitions, Issues, Resources, and Future Directions: Introduction to the Special Issue.” Studies in Second Language Acquisition 46 (5): 1299–319.
https://doi.org/10.1017/S0272263124000652.
Open Science Collaboration. 2015.
“Estimating the Reproducibility of Psychological Science.” Science 349 (6251): aac4716.
https://doi.org/10.1126/science.aac4716.
Perry, T., R. Morris, and R. Lea. 2022.
“A Decade of Replication Study in Education? A Mapping Review (2011–2020).” Educational Research and Evaluation 27 (1-2): 12–34.
https://doi.org/10.1080/13803611.2021.2022315.
Pownall, M. 2022.
“Is Replication Possible for Qualitative Research?” https://doi.org/10.31234/osf.io/dwxeg.
Röseler, L., L. Kaiser, C. Doetsch, N. Klett, C. Seida, A. Schütz, and Y. and Zhang. 2024.
“The Replication Database: Documenting the Replicability of Psychological Science.” Journal of Open Psychology Data 12 (1): 8.
https://doi.org/10.5334/jopd.101.
Urminsky, O., and B. J. Dietvorst. 2024.
“Taking the Full Measure: Integrating Replication into Research Practice to Assess Generalizability.” Journal of Consumer Research 51 (1): 157–68.
https://doi.org/10.1093/jcr/ucae007.