8 Replication and meta-research

7 sub-clusters · 82 references

Attainment of a grounding in 'replication research', which takes a variety of forms, each with a different purpose and contribution. Replicable science requires replication research. When teaching, students should understand the purpose and need of replications in its variety of forms and be able to conduct (and join) replication projects. There are 6 sub-clusters which aim to further parse the learning and teaching process:[ag][ah]

Conducting replication studies; challenges, limitations, and comparisons with the original study

A replication study seeks to repeat findings of previous research using identical or similar methods to determine if consistent results can be obtained. Limits can arise from protocol drift, differences in context or measurement, and low power.

  • (2015). Estimating the reproducibility of psychological science. Science, 349(6251). https://doi.org/10.1126/science.aac4716
  • Auspurg, K., & Brüderl, J. (2024). Toward a more credible assessment of the credibility of science by many-analyst studies. Proceedings of the National Academy of Sciences, 121(38). https://doi.org/10.1073/pnas.2404035121
  • Bartscherer, S. F., & Reinhart, M. (2025). The (Non)Academic Community Forming around Replications: Mapping the International Open Science space via its Replication Initiatives. https://doi.org/10.31235/osf.io/rbyt6_v1
  • Davis-Stober, C. P., Sarafoglou, A., Aczel, B., Chandramouli, S. H., Errington, T. M., Field, S. M., Fishbach, A., Freire, J., Ioannidis, J. P. A., Oberauer, K., Pestilli, F., Ressl, S., Schad, D. J., ter Schure, J., Tentori, K., van Ravenzwaaij, D., Vandekerckhove, J., & Gundersen, O. E. (2025). How can we make sound replication decisions? Proceedings of the National Academy of Sciences, 122(5). https://doi.org/10.1073/pnas.2401236121
  • Devezer, B., & Buzbas, E. O. (2025). Minimum viable experiment to replicate. PhilSci Archive. https://philsci-archive.pitt.edu/24738/
  • Errington, T. M. (2024). Building reproducible bridges to cross the “valley of death.” Journal of Clinical Investigation, 134(1). https://doi.org/10.1172/JCI177383
  • Frank, M. C., & Saxe, R. (2012). Teaching Replication. Perspectives on Psychological Science, 7(6), 600–604. https://doi.org/10.1177/1745691612460686
  • Gilbert, D. T., King, G., Pettigrew, S., & Wilson, T. D. (2016). Comment on “Estimating the reproducibility of psychological science.” Science, 351(6277), 1037–1037. https://doi.org/10.1126/science.aad7243
  • Grahe, J. E., Brandt, M. J., Wagge, J. R., Legate, N., Wiggins, B. J., Christopherson, C. D., Weisberg, Y., Corker, K.S., Chartier, C.R., Fallon, M., Hildebrandt, L., Hurst, M.A., Lazarevic, L., Levitan, C., McFall, J., McLaughlin, H., Pazda, A., Ijzerman, H., Nosek, B.A., … & France, H. (2018). Collaborative Replications and Education Project (CREP). https://osf.io/wfc6u/
  • Grahe, J. E., Reifman, A., Hermann, A. D., Walker, M., Oleson, K. C., Nario-Redmond, M., & Wiebe, R. P. (2012). Harnessing the Undiscovered Resource of Student Research Projects. Perspectives on Psychological Science, 7(6), 605–607. https://doi.org/10.1177/1745691612459057
  • Harmon-Jones, E., Harmon-Jones, C., Amodio, D. M., Gable, P. A., & Schmeichel, B. J. (2025). Valid replications require valid methods: Recommendations for best methodological practices with lab experiments. Motivation Science, 11(3), 235–245. https://doi.org/10.1037/mot0000398
  • Horbach, S. P. J. M., Cole, N. L., Kopeinik, S., Leitner, B., Ross-Hellauer, T., & Tijdink, J. (2025). How to get there from here? Barriers and enablers on the road towards reproducibility in research [Manuscript]. OSF. https://osf.io/n28sg/
  • Karhulahti, V.-M., Martončik, M., & Adamkovic, M. (2024). Pre-replication: anything goes, once. https://doi.org/10.31234/osf.io/5gn7m
  • King, G. (1995). Replication, replication. PS: Political Science & Politics, 28(3), 444-452. https://gking.harvard.edu/files/replication.pdf
  • Lavelle, J. S. (2023). Growth From Uncertainty: Understanding the Replication ‘Crisis’ in Infant Cognition. Philosophy of Science, 91(2), 390–409. https://doi.org/10.1017/psa.2023.157
  • Lenne & Mann (2016). CREP project report. https://osf.io/sdj7e/
  • McIntosh, B., Ichikawa, K., & Nelson, N. C. (2025). Adversarial reanalysis and the challenge of open data in regulatory science. https://doi.org/10.31222/osf.io/jfbr8_v1
  • Pratt, M. G., Kaplan, S., & Whittington, R. (2019). Editorial Essay: The Tumult over Transparency: Decoupling Transparency from Replication in Establishing Trustworthy Qualitative Research. Administrative Science Quarterly, 65(1), 1–19. https://doi.org/10.1177/0001839219887663
  • Ross-Hellauer, T., Klebel, T., Bannach-Brown, A., Horbach, S. P. J. M., Jabeen, H., Manola, N., Metodiev, T., Papageorgiou, H., Reczko, M., Sansone, S.-A., Schneider, J., Tijdink, J., & Vergoulis, T. (2022). TIER2: enhancing Trust, Integrity and Efficiency in Research through next-level Reproducibility. Research Ideas and Outcomes, 8. https://doi.org/10.3897/rio.8.e98457
  • Stanley, D. J., & Spence, J. R. (2014). Expectations for Replications. Perspectives on Psychological Science, 9(3), 305–318. https://doi.org/10.1177/1745691614528518
  • Wagge, J. R., Brandt, M. J., Lazarevic, L. B., Legate, N., Christopherson, C., Wiggins, B., & Grahe, J. E. (2019). Publishing Research With Undergraduate Students via Replication Work: The Collaborative Replications and Education Project. Frontiers in Psychology, 10. https://doi.org/10.3389/fpsyg.2019.00247

Direct vs. conceptual replications

Direct replications use the exact same methods and materials, while conceptual replications test the same concept but with different methods, materials, or both. There is an ongoing debate as to how “direct” a replication can possibly be.

  • Derksen, M., & Morawski, J. (2022). Kinds of Replication: Examining the Meanings of “Conceptual Replication” and “Direct Replication.” Perspectives on Psychological Science, 17(5), 1490–1505. https://doi.org/10.1177/17456916211041116
  • Hüffmeier, J., Mazei, J., & Schultze, T. (2016). Reconceptualizing replication as a sequence of different studies: A replication typology. Journal of Experimental Social Psychology, 66, 81–92. https://doi.org/10.1016/j.jesp.2015.09.009
  • Hutmacher, F., & Franz, D. J. (2025). Approaching psychology’s current crises by exploring the vagueness of psychological concepts: Recommendations for advancing the discipline. American Psychologist, 80(2), 220–231. https://doi.org/10.1037/amp0001300
  • Kunert, R. (2016). Internal conceptual replications do not increase independent replication success. Psychonomic Bulletin & Review, 23(5), 1631–1638. https://doi.org/10.3758/s13423-016-1030-9
  • Simons, D. J. (2014). The Value of Direct Replication. Perspectives on Psychological Science, 9(1), 76–80. https://doi.org/10.1177/1745691613514755
  • Van Bavel, J. J., Mende-Siedlecki, P., Brady, W. J., & Reinero, D. A. (2016). Contextual sensitivity in scientific reproducibility. Proceedings of the National Academy of Sciences, 113(23), 6454–6459. https://doi.org/10.1073/pnas.1521897113
  • Zwaan, R. A., Etz, A., Lucas, R. E., & Donnellan, M. B. (2017). Making replication mainstream. Behavioral and Brain Sciences, 41. https://doi.org/10.1017/S0140525X17001972

Meta-analyses

Meta-analysis pools estimates to show the bigger picture. Careful work starts with a prespecified plan, aligns effect sizes, and checks bias and sensitivity. It reports heterogeneity and uses prediction intervals to show what a future study might find. Data and code are shared so the synthesis can be audited and updated.

  • Bartoš, F., Maier, M., Shanks, D. R., Stanley, T. D., Sladekova, M., & Wagenmakers, E.-J. (2023). Meta-analyses in psychology often overestimate evidence for and size of effects. Royal Society Open Science, 10(7). https://doi.org/10.1098/rsos.230224
  • Bartoš, F., Maier, M., Wagenmakers, E., Doucouliagos, H., & Stanley, T. D. (2022). Robust Bayesian meta‐analysis: Model‐averaging across complementary publication bias adjustment methods. Research Synthesis Methods, 14(1), 99–116. Portico. https://doi.org/10.1002/jrsm.1594
  • Bartoš, F., Maier, M., Wagenmakers, E., Nippold, F., Doucouliagos, H., Ioannidis, J. P. A., Otte, W. M., Sladekova, M., Deresssa, T. K., Bruns, S. B., Fanelli, D., & Stanley, T. D. (2024). Footprint of publication selection bias on meta‐analyses in medicine, environmental sciences, psychology, and economics. Research Synthesis Methods, 15(3), 500–511. Portico. https://doi.org/10.1002/jrsm.1703
  • Carter, E. C., Schönbrodt, F. D., Gervais, W. M., & Hilgard, J. (2019). Correcting for Bias in Psychology: A Comparison of Meta-Analytic Methods. Advances in Methods and Practices in Psychological Science, 2(2), 115–144. https://doi.org/10.1177/2515245919847196
  • Fanelli, D., Costas, R., & Ioannidis, J. P. A. (2017). Meta-assessment of bias in science. Proceedings of the National Academy of Sciences, 114(14), 3714–3719. https://doi.org/10.1073/pnas.1618569114
  • Fanelli, D. (2010). “Positive” Results Increase Down the Hierarchy of the Sciences. PLoS ONE, 5(4), e10068. https://doi.org/10.1371/journal.pone.0010068
  • Hong, S., & Reed, W. R. (2020). Using Monte Carlo experiments to select meta‐analytic estimators. Research Synthesis Methods, 12(2), 192–215. Portico. https://doi.org/https://doi.org/10.1002/jrsm.1467
  • Ioannidis, J. P. A., Stanley, T. D., & Doucouliagos, H. (2017). The Power of Bias in Economics Research. The Economic Journal, 127(605), F236–F265. https://doi.org/10.1111/ecoj.12461
  • Ivimey‐Cook, E. R., Noble, D. W. A., Nakagawa, S., Lajeunesse, M. J., & Pick, J. L. (2023). Advice for improving the reproducibility of data extraction in meta‐analysis. Research Synthesis Methods, 14(6), 911–915. Portico. https://doi.org/10.1002/jrsm.1663
  • Kvarven, A., Strømland, E., & Johannesson, M. (2019). Comparing meta-analyses and preregistered multiple-laboratory replication projects. Nature Human Behaviour, 4(4), 423–434. https://doi.org/10.1038/s41562-019-0787-z
  • Lau, J., Ioannidis, J. P. A., Terrin, N., Schmid, C. H., & Olkin, I. (2006). The case of the misleading funnel plot. BMJ, 333(7568), 597–600. https://doi.org/10.1136/bmj.333.7568.597
  • Bartoš, F., Maier, M., Wagenmakers, E., Nippold, F., Doucouliagos, H., Ioannidis, J. P. A., Otte, W. M., Sladekova, M., Deresssa, T. K., Bruns, S. B., Fanelli, D., & Stanley, T. D. (2024). Footprint of publication selection bias on meta‐analyses in medicine, environmental sciences, psychology, and economics. Research Synthesis Methods, 15(3), 500–511. Portico. https://doi.org/10.1002/jrsm.1703
  • Schwab, S., Kreiliger, G., & Held, L. (2021). Assessing treatment effects and publication bias across different specialties in medicine: a meta-epidemiological study. BMJ Open, 11(9), e045942. https://doi.org/10.1136/bmjopen-2020-045942
  • Stanley, T. D., Carter, E. C., & Doucouliagos, H. (2018). What meta-analyses reveal about the replicability of psychological research. Psychological Bulletin, 144(12), 1325–1346. https://doi.org/10.1037/bul0000169
  • Topor, M., Pickering, J. S., Barbosa Mendes, A., Bishop, D. V. M., Büttner, F., Elsherif, M. M., Evans, T. R., Henderson, E. L., Kalandadze, T., Nitschke, F. T., Staaks, J. P. C., Van den Akker, O. R., Yeung, S. K., Zaneva, M., Lam, A., Madan, C. R., Moreau, D., O’Mahony, A., Parker, A. J., … Westwood, S. J. (2023). An integrative framework for planning and conducting Non-Intervention, Reproducible, and Open Systematic Reviews (NIRO-SR). Meta-Psychology, 7. https://doi.org/10.15626/MP.2021.2840
  • van Aert, R. C. M., Wicherts, J. M., & van Assen, M. A. L. M. (2019). Publication bias examined in meta-analyses from psychology and medicine: A meta-meta-analysis. PLOS ONE, 14(4), e0215052. https://doi.org/10.1371/journal.pone.0215052

Meta-research

Meta-research studies how research is done. It maps power, bias, reporting quality, and the uptake of open practices. It tests which interventions improve credibility and efficiency. The aim is practical guidance that helps fields do better work and waste less effort. Findings are shared openly so policies, training, and incentives can respond.

  • Bak-Coleman, J., & Devezer, B. (2024). Claims about scientific rigour require rigour. Nature Human Behaviour, 8(10), 1890–1891. https://doi.org/10.1038/s41562-024-01982-w
  • Bak-Coleman, J. B., Mann, R. P., Bergstrom, C. T., Gross, K., & West, J. (2022). Revisiting the replication crisis without false positives. https://doi.org/10.31235/osf.io/rkyf7
  • Bartscherer, S. F., & Reinhart, M. (2025). The (Non)Academic Community Forming around Replications: Mapping the International Open Science space via its Replication Initiatives. https://doi.org/10.31235/osf.io/rbyt6_v1
  • Claesen, A., Gomes, S., Tuerlinckx, F., & Vanpaemel, W. (2021). Comparing dream to reality: an assessment of adherence of the first generation of preregistered studies. Royal Society Open Science, 8(10). https://doi.org/10.1098/rsos.211037
  • Davis-Stober, C. P., Sarafoglou, A., Aczel, B., Chandramouli, S. H., Errington, T. M., Field, S. M., Fishbach, A., Freire, J., Ioannidis, J. P. A., Oberauer, K., Pestilli, F., Ressl, S., Schad, D. J., ter Schure, J., Tentori, K., van Ravenzwaaij, D., Vandekerckhove, J., & Gundersen, O. E. (2025). How can we make sound replication decisions? Proceedings of the National Academy of Sciences, 122(5). https://doi.org/10.1073/pnas.2401236121
  • Devezer, B., & Buzbas, E. O. (2025). Minimum viable experiment to replicate. PhilSci Archive. https://philsci-archive.pitt.edu/24738/
  • Dudda, L., Kormann, E., Kozula, M., DeVito, N. J., Klebel, T., Dewi, A. P. M., Spijker, R., Stegeman, I., Van den Eynden, V., Ross-Hellauer, T., & Leeflang, M. M. G. (2025). Open science interventions to improve reproducibility and replicability of research: a scoping review. Royal Society Open Science, 12(4). https://doi.org/10.1098/rsos.242057
  • Field, S. M., Volz, L., Kaznatcheev, A., & van Dongen, N. (2024). Can a Good Theory Be Built Using Bad Ingredients? Computational Brain & Behavior, 7(4), 608–615. https://doi.org/10.1007/s42113-024-00220-w
  • Horbach, S. P. J. M., & Halffman, W. (2016). Promoting Virtue or Punishing Fraud: Mapping Contrasts in the Language of ‘Scientific Integrity.’ Science and Engineering Ethics, 23(6), 1461–1485. https://doi.org/10.1007/s11948-016-9858-y
  • Horbach, S. P. J. M., & Halffman, W. (2017). The ghosts of HeLa: How cell line misidentification contaminates the scientific literature. PLOS ONE, 12(10), e0186281. https://doi.org/10.1371/journal.pone.0186281
  • Horbach, S. P. J. M., & Halffman, W. ( W. (2018). The changing forms and expectations of peer review. Research Integrity and Peer Review, 3(1). https://doi.org/10.1186/s41073-018-0051-5
  • Horbach, S. P. J. M., & Halffman, W. (2018). The ability of different peer review procedures to flag problematic publications. Scientometrics, 118(1), 339–373. https://doi.org/10.1007/s11192-018-2969-2
  • Horbach, S. P. J. M. (Serge), & Halffman, W. (Willem). (2019). The extent and causes of academic text recycling or ‘self-plagiarism.’ Research Policy, 48(2), 492–502. https://doi.org/10.1016/j.respol.2017.09.004
  • Horbach, S. P. J. M., & Halffman, W. (2019). Journal Peer Review and Editorial Evaluation: Cautious Innovator or Sleepy Giant? Minerva, 58(2), 139–161. https://doi.org/10.1007/s11024-019-09388-z
  • Horbach, S. P. J. M. (2021). No time for that now! Qualitative changes in manuscript peer review during the Covid-19 pandemic. Research Evaluation, 30(3), 231–239. https://doi.org/10.1093/reseval/rvaa037
  • Ioannidis, J. P. A. (2005). Why Most Published Research Findings Are False. PLoS Medicine, 2(8), e124. https://doi.org/10.1371/journal.pmed.0020124
  • Lengersdorff, L. L., & Lamm, C. (2025). With Low Power Comes Low Credibility? Toward a Principled Critique of Results From Underpowered Tests. Advances in Methods and Practices in Psychological Science, 8(1). https://doi.org/10.1177/25152459241296397
  • Pownall, M., Pennington, C. R., Norris, E., Juanchich, M., Smailes, D., Russell, S., Gooch, D., Evans, T. R., Persson, S., Mak, M. H. C., Tzavella, L., Monk, R., Gough, T., Benwell, C. S. Y., Elsherif, M., Farran, E., Gallagher-Mitchell, T., Kendrick, L. T., Bahnmueller, J., … Clark, K. (2023). Evaluating the Pedagogical Effectiveness of Study Preregistration in the Undergraduate Dissertation. Advances in Methods and Practices in Psychological Science, 6(4). https://doi.org/10.1177/25152459231202724
  • Rubin, M. (2025). What is critical metascience and why is it important? https://doi.org/10.31234/osf.io/4tpk8_v2
  • Spitzer, L., & Mueller, S. (2023). Registered report: Survey on attitudes and experiences regarding preregistration in psychological research. PLOS ONE, 18(3), e0281086. https://doi.org/10.1371/journal.pone.0281086
  • Syed, M. (2023). Some Data Indicating that Editors and Reviewers Do Not Check Preregistrations during the Review Process. https://osf.io/nh7qw/
  • Ulpts, S., Bartscherer, S. F., Field, S. M., & Penders, B. (2025). The social replication of replication: Moving replication through epistemic communities. https://doi.org/10.31235/osf.io/pqc4v_v1
  • van den Akker, O. R., van Assen, M. A. L. M., Bakker, M., Elsherif, M., Wong, T. K., & Wicherts, J. M. (2023). Preregistration in practice: A comparison of preregistered and non-preregistered studies in psychology. Behavior Research Methods, 56(6), 5424–5433. https://doi.org/10.3758/s13428-023-02277-0

Purposes of replication attempts - what is a ‘failed’ replication?

Explains the diverse aims of replication and clarifies that “failure” is not a verdict on truth but evidence about effect size, robustness, and conditions. Encourages nuanced interpretation (e.g., meta-analytic and design-aware) over binary success/failure narratives and highlights responsible communication of discrepant findings.

Registered Replication Reports

Registered Reports are studies that are peer-reviewed prior to data collection, with an agreement between the journal and the author(s) that it will be published regardless of outcome as long as the preregistered methods are reasonably followed. Registered REPLICATION Reports are a special category of these that only include replications.

  • Alogna, V. K., Attaya, M. K., Aucoin, P., Bahník, Š., Birch, S., Birt, A. R., Bornstein, B. H., Bouwmeester, S., Brandimonte, M. A., Brown, C., Buswell, K., Carlson, C., Carlson, M., Chu, S., Cislak, A., Colarusso, M., Colloff, M. F., Dellapaolera, K. S., Delvenne, J.-F., … Zwaan, R. A. (2014). Registered Replication Report. Perspectives on Psychological Science, 9(5), 556–578. https://doi.org/10.1177/1745691614545653
  • Anon. (n.d.). Ongoing Replication Projects. Association for Psychological Science - APS. https://www.psychologicalscience.org/publications/replication/ongoing-projects
  • Chambers, C. D., & Tzavella, L. (2021). The past, present and future of Registered Reports. Nature Human Behaviour, 6(1), 29–42. https://doi.org/10.1038/s41562-021-01193-7
  • Eerland, A., Sherrill, A. M., Magliano, J. P., Zwaan, R. A., Arnal, J. D., Aucoin, P., Berger, S. A., Birt, A. R., Capezza, N., Carlucci, M., Crocker, C., Ferretti, T. R., Kibbe, M. R., Knepp, M. M., Kurby, C. A., Melcher, J. M., Michael, S. W., Poirier, C., & Prenoveau, J. M. (2016). Registered Replication Report. Perspectives on Psychological Science, 11(1), 158–171. https://doi.org/10.1177/1745691615605826
  • Simons, D. J., Holcombe, A. O., & Spellman, B. A. (2014). An Introduction to Registered Replication Reports at Perspectives on Psychological Science. Perspectives on Psychological Science, 9(5), 552–555. https://doi.org/10.1177/1745691614543974

The politics of replicating famous studies

Sometimes responses to replication research can be negative. Failed replications of famous work, most notably power posing, ego depletion, stereotype threat, and facial feedback[ai][aj], have received a lot of attention.

  • Doyen, S., Klein, O., Pichon, C.-L., & Cleeremans, A. (2012). Behavioral Priming: It’s All in the Mind, but Whose Mind? PLoS ONE, 7(1), e29081. https://doi.org/10.1371/journal.pone.0029081
  • Neuliep, J. W., & Crandall, R. (1990). Editorial bias against replication research. Journal of Social Behavior & Personality, 5(4), 85-90.
  • Neuliep, J. W., & Crandall, R. (1993). Reviewer bias against replication research. Journal of Social Behavior & Personality, 8(6), 21-29.
JUST-OS