1 Replication Crisis and Credibility Revolution

7 sub-clusters · 146 references

Attainment of foundational knowledge on the importance of reproducible and open research (i.e., grounding the motivations and theoretical underpinnings of Open and Reproducible Science). Integration with field specific content (i.e., grounded in the history of replicability). There are 7 sub-clusters which aim to further parse the learning and teaching process:

History of the replication crisis & credibility revolution

In order to understand and weigh in on current developments, we need to first understand how the Open and Reproducible Science movement started, from its origins over the replicability/reproducibility crisis to the credibility revolution.

  • Archer, R. (2024). Retiring Popper: Critical realism, falsificationism, and the crisis of replication. Theory & Psychology, 34(5), 561–584. https://doi.org/10.1177/09593543241250079
  • Arel-Bundock, V., Briggs, R. C., Doucouliagos, H., Aviña, M. M., & Stanley, T. D. (2026). Quantitative Political Science Research Is Greatly Underpowered. The Journal of Politics, 88(1), 36–46. https://doi.org/10.1086/734279
  • Baker, M. (2016). 1,500 scientists lift the lid on reproducibility. Nature, 533(7604), 452–454. https://doi.org/10.1038/533452a
  • Stoddart, C. (2016). Is there a reproducibility crisis in science? Nature. https://doi.org/10.1038/d41586-019-00067-3
  • Bird, A. (2021). Understanding the Replication Crisis as a Base Rate Fallacy. The British Journal for the Philosophy of Science, 72(4), 965–993. https://doi.org/10.1093/bjps/axy051
  • Chambers, C. (2017). The Seven Deadly Sins of Psychology. https://doi.org/10.1515/9781400884940
  • Cova, F., Strickland, B., Abatista, A., Allard, A., Andow, J., Attie, M., Beebe, J., Berniūnas, R., Boudesseul, J., Colombo, M., Cushman, F., Diaz, R., N’Djaye Nikolai van Dongen, N., Dranseika, V., Earp, B. D., Torres, A. G., Hannikainen, I., Hernández-Conde, J. V., Hu, W., … Zhou, X. (2018). Estimating the Reproducibility of Experimental Philosophy. Review of Philosophy and Psychology, 12(1), 9–44. https://doi.org/10.1007/s13164-018-0400-9
  • Crüwell, S., van Doorn, J., Etz, A., Makel, M. C., Moshontz, H., Niebaum, J. C., Orben, A., Parsons, S., & Schulte-Mecklenbeck, M. (2019). Seven Easy Steps to Open Science. Zeitschrift Für Psychologie, 227(4), 237–248. https://doi.org/10.1027/2151-2604/a000387
  • Edwards, M. A., & Roy, S. (2017). Academic Research in the 21st Century: Maintaining Scientific Integrity in a Climate of Perverse Incentives and Hypercompetition. Environmental Engineering Science, 34(1), 51–61. https://doi.org/10.1089/ees.2016.0223
  • Feest, U. (2024). What is the Replication Crisis a Crisis Of? Philosophy of Science, 91(5), 1361–1371. https://doi.org/10.1017/psa.2024.2
  • Haven, T. L., & Ioannidis, J. P. A. (2025). Reproducibility Failure in Biomedical Research: Problems and Solutions. https://doi.org/10.31222/osf.io/k5su6_v1
  • Korbmacher, M., Azevedo, F., Pennington, C. R., Hartmann, H., Pownall, M., Schmidt, K., Elsherif, M., Breznau, N., Robertson, O., Kalandadze, T., Yu, S., Baker, B. J., O’Mahony, A., Olsnes, J. Ø.-S., Shaw, J. J., Gjoneska, B., Yamada, Y., Röer, J. P., Murphy, J., … Evans, T. (2023). The replication crisis has led to positive structural, procedural, and community changes. Communications Psychology, 1(1). https://doi.org/10.1038/s44271-023-00003-2
  • Lakens, D. (2023). Concerns about Replicability, Theorizing, Applicability, Generalizability, and Methodology across Two Crises in Social Psychology. https://doi.org/10.31234/osf.io/dtvs7
  • Leonelli, S. (2023). Philosophy of open science. Cambridge University Press. http://philsci-archive.pitt.edu/id/eprint/21986
  • Merton, R. K. (1968). The Matthew Effect in Science. Science, 159(3810), 56–63. https://doi.org/10.1126/science.159.3810.56
  • Merton, R. K. (1988). The Matthew Effect in Science, II: Cumulative Advantage and the Symbolism of Intellectual Property. Isis, 79(4), 606–623. https://doi.org/10.1086/354848
  • Munafò, M. R., Nosek, B. A., Bishop, D. V. M., Button, K. S., Chambers, C. D., Percie du Sert, N., Simonsohn, U., Wagenmakers, E.-J., Ware, J. J., & Ioannidis, J. P. A. (2017). A manifesto for reproducible science. Nature Human Behaviour, 1(1). https://doi.org/10.1038/s41562-016-0021
  • Nelson, L. D., Simmons, J., & Simonsohn, U. (2018). Psychology’s Renaissance. Annual Review of Psychology, 69(1), 511–534. https://doi.org/10.1146/annurev-psych-122216-011836
  • Nosek, B. A., Hardwicke, T. E., Moshontz, H., Allard, A., Corker, K. S., Dreber, A., Fidler, F., Hilgard, J., Kline Struhl, M., Nuijten, M. B., Rohrer, J. M., Romero, F., Scheel, A. M., Scherer, L. D., Schönbrodt, F. D., & Vazire, S. (2022). Replicability, Robustness, and Reproducibility in Psychological Science. Annual Review of Psychology, 73(1), 719–748. https://doi.org/10.1146/annurev-psych-020821-114157
  • Nuzzo, R. (2015). How scientists fool themselves – and how they can stop. Nature, 526(7572), 182–185. https://doi.org/10.1038/526182a
  • Devezer, B., & Buzbas, E. O. (2025). Minimum viable experiment to replicate. PhilSci Archive. https://philsci-archive.pitt.edu/24738/
  • Parsons, S., Azevedo, F., Elsherif, M. M., Guay, S., Shahim, O. N., Govaart, G. H., Norris, E., O’Mahony, A., Parker, A. J., Todorovic, A., Pennington, C. R., Garcia-Pelegrin, E., Lazić, A., Robertson, O., Middleton, S. L., Valentini, B., McCuaig, J., Baker, B. J., Collins, E., … Aczel, B. (2022). A community-sourced glossary of open scholarship terms. Nature Human Behaviour, 6(3), 312–318. https://doi.org/10.1038/s41562-021-01269-4
  • Penders, B. (2024). Renovating the Theatre of Persuasion. ManyLabs as Collaborative Prototypes for the Production of Credible Knowledge. https://doi.org/10.31222/osf.io/vhmk2
  • Peterson, D., & Panofsky, A. (2023). Metascience as a Scientific Social Movement. Minerva, 61(2), 147–174. https://doi.org/10.1007/s11024-023-09490-3
  • Phaf, R. H. (2024). Positive Deviance Underlies Successful Science: Normative Methodologies Risk Throwing out the Baby With the Bathwater. Review of General Psychology, 28(3), 219–236. https://doi.org/10.1177/10892680241235120
  • Vazire, S. (2018). Implications of the Credibility Revolution for Productivity, Creativity, and Progress. Perspectives on Psychological Science, 13(4), 411–417. https://doi.org/10.1177/1745691617751884

Scientific Misconduct: Fabrication and Falsification

In order to understand and weigh in on how the Reproducibility Crisis started, we first need to understand scientific misconduct, especially data fabrication and falsification. These practices erode trust in science and distort the research record. Fabrication involves inventing data, participants, or outcomes; falsification involves altering materials, methods, measurements, images, or reporting so that findings are misrepresented. Because intent to mislead is central, these acts are distinct from questionable research practices and from honest mistakes. Recognizing the role of misconduct is therefore essential for understanding how unreliable or non-replicable studies entered the literature and contributed to the broader crisis.

  • Fanelli, D. (2009). How Many Scientists Fabricate and Falsify Research? A Systematic Review and Meta-Analysis of Survey Data. PLoS ONE, 4(5), e5738. https://doi.org/10.1371/journal.pone.0005738
  • Fang, F. C., Steen, R. G., & Casadevall, A. (2012). Misconduct accounts for the majority of retracted scientific publications. Proceedings of the National Academy of Sciences, 109(42), 17028–17033. https://doi.org/10.1073/pnas.1212247109
  • Haven, T., & van Woudenberg, R. (2021). Explanations of Research Misconduct, and How They Hang Together. Journal for General Philosophy of Science, 52(4), 543–561. https://doi.org/10.1007/s10838-021-09555-5
  • Jamieson, K. H., McNutt, M., Kiermer, V., & Sever, R. (2019). Signaling the trustworthiness of science. Proceedings of the National Academy of Sciences, 116(39), 19231–19236. https://doi.org/10.1073/pnas.1913039116
  • Jamieson, K. H., McNutt, M., Kiermer, V., & Sever, R. (2019). Reply to Kornfeld and Titus: No distraction from misconduct. Proceedings of the National Academy of Sciences, 117(1), 42–42. https://doi.org/10.1073/pnas.1918001116
  • Kombe, F., Anunobi, E. N., Tshifugula, N. P., Wassenaar, D., Njadingwe, D., Mwalukore, S., Chinyama, J., Randrianasolo, B., Akindeh, P., Dlamini, P. S., Ramiandrisoa, F. N., & Ranaivo, N. (2013). Promoting Research Integrity in Africa: An African Voice of Concern on Research Misconduct and the Way Forward. Developing World Bioethics, 14(3), 158–166. Portico. https://doi.org/10.1111/dewb.12024
  • Kornfeld, D. S., & Titus, S. L. (2016). Stop ignoring misconduct. Nature, 537(7618), 29–30. https://doi.org/10.1038/537029a
  • Kornfeld, D. S., & Titus, S. L. (2019). Signaling the trustworthiness of science should not be a substitute for direct action against research misconduct. Proceedings of the National Academy of Sciences, 117(1), 41–41. https://doi.org/10.1073/pnas.1917490116
  • Stroebe, W., Postmes, T., & Spears, R. (2012). Scientific Misconduct and the Myth of Self-Correction in Science. Perspectives on Psychological Science, 7(6), 670–688. https://doi.org/10.1177/1745691612460687
  • Tijdink, J. K., Verbeke, R., & Smulders, Y. M. (2014). Publication Pressure and Scientific Misconduct in Medical Scientists. Journal of Empirical Research on Human Research Ethics, 9(5), 64–71. https://doi.org/10.1177/1556264614552421

Questionable research practices & their prevalence

Questionable research practices are actions which researchers take to increase the probability of their desired result. They can be done consciously and unconsciously, distinguishing them from deliberate scientific misconduct, but still compromise research integrity since they can lead to misleading conclusions. Examples of such behaviors include p-hacking, selective reporting, and HARK-ing (Hypothesizing After the Results are Known). The ways in which researchers engage in behaviors and decision-making that increase the probability of their (consciously or unconsciously) desired result.

Collection of large scale replications

This is a collection of large scale replications that have been conducted estimating the rate of reproducibility of entire (sub)disciplines, offering a big-picture view of replication efforts and the current state of replicability across fields.

  • Ankel-Peters, J., Fiala, N., & Neubauer, F. (2023). Do economists replicate? Journal of Economic Behavior & Organization, 212, 219–232. https://doi.org/10.1016/j.jebo.2023.05.009
  • Arel-Bundock, V., Briggs, R. C., Doucouliagos, H., Aviña, M. M., & Stanley, T. D. (2026). Quantitative Political Science Research Is Greatly Underpowered. The Journal of Politics, 88(1), 36–46. https://doi.org/10.1086/734279
  • Baumeister, R. F., Tice, D. M., & Bushman, B. J. (2022). A Review of Multisite Replication Projects in Social Psychology: Is It Viable to Sustain Any Confidence in Social Psychology’s Knowledge Base? Perspectives on Psychological Science, 18(4), 912–935. https://doi.org/10.1177/17456916221121815
  • Camerer, C. F., Dreber, A., Forsell, E., Ho, T.-H., Huber, J., Johannesson, M., Kirchler, M., Almenberg, J., Altmejd, A., Chan, T., Heikensten, E., Holzmeister, F., Imai, T., Isaksson, S., Nave, G., Pfeiffer, T., Razen, M., & Wu, H. (2016). Evaluating replicability of laboratory experiments in economics. Science, 351(6280), 1433–1436. https://doi.org/10.1126/science.aaf0918
  • Camerer, C. F., Dreber, A., Holzmeister, F., Ho, T.-H., Huber, J., Johannesson, M., Kirchler, M., Nave, G., Nosek, B. A., Pfeiffer, T., Altmejd, A., Buttrick, N., Chan, T., Chen, Y., Forsell, E., Gampa, A., Heikensten, E., Hummer, L., Imai, T., … Wu, H. (2018). Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015. Nature Human Behaviour, 2(9), 637–644. https://doi.org/10.1038/s41562-018-0399-z
  • Chang, A. C., & Li, P. (2015). Is Economics Research Replicable? Sixty Published Papers from Thirteen Journals Say “Usually Not.” Finance and Economics Discussion Series, 2015.0(83), 1–26. https://doi.org/10.17016/feds.2015.083
  • Clarke, B., Lee, P. Y., Schiavone, S. R., Rhemtulla, M., & Vazire, S. (2023). The Prevalence of Direct Replication Articles in Top-Ranking Psychology Journals. https://doi.org/10.31234/osf.io/sa6rc
  • Ebersole, C. R., Atherton, O. E., Belanger, A. L., Skulborstad, H. M., Allen, J. M., Banks, J. B., Baranski, E., Bernstein, M. J., Bonfiglio, D. B. V., Boucher, L., Brown, E. R., Budiman, N. I., Cairo, A. H., Capaldi, C. A., Chartier, C. R., Chung, J. M., Cicero, D. C., Coleman, J. A., Conway, J. G., … Nosek, B. A. (2016). Many Labs 3: Evaluating participant pool quality across the academic semester via replication. Journal of Experimental Social Psychology, 67, 68–82. https://doi.org/10.1016/j.jesp.2015.10.012
  • Ebersole, C. R., Mathur, M. B., Baranski, E., Bart-Plange, D.-J., Buttrick, N. R., Chartier, C. R., Corker, K. S., Corley, M., Hartshorne, J. K., IJzerman, H., Lazarević, L. B., Rabagliati, H., Ropovik, I., Aczel, B., Aeschbach, L. F., Andrighetto, L., Arnal, J. D., Arrow, H., Babincak, P., … Nosek, B. A. (2020). Many Labs 5: Testing Pre-Data-Collection Peer Review as an Intervention to Increase Replicability. Advances in Methods and Practices in Psychological Science, 3(3), 309–331. https://doi.org/10.1177/2515245920958687
  • Errington, T. M., Mathur, M., Soderberg, C. K., Denis, A., Perfito, N., Iorns, E., & Nosek, B. A. (2021). Investigating the replicability of preclinical cancer biology. ELife, 10. CLOCKSS. https://doi.org/10.7554/eLife.71601
  • Finger, R., Grebitus, C., & Henningsen, A. (2023). Replications in agricultural economics. Applied Economic Perspectives and Policy, 45(3), 1258–1274. Portico. https://doi.org/10.1002/aepp.13386
  • Christopherson, C. D., Hildebrandt, L., Adeyemi Adetula, Wiggins, B. J., McLaughlin, H., Hurst, M. A., IJzerman, H., Levitan, C., Legate, N., Pazda, A., Kaylis Hase, VanBenschoten, A., Fallon, M., LePine, S., Gervais, H., Lazarevic, L., Chartier, C. R., Corker, K. S., France, H., … Wagge, J. (2013). Collaborative Replications and Education Project (CREP). OSF. https://doi.org/10.17605/OSF.IO/WFC6U
  • Hardwicke, T. E., Thibault, R. T., Kosie, J. E., Wallach, J. D., Kidwell, M. C., & Ioannidis, J. P. A. (2021). Estimating the Prevalence of Transparency and Reproducibility-Related Research Practices in Psychology (2014–2017). Perspectives on Psychological Science, 17(1), 239–251. https://doi.org/10.1177/1745691620979806
  • Kelly, C. D. (2006). Replicating Empirical Research In Behavioral Ecology: How And Why It Should Be Done But Rarely Ever Is. The Quarterly Review of Biology, 81(3), 221–236. https://doi.org/10.1086/506236
  • Kelly, C. D. (2019). Rate and success of study replication in ecology and evolution. PeerJ, 7, e7654. Portico. https://doi.org/10.7717/peerj.7654
  • Klein, R. A., Ratliff, K. A., Vianello, M., Adams, R. B., Bahník, Š., Bernstein, M. J., Bocian, K., Brandt, M. J., Brooks, B., Brumbaugh, C. C., Cemalcilar, Z., Chandler, J., Cheong, W., Davis, W. E., Devos, T., Eisner, M., Frankowska, N., Furrow, D., Galliani, E. M., … Nosek, B. A. (2014). Investigating Variation in Replicability. Social Psychology, 45(3), 142–152. https://doi.org/10.1027/1864-9335/a000178
  • Klein, R. A., Vianello, M., Hasselman, F., Adams, B. G., Adams, R. B., Alper, S., Aveyard, M., Axt, J. R., Babalola, M. T., Bahník, Š., Batra, R., Berkics, M., Bernstein, M. J., Berry, D. R., Bialobrzeska, O., Binan, E. D., Bocian, K., Brandt, M. J., Busching, R., … Nosek, B. A. (2018). Many Labs 2: Investigating Variation in Replicability Across Samples and Settings. Advances in Methods and Practices in Psychological Science, 1(4), 443–490. https://doi.org/10.1177/2515245918810225
  • Kobrock, K., & Roettger, T. B. (2023). Assessing the replication landscape in experimental linguistics. Glossa Psycholinguistics, 2(1). https://doi.org/10.5070/g6011135
  • Makel, M. C., & Plucker, J. A. (2014). Facts Are More Important Than Novelty. Educational Researcher, 43(6), 304–316. https://doi.org/10.3102/0013189X14545513
  • Makel, M. C., Plucker, J. A., & Hegarty, B. (2012). Replications in Psychology Research. Perspectives on Psychological Science, 7(6), 537–542. https://doi.org/10.1177/1745691612460688
  • Makel, M. C., Plucker, J. A., Freeman, J., Lombardi, A., Simonsen, B., & Coyne, M. (2016). Replication of Special Education Research. Remedial and Special Education, 37(4), 205–212. https://doi.org/10.1177/0741932516646083
  • ManyPrimates. (n.d.). ManyPrimates. ManyPrimates. https://manyprimates.github.io/
  • Marsden, E., Morgan‐Short, K., Thompson, S., & Abugaber, D. (2018). Replication in Second Language Research: Narrative and Systematic Reviews and Recommendations for the Field. Language Learning, 68(2), 321–391. Portico. https://doi.org/10.1111/lang.12286
  • McNeeley, S., & Warner, J. J. (2015). Replication in criminology: A necessary practice. European Journal of Criminology, 12(5), 581–597. https://doi.org/10.1177/1477370815578197
  • Mueller-Langer, F., Fecher, B., Harhoff, D., & Wagner, G. G. (2019). Replication studies in economics—How many and which papers are chosen for replication, and why? Research Policy, 48(1), 62–83. https://doi.org/10.1016/j.respol.2018.07.019
  • (2012). An Open, Large-Scale, Collaborative Effort to Estimate the Reproducibility of Psychological Science. Perspectives on Psychological Science, 7(6), 657–660. https://doi.org/10.1177/1745691612462588
  • (2015). Estimating the reproducibility of psychological science. Science, 349(6251). https://doi.org/10.1126/science.aac4716
  • Protzko, J., Krosnick, J., Nelson, L., Nosek, B. A., Axt, J., Berent, M., Buttrick, N., DeBell, M., Ebersole, C. R., Lundmark, S., MacInnis, B., O’Donnell, M., Perfecto, H., Pustejovsky, J. E., Roeder, S. S., Walleczek, J., & Schooler, J. W. (2023). RETRACTED ARTICLE: High replicability of newly discovered social-behavioural findings is achievable. Nature Human Behaviour, 8(2), 311–319. https://doi.org/10.1038/s41562-023-01749-9
  • Van Bavel, J. J., Mende-Siedlecki, P., Brady, W. J., & Reinero, D. A. (2016). Contextual sensitivity in scientific reproducibility. Proceedings of the National Academy of Sciences, 113(23), 6454–6459. https://doi.org/10.1073/pnas.1521897113

Proposed science improvement initiatives on statistics, measurement, teaching, data sharing, code sharing, pre-registration, & replication

Published checklists and other resources that can be used to shift behavior more toward improved practices.

  • Abadie, A. (2020). Statistical Nonsignificance in Empirical Economics. American Economic Review: Insights, 2(2), 193–208. https://doi.org/10.1257/aeri.20190252
  • Azevedo, F., Liu, M., Pennington, C. R., Pownall, M., Evans, T. R., Parsons, S., Elsherif, M. M., Micheli, L., & Westwood, S. J. (2022). Towards a culture of open scholarship: the role of pedagogical communities. BMC Research Notes, 15(1). https://doi.org/10.1186/s13104-022-05944-1
  • Azevedo, F., Parsons, S., Micheli, L., Strand, J. F., Rinke, E. M., Guay, S., Elsherif, M. M., Quinn, K. A., Wagge, J. R., Steltenpohl, C. N., Kalandadze, T., Vasilev, M. R., Oliveira, C. M., Aczel, B., Miranda, J. F., Baker, B. J., Galang, C. M., Pennington, C. R., Marques, T., … FORRT. (2019). Introducing a Framework for Open and Reproducible Research Training (FORRT). https://doi.org/10.31219/osf.io/bnh7p
  • Bryan, C. J., Tipton, E., & Yeager, D. S. (2021). Behavioural science is unlikely to change the world without a heterogeneity revolution. Nature Human Behaviour, 5(8), 980–989. https://doi.org/10.1038/s41562-021-01143-3
  • Button, K. S., & Munafò, M. R. (2017). Powering Reproducible Research. Psychological Science Under Scrutiny, 22–33. Portico. https://doi.org/10.1002/9781119095910.ch2
  • Crüwell, S., van Doorn, J., Etz, A., Makel, M. C., Moshontz, H., Niebaum, J. C., Orben, A., Parsons, S., & Schulte-Mecklenbeck, M. (2019). Seven Easy Steps to Open Science. Zeitschrift Für Psychologie, 227(4), 237–248. https://doi.org/10.1027/2151-2604/a000387
  • Devezer, B., Navarro, D. J., Vandekerckhove, J., & Ozge Buzbas, E. (2021). The case for formal methodology in scientific reform. Royal Society Open Science, 8(3). https://doi.org/10.1098/rsos.200805
  • Dudda, L., Kormann, E., Kozula, M., DeVito, N. J., Klebel, T., Dewi, A. P. M., Spijker, R., Stegeman, I., Van den Eynden, V., Ross-Hellauer, T., & Leeflang, M. M. G. (2025). Open science interventions to improve reproducibility and replicability of research: a scoping review. Royal Society Open Science, 12(4). https://doi.org/10.1098/rsos.242057
  • Gervais, W. M. (2021). Practical Methodological Reform Needs Good Theory. Perspectives on Psychological Science, 16(4), 827–843. https://doi.org/10.1177/1745691620977471
  • Harmon-Jones, E., Harmon-Jones, C., Amodio, D. M., Gable, P. A., & Schmeichel, B. J. (2025). Valid replications require valid methods: Recommendations for best methodological practices with lab experiments. Motivation Science, 11(3), 235–245. https://doi.org/10.1037/mot0000398
  • Ioannidis, J. P. A., Munafò, M. R., Fusar-Poli, P., Nosek, B. A., & David, S. P. (2014). Publication and other reporting biases in cognitive sciences: detection, prevalence, and prevention. Trends in Cognitive Sciences, 18(5), 235–241. https://doi.org/10.1016/j.tics.2014.02.010
  • Kathawalla, U.-K., Silverstein, P., & Syed, M. (2021). Easing Into Open Science: A Guide for Graduate Students and Their Advisors. Collabra: Psychology, 7(1). https://doi.org/10.1525/collabra.18684
  • Krähmer, D., Schächtele, L., & Auspurg, K. (2026). Code sharing and reproducibility in survey-based social research: evidence from a large-scale audit. Royal Society Open Science, 13(3). https://doi.org/10.1098/rsos.251997
  • Klein, R. A., Vianello, M., Hasselman, F., Adams, B. G., Adams, R. B., Alper, S., Aveyard, M., Axt, J. R., Babalola, M. T., Bahník, Š., Batra, R., Berkics, M., Bernstein, M. J., Berry, D. R., Bialobrzeska, O., Binan, E. D., Bocian, K., Brandt, M. J., Busching, R., … Nosek, B. A. (2018). Many Labs 2: Investigating Variation in Replicability Across Samples and Settings. Advances in Methods and Practices in Psychological Science, 1(4), 443–490. https://doi.org/10.1177/2515245918810225
  • Lindsay, D. S. (2020). Seven steps toward transparency and replicability in psychological science. Canadian Psychology / Psychologie Canadienne, 61(4), 310–317. https://doi.org/10.1037/cap0000222
  • Munafò, M. R., Nosek, B. A., Bishop, D. V. M., Button, K. S., Chambers, C. D., Percie du Sert, N., Simonsohn, U., Wagenmakers, E.-J., Ware, J. J., & Ioannidis, J. P. A. (2017). A manifesto for reproducible science. Nature Human Behaviour, 1(1). https://doi.org/10.1038/s41562-016-0021
  • Checklists work to improve science. (2018). Nature, 556(7701), 273–274. https://doi.org/10.1038/d41586-018-04590-7
  • Peng, R. (2015). The Reproducibility Crisis in Science: A Statistical Counterattack. Significance, 12(3), 30–32. https://doi.org/10.1111/j.1740-9713.2015.00827.x
  • Pownall, M., Azevedo, F., Aldoh, A., Elsherif, M., Vasilev, M., Pennington, C. R., Robertson, O., Tromp, M. V., Liu, M., Makel, M. C., Tonge, N., Moreau, D., Horry, R., Shaw, J., Tzavella, L., McGarrigle, R., Talbot, C., & Parsons, S. (2024). Embedding open and reproducible science into teaching: A bank of lesson plans and resources. Scholarship of Teaching and Learning in Psychology, 10(3), 342–349. https://doi.org/10.1037/stl0000307
  • Pownall, M., Azevedo, F., König, L. M., Slack, H. R., Evans, T. R., Flack, Z., Grinschgl, S., Elsherif, M. M., Gilligan-Lee, K. A., de Oliveira, C. M. F., Gjoneska, B., Kalandadze, T., Button, K., Ashcroft-Jones, S., Terry, J., Albayrak-Aydemir, N., Děchtěrenko, F., Alzahawi, S., … Baker, B. J. (2023). Teaching open and reproducible scholarship: a critical review of the evidence base for current pedagogical methods and their outcomes. Royal Society Open Science, 10(5). https://doi.org/10.1098/rsos.221255
  • Proulx, T., & Morey, R. D. (2021). Beyond Statistical Ritual: Theory in Psychological Science. Perspectives on Psychological Science, 16(4), 671–681. https://doi.org/10.1177/17456916211017098
  • Silverstein, P., Elman, C., Montoya, A. K., McGillivray, B., Pennington, C. R., Harrison, C. H., Steltenpohl, C. N., Röer, J. P., Corker, K. S., Charron, L. M., Elsherif, M. M., na, ana, Hayes-Harb, R., Grinschgl, S., Neal, T. M. S., Evans, T. R., Karhulahti, V.-M., Krenzer, W. L. D., Belaus, A., … Syed, M. (2023). A Guide for Social Science Journal Editors on Easing into Open Science. https://doi.org/10.31219/osf.io/5dar8

Ethical considerations for improved practices

Engaging in Open and Reproducible Science practices comes with ethical challenges that need to be sensitively navigated (e.g. when sharing data openly).

  • Brabeck, M. M. (2021). Open Science and Feminist Ethics: Promises and Challenges of Open Access. Psychology of Women Quarterly, 45(4), 457–474. https://doi.org/10.1177/03616843211030926
  • Bol, T., de Vaan, M., & van de Rijt, A. (2018). The Matthew effect in science funding. Proceedings of the National Academy of Sciences, 115(19), 4887–4890. https://doi.org/10.1073/pnas.1719557115
  • Chakravorty, N., Sharma, C. S., Molla, K. A., & Pattanaik, J. K. (2022). Open Science: Challenges, Possible Solutions and the Way Forward. Proceedings of the Indian National Science Academy, 88(3), 456–471. https://doi.org/10.1007/s43538-022-00104-2
  • Chopik, W. J., Bremner, R. H., Defever, A. M., & Keller, V. N. (2018). How (and Whether) to Teach Undergraduates About the Replication Crisis in Psychological Science. Teaching of Psychology, 45(2), 158–163. https://doi.org/10.1177/0098628318762900
  • Edwards, M. A., & Roy, S. (2017). Academic Research in the 21st Century: Maintaining Scientific Integrity in a Climate of Perverse Incentives and Hypercompetition. Environmental Engineering Science, 34(1), 51–61. https://doi.org/10.1089/ees.2016.0223
  • Fell, M. J. (2019). The Economic Impacts of Open Science: A Rapid Evidence Assessment. Publications, 7(3), 46. https://doi.org/10.3390/publications7030046
  • Jacobs, A. M., Büthe, T., Arjona, A., Arriola, L. R., Bellin, E., Bennett, A., Björkman, L., Bleich, E., Elkins, Z., Fairfield, T., Gaikwad, N., Greitens, S. C., Hawkesworth, M., Herrera, V., Herrera, Y. M., Johnson, K. S., Karakoç, E., Koivu, K., Kreuzer, M., … Yashar, D. J. (2021). The Qualitative Transparency Deliberations: Insights and Implications. Perspectives on Politics, 19(1), 171–208. https://doi.org/10.1017/S1537592720001164
  • Gefenas, E. (2006). The concept of risk and responsible conduct of research. Science and Engineering Ethics, 12(1), 75–83. https://doi.org/10.1007/s11948-006-0007-x
  • Khalil, A. T., Shinwari, Z. K., & Islam, A. (2022). Fostering openness in open science: An ethical discussion of risks and benefits. Frontiers in Political Science, 4. https://doi.org/10.3389/fpos.2022.930574
  • Lamb, D., Russell, A., Morant, N., & Stevenson, F. (2024). The challenges of open data sharing for qualitative researchers. Journal of Health Psychology, 29(7), 659–664. https://doi.org/10.1177/13591053241237620
  • Lupia, A. (2020). Practical and Ethical Reasons for Pursuing a More Open Science. PS: Political Science & Politics, 54(2), 301–304. https://doi.org/10.1017/S1049096520000979
  • Pratt, M. G., Kaplan, S., & Whittington, R. (2019). Editorial Essay: The Tumult over Transparency: Decoupling Transparency from Replication in Establishing Trustworthy Qualitative Research. Administrative Science Quarterly, 65(1), 1–19. https://doi.org/10.1177/0001839219887663
  • Prosser, A. M., Bagnall, R., & Higson-Sweeney, N. (2024). Reflection over compliance: Critiquing mandatory data sharing policies for qualitative research. Journal of Health Psychology, 29(7), 653–658. https://doi.org/10.1177/13591053231225903

Ongoing debates (e.g., incentives for and against open science practices)

Open Science is not a monolith, and continued scrutiny of the proposed practices and reforms can be of value - whether to understand why there is resistance (and how to combat anti-open arguments) as well as pushing us to evaluate the potential positive and negative impacts of reforms.

  • Bahlai, C., Bartlett, L., Burgio, K., Fournier, A., Keiser, C., Poisot, T., & Whitney, K. (2019). Open Science Isn’t Always Open to All Scientists. American Scientist, 107(2), 78. https://doi.org/10.1511/2019.107.2.78
  • Bak-Coleman, J. B., Mann, R. P., Bergstrom, C. T., Gross, K., & West, J. (2022). Revisiting the replication crisis without false positives. https://doi.org/10.31235/osf.io/rkyf7
  • Breznau, N. (2021). Does Sociology Need Open Science? Societies, 11(1), 9. https://doi.org/10.3390/soc11010009
  • Burgos, J. E. (2025). Getting ontologically serious about the replication crisis in psychology. Journal of Theoretical and Philosophical Psychology, 45(2), 79–100. https://doi.org/10.1037/teo0000281
  • Chen, X., Dallmeier-Tiessen, S., Dasler, R., Feger, S., Fokianos, P., Gonzalez, J. B., Hirvonsalo, H., Kousidis, D., Lavasa, A., Mele, S., Rodriguez, D. R., Šimko, T., Smith, T., Trisovic, A., Trzcinska, A., Tsanaktsidis, I., Zimmermann, M., Cranmer, K., Heinrich, L., … Neubert, S. (2018). Open is not enough. Nature Physics, 15(2), 113–119. https://doi.org/10.1038/s41567-018-0342-2
  • Dienlin, T., Johannes, N., Bowman, N. D., Masur, P. K., Engesser, S., Kümpel, A. S., Lukito, J., Bier, L. M., Zhang, R., Johnson, B. K., Huskey, R., Schneider, F. M., Breuer, J., Parry, D. A., Vermeulen, I., Fisher, J. T., Banks, J., Weber, R., Ellis, D. A., … de Vreese, C. (2020). An Agenda for Open Science in Communication. Journal of Communication, 71(1), 1–26. https://doi.org/10.1093/joc/jqz052
  • Drummond, C. (2017). Reproducible research: a minority opinion. Journal of Experimental & Theoretical Artificial Intelligence, 30(1), 1–11. https://doi.org/10.1080/0952813X.2017.1413140
  • Fanelli, D., & Ioannidis, J. P. A. (2013). US studies may overestimate effect sizes in softer research. Proceedings of the National Academy of Sciences, 110(37), 15031–15036. https://doi.org/10.1073/pnas.1302997110
  • Fanelli, D. (2018). Is science really facing a reproducibility crisis, and do we need it to? Proceedings of the National Academy of Sciences, 115(11), 2628–2631. https://doi.org/10.1073/pnas.1708272114
  • Fell, M. J. (2019). The Economic Impacts of Open Science: A Rapid Evidence Assessment. Publications, 7(3), 46. https://doi.org/10.3390/publications7030046
  • Fernández Pinto, M. (2020). Open Science for private Interests? How the Logic of Open Science Contributes to the Commercialization of Research. Frontiers in Research Metrics and Analytics, 5. https://doi.org/10.3389/frma.2020.588331
  • Iso-Ahola, S. E. (2025). Science of psychological phenomena and their testing. American Psychologist, 80(7), 1057–1072. https://doi.org/10.1037/amp0001362
  • Klonsky, E. D. (2024). Campbell’s Law Explains the Replication Crisis: Pre-Registration Badges Are History Repeating. Assessment, 32(2), 224–234. https://doi.org/10.1177/10731911241253430
  • Klonsky, E. D. (2024). How to Produce, Identify, and Motivate Robust Psychological Science: A Roadmap and a Response to Vize et al. Assessment, 32(2), 244–252. https://doi.org/10.1177/10731911241299723
  • Crespo López, M. de los Á., Pallise Perello, C., de Ridder, J., & Labib, K. (2025). Open Science as Confused: Contradictory and Conflicting Discourses in Open Science Guidance to Researchers. https://doi.org/10.31222/osf.io/zr35u_v1
  • Lewandowsky, S., & Oberauer, K. (2020). Low replicability can support robust and efficient science. Nature Communications, 11(1). https://doi.org/10.1038/s41467-019-14203-0
  • Makel, M. C., Meyer, M. S., Simonsen, M. A., Roberts, A. M., & Plucker, J. A. (2022). Replication is relevant to qualitative research. Educational Research and Evaluation, 27(1–2), 215–219. https://doi.org/10.1080/13803611.2021.2022310
  • Mayrhofer, R., Büchner, I. C., & Hevesi, J. (2024). The quantitative paradigm and the nature of the human mind. The replication crisis as an epistemological crisis of quantitative psychology in view of the ontic nature of the psyche. Frontiers in Psychology, 15. https://doi.org/10.3389/fpsyg.2024.1390233
  • Maziarz, M. (2024). Conflicting Results and Statistical Malleability: Embracing Pluralism of Empirical Results. Perspectives on Science, 32(6), 701–728. https://doi.org/10.1162/posc_a_00627
  • McDermott, R. (2022). Breaking free. Politics and the Life Sciences, 41(1), 55–59. https://doi.org/10.1017/pls.2022.4
  • Reality check on reproducibility. (2016). Nature, 533(7604), 437–437. https://doi.org/10.1038/533437a
  • Pashler, H., & Harris, C. R. (2012). Is the Replicability Crisis Overblown? Three Arguments Examined. Perspectives on Psychological Science, 7(6), 531–536. https://doi.org/10.1177/1745691612463401
  • Penders, B. (2024). Scandal in scientific reform: the breaking and remaking of science. Journal of Responsible Innovation, 11(1). https://doi.org/10.1080/23299460.2024.2371172
  • Pethick, S., Wass, M. N., & Michaelis, M. (2025). Is There a Reproducibility Crisis? On the Need for Evidence-based Approaches. International Studies in the Philosophy of Science, 38(4), 287–303. https://doi.org/10.1080/02698595.2025.2538937
  • Pham, M. T., & Oh, T. T. (2021). Preregistration Is Neither Sufficient nor Necessary for Good Science. Journal of Consumer Psychology, 31(1), 163–176. Portico. https://doi.org/10.1002/jcpy.1209
  • Pratt, M. G., Kaplan, S., & Whittington, R. (2019). Editorial Essay: The Tumult over Transparency: Decoupling Transparency from Replication in Establishing Trustworthy Qualitative Research. Administrative Science Quarterly, 65(1), 1–19. https://doi.org/10.1177/0001839219887663
  • Ràfols, I. (2025). Rethinking open science: Towards care for equity and inclusion. Global Dialogue. https://globaldialogue.isa-sociology.org/articles/rethinking-open-science-towards-care-for-equity-and-inclusion
  • Rubin, M. (2025). A brief review of research that questions the impact of questionable research practices. https://doi.org/10.31234/osf.io/ah9wb_v3
  • Rubin, M. (2025). Preregistration does not improve the transparent evaluation of severity in Popper’s philosophy of science or when deviations are allowed. Synthese, 206(3). https://doi.org/10.1007/s11229-025-05191-4
  • Rubin, M. (2025). The replication crisis is less of a “crisis” in Lakatos’ philosophy of science than it is in Popper’s. European Journal for Philosophy of Science, 15(1). https://doi.org/10.1007/s13194-024-00629-x
  • Sanbonmatsu, D. M., Neufeld, B., & Posavac, S. S. (2025). There is no theory crisis in psychological science. Journal of Theoretical and Philosophical Psychology. https://doi.org/10.1037/teo0000301
  • Souza-Neto, V., & Moyle, B. (2025). Preregistration is not a panacea, but why? A rejoinder to “infusing preregistration into tourism research.” Tourism Management, 107, 105061. https://doi.org/10.1016/j.tourman.2024.105061
  • Ulpts, S., Bartscherer, S. F., Penders, B., & Nelson, N. (2025). Epistemic oligarchies: capture and concentration through science reform. Zenodo. https://doi.org/10.5281/zenodo.17136864
JUST-OS