References
Note — Preliminary Version 0.1
This is a preliminary version. Feedback welcome: lukas.roeseler@uni-muenster.de or GitHub.
Abramian, D., & Eklund, A. (2019). Refacing: Reconstructing
anonymized facial features using GANs. 2019 IEEE 16th International
Symposium on Biomedical Imaging (ISBI 2019), 1104–1108. https://doi.org/10.1109/ISBI.2019.8759515
Adler, S. J., Röseler, L., & Schöniger, M. K. (2023). A toolbox to
evaluate the trustworthiness of published findings. Journal of
Business Research, 167, 114189. https://doi.org/10.1016/j.jbusres.2023.114189
Aguinis, H., & Solarino, A. M. (2019). Transparency and
replicability in qualitative research: The case of interviews with elite
informants. Strategic Management Journal, 40(8),
1291–1315. https://doi.org/10.1002/smj.3015
Ankel-Peters, J., Brodeur, A., Dreber, A., Johannesson, M., Neubauer,
F., & Rose, J. (2025). A protocol for structured robustness
reproductions and replicability assessments. Q Open, qoaf004.
https://doi.org/10.1093/qopen/qoaf004
Ankel-Peters, J., Fiala, N., & Neubauer, F. (2023). Do economists
replicate? Journal of Economic Behavior & Organization,
212, 219–232. https://doi.org/10.1016/j.jebo.2023.05.009
Association, A. P. (n.d.). Journal article reporting standards
(JARS): Quantitative replications reporting table. https://apastyle.apa.org/jars/quant-table-6.pdf.
Bakker, M., Dijk, A. van, & Wicherts, J. M. (2012). The rules of the
game called psychological science. Perspectives on Psychological
Science, 7(6), 543–554. https://doi.org/10.1177/1745691612459060
Bargh, J. A. (2006). What have we been priming all these years? On the
development, mechanisms, and ecology of nonconscious social behavior.
European Journal of Social Psychology, 36(2), 147–168.
https://doi.org/10.1002/ejsp.336
Bartoš, F., & Schimmack, U. (2022). Z-curve 2.0: Estimating
replication rates and discovery rates. Meta-Psychology,
6. https://doi.org/10.15626/MP.2021.2720
Baumeister, R. F., Tice, D. M., & Bushman, B. J. (2022). A review of
multisite replication projects in social psychology: Is it viable to
sustain any confidence in social psychology’s knowledge base?
Perspectives on Psychological Science, 18(4), 912–935.
https://doi.org/10.1177/17456916221121815
Baumeister, R. F., & Vohs, K. D. (2016). Misguided effort with
elusive implications. Perspectives on Psychological Science,
11(4), 574–575. https://doi.org/10.1177/1745691616652878
Bekkers, R. (2024). Replication value: A comment and
alternative. https://doi.org/10.31234/osf.io/uj5g7
Bennett, E. A. (2021). Open science from a qualitative, feminist
perspective: Epistemological dogmas and a call for critical examination.
Psychology of Women Quarterly, 45(4), 448–456. https://doi.org/10.1177/03616843211036460
Berinsky, A. J., Druckman, J. N., & Yamamoto, T. (2021). Publication
biases in replication studies. Political Analysis,
29(3), 370–384. https://doi.org/10.1017/pan.2020.34
Berkeley Initiative for Transparency in the Social Sciences. (2020).
Guide for advancing computational reproducibility in the social
sciences. https://bitss.github.io/ACRE/
Beyer, F., Flannery, J., Gau, R., Janssen, L., Schaare, L., Hartmann,
H., Nilsonne, G., Martin, S., Khalil, A., Lipp, I., Puhlmann, L.,
Heinrichs, H., Mohamed, A., Herholz, P., Sicorello, M., &
Panagoulas, E. (2021). A fMRI pre-registration template.
PsychArchives. https://doi.org/10.23668/PSYCHARCHIVES.5121
Block, J., & Kuckertz, A. (2018). Seven principles of effective
replication studies: Strengthening the evidence base of management
research. Management Review Quarterly, 68(4), 355–359.
https://doi.org/10.1007/s11301-018-0149-3
Borgstede, M., & Scholz, M. (2021). Quantitative and qualitative
approaches to generalization and replication–a representationalist view.
Frontiers in Psychology, 12, 605191. https://doi.org/10.3389/fpsyg.2021.605191
Bosco, F. A., Uggerslev, K. L., & Steel, P. (2017). MetaBUS as a
vehicle for facilitating meta-analysis. Human Resource Management
Review, 27(1), 237–254. https://doi.org/10.1016/j.hrmr.2016.09.013
Botvinik-Nezer, R., Holzmeister, F., Camerer, C. F., Dreber, A., Huber,
J., Johannesson, M., & Rieck, J. R. (2020). Variability in the
analysis of a single neuroimaging dataset by many teams.
Nature, 582(7810), 84–88.
Boyce, V., Prystawski, B., Abutto, A. B., Chen, E. M., Chen, Z., Chiu,
H., & Frank, M. C. (2024). Estimating the replicability of
psychology experiments after an initial failure to replicate. https://doi.org/10.31234/osf.io/an3yb
Brandt, M. J., IJzerman, H., Dijksterhuis, A., Farach, F. J., Geller,
J., Giner-Sorolla, R., & Van’t Veer, A. (2014). The replication
recipe: What makes for a convincing replication? Journal of
Experimental Social Psychology, 50, 217–224. https://doi.org/10.1016/j.jesp.2013.10.005
Brodeur, A., Cook, N. M., Hartley, J. S., & Heyes, A. (2024). Do
preregistration and preanalysis plans reduce p-hacking and publication
bias? Evidence from 15,992 test statistics and suggestions for
improvement. Journal of Political Economy Microeconomics,
2(3), 527–561. https://doi.org/10.1086/730455
Brodeur, A., Dreber, A., Hoces de la Guardia, F., & Miguel, E.
(2024). Reproduction and replication at scale. Nature Human
Behaviour, 8(1), 2–3. https://doi.org/10.1038/s41562-023-01807-2
Bryan, C. J., Yeager, D. S., & O’Brien, J. M. (2019). Replicator
degrees of freedom allow publication of misleading failures to
replicate. Proceedings of the National Academy of Sciences,
116(51), 25535–25545. https://doi.org/10.1073/pnas.1910951116
Buttliere, B. (2024). Was this registered report pilot tested?
Examination of vaidis, sleegers, van leeuwen, DeMarree, ... &
priolo, d. (2024). https://doi.org/10.31234/osf.io/c6r8x
Button, K., Ioannidis, J., Mokrysz, C., et al. (2013). Power failure:
Why small sample size undermines the reliability of neuroscience.
Nature Reviews Neuroscience, 14, 365–376. https://doi.org/10.1038/nrn3475
Calder, B. J., Phillips, L. W., & Tybout, A. M. (1981). Designing
research for application. Journal of Consumer Research,
8(2), 197–207. https://doi.org/10.1086/208856
Carter, E. C., Schönbrodt, F. D., Gervais, W. M., & Hilgard, J.
(2019). Correcting for bias in psychology: A comparison of meta-analytic
methods. Advances in Methods and Practices in Psychological
Science, 2(2), 115–144. https://doi.org/10.1177/2515245919847196
Chartrand, T. L., & Bargh, J. A. (1999). The chameleon effect: The
perception–behavior link and social interaction. Journal of
Personality and Social Psychology, 76(6), 893. https://doi.org/10.1037/0022-3514.76.6.893
Clark, C. J., Costello, T., Mitchell, G., & Tetlock, P. E. (2022).
Keep your enemies close: Adversarial collaborations will improve
behavioral science. Journal of Applied Research in Memory and
Cognition, 11(1), 1. https://doi.org/10.1037/mac0000004
Clarke, B., Lee, P. Y. (K. )., Schiavone, S. R., Rhemtulla, M., &
Vazire, S. (2024). The prevalence of direct replication articles in
top-ranking psychology journals. American Psychologist. https://doi.org/10.1037/amp0001385
Cole, N. L., Ulpts, S., Bochynska, A., Kormann, E., Good, M., Leitner,
B., & Ross-Hellauer, T. (2024). Reproducibility and
replicability of qualitative research: An integrative review of
concepts, barriers and enablers. https://doi.org/10.31222/osf.io/n5zkw_v1
Coles, N. A., March, D. S., Marmolejo-Ramos, F., et al. (2022). A
multi-lab test of the facial feedback hypothesis by the many smiles
collaboration. Nature Human Behaviour, 6, 1731–1742.
https://doi.org/10.1038/s41562-022-01458-9
Corcoran, A. W., Hohwy, J., & Friston, K. J. (2023). Accelerating
scientific progress through bayesian adversarial collaboration.
Neuron, 111(22), 3505–3516. https://doi.org/10.1016/j.neuron.2023.08.027
Cortina, J. M., Köhler, T., & Aulisi, L. C. (2023). Current
reproducibility practices in management: What they are versus what they
could be. Journal of Management Scientific Reports,
1(3-4), 171–205. https://doi.org/10.1177/27550311231202696
Cowan, N., Belletier, C., Doherty, J. M., Jaroslawska, A. J., Rhodes,
S., Forsberg, A., Naveh-Benjamin, M., Barrouillet, P., Camos, V., &
Logie, R. H. (2020). How do scientific views change? Notes from an
extended adversarial collaboration. Perspectives on Psychological
Science, 15(4), 1011–1025. https://doi.org/10.1177/1745691620906415
DeBruine, L., & Lakens, D. (2025). Papercheck: Check scientific
papers for best practices. R package version 0.0.0.9053. https://github.com/scienceverse/papercheck.
Dreber, A., & Johannesson, M. (2024). A framework for evaluating
reproducibility and replicability in economics. Economic
Inquiry. https://doi.org/10.1111/ecin.13244
Dunlap, K. (1926). The experimental methods of psychology. In C.
Murchison (Ed.), Psychologies of 1925 (pp. 331–351). Clark
University Press. https://doi.org/10.1037/11020-022
Ebersole, C. R., Mathur, M. B., Baranski, E., Bart-Plange, D. J.,
Buttrick, N. R., Chartier, C. R., & Szecsi, P. (2020). Many labs 5:
Testing pre-data-collection peer review as an intervention to increase
replicability. Advances in Methods and Practices in Psychological
Science, 3(3), 309–331. https://doi.org/10.1177/2515245920958687
Errington, T. M., Denis, A., Perfito, N., Iorns, E., & Nosek, B. A.
(2021). Challenges for assessing replicability in preclinical cancer
biology. eLife, 10, e67995. https://doi.org/10.7554/eLife.67995
Errington, T. M., Mathur, M., Soderberg, C. K., Denis, A., Perfito, N.,
Iorns, E., & Nosek, B. A. (2021). Investigating the replicability of
preclinical cancer biology. eLife, 10, e71601. https://doi.org/10.7554/eLife.71601
Esteban, O., Markiewicz, C. J., Blair, R. W., Moodie, C. A., Isik, A.
I., Erramuzpe, A., & Gorgolewski, K. J. (2019). fMRIPrep: A robust
preprocessing pipeline for functional MRI. Nature Methods,
16(1), 111–116.
Fabrigar, L. R., Wegener, D. T., & Petty, R. E. (2020). A
validity-based framework for understanding replication in psychology.
Personality and Social Psychology Review, 24(4),
316–344. https://doi.org/10.1177/1088868320931366
Feldman, G. (2024). Registered report stage 1 manuscript
template. https://doi.org/10.17605/OSF.IO/YQXTP
Feldman, G. (2025). The value of replications goes beyond replicability
and is associated with the value of the research it replicates:
Commentary on isager et al., 2021. Meta-Psychology, 9.
https://doi.org/10.15626/MP.2024.4326
Fiedler, K. (2011). Voodoo correlations are everywhere—not only in
neuroscience. Perspectives on Psychological Science,
6(2), 163–171. https://doi.org/10.1177/1745691611400237
Fiedler, K., McCaughey, L., & Prager, J. (2021). Quo vadis,
methodology? The key role of manipulation checks for validity control
and quality of science. Perspectives on Psychological Science,
16(4), 816–826. https://doi.org/10.1177/1745691620970602
Field, S. M., Hoekstra, R., Bringmann, L., & Ravenzwaaij, D. van.
(2019). When and why to replicate: As easy as 1, 2, 3? Collabra:
Psychology, 5(1), 46. https://doi.org/10.1525/collabra.218
Field, S. M., Volz, L., Kaznatcheev, A., & Dongen, N. van. (2024).
Can a good theory be built using bad ingredients? Computational
Brain & Behavior, 7, 608–615. https://doi.org/10.1007/s42113-024-00220-w
Flake, J. K., & Fried, E. I. (2020). Measurement schmeasurement:
Questionable measurement practices and how to avoid them. Advances
in Methods and Practices in Psychological Science, 3(4),
456–465. https://doi.org/10.1177/2515245920952393
Francis, G. (2012). Too good to be true: Publication bias in two
prominent studies from experimental psychology. Psychonomic Bulletin
& Review, 19, 151–156. https://doi.org/10.3758/s13423-012-0227-9
Freese, J., & Peterson, D. (2017). Replication in social science.
Annual Review of Sociology, 43(1), 147–165. https://doi.org/10.1146/annurev-soc-060116-053450
Friese, M., Loschelder, D. D., Gieseler, K., Frankenbach, J., &
Inzlicht, M. (2019). Is ego depletion real? An analysis of arguments.
Personality and Social Psychology Review, 23(2),
107–131. https://doi.org/10.1177/1088868318762183
Gignac, G. E., & Zajenkowski, M. (2020). The dunning-kruger effect
is (mostly) a statistical artefact: Valid approaches to testing the
hypothesis with individual differences data. Intelligence,
80, 101449. https://doi.org/10.1016/j.intell.2020.101449
Goltermann, J., & Altegoer, L. (2025). ReFiNe-MDD: Replicability
of findings in neuroimaging in depression. https://doi.org/10.17605/OSF.IO/N86Q9
Grant, S., Corker, K. S., Mellor, D. T., Stewart, S. L. K., Cashin, A.
G., Lagisz, M., & Nosek, B. A. (2024). TOP 2025: An update to
the transparency and openness promotion guidelines. https://doi.org/10.31222/osf.io/nmfs6
Hagger, M. S., Chatzisarantis, N. L. D., Alberts, H., Anggono, C. O.,
Batailler, C., Birt, A. R., Brand, R., Brandt, M. J., Brewer, G.,
Bruyneel, S., Calvillo, D. P., Campbell, W. K., Cannon, P. R., Carlucci,
M., Carruth, N. P., Cheung, T., Crowell, A., De Ridder, D. T. D.,
Dewitte, S., & Zwienenberg, M. (2016). A multilab preregistered
replication of the ego-depletion effect. Perspectives on
Psychological Science, 11(4), 546–573. https://doi.org/10.1177/1745691616652873
Han, H., Glenn, A. L., & Dawson, K. J. (2019). Evaluating
alternative correction methods for multiple comparison in functional
neuroimaging research. Brain Sciences, 9(8), 198. https://doi.org/10.3390/brainsci9080198
Hardwicke, T. E., & Wagenmakers, E.-J. (2023). Reducing bias,
increasing transparency and calibrating confidence with preregistration.
Nature Human Behaviour, 7(1), 15–26. https://doi.org/10.1038/s41562-022-01497-2
Hawkins, R. X., Smith, E. N., Au, C., Arias, J. M., Catapano, R.,
Hermann, E., & Frank, M. C. (2018). Improving the replicability of
psychological science through pedagogy. Advances in Methods and
Practices in Psychological Science, 1(1), 7–18. https://doi.org/10.1177/2515245917740427
Heathers, J. (2025). An introduction to forensic metascience.
https://doi.org/10.5281/zenodo.14871843
Heirene, R., LaPlante, D., Louderback, E., Keen, B., Bakker, M.,
Serafimovska, A., & Gainsbury, S. (2024). Preregistration
specificity and adherence: A review of preregistered gambling studies
and cross-disciplinary comparison. Meta-Psychology, 8.
https://doi.org/10.15626/MP.2021.2909
Held, L., Pawel, S., & Micheloud, C. (2024). The assessment of
replicability using the sum of p-values. Royal Society Open
Science, 11(8), 240149. https://doi.org/10.1098/rsos.240149
Henriques, S. O., Rzayeva, N., Pinfield, S., & Waltman, L. (2023).
Preprint review services: Disrupting the scholarly communication
landscape? https://doi.org/10.31235/osf.io/8c6xm
Heroux, M. A., Barba, L. A., Parashar, M., Stodden, V., & Taufer, M.
(2018). Toward a compatible reproducibility taxonomy for
computational and computing sciences. https://doi.org/10.2172/1481626
Heyard, R., Pawel, S., Frese, J., Voelkl, B., Würbel, H., McCann, S.,
& Zellers, S. (2025). A scoping review on metrics to quantify
reproducibility: A multitude of questions leads to a multitude of
metrics. Royal Society Open Science, 12(7), 242076. https://doi.org/10.1098/rsos.242076
Höffler, J. H. (2017). ReplicationWiki: Improving transparency in social
sciences research. D-Lib Magazine, 23(3), 1. https://doi.org/10.1045/march2017-hoeffler
Huang, F. L., & Huang, A. B. (2024). Replication studies using
secondary or nonexperimental datasets. School Psychology
Review, 1–15. https://doi.org/10.1080/2372966X.2024.2346781
Hüffmeier, J., Mazei, J., & Schultze, T. (2016). Reconceptualizing
replication as a sequence of different studies: A replication typology.
Journal of Experimental Social Psychology, 66, 81–92.
https://doi.org/10.1016/j.jesp.2015.09.009
Hummel, T., & Manner, J. (2024). A literature review on
reproducibility studies in computer science. Proceedings of the 16th
ZEUS Workshop on Services and Their Composition (ZEUS 2024)(CEUR),
3673.
Ioannidis, J. P. (2005). Why most published research findings are false.
PLoS Medicine, 2(8), e124.
Isager, P. M., Aert, R. C. M. van, Bahník, Š., Brandt, M. J., DeSoto, K.
A., Giner-Sorolla, R., Krueger, J. I., Perugini, M., Ropovik, I., Veer,
A. E. van ’t, Vranka, M., & Lakens, D. (2023). Deciding what to
replicate: A decision model for replication study selection under
resource and knowledge constraints. Psychological Methods,
28(2), 438–451. https://doi.org/10.1037/met0000438
Isager, P. M., Veer, A. E. van’t, Lakens, D., et al. (2021). Replication
value as a function of citation impact and sample size.
MetaArXiv. https://doi.org/10.31222/osf.io/knjea
Jacowitz, K. E., & Kahneman, D. (1995). Measures of anchoring in
estimation tasks. Personality and Social Psychology Bulletin,
21(11), 1161–1166. https://doi.org/10.1177/01461672952111004
Janz, N., & Freese, J. (2021). Replicate others as you would like to
be replicated yourself. PS: Political Science & Politics,
54(2), 305–308. https://doi.org/10.1017/S1049096520000943
Jekel, M., Fiedler, S., Allstadt Torras, R., Mischkowski, D., Dorrough,
A. R., & Glöckner, A. (2020). How to teach open science principles
in the undergraduate curriculum—the hagen cumulative science project.
Psychology Learning & Teaching, 19(1), 91–106. https://doi.org/10.1177/1475725719868149
John, L. K., Loewenstein, G., & Prelec, D. (2012). Measuring the
prevalence of questionable research practices with incentives for truth
telling. Psychological Science, 23(5), 524–532. https://doi.org/10.1177/0956797611430953
Jwa, A. S., Koyejo, O., & Poldrack, R. A. (2024). Demystifying the
likelihood of reidentification in neuroimaging data: A technical and
regulatory analysis. Imaging Neuroscience, 2. https://doi.org/10.1162/imag_a_00111
Kamermans, K. L., Dudda, L., Daikoku, T., & Verheyen, S. (2025).
The is-ought problem in deciding what to replicate: Which motives
guide current replication practices? https://doi.org/10.31234/osf.io/6xdy2_v2
Kapitány, R., & Kavanagh, C. M. (2023). Best practices and
ethical considerations for crowd-sourced data in the behavioral
sciences. https://doi.org/10.31219/osf.io/sn5gh
Karhulahti, V., Martončik, M., & Adamkovic, M. (2024).
Pre-replication in meaningful science. https://doi.org/10.31234/osf.io/5gn7m
King, G. (1995). Replication, replication. PS: Political Science
& Politics, 28(3), 444–452. https://doi.org/10.2307/420301
Klein, R. A., Ratliff, K. A., Vianello, M., Adams Jr, R. B., Bahník, Š.,
Bernstein, M. J., & Nosek, B. A. (2014). Investigating variation in
replicability. Social Psychology. https://doi.org/10.1027/1864-9335/a000178
Köhler, T., & Cortina, J. M. (2021). Play it again, sam! An analysis
of constructive replication in the organizational sciences. Journal
of Management, 47(2), 488–518. https://doi.org/10.1177/0149206319843985
Koole, S. L., & Lakens, D. (2012). Rewarding replications: A sure
and simple way to improve psychological science. Perspectives in
Psychological Science, 7, 608–614. https://doi.org/10.1177/1745691612462586
Kranz, S. (2025). Extensive database of economics studies with
available data. https://ejd.econ.mathematik.uni-ulm.de/.
Kruger, J., & Dunning, D. (1999). Unskilled and unaware of it: How
difficulties in recognizing one’s own incompetence lead to inflated
self-assessments. Journal of Personality and Social Psychology,
77(6), 1121–1134. https://doi.org/10.1037/0022-3514.77.6.1121
Lakens, D. (2022). Sample size justification. Collabra:
Psychology, 8(1), 33267. https://doi.org/10.1525/collabra.33267
Lakens, D. (2024). When and how to deviate from a preregistration.
Collabra: Psychology, 10(1). https://doi.org/10.1525/collabra.117094
Lakens, D., & Etz, A. J. (2017). Too true to be bad: When sets of
studies with significant and nonsignificant findings are probably true.
Social Psychological and Personality Science, 8(8),
875–881. https://doi.org/10.1177/1948550617693058
Lakens, D., Scheel, A. M., & Isager, P. M. (2018). Equivalence
testing for psychological research: A tutorial. Advances in Methods
and Practices in Psychological Science, 1(2), 259–269. https://doi.org/10.1177/2515245918770963
Landy, J. F., Jia, M. L., Ding, I. L., Viganola, D., Tierney, W.,
Dreber, A., & Collaboration, C. H. T. (2020). Crowdsourcing
hypothesis tests: Making transparent how design choices shape research
results. Psychological Bulletin, 146(5), 451. https://doi.org/10.1037/bul0000220
Lash, T. L., Collin, L. J., & Van Dyke, M. E. (2018). The
replication crisis in epidemiology: Snowball, snow job, or winter
solstice? Current Epidemiology Reports, 5, 175–183.
LeBel, E. P., McCarthy, R. J., Earp, B. D., Elson, M., & Vanpaemel,
W. (2018). A unified framework to quantify the credibility of scientific
findings. Advances in Methods and Practices in Psychological
Science, 1(3), 389–402. https://doi.org/10.1177/2515245918787489
Leehr, E. J., Seeger, F. R., Böhnlein, J., Gathmann, B., Straube, T.,
Roesmann, K., & Lueken, U. (2024). Association between resting-state
connectivity patterns in the defensive system network and treatment
response in spider phobia—a replication approach. Translational
Psychiatry, 14(1), 137.
Leibniz Institute for the Social Sciences. (2023). TOP factor: Open
data levels of social science journals. https://topfactor.org/journals?factor=Data+Transparency.
Lynch, C. J., Elbau, I. G., Ng, T., et al. (2024). Frontostriatal
salience network expansion in individuals in depression.
Nature, 633, 624–633. https://doi.org/10.1038/s41586-024-07805-2
Mac Giolla, E., Karlsson, S., Neequaye, D. A., & Bergquist, M.
(2024). Evaluating the replicability of social priming studies.
Meta-Psychology, 8. https://doi.org/10.15626/MP.2022.3308
Mahoney, M. J. (1977). Publication prejudices: An experimental study of
confirmatory bias in the peer review system. Cognitive Therapy and
Research, 1, 161–175. https://doi.org/10.1007/BF01173636
Makel, M. C., Plucker, J. A., & Hegarty, B. (2012). Replications in
psychology research: How often do they really occur? Perspectives on
Psychological Science, 7(6), 537–542. https://doi.org/10.1177/1745691612460688
Marek, S., Tervo-Clemmens, B., Calabro, F. J., Montez, D. F., Kay, B.
P., Hatoum, A. S., & Dosenbach, N. U. (2022). Reproducible
brain-wide association studies require thousands of individuals.
Nature, 603(7902), 654–660.
Mazei, J., Hüffmeier, J., & Schultze, T. (2025). Specification curve
and reproducibility dashboards for social science research:
Recommendations for implementation. Advances in Methods and
Practices in Psychological Science.
McCarthy, R., Gervais, W., Aczel, B., Al-Kire, R. L., Aveyard, M.,
Marcella Baraldo, S., & Zogmaister, C. (2021). A multi-site
collaborative study of the hostile priming effect. Collabra:
Psychology, 7(1), 18738. https://doi.org/10.1525/collabra.18738
McManus, K. (2024). Replication studies in second language acquisition
research: Definitions, issues, resources, and future directions:
Introduction to the special issue. Studies in Second Language
Acquisition, 46(5), 1299–1319. https://doi.org/10.1017/S0272263124000652
McShane, B. B., & Böckenholt, U. (2017). Single-paper meta-analysis:
Benefits for study summary, theory testing, and replicability.
Journal of Consumer Research, 43(6), 1048–1063. https://doi.org/10.1093/jcr/ucw085
Micheloud, C., & Held, L. (2022). Power calculations for replication
studies. Statistical Science, 37(3), 369–379. https://doi.org/10.1214/21-STS828
Miłkowski, M., Hensel, W. M., & Hohol, M. (2018). Replicability or
reproducibility? On the replication crisis in computational neuroscience
and sharing only relevant detail. Journal of Computational
Neuroscience, 45(3), 163–172. https://doi.org/10.1007/s10827-018-0702-z
Moreau, D., & Wiebels, K. (2023). Ten simple rules for designing and
conducting undergraduate replication projects. PLOS Computational
Biology, 19(3), e1010957. https://doi.org/10.1371/journal.pcbi.1010957
Munafò, M. R., Chambers, C. D., Collins, A. M., Fortunato, L., &
Macleod, M. R. (2020). Research culture and reproducibility. Trends
in Cognitive Sciences, 24(2), 91–93. https://doi.org/10.1016/j.tics.2019.12.002
Muradchanian, J., Hoekstra, R., Kiers, H., & Ravenzwaaij, D. van.
(2021). How best to quantify replication success? A simulation study on
the comparison of replication success metrics. Royal Society Open
Science, 8(5), 201697. https://doi.org/10.1098/rsos.201697
Mussweiler, T., Strack, F., & Pfeiffer, T. (2000). Overcoming the
inevitable anchoring effect: Considering the opposite compensates for
selective accessibility. Personality and Social Psychology
Bulletin, 26(9), 1142–1150. https://doi.org/10.1177/01461672002611010
Nagy, T., Hergert, J., Elsherif, M. M., Wallrich, L., Schmidt, K.,
Waltzer, T., Payne, J. W., Gjoneska, B., Seetahul, Y., Wang, Y. A.,
Scharfenberg, D., Tyson, G., Yang, Y.-F., Skvortsova, A., Alarie, S.,
Graves, K., Sotola, L. K., Moreau, D., & Rubínová, E. (2025).
Bestiary of questionable research practices in psychology. Advances
in Methods and Practices in Psychological Science, 8(3).
https://doi.org/10.1177/25152459251348431
Nelson, L. D., Simmons, J., & Simonsohn, U. (2018). Psychology’s
renaissance. Annual Review of Psychology, 69(1),
511–534. https://doi.org/10.1146/annurev-psych-122216-011836
Nosek, B. A., & Errington, T. M. (2020). What is replication?
PLoS Biology, 18(3), e3000691. https://doi.org/10.1371/journal.pbio.3000691
Nuijten, M. B., & Polanin, J. R. (2020). “Statcheck”:
Automatically detect statistical reporting inconsistencies to increase
reproducibility of meta‐analyses. Research Synthesis Methods,
11(5), 574–579. https://doi.org/10.1002/jrsm.1408
Nüst, D., & Eglen, S. J. (2021). CODECHECK: An open science
initiative for the independent execution of computations underlying
research articles during peer review to improve reproducibility.
F1000Research, 10, 253. https://doi.org/10.12688/f1000research.51738.2
Open Science Collaboration. (2015). Estimating the reproducibility of
psychological science. Science, 349(6251), aac4716. https://doi.org/10.1126/science.aac4716
Patil, P., Peng, R. D., & Leek, J. T. (2016a). A statistical
definition for reproducibility and replicability. BioRxiv,
066803. https://doi.org/10.1101/066803
Patil, P., Peng, R. D., & Leek, J. T. (2016b). What should
researchers expect when they replicate studies? A statistical view of
replicability in psychological science. Perspectives on
Psychological Science, 11(4), 539–544. https://doi.org/10.1177/1745691616646366
Pawel, S., Consonni, G., & Held, L. (2023). Bayesian approaches to
designing replication studies. Psychological Methods. https://doi.org/10.1037/met0000604
Pennington, C. R. (2023). A student’s guide to open science: Using
the replication crisis to reform psychology. Open University Press.
Perry, T., Morris, R., & Lea, R. (2022). A decade of replication
study in education? A mapping review (2011–2020). Educational
Research and Evaluation, 27(1-2), 12–34. https://doi.org/10.1080/13803611.2021.2022315
Pittelkow, M. M., Field, S. M., Isager, P. M., Veer, T. van’t, A. E.
Anderson, Cole, S. N., & Van Ravenzwaaij, D. (2023). The process of
replication target selection in psychology: What to consider? Royal
Society Open Science, 10(2), 210586. https://doi.org/10.1098/rsos.210586
Pittelkow, M. M., Field, S. M., & Ravenzwaaij, D. van. (2025).
Thinking beyond RVCN: Addressing the complexity of replication
target selection. https://doi.org/10.31234/osf.io/6tmyx_v2
Pittelkow, M. M., Hoekstra, R., Karsten, J., & Ravenzwaaij, D. van.
(2021). Replication target selection in clinical psychology: A bayesian
and qualitative reevaluation. Clinical Psychology: Science and
Practice, 28(2), 210. https://doi.org/10.1037/cps0000013
Powers, K. L., Brooks, P. J., Aldrich, N. J., Palladino, M. A., &
Alfieri, L. (2013). Effects of video-game play on information
processing: A meta-analytic investigation. Psychonomic Bulletin
& Review, 20(6), 1055–1079. https://doi.org/10.3758/s13423-013-0418-z
Pownall, M. (2022). Is replication possible for qualitative
research? https://doi.org/10.31234/osf.io/dwxeg
Protzko, J. (2018). Null-hacking, a lurking problem. https://doi.org/10.31234/osf.io/9y3mp
Puhlmann, L., Koppold, A., Feld, G., Lonsdorf, T., Hilger, K., Vogel,
S., & Hartmann, H. (2025). There is no research on a dead
planet–fostering ecologically sustainable open science practices in
neuroscience. OSF Preprint. https://doi.org/10.31219/osf.io/rju75_v1
Renton, A. I., Dao, T. T., Johnstone, T., Civier, O., Sullivan, R. P.,
White, D. J., Lyons, P., Slade, B. M., Abbott, D. F., Amos, T. J.,
Bollmann, S., Botting, A., Campbell, M. E. J., Chang, J., Close, T. G.,
Dörig, M., Eckstein, K., Egan, G. F., Evas, S., & Bollmann, S.
(2024). Neurodesk: An accessible, flexible and portable data analysis
environment for reproducible neuroimaging. Nature Methods,
21(5), 804–808. https://doi.org/10.1038/s41592-023-02145-x
Röseler, L., Hein, M., & Oppong Boakye, P. (2025). Standardized
reproduction and replication templates (StaRT). https://doi.org/10.17605/OSF.IO/BRXTD
Röseler, L., Kaiser, L., Doetsch, C., Klett, N., Seida, C., Schütz, A.,
& Zhang, Y. (2024). The replication database: Documenting the
replicability of psychological science. Journal of Open Psychology
Data, 12(1), 8. https://doi.org/10.5334/jopd.101
Röseler, L., Schütz, A., Blank, P. A., Dück, M., Fels, S., Kupfer, J.,
Scheelje, L., & Seida, C. (2021). Evidence against subliminal
anchoring: Two close, highly powered, preregistered, and failed
replication attempts. Journal of Experimental Social
Psychology, 92, 104066. https://doi.org/10.1016/j.jesp.2020.104066
Röseler, L., & Wallrich, L. (2024). FReD: Interfaces to the
FORRT replication database. http://forrt.org/FReD/
Rosenberg, M. D., & Finn, E. S. (2022). How to establish robust
brain–behavior relationships without thousands of individuals.
Nature Neuroscience, 25, 835–837. https://doi.org/10.1038/s41593-022-01110-9
Schauer, J. M., & Hedges, L. V. (2021). Reconsidering statistical
methods for assessing replication. Psychological Methods,
26(1), 127–139. https://doi.org/10.1037/met0000302
Schimmack, U. (2012). The ironic effect of significant results on the
credibility of multiple-study articles. Psychological Methods,
17(4), 551. https://doi.org/10.1037/a0029487
Schmidt, S. (2009). Shall we really do it again? The powerful concept of
replication is neglected in the social sciences. Review of General
Psychology, 13(2), 90–100. https://doi.org/10.1037/a0015108
Schöch, C. (2023). Repetitive research: A conceptual space and
terminology of replication, reproduction, revision, reanalysis,
reinvestigation and reuse in digital humanities. International
Journal of Digital Humanities, 5(2), 373–403. https://doi.org/10.1007/s42803-023-00073-y
Schultze, T., Gerlach, T. M., & Rittich, J. C. (2018). Some people
heed advice less than others: Agency (but not communion) predicts advice
taking. Journal of Behavioral Decision Making, 31(3),
430–445. https://doi.org/10.1002/bdm.2065
Simmons, J. P., Nelson, L. D., & Simonsohn, U. (2011).
False-positive psychology: Undisclosed flexibility in data collection
and analysis allows presenting anything as significant.
Psychological Science, 22(11), 1359–1366. https://doi.org/10.1177/0956797611417632
Simons, D. J., Shoda, Y., & Lindsay, D. S. (2017). Constraints on
generality (COG): A proposed addition to all empirical papers.
Perspectives on Psychological Science, 12(6),
1123–1128. https://doi.org/10.1177/1745691617708630
Simonsohn, U. (2015). Small telescopes: Detectability and the evaluation
of replication results. Psychological Science, 26(5),
559–569. https://doi.org/10.1177/0956797614567341
Simonsohn, U., Simmons, J. P., & Nelson, L. D. (2020). Specification
curve analysis. Nature Human Behaviour, 4, 1208–1214.
https://doi.org/10.1038/s41562-020-0912-z
Soderberg, C. K., Errington, T. M., Schiavone, S. R., et al. (2021).
Initial evidence of research quality of registered reports compared with
the standard publishing model. Nature Human Behaviour,
5, 990–997. https://doi.org/10.1038/s41562-021-01142-4
Soto, C. J. (2019). How replicable are links between personality traits
and consequential life outcomes? The life outcomes of personality
replication project. Psychological Science, 30(5),
711–727. https://doi.org/10.1177/0956797619831612
Spisak, T., Bingel, U., & Wager, T. D. (2023). Multivariate BWAS can
be replicable with moderate sample sizes. Nature, 615,
E4–E7. https://doi.org/10.1038/s41586-023-05745-x
Srivastava, S. (2012). A pottery barn rule for scientific
journals. The Hardest Science blog. https://thehardestscience.com/2012/09/27/a-pottery-barn-rule-for-scientific-journals
Syed, M. (2023). Replication or generalizability? How flexible
inferences uphold unfounded universal claims. https://doi.org/10.31234/osf.io/znv5r
Taylor, P. A., Reynolds, R. C., Calhoun, V., Gonzalez-Castillo, J.,
Handwerker, D. A., Bandettini, P. A., & Chen, G. (2023). Highlight
results, don’t hide them: Enhance interpretation, reduce biases and
improve reproducibility. Neuroimage, 274, 120138.
The Turing Way Community. (2025). The turing way: A handbook for
reproducible, ethical and collaborative research (1.2.3 ed.).
Zenodo. https://doi.org/10.5281/zenodo.15213042
Tsang, E. W., & Kwan, K. M. (1999). Replication and theory
development in organizational science: A critical realist perspective.
Academy of Management Review, 24(4), 759–780. https://doi.org/10.2307/259353
Urminsky, O., & Dietvorst, B. J. (2024). Taking the full measure:
Integrating replication into research practice to assess
generalizability. Journal of Consumer Research, 51(1),
157–168. https://doi.org/10.1093/jcr/ucae007
Van Bavel, J. J., Mende-Siedlecki, P., Brady, W. J., & Reinero, D.
A. (2016). Contextual sensitivity in scientific reproducibility.
Proceedings of the National Academy of Sciences,
113(23), 6454–6459. https://doi.org/10.1073/pnas.1521897113
Vazire, S., Schiavone, S. R., & Bottesini, J. G. (2022). Credibility
beyond replicability: Improving the four validities in psychological
science. Current Directions in Psychological Science,
31(2), 162–168. https://doi.org/10.1177/09637214211067779
Voelkl, B., Heyard, R., Fanelli, D., Wever, K. E., Held, L., Maniadis,
Z., & Würbel, H. (2025). Defining reproducibility. https://doi.org/10.17605/OSF.IO/BR9SP
Vohs, K. D., Schmeichel, B. J., Lohmann, S., Gronau, Q. F., Finley, A.
J., Ainsworth, S. E., Alquist, J. L., Baker, M. D., Brizi, A., Bunyi,
A., Butschek, G. J., Campbell, C., Capaldi, J., Cau, C., Chambers, H.,
Chatzisarantis, N. L. D., Christensen, W. J., Clay, S. L., Curtis, J.,
& Albarracín, D. (2021). A multisite preregistered paradigmatic test
of the ego-depletion effect. Psychological Science,
32(10), 1566–1581. https://doi.org/10.1177/0956797621989733
Wagenmakers, E.-J., Beek, T., Dijkhoff, L., Gronau, Q. F., Acosta, A.,
Adams, R. B., Albohn, D. N., Allard, E. S., Benning, S. D.,
Blouin-Hudon, E.-M., Bulnes, L. C., Caldwell, T. L., Calin-Jageman, R.
J., Capaldi, C. A., Carfagno, N. S., Chasten, K. T., Cleeremans, A.,
Connell, L., DeCicco, J. M., & Zwaan, R. A. (2016). Registered
replication report: Strack, martin, & stepper (1988).
Perspectives on Psychological Science, 11(6), 917–928.
https://doi.org/10.1177/1745691616674458
Wagenmakers, E.-J., Gronau, Q. F., & Vandekerckhove, J. (2019).
Five bayesian intuitions for the stopping rule principle. https://doi.org/10.31234/osf.io/5ntkd
Wallrich, L. (2025). Small telescopes for higher-power
replications. Personal blog. https://www.lukaswallrich.coffee/blog/small-telescopes-for-higher-power-replications/
Ward, M. K., & Meade, A. W. (2023). Dealing with careless responding
in survey data: Prevention, identification, and recommended best
practices. Annual Review of Psychology, 74(1),
577–596. https://doi.org/10.1146/annurev-psych-040422-045007
Wilkinson, M. D., Dumontier, M., Aalbersberg, I. J., Appleton, G.,
Axton, M., Baak, A., & Mons, B. (2016). The FAIR guiding principles
for scientific data management and stewardship. Scientific
Data, 3(1), 1–9. https://doi.org/10.1038/sdata.2016.18
Willroth, E. C., & Atherton, O. E. (2024). Best laid plans: A guide
to reporting preregistration deviations. Advances in Methods and
Practices in Psychological Science, 7(1),
25152459231213802. https://doi.org/10.1177/25152459231213802
Winter, N. R., Leenings, R., Ernsting, J., Sarink, K., Fisch, L., Emden,
D., & Hahn, T. (2022). Quantifying deviations of brain structure and
function in major depressive disorder across neuroimaging modalities.
JAMA Psychiatry, 79(9), 879–888.
Yarkoni, T. (2013). “What we can and can’t learn from the many
labs replication project.” Talyarkoni.org/Blog.
Zhou, H., & Fishbach, A. (2016). The pitfall of experimenting on the
web: How unattended selective attrition leads to surprising (yet false)
research conclusions. Journal of Personality and Social
Psychology, 111(4), 493–504. https://doi.org/10.1037/pspa0000056
Zhou, X., Wu, R., Zeng, Y., Qi, Z., Ferraro, S., Xu, L., & Becker,
B. (2022). Choice of voxel-based morphometry processing pipeline drives
variability in the location of neuroanatomical brain markers.
Communications Biology, 5(1), 913.
Zwaan, R. A., Etz, A., Lucas, R. E., & Donnellan, M. B. (2018).
Making replication mainstream. Behavioral and Brain Sciences,
41, e120. https://doi.org/10.1017/S0140525X17001972