5 Transparency and reproducibility in computation and analysis
5 sub-clusters · 54 referencesAttainment of the how-to basics of reproducible reports and analyses. It requires students to move towards transparent and scripted analysis practices in quantitative research. There are 5 sub-clusters which aim to further parse the learning and teaching process:
Analysis and reporting in qualitative research
By documenting and reporting research processes in qualitative research, transparency and credibility in qualitative research reports is ensured. Topics include using agreed reporting standards, demonstrating methodological rigor, and recent calls to integrate qualitative methods into the open science movement. The emphasis is on making qualitative research as trustworthy and open as context permits, without forcing inappropriate replication model
- Anon. (n.d.). OSL Open Stats Lab. Trinity Sites. https://sites.trinity.edu/osl/
- Anon. (n.d.). Software carpentry. Software Carpentry. https://software-carpentry.org/
- Anon. (n.d.). Teaching integrity in empirical research. Project TIER | Teaching Integrity in Empirical Research. https://www.projecttier.org/
- Clarke, V. (2021). Navigating the messy swamp of qualitative research: Are generic reporting standards the answer?A review essay of the book Reporting Qualitative Research in Psychology: How to Meet APA Style Journal Article Reporting Standards, Revised Edition, by Heidi M. Levitt, Washington, DC, American Psychological Association, 2020, 173pp., $29.99 (paperback), ISBN: 978-1-4338-3343-4. Qualitative Research in Psychology, 19(4), 1004–1012. https://doi.org/10.1080/14780887.2021.1995555
- O’Brien, B. C., Harris, I. B., Beckman, T. J., Reed, D. A., & Cook, D. A. (2014). Standards for Reporting Qualitative Research. Academic Medicine, 89(9), 1245–1251. https://doi.org/10.1097/ACM.0000000000000388
- Tong, A., Sainsbury, P., & Craig, J. (2007). Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups. International Journal for Quality in Health Care, 19(6), 349–357. https://doi.org/10.1093/intqhc/mzm042
Computational reproducibility
Making sure anyone can reproduce quantitative analyses through things like well-commented scripts, writing codebooks, version control, literate programming (e.g. Quarto), reproducible computational environment (containers, package managers), and automated data pipelines.
- Chen, X., Dallmeier-Tiessen, S., Dasler, R., Feger, S., Fokianos, P., Gonzalez, J. B., Hirvonsalo, H., Kousidis, D., Lavasa, A., Mele, S., Rodriguez, D. R., Šimko, T., Smith, T., Trisovic, A., Trzcinska, A., Tsanaktsidis, I., Zimmermann, M., Cranmer, K., Heinrich, L., … Neubert, S. (2018). Open is not enough. Nature Physics, 15(2), 113–119. https://doi.org/10.1038/s41567-018-0342-2
- Creswell, J. W., & Miller, D. L. (2000). Determining Validity in Qualitative Inquiry. Theory Into Practice, 39(3), 124–130. https://doi.org/10.1207/s15430421tip3903_2
- Data Carpentry. (n.d.). Reproducible research in R workshop overview. https://datacarpentry.org/rr-workshop/
- Davies, D., & Dodd, J. (2002). Qualitative Research and the Question of Rigor. Qualitative Health Research, 12(2), 279–289. https://doi.org/10.1177/104973230201200211
- Fox, N. (2018). Writing reproducible scientific papers in R. YouTube. https://www.youtube.com/playlist?list=PLmvNihjFsoM5hpQdqoI7onL4oXDSQ0ym8
- Frohwirth, L., Karcher, S., & Lever, T. A. (2023). A Transparency Checklist for Qualitative Research. https://doi.org/10.31235/osf.io/wc35g
- Gandrud, C. (2016). Reproducible research with R and R Studio. CRC Press.
- Harmon-Jones, E., Harmon-Jones, C., Amodio, D. M., Gable, P. A., & Schmeichel, B. J. (2025). Valid replications require valid methods: Recommendations for best methodological practices with lab experiments. Motivation Science, 11(3), 235–245. https://doi.org/10.1037/mot0000398
- Harrison, P., Barugahare, A., & Tsyganov, K., (2020). Reproducible research in R. Monash Data Fluency. https://monashdatafluency.github.io/r-rep-res/index.html
- Humphreys, L., Lewis, N. A., Sender, K., & Won, A. S. (2021). Integrating Qualitative Methods and Open Science: Five Principles for More Trustworthy Research*. Journal of Communication. https://doi.org/10.1093/joc/jqab026
- Ross-Hellauer, T., Klebel, T., Bannach-Brown, A., Horbach, S. P. J. M., Jabeen, H., Manola, N., Metodiev, T., Papageorgiou, H., Reczko, M., Sansone, S.-A., Schneider, J., Tijdink, J., & Vergoulis, T. (2022). TIER2: enhancing Trust, Integrity and Efficiency in Research through next-level Reproducibility. Research Ideas and Outcomes, 8. https://doi.org/10.3897/rio.8.e98457
- Siraji, M. A., & Rahman, M. (2023). Primer on Reproducible Research in R: Enhancing Transparency and Scientific Rigor. Clocks & Sleep, 6(1), 1–10. https://doi.org/10.3390/clockssleep6010001
- Stodden, V., & Miguez, S. (2014). Best Practices for Computational Science: Software Infrastructure and Environments for Reproducible and Extensible Research. Journal of Open Research Software, 2(1). https://doi.org/10.5334/jors.ay
- Johnson, J. L., Adkins, D., & Chauvin, S. (2020). A Review of the Quality Indicators of Rigor in Qualitative Research. American Journal of Pharmaceutical Education, 84(1), 7120. https://doi.org/10.5688/ajpe7120
- Kapiszewski, D., & Karcher, S. (2020). Transparency in Practice in Qualitative Research. PS: Political Science & Politics, 54(2), 285–291. https://doi.org/10.1017/S1049096520000955
- Levitt, H. M., Bamberg, M., Creswell, J. W., Frost, D. M., Josselson, R., & Suárez-Orozco, C. (2018). Journal article reporting standards for qualitative primary, qualitative meta-analytic, and mixed methods research in psychology: The APA Publications and Communications Board task force report. American Psychologist, 73(1), 26–46. https://doi.org/10.1037/amp0000151
- Levitt, H. M. (2020). Reporting qualitative research in psychology: How to meet APA style journal article reporting standards. American Psychological Association.
- Lincoln, Y. S., & Guba, E. G. (1986). But is it rigorous? Trustworthiness and authenticity in naturalistic evaluation. New Directions for Program Evaluation, 1986(30), 73–84. Portico. https://doi.org/10.1002/ev.1427
- Makel, M. C., Meyer, M. S., Simonsen, M. A., Roberts, A. M., & Plucker, J. A. (2022). Replication is relevant to qualitative research. Educational Research and Evaluation, 27(1–2), 215–219. https://doi.org/10.1080/13803611.2021.2022310
- Malterud, K. (2001). Qualitative research: standards, challenges, and guidelines. The Lancet, 358(9280), 483–488. https://doi.org/10.1016/S0140-6736(01)05627-6
- Merow, C., Serra-Diaz, J. M., Enquist, B. J., & Wilson, A. M. (2023). AI chatbots can boost scientific coding. Nature Ecology & Evolution, 7(7), 960–962. https://doi.org/10.1038/s41559-023-02063-3
- Mill, J. E., & Ogilvie, L. D. (2003). Establishing methodological rigour in international qualitative nursing research: a case study from Ghana. Journal of Advanced Nursing, 41(1), 80–87. Portico. https://doi.org/10.1046/j.1365-2648.2003.02509.x
- Denzin, Norman. (2020). https://doi.org/10.4135/9781526421036
- Navarro, D. (2019, January 11). A tutorial for psychology students and other beginners. (version 0.6.1). Learning Statistics with R. https://learningstatisticswithr.com/book/
- Pratt, M. G., Kaplan, S., & Whittington, R. (2019). Editorial Essay: The Tumult over Transparency: Decoupling Transparency from Replication in Establishing Trustworthy Qualitative Research. Administrative Science Quarterly, 65(1), 1–19. https://doi.org/10.1177/0001839219887663
- Peng, R. D. (2011). Reproducible Research in Computational Science. Science, 334(6060), 1226–1227. https://doi.org/10.1126/science.1213847
- PsyTeachR. (2021). Data Skills for Reproducible Science. PsyTeachR. https://psyteachr.github.io/msc-data-skills/
- Ruggiano, N., & Perry, T. E. (2017). Conducting secondary analysis of qualitative data: Should we, can we, and how? Qualitative Social Work, 18(1), 81–97. https://doi.org/10.1177/1473325017700701
- Shaw, R. L., Bishop, F. L., Horwood, J., Chilcot, J., & Arden, M. A. (2019). Enhancing the quality and transparency of qualitative research methods in health psychology. British Journal of Health Psychology, 24(4), 739–745. Portico. https://doi.org/10.1111/bjhp.12393
- Stahl, N. A., & King, J. R. (2020). Expanding approaches for research: Understanding and using trustworthiness in qualitative research. Journal of Developmental Education, 44(1), 26–28. https://files.eric.ed.gov/fulltext/EJ1320570.pdf
- Steltenpohl, C. N., Lustick, H., Meyer, M. S., Lee, L. E., Stegenga, S. M., Standiford Reyes, L., & Renbarger, R. L. (2023). Rethinking Transparency and Rigor from a Qualitative Open Science Perspective. Journal of Trial and Error, 4(1), 47–59. KB. https://doi.org/10.36850/mr7
- Stodden, V., Miguez, S. (2014). Best Practices for Computational Science: Software Infrastructure and Environments for Reproducible and Extensible Research. Journal of Open Research Software (1), e21. https://openresearchsoftware.metajnl.com/articles/10.5334/jors.ay
- Tamminen, K. A., Bundon, A., Smith, B., McDonough, M. H., Poucher, Z. A., & Atkinson, M. (2021). Considerations for making informed choices about engaging in open qualitative research. Qualitative Research in Sport, Exercise and Health, 13(5), 864–886. https://doi.org/10.1080/2159676X.2021.1901138
- Community, T. T. W., Arnold, B., Bowler, L., Gibson, S., Herterich, P., Higman, R., Krystalli, A., Morley, A., O'Reilly, M., & Whitaker, K. (2019). The Turing Way: A Handbook for Reproducible Data Science (Version v0.0.4) [Computer software]. Zenodo. https://doi.org/10.5281/zenodo.3233986
- Wilson, G., Bryan, J., Cranston, K., Kitzes, J., Nederbragt, L., & Teal, T. K. (2017). Good enough practices in scientific computing. PLOS Computational Biology, 13(6), e1005510. https://doi.org/10.1371/journal.pcbi.1005510
Free and open source software
Free and open source software is a foundation for reproducible research: open tooling lowers access barriers, enables community review, and supports longevity through transparent code, issue tracking, and forking.
- Chao, L. (2009). Utilizing open source tools for online teaching and learning Information Science. Information Science Reference.
- Huang, R. (2016). RQDA: R-based qualitative data analysis (Version 0.2-8) [Computer software]. R Project. http://rqda.r-forge.r-project.org/
- Morin, A., Urban, J., & Sliz, P. (2012). A Quick Guide to Software Licensing for the Scientist-Programmer. PLoS Computational Biology, 8(7), e1002598. https://doi.org/10.1371/journal.pcbi.1002598
- Nab, L., Schaffer, A. L., Hulme, W., DeVito, N. J., Dillingham, I., Wiedemann, M., Andrews, C. D., Curtis, H., Fisher, L., Green, A., Massey, J., Walters, C. E., Higgins, R., Cunningham, C., Morley, J., Mehrkar, A., Hart, L., Davy, S., Evans, D., … Goldacre, B. (2024).
OpenSAFELY : A platform for analysing electronic health records designed for reproducible research. Pharmacoepidemiology and Drug Safety, 33(6). Portico. https://doi.org/10.1002/pds.5815 - Raymond, E. (1999). The cathedral and the bazaar. Knowledge, Technology & Policy, 12(3), 23–49. https://doi.org/10.1007/s12130-999-1026-0
Research software engineering
Covers the emerging role of Research Software Engineers, professionals who develop software for research purposes. Emphasizes best practices in coding (testing, version control, documentation) as integral to research transparency. Also discusses how RSEs bridge the gap between traditional IT and academic science, ensuring that scientific software is reliable and sustainable
- Carver, J. C., Weber, N., Ram, K., Gesing, S., & Katz, D. S. (2022). A survey of the state of the practice for research software in the United States. PeerJ Computer Science, 8, e963. Portico. https://doi.org/10.7717/peerj-cs.963
- Lamprecht, A.-L., Garcia, L., Kuzak, M., Martinez, C., Arcila, R., Martin Del Pico, E., Dominguez Del Angel, V., van de Sandt, S., Ison, J., Martinez, P. A., McQuilton, P., Valencia, A., Harrow, J., Psomopoulos, F., Gelpi, J. Ll., Chue Hong, N., Goble, C., & Capella-Gutierrez, S. (2019). Towards FAIR principles for research software. Data Science, 3(1), 37–59. https://doi.org/10.3233/DS-190026
Tools to check yourself and others
Detecting errors in the literature, and preventing them from entering the literature by checking your own work. Includes tools such as statcheck.io, GRIM, and SPRITE to detect errors in reporting of statistics.
- Brown, N. J. L., & Heathers, J. A. J. (2016). The GRIM Test. Social Psychological and Personality Science, 8(4), 363–369. https://doi.org/10.1177/1948550616673876
- Baker, M. (2016). How quality control could save your science. Nature, 529(7587), 456–458. https://doi.org/10.1038/529456a
- Nuijten, M. B., Van Assen, M. A. L. M., Hartgerink, C. H. J., Epskamp, S., & Wicherts, J. M. (2017). The validity of the tool “statcheck” in discovering statistical reporting inconsistencies. https://psyarxiv.com/tcxaj/.
- Pejdo, D., Ursić, L., & Marušić, A. (2025). Primer on Reproducibility: Trends, Tools, and Some Tips and Tricks. Current Protocols, 6(1). Portico. https://doi.org/10.1002/cpz1.70299
- Strand, J. F. (2021). Error Tight: Exercises for Lab Groups to Prevent Research Mistakes. https://doi.org/10.31234/osf.io/rsn5y
- van der Zee, T., Anaya, J., & Brown, N. J. L. (2017). Statistical heartburn: an attempt to digest four pizza publications from the Cornell Food and Brand Lab. BMC Nutrition, 3(1). https://doi.org/10.1186/s40795-017-0167-x