Focus
Type

5 Transparency and reproducibility in computation and analysis

5 sub-clusters · 54 references

Attainment of the how-to basics of reproducible reports and analyses. It requires students to move towards transparent and scripted analysis practices in quantitative research. There are 5 sub-clusters which aim to further parse the learning and teaching process:

Analysis and reporting in qualitative research 6 / 6

By documenting and reporting research processes in qualitative research, transparency and credibility in qualitative research reports is ensured. Topics include using agreed reporting standards, demonstrating methodological rigor, and recent calls to integrate qualitative methods into the open science movement. The emphasis is on making qualitative research as trustworthy and open as context permits, without forcing inappropriate replication model

Anon. (n.d.). OSL Open Stats Lab. Trinity Sites. https://sites.trinity.edu/osl/
Anon. (n.d.). Software carpentry. Software Carpentry. https://software-carpentry.org/
Anon. (n.d.). Teaching integrity in empirical research. Project TIER | Teaching Integrity in Empirical Research. https://www.projecttier.org/
practice/tools Paper
Navigating the messy swamp of qualitative research: Are generic reporting standards the answer?A review essay of the book Reporting Qualitative Research in Psychology: How to Meet APA Style Journal Article Reporting Standards, Revised Edition, by Heidi M. Levitt, Washington, DC, American Psychological Association, 2020, 173pp., $29.99 (paperback), ISBN: 978-1-4338-3343-4
This article identifies and corrects ten common pitfalls in published reflexive thematic analysis research to clarify what constitutes high-quality practice. It specifically challenges the application of quantitative-style metrics like inter-rater reliability, arguing for quality standards that align with the qualitative and reflexive nature of the method.
O’Brien, B. C., Harris, I. B., Beckman, T. J., Reed, D. A., & Cook, D. A. (2014). Standards for Reporting Qualitative Research. Academic Medicine, 89(9), 1245–1251. https://doi.org/10.1097/ACM.0000000000000388
practice/tools Paper
Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups
This resource provides the COREQ checklist, a highly specific 32-item tool dedicated to improving the reporting quality of studies specifically using interviews and focus groups. It ensures researchers transparently document critical details about the research team, study design, and data analysis process to facilitate better appraisal and synthesis of qualitative work.
Computational reproducibility 35 / 35

Making sure anyone can reproduce quantitative analyses through things like well-commented scripts, writing codebooks, version control, literate programming (e.g. Quarto), reproducible computational environment (containers, package managers), and automated data pipelines.

advocacy Paper
Open is not enough
This resource uses the experiences of the high-energy physics community to argue that simple data openness is insufficient for ensuring true computational reproducibility. It advocates for the adoption of more comprehensive practices, such as preserving the entire computational environment and workflow, to make research findings genuinely verifiable by others.
critique Paper
Determining Validity in Qualitative Inquiry
This review essay evaluates the American Psychological Association’s reporting standards for qualitative research in psychology. It critiques the utility of standardized, generic templates in a diverse methodological field, questioning whether such standards can accommodate the inherent complexity and messiness of qualitative inquiry.
Data Carpentry. (n.d.). Reproducible research in R workshop overview. https://datacarpentry.org/rr-workshop/
practice/tools Paper
Qualitative Research and the Question of Rigor
This resource provides a framework for selecting qualitative validity procedures by cross-referencing a researcher's philosophical lens with their adopted paradigm. It details nine specific strategies—such as triangulation, member checking, and thick description—to help researchers ensure and demonstrate the rigor of their findings.
Fox, N. (2018). Writing reproducible scientific papers in R. YouTube. https://www.youtube.com/playlist?list=PLmvNihjFsoM5hpQdqoI7onL4oXDSQ0ym8
practice/tools Preprint
A Transparency Checklist for Qualitative Research
This resource introduces and compares open-source software tools for qualitative data analysis, specifically the 'qcoder' R package and 'Taguette' application. It discusses how free and extensible tools can improve equity in research and outlines the benefits of adopting open-source workflows to increase the transparency of qualitative inquiry.
Gandrud, C. (2016). Reproducible research with R and R Studio. CRC Press.
practice/tools Paper
Valid replications require valid methods: Recommendations for best methodological practices with lab experiments.
This resource provides actionable methodological recommendations for conducting lab experiments to ensure they serve as a solid foundation for valid replications. It highlights specific practices in experimental design and implementation that are essential for producing reliable and reproducible findings.
Harrison, P., Barugahare, A., & Tsyganov, K., (2020). Reproducible research in R. Monash Data Fluency. https://monashdatafluency.github.io/r-rep-res/index.html
practice/tools Paper
Integrating Qualitative Methods and Open Science: Five Principles for More Trustworthy Research*
This article provides specific recommendations for improving the methodological rigor of laboratory experiments to ensure they are robust enough for valid replication. It argues that the success of the open science movement depends not only on statistical transparency but also on the fundamental validity of the experimental methods employed.
overview Paper
TIER2: enhancing Trust, Integrity and Efficiency in Research through next-level Reproducibility
This resource describes TIER2, a European Commission-funded project designed to investigate and improve research reproducibility across the social, life, and computer sciences. It outlines a systematic approach to developing tools and guidelines for diverse stakeholders to foster a more robust and trustworthy scientific ecosystem.
practice/tools Paper
Primer on Reproducible Research in R: Enhancing Transparency and Scientific Rigor
This publication provides a practical tutorial on using the R programming language to achieve computational reproducibility, specifically targeting researchers who may have limited coding experience. It offers a step-by-step primer on documenting research procedures and contexts more comprehensively to overcome the transparency limitations inherent in traditional publication practices.
practice/tools Paper
Best Practices for Computational Science: Software Infrastructure and Environments for Reproducible and Extensible Research
This article synthesizes a formalized set of best practice recommendations for managing software infrastructure and computational environments to ensure research is reproducible and extensible. It provides a practical framework for computational scientists across disciplines to standardize the dissemination of their code and data.
overview Paper
A Review of the Quality Indicators of Rigor in Qualitative Research
This resource synthesizes key indicators of rigor and quality specifically tailored for qualitative research within the health professions education field. It maps out best practices across the entire research process—from question formulation to final reporting—to minimize researcher bias and enhance the overall trustworthiness of findings.
Kapiszewski, D., & Karcher, S. (2020). Transparency in Practice in Qualitative Research. PS: Political Science & Politics, 54(2), 285–291. https://doi.org/10.1017/S1049096520000955
overview Paper
Journal article reporting standards for qualitative primary, qualitative meta-analytic, and mixed methods research in psychology: The APA Publications and Communications Board task force report.
This resource provides a review of rigor attributes and best practices for qualitative research design specifically within the context of health professions education. It outlines how a strong conceptual framework and iterative data analysis can minimize bias and enhance the trustworthiness of qualitative findings.
Levitt, H. M. (2020). Reporting qualitative research in psychology: How to meet APA style journal article reporting standards. American Psychological Association.
overview Paper
But is it rigorous? Trustworthiness and authenticity in naturalistic evaluation
This resource introduces foundational criteria for rigor in naturalistic inquiry, specifically proposing the concepts of trustworthiness and authenticity as alternatives to traditional quantitative standards. It provides a conceptual framework that allows qualitative researchers to demonstrate the quality and integrity of their work while remaining consistent with the paradigm’s unique assumptions.
advocacy Paper
Replication is relevant to qualitative research
This paper argues for the relevance and value of replication within qualitative research, suggesting it can address issues of transparency and transferability. It seeks to promote the adoption of replication as a fundamental building block of scholarship even in methodologies where it has traditionally been ignored.
Malterud, K. (2001). Qualitative research: standards, challenges, and guidelines. The Lancet, 358(9280), 483–488. https://doi.org/10.1016/S0140-6736(01)05627-6
practice/tools Paper
AI chatbots can boost scientific coding
This article explores how AI chatbots can be integrated into scientific workflows to assist with programming tasks, debugging, and documentation. It provides actionable insights into how these tools can lower the barrier to entry for computational methods and enhance the reproducibility of research code.
overview Paper
Establishing methodological rigour in international qualitative nursing research: a case study from Ghana
This resource surveys the fundamental standards and common challenges inherent in conducting high-quality qualitative research across various fields. It provides general guidelines to help researchers navigate the complexities of qualitative methodology while maintaining rigor and transparency.
overview Book
Denzin, Norman
This resource offers an integrated account of how a researcher’s ethical obligations and epistemological commitments should shape their approach to research openness. It argues for a flexible understanding of transparency in political science that accounts for different ways of knowing and the specificities of research with human participants.
Navarro, D. (2019, January 11). A tutorial for psychology students and other beginners. (version 0.6.1). Learning Statistics with R. https://learningstatisticswithr.com/book/
critique Editorial
Editorial Essay: The Tumult over Transparency: Decoupling Transparency from Replication in Establishing Trustworthy Qualitative Research
This editorial warns against the uncritical transfer of transparency and replication standards from psychology to qualitative management research. It argues for decoupling transparency from replication, suggesting that while transparency is necessary for trust, replication is often a poor fit for qualitative research goals.
overview Paper
Reproducible Research in Computational Science
This article proposes reproducibility as a necessary minimum standard for evaluating scientific claims within computational science, especially when independent replication is unfeasible. It highlights how the increasing complexity of computational work necessitates transparent sharing of code and data to ensure the validity of published findings.
PsyTeachR. (2021). Data Skills for Reproducible Science. PsyTeachR. https://psyteachr.github.io/msc-data-skills/
evidence Paper
Conducting secondary analysis of qualitative data: Should we, can we, and how?
This critical interpretive synthesis analyzes 71 published articles to provide empirical evidence on the current state and methodologies of qualitative secondary data analysis. It systematically maps how researchers navigate methodological and ethical concerns, offering a data-driven look at the prevalence and execution of these practices across disciplines.
teaching/training Editorial
Enhancing the quality and transparency of qualitative research methods in health psychology
This resource introduces a digital research environment built on Semantic MediaWiki designed to facilitate collaborative analysis using the method of objective hermeneutics. It specifically explores how this platform enhances transparency and supports student learning within research-based university seminars.
Stahl, N. A., & King, J. R. (2020). Expanding approaches for research: Understanding and using trustworthiness in qualitative research. Journal of Developmental Education, 44(1), 26–28. https://files.eric.ed.gov/fulltext/EJ1320570.pdf
critique Paper
Rethinking Transparency and Rigor from a Qualitative Open Science Perspective
This paper critiques the quantitative-centric definition of transparency in open science, arguing that current frameworks do not align with the epistemic goals of qualitative research. It proposes a broader perspective that emphasizes researcher reflexivity and contextual data interpretation as essential components of rigor.
Stodden, V., Miguez, S. (2014). Best Practices for Computational Science: Software Infrastructure and Environments for Reproducible and Extensible Research. Journal of Open Research Software (1), e21. https://openresearchsoftware.metajnl.com/articles/10.5334/jors.ay
Tamminen, K. A., Bundon, A., Smith, B., McDonough, M. H., Poucher, Z. A., & Atkinson, M. (2021). Considerations for making informed choices about engaging in open qualitative research. Qualitative Research in Sport, Exercise and Health, 13(5), 864–886. https://doi.org/10.1080/2159676X.2021.1901138
practice/tools Website
The Turing Way: A Handbook for Reproducible Data Science
This collaboratively written handbook serves as a comprehensive guide to reproducible, ethical, and collaborative data science practices throughout the research lifecycle. It offers an extensive collection of actionable tutorials, checklists, and templates designed to make open science practices achievable for researchers across all fields.
practice/tools Preprint
Good enough practices in scientific computing
This resource outlines a minimum set of practical, "good enough" computing habits designed to improve data management and analysis reproducibility for researchers across all scientific disciplines. It focuses on accessible entry points for scientists without formal computer science training, covering topics like file organization, documentation, and versioning.
Free and open source software 5 / 5

Free and open source software is a foundation for reproducible research: open tooling lowers access barriers, enables community review, and supports longevity through transparent code, issue tracking, and forking.

Chao, L. (2009). Utilizing open source tools for online teaching and learning Information Science. Information Science Reference.
Huang, R. (2016). RQDA: R-based qualitative data analysis (Version 0.2-8) [Computer software]. R Project. http://rqda.r-forge.r-project.org/
practice/tools Paper
A Quick Guide to Software Licensing for the Scientist-Programmer
This guide simplifies the complex landscape of software licensing for researchers who develop their own tools, helping them choose appropriate legal frameworks for their work. It provides practical advice on navigating institutional requirements to ensure that scientific software can be legally shared and reused.
practice/tools Paper
<scp>OpenSAFELY</scp>: A platform for analysing electronic health records designed for reproducible research
This paper introduces OpenSAFELY, a secure software platform designed to enable reproducible and transparent analysis of electronic health records (EHRs) at scale. It addresses technical and privacy barriers in health data research by providing a framework where analysis code is automatically shared and execution is standardized.
advocacy Book
The cathedral and the bazaar
This seminal essay contrasts two models of software development—the closed "cathedral" and the collaborative "bazaar"—to explain the success of the Linux kernel. It articulates the fundamental principles of peer review and decentralized community iteration that underpin the effectiveness of the open-source movement.
Research software engineering 2 / 2

Covers the emerging role of Research Software Engineers, professionals who develop software for research purposes. Emphasizes best practices in coding (testing, version control, documentation) as integral to research transparency. Also discusses how RSEs bridge the gap between traditional IT and academic science, ensuring that scientific software is reliable and sustainable

evidence Paper
A survey of the state of the practice for research software in the United States
This study presents empirical data from a large-scale survey of researchers to identify the structural and technical challenges involved in maintaining sustainable research software. It highlights a widespread, unmet need for formalized software training and identifies critical gaps in institutional support for the researchers who build these tools.
Lamprecht, A.-L., Garcia, L., Kuzak, M., Martinez, C., Arcila, R., Martin Del Pico, E., Dominguez Del Angel, V., van de Sandt, S., Ison, J., Martinez, P. A., McQuilton, P., Valencia, A., Harrow, J., Psomopoulos, F., Gelpi, J. Ll., Chue Hong, N., Goble, C., & Capella-Gutierrez, S. (2019). Towards FAIR principles for research software. Data Science, 3(1), 37–59. https://doi.org/10.3233/DS-190026
Tools to check yourself and others 6 / 6

Detecting errors in the literature, and preventing them from entering the literature by checking your own work. Includes tools such as statcheck.io, GRIM, and SPRITE to detect errors in reporting of statistics.

practice/tools Paper
The GRIM Test
This paper introduces the Granularity-Related Inconsistency of Means (GRIM) test, a simple mathematical technique used to detect reporting errors in summary statistics derived from integer-based data like Likert scales. It provides a low-threshold method for researchers and peer reviewers to verify the mathematical consistency of means reported in psychological research.
practice/tools Paper
How quality control could save your science
This resource outlines practical quality control measures researchers can implement to prevent common errors and improve the reproducibility of their work. It highlights specific steps such as better documentation, blinding, and validating reagents to bolster scientific rigor.
Nuijten, M. B., Van Assen, M. A. L. M., Hartgerink, C. H. J., Epskamp, S., & Wicherts, J. M. (2017). The validity of the tool “statcheck” in discovering statistical reporting inconsistencies. https://psyarxiv.com/tcxaj/.
overview Paper
Primer on Reproducibility: Trends, Tools, and Some Tips and Tricks
This primer provides a comprehensive introduction to the reproducibility crisis, detailing its multifaceted causes and implications for the scientific community. It offers a curated collection of trends and practical tools designed to help researchers enhance the reproducibility of their own work.
teaching/training Preprint
Error Tight: Exercises for Lab Groups to Prevent Research Mistakes
This resource introduces a structured set of exercises based on human factors research to help lab groups identify and prevent common procedural errors. It focuses on building a collaborative lab culture that prioritizes error detection through systematic workflow audits and collective accountability.
evidence Paper
Statistical heartburn: an attempt to digest four pizza publications from the Cornell Food and Brand Lab
This article demonstrates the application of simple statistical checks to identify inconsistencies and potential errors within a specific set of published research papers. By detailing a forensic analysis of these case studies, it provides a model for how researchers can independently verify the plausibility and accuracy of reported data.
Reading List 0
Saved to your reading list! Click the pill to view, export BibTeX, or manage your list.
JUST-OS chatbot (offline)
Chatbot offline — we hope to bring it back soon