Reproducibility vs. replicability across disciplines
This table compares the concepts of reproducibility and replicability between different disciplines, to highlight slightly different interpretations between communities. (Barba 2018) discusses different interpretations of the terms reproducibility and replicability across disciplines, authors, and stakeholder groups.
Area of knowledge | Reproducibility | Replicability | Additional readings |
---|---|---|---|
To obtain the same results using the original data and exact methods | To obtain consistent results when repeating a study with new data or under similar conditions, and/or similar methods | ||
Experimental sciences (physics, chemistry, biology) | Recreating a specific experiment using the same equipment and protocols | Repeating the experiment with different variables and subjects to verify the generalization of the results | (Baker 2017), (Mesnard and Barba 2017) |
Computer science | Obtain the same results when executing the same code and data | Obtain consistent/similar results by implementing the same algorithm or code in a different environment and/or on a different data set | (Peng 2011), (V. Stodden and Miguez 2014), (The Turing Way Community 2019) |
Social sciences (psychology, sociology) | Obtain similar results when repeating a study using new participants | Repeating a study with different groups of participants or in different cultural contexts to evaluate the generalizability of the findings | (Baker 2015), (Open Science Collaboration 2015), (Nosek et al. 2022) |
Mathematical science | Obtaining the same results by following the same reasoning and methods. | Repetition of demonstrations and proofs in different mathematical contexts to confirm the validity and generality of the theorems | (Victoria Stodden et al. 2013), (Donoho and Stodden 2015) |
Economics | Obtaining the same results when applying the same model to new data sets or economic contexts | Repetition of economic analyses with data from different periods or in different regions to evaluate the robustness of the results. | (Ioannidis, Stanley, and Doucouliagos 2017), (Vilhuber 2020) |
Education | Replicate a study using the same design and methods to confirm the results | Repeating the study in different schools or educational settings to examine the applicability of the findings | (Foundation and Education Sciences U. S. Department of Education 2019), (LeBeau, Ellison, and Aloe 2021), (Thomas Perry and Lea 2022), (Karathanasis et al. 2022), (Pownall et al. 2023) |
Medicine and health sciences | Repeat a study using the same methods to validate the results | Repeat a clinical study with different patient groups or under different healthcare conditions to confirm the applicability of the findings. | (Ioannidis 2005), (Button et al. 2013), (Niven et al. 2018) , (Stupple, Singerman, and Celi 2019), (McDermott et al. 2021), (Moassefi et al. 2023), (Montgomery 2024) |
References
Baker, M. 2015. “First Results from Psychology’s Largest Reproducibility Test.” Nature News. https://doi.org/10.1038/nature.2015.17433.
———. 2017. “Check Your Chemistry.” Nature 548 (7668): 485–88. https://doi.org/10.1038/548485a.
Barba, Lorena A. 2018. “Terminologies for Reproducible Research.” arXiv Preprint arXiv:1802.03311. https://arxiv.org/abs/1802.03311.
Donoho, David L, and Victoria Stodden. 2015. “Reproducible Research in the Mathematical Sciences.” In Princeton Companion to Applied Mathematics, 916–25. Princeton University Press. https://www.stodden.net/papers/PCAM_20140620-VCS.pdf.
Foundation, The National Science, and The Institute of Education Sciences U. S. Department of Education. 2019. Companion Guidelines on Replication & Reproducibility in Education Research. The National Science Foundation. https://www.nsf.gov/pubs/2019/nsf19022/nsf19022.pdf.
Ioannidis, JP. 2005. “Why Most Published Research Findings Are False.” PLOS Medicine 2 (8): e124. https://doi.org/10.1371/journal.pmed.0020124.
Ioannidis, JP, TD Stanley, and H Doucouliagos. 2017. “The Power of Bias in Economics Research.” The Economic Journal 127 (605): F236–65. https://doi.org/10.1111/ecoj.12461.
Karathanasis, Nestoras, Daniel Hwang, Vibol Heng, Rimal Abhimannyu, Phillip Slogoff-Sevilla, Gina Buchel, Victoria Frisbie, Peiyao Li, Dafni Kryoneriti, and Isidore Rigoutsos. 2022. “Reproducibility Efforts as a Teaching Tool: A Pilot Study.” PLOS Computational Biology 18 (11): e1010615. https://doi.org/10.1371/journal.pcbi.1010615.
LeBeau, Brandon, Scott Ellison, and Ariel M. Aloe. 2021. “Reproducible Analyses in Education Research.” Review of Research in Education 45 (1): 195–222. https://doi.org/10.3102/0091732X20985076.
McDermott, Matthew B. A., Shirly Wang, Nikki Marinsek, Rajesh Ranganath, Luca Foschini, and Marzyeh Ghassemi. 2021. “Reproducibility in Machine Learning for Health Research: Still a Ways to Go.” Science Translational Medicine 13 (586): eabb1655. https://doi.org/10.1126/scitranslmed.abb1655.
Mesnard, Olivier, and Lorena A. Barba. 2017. “Reproducible and Replicable Computational Fluid Dynamics: It’s Harder Than You Think.” Computing in Science Engineering 19 (4): 44–55. https://doi.org/10.1109/MCSE.2017.3151254.
Moassefi, Mana, Pouria Rouzrokh, Gian Marco Conte, Sanaz Vahdati, Tianyuan Fu, Aylin Tahmasebi, Mira Younis, et al. 2023. “Reproducibility of Deep Learning Algorithms Developed for Medical Imaging Analysis: A Systematic Review.” Journal of Digital Imaging 36 (5): 2306–12. https://doi.org/10.1007/s10278-023-00870-5.
Montgomery, Erwin B. 2024. Reproducibility in Biomedical Research: Epistemological and Statistical Problems and the Future. 2nd ed. Academic Press.
Niven, Daniel J., T. Jared McCormick, Sharon E. Straus, Brenda R. Hemmelgarn, Lianne Jeffs, Tavish R. M. Barnes, and Henry T. Stelfox. 2018. “Reproducibility of Clinical Research in Critical Care: A Scoping Review.” BMC Medicine 16 (1). https://doi.org/10.1186/s12916-018-1018-6.
Nosek, Brian A., Tom E. Hardwicke, Hannah Moshontz, Aurélien Allard, Katherine S. Corker, Anna Dreber, Fiona Fidler, et al. 2022. “Replicability, Robustness, and Reproducibility in Psychological Science.” Annual Review of Psychology 73 (1): 719–48. https://doi.org/10.1146/annurev-psych-020821-114157.
Open Science Collaboration. 2015. “Estimating the Reproducibility of Psychological Science.” Science 349 (6251): aac4716. https://doi.org/10.1126/science.aac4716.
Peng, RD. 2011. “Reproducible Research in Computational Science.” Science 334 (6060): 1226–27. https://doi.org/10.1126/science.1213847.
Pownall, Madeleine, Flávio Azevedo, Laura M. König, Hannah R. Slack, Thomas Rhys Evans, Zoe Flack, Sandra Grinschgl, et al. 2023. “Teaching Open and Reproducible Scholarship: A Critical Review of the Evidence Base for Current Pedagogical Methods and Their Outcomes.” Royal Society Open Science 10 (5). https://doi.org/10.1098/rsos.221255.
Stodden, Victoria, David H Bailey, Jonathan Borwein, R. J. LeVeque, W. Rider, and W. Stein. 2013. “Setting the Default to Reproducible. Reproducibility in Computational and Experimental Mathematics.” Computational Science Research. SIAM News 46 (5): 4–6. https://stodden.net/icerm_report.pdf.
Stodden, V, and SB Miguez. 2014. “Best Practices for Computational Science: Software Infrastructure and Environments for Reproducible and Extensible Research.” Journal of Open Research Software 2 (1): e21. https://doi.org/10.5334/jors.ay.
Stupple, Aaron, David Singerman, and Leo Anthony Celi. 2019. “The Reproducibility Crisis in the Age of Digital Medicine.” Npj Digital Medicine 2 (1). https://doi.org/10.1038/s41746-019-0079-z.
The Turing Way Community. 2019. “The Turing Way: A Handbook for Reproducible Data Science.” Zenodo. https://doi.org/10.5281/zenodo.3233986.
Thomas Perry, Rebecca Morris, and Rosanna Lea. 2022. “A Decade of Replication Study in Education? A Mapping Review (2011–2020).” Educational Research and Evaluation 27 (1-2): 12–34. https://doi.org/10.1080/13803611.2021.2022315.
Vilhuber, Lars. 2020. “Reproducibility and Replicability in Economics.” Harvard Data Science Review 2 (4). https://doi.org/10.1162/99608f92.4f6b9e67.