Today's blog is from Joanna Diong. Jo is a faculty member at Sydney University where she investigates muscle activity (EMG) in motor control. She is passionate about good science and regularly co-authors the blog scientificallysound.org as a resource for people interested in doing research well.
In the last few years, large replication studies in psychology, oncology and cancer biology have highlighted the inability to reproduce key scientific findings, and continue to fuel the debate on the reproducibility crisis in research. These findings, along with similar findings from other meta-research studies, have prompted researchers and scientists to look more closely at how research is done, and develop ways to improve the transparency and conduct of research. Fellow ICECReam’ers have written about related issues such as quality of clinical trials and limitations of p values. What are some ways that we, as a community of early career researchers, could implement reproducible research practices in our own research? Here are some strategies, adapted from a paper by Sandve and colleagues:
1) For every result, keep track of how it was produced
The development of a research question or testing protocol often goes through many iterations before it is finalised for a study. Especially during such a tortuous process, record the details of thought processes, rationale for testing procedures, and how pilot data were collected and analysed to obtain preliminary results. If you use a computer program to process and analyse data, record enough details on program settings, parameters and manual procedures so that in a year or so, your future self can reproduce the results.
2) Avoid manual data manipulation steps
Where possible, use an automated program (eg. programming scripts) instead of manual procedures (eg. clicking on drop-down boxes) to clean or analyse data. This is because such manual procedures are error-prone, inefficient, and difficult to reproduce.
3) Version control programming scripts
When writing a programming script to clean or analyse data, small changes can have large consequences, so record the versions of the script as it is being developed – this allows you to revert back to previous versions if needed. Ideally, use an automated version control system (eg. Git or Mercurial) to record versions of scripts. There are many free (online or face-to-face) courses to learn these programs.
4) Provide public access to data, progamming scripts, and results
Many journals are implementing measures to improve transparency in research. For example, the International Committee of Medical Journal Editors now require data sharing statements for publication of clinical trials, and the Journal of Physiotherapy encourages publishing code used to analyse the data. These measures are in keeping with principles of open science, with the aim of improving transparency and reproducibility in research.
Sandve GK, Nekrutenko A, Taylor J, Hovig E (2013) Ten Simple Rules for Reproducible Computational Research. PLoS Comput Biol 9(10): e1003285. https://doi.org/10.1371/journal.pcbi.1003285