Great post today out on Simply Statistics. In it, the author critiques the claim that most research is false, finding claims of a reproducibility crisis are probably overstated at this point, but concluding the following steps are still necessary:
We need more statistical literacy
We need more computational literacy
We need to require code be published
We need mechanisms of peer review that deal with code
We need a culture that doesn’t use reproducibility as a weapon
We need increased transparency in review and evaluation of papers
I’d agree with this. I find the points about culture particularly important. One of the sad things about about the Course Signals situation was the reaction showed how incapable the current system is of having a debate about *numbers*, with everyone immediately retreating to narrative (or retreating to silence).
If you’re not willing to show your work, you’re not in research. But if your analysis of statistical or computational error is “Ha, ha, you’re wrong!” you’re not in research either. Both tendencies are toxic, and both issues play off one another in a culture that increasingly awards people no benefits for openness but exposes them to a lot of professional risk. The smart move for any researcher today — in climate science, education, or anything of social import — is to make it as difficult as possible to dig into the work, process, and numbers behind their results. (Don’t believe me? Just ask Michael Mann.)
If we believe educational research does matter (and it does), we need to lose the combative attitude about results that don’t support our views, and we need to open up results that do to criticism. That requires a culture that takes a joy in geeking out about the numbers before jumping into tried and true narratives; we seem to be getting further from that every day.