I thought the reproducibility crisis was limited to the social sciences and some areas of health/medicine, this is the first article I've seen that claims it is a general problem through all of academia.
The Wiki article on the Reproducibility Crisis cites a Nature survey that makes it seem like the issue is widespread through every industry, including the hard sciences like physics and engineering: https://en.m.wikipedia.org/wiki/Replication_crisis#General
Oh it's general. Apart from anything else, a majority of all scientific papers rely on some form of software, be it simple R statistical scripts, to complex programs. None of those programs ever have bugs in them, right?
Then to be fair to the authors, scientific papers follow a fairly fixed format, often with hard limits on paper length (cf. Nature, etc.). Putting all the detail necessary for replication, and a decent literature review, and an overview , is simple not possible.
And then the meta-problem - which is general - every aspect of science from hiring to grant writing, to managing a phd. farm, to goofing off on ycombinator..., essentially works against anybody trying to do detailed, methodical, provable work, no matter how brilliant they are, because doing all aspects of science properly is incredibly time consuming in a world with an ever reducing attention span.
It is not a completely general problem. In the field I work in (computational physics) we very often reproduce results published by other research groups. It's very common in our field to see plots in a new paper overlaid with data from old papers showing where our results agree (where they disagree is not a reproducability issue but is instead due to our ability to do more accurate calculations).
> None of those programs ever have bugs in them, right?
On that note: I am currently enrolled in and working thru the Udacity Self-Driving Car Engineer nanodegree.
We recently had a lesson with a coding quiz, in which the writeup stated that the output we would get might vary between one of two values; this output (for the code involved) should have been consistent, not fluctuating (as far as I could see). But there was this one small bug (?)...
IIRC, it was something in Numpy or Scikit-Learn - now, that was for one small part of this class. Nothing critical - but imagine it was for something important...
The Wiki article on the Reproducibility Crisis cites a Nature survey that makes it seem like the issue is widespread through every industry, including the hard sciences like physics and engineering: https://en.m.wikipedia.org/wiki/Replication_crisis#General