It’s one of the biggest problems in science—and computers are part of the problem:
Nature Computational Science will champion the reproducibility of scientific outcomes, ensuring that articles meet the highest standards of reproducibility and transparency in reporting. – Elizabeth Hawkins, “A Dedicated Home for Computational Science” at Nature (April 21, 2020)
News, “From Nature: A new, topflight computer science journal” at Mind Matters News
Computers are a part of the problem because, as Gary Smith explains,
Computer algorithms are terrible at identifying logical theories and selecting appropriate data to test these theories but they are really, really good at rummaging through data for statistically significant relationships. The problem is that discovered patterns are usually coincidental. They vanish when tested with fresh data—a disappearing act which contributes to the replication crisis that is undermining the credibility of scientific research. A 2015 survey by Nature, one of the very best scientific journals, found that more than 70 percent of the researchers surveyed reported that they had tried and failed to reproduce another scientist’s experiment and more than half had tried and failed to reproduce some of their own studies!
GARY SMITH, “Computers Excel at Finding Temporary Patterns” at Mind Matters News
Then, of course, there’s also Goodhart’s Law: Once a policy becomes a target, it loses all information.
See also: Why it’s so hard to reform peer review. Robert J. Marks: Reformers are battling numerical laws that govern how incentives work. Know your enemy!