Popper is dead. Has been dead since 1994 to be precise. But also his philosophy, that a scientific idea needs to be falsifiable, is dead.
And luckily so, because it was utterly impractical. In practice, scientists can’t falsify theories. That’s because any theory can be amended in hindsight so that it fits new data. Don’t roll your eyes – updating your knowledge in response to new information is scientifically entirely sound procedure.
But she qualifies:
Even in his worst moments Popper never said a theory is scientific just because it’s falsifiable. That’s Popper upside-down and clearly nonsense. Unfortunately, upside-down Popper now drives theory-development, both in cosmology and in high energy physics.
It’s not hard to come up with theories that are falsifiable but not scientific. By scientific I mean the theory has a reasonable chance of accurately describing nature. (Strictly speaking it’s not an either/or criterion until one quantifies “reasonable chance” but it will suffice for the present purpose.)
She offers an example:
You’d think that scientists know better. But two years ago I sat in a talk by Professor Lisa Randall who spoke about how dark matter killed the dinosaurs. Srsly. This was when I realized the very same mistake befalls professional particle physicists. More.
Hossenfelder takes issue with Randall’s view that her thesis is scientific because it can be tested.
Yes, we remember Randall, the dark matter and the dinosaurs. Whatta film awaits…
Anyway, our physics color commentator Rob Sheldon offers to interpret:
If you delete the title, the remainder of the article is certainly both timely and apt. Sabine Hossenfelder has identified a very, very serious problem in physics that is costing taxpayers $bn in unnecessary experiments, and she needed a catchy title. In the middle of the article she exonerates Popper, but not his followers.
What constitutes a good particle physics theory? Just because it uses empirical support doesn’t make it true.
She ties her concerns with physics to the larger concern with psychology and pharmacology fields which are largely unrepeatable. As she says quite eloquently, you can’t use the same data set to construct a theory and also confirm it. But what she doesn’t say, is that this is what theoretical physics is doing!
That is, they have a summary of all known particles and fields—I even downloaded it once. This data set is then inverted to a recipe that cranks out all those particles given a set of adjustable variables. The recipe has no more information in it than the original data, but at least it has no less. Theorists can then add terms in specific locations that don’t disturb the existing particles and call it a new theory. But as Sabine says, it isn’t a new theory–it isn’t even a new hat, it’s just a gaudy hatpin. And there are just about an infinite number of such hatpins, all vying for federal research dollars. To date about $1bn has been spent on ground-based dark matter hatpins, another $1bn for future space-based DM hatpins, and that isn’t even counting the $10bn for CERN upgrades. As Sabine herself notes, this fetish with hatpins is very much like adding epicycles to Ptolemy. We can get as close to any answer we desire with enough epicycles, but it isn’t a very satisfying solution.
She doesn’t say it explicitly, but theoretical physicists need to stop trying to add terms to the Standard Model to account for dark matter. Ditto for experiments. We have had 5 or 6 liquid Xenon experiments looking for dark matter, do we really need a seventh? And the justification, often attributed to Edison that “we now know 9900 materials that don’t work for light bulbs”, won’t work for particle physics because unlike filaments, there are an infinite number of theories, so eliminating one more is not diminishing our ignorance one iota!
I’d like to broaden her concern to include biology, linguistics and chemistry—that we are using a bad method for constructing new theories. Why should we accept the existing recipes as valid? Why are we so willing to accept the existing data sets as error-free? The transition from Ptolemy to Copernicus required a completely new approach that initially was less accurate than the original. It even undermined the metaphysical perfection of a circle, replacing circular orbits with ellipses. (“Right, Kepler, put one of those on your cart and drive it! And if you don’t like it, why should God?”) The point made by Thomas Kuhn (and many others), was that metaphysics drives our meta-theory theorizing. We have to be willing to trash the whole system, theology and all, if we want to find something better. Adding one more item to the recipe is not even good cooking, much less good metaphysics. Kuhn likened it to sweeping the floors of a metaphysical skyscraper, when what we need is a new building.
And that is precisely what Sabine doesn’t know how to do. But that is what ID does. Surely if we can help biologists get past Darwin, we can help Sabine get past the Standard Model.
Our ID 800 number is sure to be online somewhere. 😉 No bots.
See also: Karl Popper on “adaptive” as a tautology
Dark matter killed off the dinosaurs
Question for multiverse theorists: To what can science appeal, if not evidence?