Uncommon Descent Serving The Intelligent Design Community

New find might “upend the Standard Model” in physics? Really? Rob Sheldon has the story


Yesterday, readers may have seen hedders like “Particle’s surprise mass threatens to upend the standard model.(Nature) and “Particle physics could be rewritten after shock W boson measurement” (New Scientist) We asked our phyusicscolor commentator Rob Sheldon what was really going on with the Standard Model and here is his take:

As usual, the journo article is breathless, “if it holds true, it could lead to entirely new theories of physics.” but the actual results are pretty boring. The standard model (SM) reports 80.357 +/- 0.008 GeV for the mass of the W-boson, the experimental result is 80.433 +/- 0.009 GeV. Yes, I reported that correctly, a 0.1% difference. Now there’s no shortage of theories to replace the SM, they’ve been coming out at 1000/yr since 1980 when the SM was formulated. But 42 years is a mighty long run for a theory, and 3 generations of particle physicists have been champing at the bit to provide the next updated theory. They invested some 30 years of that time on a failed program called “Supersymmetry” or SUSY for short, and are grasping at straws.

One of those straws is “dark matter”, which wasn’t even on their radar until President Clinton cancelled the Texas Supercollider and the Tevatron in Illinois closed and 1000 particle physicists went unemployed. At that point, they moved to astrophysics with the dubious claim that the “dark matter” of galaxies wasn’t black comets or black holes, but exotic particles not predicted by the Standard Model. This required particle physicists to build detectors and put them in abandoned gold mines, etc, to search for these elusive particles. That kept them employed for 20 years, but the effort never paid off, and now most of those experiments are winding down. Hence the panic to find something wrong with the SM that requires a new experiment. As for gravity, it isn’t so much a problem with the SM per se as with Quantum Mechanics, of which SM is a subset. And the anti-matter problem really is a problem for the “evolution” of the universe, not the SM, because something happened in the past that cannot be duplicated in our atom-smashers today. The SM was built to explain the data from atom-smashers, and while it is integrated into the Big Bang model, it has practically no say in the initial conditions and time-evolution of the universe. Anti-matter is a problem for the cosmologists, not the particle physicists.

But note the circularity. First they defined Dark Matter as something not in the SM, and then used that definition to prove there is something wrong with the SM. In actuality, Dark Matter might be exactly what Franz Zwicky said it was in 1937–matter that is not emitting light and cannot be seen with telescopes. I’m sure if Zwicky were alive today, he would scoff at the idea that it was an exotic particle, when gravel-sized dust, comets, small asteroids and even small black holes all can explain the astronomical data with no change to particle physics.

So what is the significance of this W-boson measurement?

It is interesting, in the sense that large discrepancies from theory drive the field to take ever better measurements, but it is not unusual or even earth-shattering. A figure in the scientific paper shows 9 measurements of the W-boson mass, 3 are higher than this, 5 are lower than this. What is different are the error bars assigned to this measurement, which are much narrower than everyone else’s. So this measurement falls in the middle of the pack, but claims to be highly accurate. This will undoubtedly drive the field to fund 10 more measurements and see if they really are as accurate as they say. Again, if experience is a guide, they probably have underestimated their error bars, but there is no theory waiting in the wings for this data. And in fact, the theory is in not much better shape than the data–having only a perturbative model of “effective mean field”, they sum up series expansions to 100’s and 1000’s of terms to get their accuracy. The theorists may still get their theory to agree with experiment if they do another 10,000 terms in their sum using up 3 more years of expensive supercomputer time. That would be the most likely outcome of this experiment, assuming of course, that there are not errors in their analysis.

But what this paper and journo piece reveals is the desperation felt in the particle physics community. They so desperately need the Standard Model to fail.

The paper is open access.

Note: Rob Sheldon is the author of Genesis: The Long Ascent and The Long Ascent, Volume II.

Shrug, we wait and see as usual. kairosfocus

Leave a Reply