Cosmology Fine tuning Intelligent Design Physics

Are the “redundant” particles of the universe evidence of fine-tuning?

Spread the love
The Big Bang timeline — a world with a beginning

McMaster University’s Physics and Astronomy faculty consider the Standard Model of our cosmos to be on the verge of failure. Here is one item they point to:

New particles — muons, pions and a horde of other new particles — continued to be discovered, initially through studies of the cosmic rays that continuously bombard our atmosphere from space. This temporarily led to a much more complicated picture, whose underlying simplicity did not emerge until the 1960s when many of the particles known by that point were themselves found to be made up of still smaller constituents. What then emerged as elementary particles remain so now: six species of ‘quarks’ (up, down, strange, charm, bottom and top), and six species of ‘leptons’ (electron, muon, tau and three species of neutrinos).

The resulting list of particles is strangely redundant. Essentially all of everyday matter is made only of electrons and up and down quarks (of which the last two make up the proton and neutron), which, together with a neutrino, make up what is called the ‘first generation’ of elementary particles. Remarkably Nature seems to come to us with two more ‘generations’ of particles, whose properties directly copy this first generation (i.e. the charm and top quarks resemble the up quark; the strange and bottom quarks resemble the down quark; the muon and tau are copies of the electron; and so on).

Department Of Physics & Astronomy, “Particle Physics At The Crossroads” at McMaster University

Our physics color commentator, experimental physicist Rob Sheldon, who is also the author of Genesis: The Long Ascent, offers some thoughts, explaining why it is evidence of fine-tuning of the universe:


I looked over the McMaster website, and it is a pretty good summary of the state-of-the-art in particle physics. Not sure why they still have the Higgs boson as unconfirmed, but the Standard Model is portrayed well.

The 3 columns in the Fermion section of the table are being discussed in the paragraph you quoted. The up/down quark, the electron and electron_neutrino leptons make up some 99% of the matter in the universe. So why do we have the charm/strange quark, the muon/neutrino leptons (column 2) or the top/bottom quark and tau/neutrino leptons (column 3)?

By analogy to the appendix and the tonsils, or junk DNA, they don’t seem to be doing very much, so what is the significance of families 2 & 3? Why not families 4 & 5? Couldn’t creation have been easier on God if he just left out this redundant stuff?

The Long Ascent: Genesis 1â  11 in Science & Myth, Volume 1 by [Sheldon, Robert]

As George puts it, everything comes down to the theological argument “if I were God, I wouldn’t have done it this way.” The counter-argument is very simple, “just because we don’t know what it is doing doesn’t mean it is junk,” or more trenchantly, “it’s a good thing you aren’t God.”

As it turns out, the Big Bang Nucleosynthesis (BBN) model makes use of every bit of particle physics we know. The ratio of protons to neutrons in the very early universe determines the amount of each element that is created. That ratio depends on the strong interaction (quarks), the weak interaction (leptons), the temperature, the number of families (3), and all the different masses of each elementary subatomic particle. While you don’t hear this mentioned as often as the fine-tuning of the mass/expansion-rate ratio (which has to be accurate to 1:10^60), the number of fermion families is likewise fine-tuned. If columns 2 & 3 didn’t exist, the ratios in the BBN would be off, and the universe would look very different.

Given that particle physicists haven’t been able to explain dark matter or dark energy, they really haven’t devoted themselves to the fine-tuning of the 3 families of quarks and leptons. But it is my view, (one day realized I hope), that in the GeV era of ultra-hot-Big-Bang, the neutrinos carry a charge or a current, which converts this era into a plasma universe. It is this plasma that produces a huge magnetic field that decayed away in the first 3 billion years of the BB and today are seen as quasars, AGNs, and jets. The number of current-carrying species (families of quarks and leptons) determines how complex the plasma can be, and the complexity stores information. Therefore the huge gap in information between a butter-smooth hot Big Bang, and the cold, galaxy- and planet-rich universe we see today is explained by the information content of this GeV plasma.

So despite McMaster U. thinking this odd, and believing (hoping?) for a failure of the Standard Model, I see this as a necessary means of storing the information in the hot Big Bang and demonstrating the ultimate fine-tuning of the cosmos.

See also: What becomes of science when the evidence does not matter?

Follow UD News at Twitter!

2 Replies to “Are the “redundant” particles of the universe evidence of fine-tuning?

  1. 1
    PaV says:

    Rob:

    I think the “not yet confirmed” status of the Higgs is simply due to an older image they copied to the article/webpage.

    I have an idea, a rather simple one, that can explain dark energy, dark matter, the three generations of electrons and a whole lot more. But unless it’s all put in stark mathematical language, there’s not a physicist alive that wants to bother thinking things through.

    Here’s the simple idea: space is pixelated; the pixels emit tiny amounts of energy at regular intervals.* Quarks are geons (Wheeler); that is, objects that are tiny pockets of energy that have so curved local space as to form an asymptotically flat boundary (on their scale). Mass comes about because a spherical volume is bounded by quarks swirling in circular orbits, with quarks rotating in one direction and the anti-quark circling in the opposite direction. Two such pairs carve out a spherical volume (S2). Closest to the center of either a neutron or proton are the top and bottom quarks; at further radial distance out are the charm and strange, with the top and bottom being at the periphery. The energy/mass associated with these different quarks depends on the amount of energy they constrain to their individual spherical volume.

    There is no fractional charge of the quarks. Instead, a neutron decays (as the energy scale of universe decreases) into a proton and an electron. All ‘massive’ objects are a function of the strong force, with the electron being a broken quark/antiquark pairing, and the anti-neutrino being the result of a transfer of strong force away from the down (or, perhaps, top) anti-quark with the net result of a color charge being broken down into what we know as electrical charge.

    If the energies are sufficiently high, a charm or bottom can be knocked out. These quarks being more massive results in “electrons” that are more massive: three concentric ‘rings,’ and three different electron and neutrino masses.

    This means that the electron and neutrinos are ‘formed’ via the strong force even though they are leptons.

    The only troubling part of this is that I see bosons as simply being the composite of two chiral ‘currents’, each being a fermion. Thus the neutron should be expected to behave as a boson while tests show they are fermions. However, these tests are statistical and perhaps this is hiding something.

    Color charges connect the concentric spherical volumes, acting as forces connecting various quarks/anti-quarks to one another. The ‘pixels’–below the Planck scale, are couples to one another via the gravitational force. Gravity occurs because of the potential created by the differential in energy density across the boundaries of quark spherical volumes, and, across the nucleus and finally that atom. The energy of the EM field ‘flows’ from the protons and electrons (broken-quarks) with the amount of energy NOT flowing in this way as the ultimate source of the gravitational force, thus, uniting QM with GR.

    Dark Energy is simply the cosmological constant: the emitted pixel energy (of infinitesimal amounts and infinitesimal intervals of time) which accumulate where “mass” does not obstruct its ‘flow.’ Dark Mass is the confluence of the Dark Energy of interstellar space and the constrained flow of pixel energy caused by stellar masses.

    When the energy density of matter (protons/neutrons–basically, quarks) reaches a certain level, energy is almost completely constrained from flowing away, and this matter ‘collapses’ to form a black hole. Thus, this pixelated energy–what we know as “entropy,” comes into equilibrium with the outward ‘force’ of this pixelated energy, which has to be at least 10^19 Gev to overcome the force of gravity between the pixels. Thus, black holes are in equilibrium, and the outward force of the pixelated energy eventually pushed itself out at an extremely low rate (Hawkings radiation). Since this pixelated energy is the source of what we know as entropy, the energy content of any black hole must be related to its entropy content (Unruh).

    * These ‘pixels’ are gravitons composed of two connected bi-linear “gluon” fluxes. This ‘bi-linearity’ results in no CP violation within quark volumes.

    I could go and on. There’s entanglement (and the delayed choice experiment) which can be thought of along the lines I’ve laid out. There’s the dominance of matter. There’s spontaneous symmetry breaking. And more. All of these are amenable to an analysis based on the basics I’ve laid out.

    Einstein came up with ideas, and then found the mathematics to describe it. I leave all of this to those who can work out the maths.

  2. 2
    vmahuna says:

    Yeah, I’m gonna go with “don’t second guess God” on this. At least until we get copies of reports on the pre-production tests.
    For as the Bard teaches us: There are more things in heaven and earth, Horatio,
    Than are dreamt of in your philosophy.

Leave a Reply