Readers will recall that Sal Cordova doesn’t agree with Granville Sewell’s doubts about Darwin based on the Second Law of Thermodynamics. His argument is here.
Sewell has replied here.
Now, Rob Sheldon weighs in:
As a physicist, I have a few more problems with the 2nd law, which as Sal points out, is a problem with definitions. I agree with Sal that few people understand either thermo or entropy, and as a consequence, make a hash of either pro or con arguments. For the sake of expediency, I think Sal is suggesting that it is better to avoid this topic because it invariably ends up in the weeds.
On the other side, Granville thinks that the weeds are still an interesting place to be. Surely if we avoided all difficult topics, we’d never make progress in anything. We can make progress if we are not afraid to plough the untilled turf, and so Granville’s work is both original and interesting (to quote Eugene Wigner.)
Here’s the definitional challenge of the two meanings of 2nd Law:
1) Thermodynamics: deals with macroscopic states of things like hot bricks and boiling water. This was all worked out in the early 1800’s, and is essential for the operation of steam engines all the way up to nuclear power plants. Entropy was defined as macroscopic state defined by temperature and heat.
No chance of this being wrong, because if it were ever possible to beat the system, you’d have a perpetual motion machine and besides being filthy rich, you could take over the world.
This is where the phrase “2nd law” applies like an iron rule. Unfortunately, cells are not steam engines, and the origin of the first cell is not a problem related to steam engines, so Sal is pleading for prudence in using the 2nd law in this fashion.
2) Statistical Mechanics: Boltzmann and Gibbs redefined macroscopic states of matter into microscopic states of matter, where the entropy of a system is now based on counting microscopic states. As Sal has pointed out, the entropy is now defined in terms of how many different ways there are to arrange, say, nitrogen and carbon dioxide molecules. No “heat” is involved, merely statistical combinatorics. This ostensibly has little to do with heat and dynamics, so it is called “statistical mechanics”.
3) The Equivalence: Boltzmann’s famous equation (and engraved on his tombstone) S = k ln W, merely is an exchange rate conversion. If W is lira, and S is dollars, then k ln() is the conversion of the one to the other, which is empirically determined. Boltzmann’s constant “k” is a semi-empirical conversion number that made Gibbs “stat mech” definition work with the earlier “thermo” definition of Lord Kelvin and co.
Despite this being something as simple as a conversion factor, you must realize how important it was to connect these two. When Einstein connected mass to energy with E = (c2) m, we can now talk about mass-energy conservation, atom bombs and baby universes, whereas before Einstein they were totally different quantities. Likewise, by connecting the two things, thermodynamics and statistical mechanics, then the hard rules derived from thermo can now be applied to statistics of counting permutations.
This is where Granville derives the potency of his argument, since a living organism certainly shows unusual permutations of the atoms, and thus has stat mech entropy that via Boltzmann, must obey the 2nd law. If life violates this, then it must not be lawfully possible for evolution to happen (without an input of work or information.)
The one remaining problem, is how to calculate it.
4) The Entropy problem
Boltzmann was working with ideal gasses, and most of the entropy illustrations in physics books deal with either ideal gasses or energy states of an atom. Nobody but nobody wants to tackle the entropic calculation of a cell.
The problem isn’t just that the number of arrangements of the 10^14 atoms in a cell is at least 10^14!, and that the arrangements show long range ordering, and that they demonstrate constrained dynamics, but frankly we don’t know how to do the arithmetic.
The entropy merely has to show an increase, but we have so many orders of magnitude in this calculation, we can’t tell if the entropy increased or decreased. In the field of numerics, if you subtract two large numbers, most of the significant digits vanish, and you can be left with noise. In this case, the numbers are so big, that subtracting the “after” entropy from the “before” entropy gives nothing but noise.
It is impossible to calculate the Boltzmann entropy change of a cell unless we could write down that number with a 100-trillion digits of math and keep track of the last 100 or so digits.
My (Nobel nominated) college professor used to ask a rhetorical question in his thermo class, “what is the entropy change of a cow after the butcher put a 22 calibre bullet through the brain?” Yet this miniscule entropy change is supposed to tell us the difference between a “living” and “dead” cow. From a physics view point, there is almost no change in disorder, yet from a biological viewpoint it is all the difference in the world. Physics just doesn’t know how to measure this, and doesn’t even know if they ever can measure this quantity.
So that’s the weeds. We have a recipe, but we can’t use it. Despite having this great definition for Boltzmann entropy, we don’t know how to apply it to life, and therefore we can’t tell if entropy is up or down until the critter rots and starts to turn into gas.
That means some other definition of entropy is employed that “approximates” the Boltzmann definition. Shannon information is often employed, as well as some thought-experiments involving black holes. These work fine on the obvious examples, but once again, they fail to detect the difference between a live and dead cow. So when Granville shows a picture of a tornado, we all know intuitively that Boltzmann entropy is increasing, but there’s just no easy way to calculate it. It reminds of the 30 years of lawsuits while tobacco companies said that no one had proven smoking causes cancer–we all knew it was true, but we didn’t have the proof.
Despite this intuitive use of the 2nd law not being mathematically robust, we can still learn a lot by using it. But if our opponent challenges us to prove it, we must be willing to go in the weeds.
Here’s two weeds:
Response 1) The entropy of life balances out. Food in, waste out, entropy up.
Answer: Really? Can you show me your calculation? Your proof would be Nobel prize material!
Response 2) The earth is not a closed system.
Then how about the solar system? No, then the galaxy? No, then surely the universe is a closed system! Where’s the missing entropy? Show me where it went, and give me a rough idea of its magnitude–one or two orders of magnitude is sufficient.