Uncommon Descent Serving The Intelligent Design Community

A Designed Object’s Entropy Must Increase for Its Design Complexity to Increase – Part 2

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

In order for a biological system to have more biological complexity, it often requires a substantial increase in thermodynamic entropy, not a reduction of it, contrary to many intuitions among creationists and IDists. This essay is part II of a series that began with Part 1

The physicist Fred Hoyle famously said:

The chance that higher life forms might have emerged in this way is comparable to the chance that a tornado sweeping through a junkyard might assemble a Boeing 747 from the materials therein.

I agree with that assertion, but that conclusion can’t be formally derived from the 2nd law of thermodynamics (at least those forms of the 2nd law that are stated in many physics and engineering text books and used in the majority of scientific and engineering journals). The 2nd law is generally expressed in 2 forms:

2nd Law of Thermodynamics (THE CLAUSIUS POSTULATE)
No cyclic process is possible whose sole outcome is transfer of heat from a cooler body to a hotter body

or equivalently

2nd Law of Thermodynamics (THE KELVIN PLANCK POSTULATE)

No cyclic process is possible whose sole outcome is extraction of heat from a single source maintained at constant temperature and its complete conversion into mechanical work

In Part 1, I explored the Shannon entropy of 500 coins. If the coins are made of copper or some other metal, the thermodynamic entropy can be calculated. But let’s have a little fun, how about the thermodynamic entropy of a 747? [Credit Mike Elzinga for the original idea, but I’m adding my own twist]

The first step is to determine about how much matter we are dealing with. From the manufacturer’s website:

A 747-400 consists of 147,000 pounds (66,150 kg) of high-strength aluminum.

747 Fun Facts

Next we find out the the standard molar entropy of Aluminum (symbol Al). From Enthalpy Entropy and Gibbs we find that the standard entropy of aluminum at 25 Celcius at 1 atmosphere is 28.3 Joules/Kelvin/Mole.

Thus a 747’s thermodynamic entropy based on the aluminum alone is:

Suppose now that a tornado runs into 747 and tears of pieces of the wings, tail, and engines such that the weight of aluminum in what’s left of the 747 is now only 50,000 kg. Using the same sort of calculation, the entropy of the broken and disordered 747 is:

Hence the tornado lowers the entropy of the 747 by disordering and removing vital parts!

And even supposing we recovered all the missing parts such that we have the original weight of the 747, the entropy calculation has nothing to say about the functionality of the 747. Hence, the 2nd law, which inspired the notion of thermodynamic entropy has little to say about the design and evolution of the aircraft, and by way of extension it has little to say about the emergence of life on planet earth.

Perhaps an even more pointed criticism in light of the above calculations is that increasing mass in general will increase entropy (all other things being equal). Thus as a system becomes more complex, on average it will have more thermodynamic entropy. For example a simple empty soda can weighing 14 grams (using a similar calculation) has a thermodynamic entropy of 14.68 J/K which implies a complex 747 has 4.7 million times the thermodynamic entropy of a simple soda can. A complex biological organism like an Albatross has more thermodynamic entropy than a handful of dirt. Worse, when the Albatross dies, it loses body heat and mass, and hence its thermodynamic entropy goes down after it dies!

So the major point of Part II is that a designed object’s thermodynamic entropy often increases with the increasing complexity of the design for the simple reason that it has more parts and hence more mass. And as was shown in part 1, the Shannon entropy also tends to increase with the complexity of the design. Hence, at least two notions of entropy (Shannon and thermodynamic) can increase with increased complexity of a design (be it man-made design, evolution made design, or ….)

This concludes the most important points I wanted to get across. Below is merely an exploration of some of the fundamentals of thermodynamics for readers interested in the some of the technical details of thermodynamics and statistical mechanics. The next section can be skipped at the reader’s discretion since it is mostly an appendix to this essay.
========================================================================
THERMODYNAMICS AND STATISTICAL MECHANICS BASICS

Classical Thermodynamics can trace some of its roots to the work of Carnot in 1824 during his quest to improve the efficiency of steam engines. In 1865 we have a paper by Clausius that describes his conception of entropy. I will adapt his formula here:

Where S is entropy, Q is heat, and T is temperature. Perhaps to make the formula more accessible, let us suppose we have a 1000 watt heater running for 100 seconds that contributes to the boiling of water (already at 373.2ᵒK). What is the entropy contribution due this burst of energy from the heater? First I calculate the amount of heat energy input in the water:

Using Clausius’ formula, and the fact the process is isothermal, I then calculate the change of entropy in the water as:

So how does all this relate to Boltzmann and statistical mechanics? There was the intuition among scientists that thermodynamics could be related to classical (Newtonian) mechanics. They suspected that what we perceived as heat and temperature could be explained in terms of mechanical behaviors of large numbers of particles, specifically the statistical aspects of these behaviors, hence the name of the discipline is statistical mechanics.

A system of particles in physical space can be described in terms of position and momentum of the particles. The state of the entire system of particles can be expressed as a location in a conceptual Phase Space. We can slice up this conceptual phase space into a finite number of chunks because of the Liouville Theorem. These sliced-up chunks correspond to the microstates which the system can be found in, and furthermore the probability of the system being in a given microstate is the same for each microstate (equiprobable). Boltzmann made the daring claim that taking the logarithm of the number of microstates is related to the entropy Clausius defined for thermodynamics. The modern form of Boltzmann’s daring assertion is:

where Ω is the number of microstates of the system, S is the entropy, and kB is Boltzmann’s constant. Using Boltzmann’s forumula we can then compute the change of entropy:

As I pointed out Boltzmann’s equation looks hauntingly similar to Shannon’s entropy formula for the special case where the microstates of a Shannon information system are equiprobable.

Around 1877 Boltzmann published his paper connecting thermodynamics to statistical mechanics. This was the major breakthrough that finally bridged the heretofore disparate fields of thermodynamics and classical mechanics.

Under certain conditions we can relate Clausius notions of entropy to Boltzmann’s notions of entropy, and thus the formerly disparate fields of thermodynamics and classical mechanics are bridged. Here is how I describe symbolically the special case where Clausius’s notions of entropy agrees with Boltzmann’s notions of entropy:

[It should be noted, the above equality will not always hold.]

Mike Elzinga and I had some heated disagreement on the effect of spatial configuration to entropy. Perhaps to clarify, the colloquial notion of disordering things does not change the thermodynamic entropy (like taking a 747 and disordering its parts, as long as we have the same matter, it has the same thermodynamic entropy). But that’s not to say that changes in volume (which is a change in spatial configuration) won’t affect the entropy calculations. This can be seen in the formula for the entropy of an ideal monoatomic gas (the Sakur-Tetrode Equation):

where
S is the entropy
N is the number of atoms
kB is Boltzmann’s constant
V is the volume
E is the internal energy
ℏ = Dirac Constant (reduced Planck’s constant)

From this we can see that increasing either the volume which the gas occupies, the energy of the gas, or the number of particles in the gas will increase the entropy. Of course this must happen under reasonable limits since if the volume is too large there cannot be energy exchange in the particles and notions of what defines equilibrium begin to get fuzzy, etc.

Nowhere in this calculation are notions of “order” explicitly or implicitly identified, and hence such notions are inessential and possibly misleading to the understanding of entropy.

How the Sakur-Tetrode formula is derived is complicated, but if one wants to see how entropy can be calculated for simpler systems, Mike Elzinga provided a pedagogical concept test where the volume of the system is fixed and small enough such that the particles are close enough to interact. The volume is not relevant in his examples so the entropy calculations are simpler.

I went through a couple of iterations to solve the problems in his concept test. His test and my two iterations of answers (with help from Olegt on discrete math) are here:
Concept test attempt 1: Basic Statistical Mechanics

and
Concept test amendments: Purcell Pound

Acknowledgements
Mike Elzinga, Olegt, Elizabeth Liddle, Andy Jones, Rob Sheldon, Neil Rickert, the management, fellow authors and commenters at UD and Skeptical Zone.

[UPDATE 9/7/2012]
Boltzmann

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.” (Final paragraph of #87, p. 443.)

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

Comments
The rigorous treatment is pretty far in my past but order and disorder are always brought in to entropy, correct. Here's a thought - A 'designed' arrangement of particles (or anything at any scale) is brought to that configuration independently of the properties (energy etc etc) of the 'particles'. No, no laws are broken in so doing, but does this not enter into a strict accounting?butifnot
September 5, 2012
September
09
Sep
5
05
2012
03:06 PM
3
03
06
PM
PDT
Butifnot, See my comment here: Comment in Part I Thank you for offering your thoughts. Salscordova
September 5, 2012
September
09
Sep
5
05
2012
02:46 PM
2
02
46
PM
PDT
Sal I think you're wanting to break new ground, but this is just basic stuff. A wide survey and a bold synthesis is needed, I believe many have advanced the concepts separately and their works are ripe for integration.butifnot
September 5, 2012
September
09
Sep
5
05
2012
02:36 PM
2
02
36
PM
PDT
This will be interesting. Remember, our formalisms are not the the real thing. The 'real thing' here, is fundamental and will relate order, info, and energy, I believe.butifnot
September 5, 2012
September
09
Sep
5
05
2012
02:19 PM
2
02
19
PM
PDT
1 2 3

Leave a Reply