Uncommon Descent Serving The Intelligent Design Community

Forgotten Creationist/ID Book endorsed by Nobel Prize Winner in Physics

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

There is a forgotten creationist book by engineer and physicist Robert Gange, PhD: Origins and Destiny that was published in 1986. It is available for free online, but for how long, I do not know. It was pioneering, and anticipated arguments that would be found in ID for the next 27 years, and likely beyond.

Gange worked in the field of cyrophysics, so it is no surprise he writes with incredible insight regarding thermodynamics. His book is the only book written by a creationist that I agree with on the subject of thermodynamics, and he uses the so-called “New Generalized 2nd Law” to make his case. [the Kelvin-Plank version of the 2nd Law is a special case of the “New Generalized 2nd Law”].

I’ve argued vigorously at UD that the Kelvin-Plank version (which measures thermal entropy in Joules/Kelvin) is insufficient to support ID arguments, but I provisionally support the “New Generalized 2nd Law” (which measures generalized entropy in bits) to support ID arguments.

I’m posting this to get the discussion going, and I don’t expect resolution any time soon, but we have to start somewhere. I’m hopeful, as rigor and debate take place over Gange’s work, that novel ID arguments can be successfully formulated in the territory of physics (rather than just chemistry and biology).

The book got an endorsement by Eugene Wigner, an ID-sympathetic Nobel Laureate in Physics.

“I was particularly pleased with Dr. Gange’s refusal of the idea of materialism, and the convincing arguments supporting that refusal. In fact, the book will be a welcome response to materialism. Good luck, for a good book!”

Eugene P. Wigner
Nobel prize in physics

Gange expresses my views about the Kelvin-Plank version of the 2nd Law, but then sketched out a very interesting argument from the Generalized version of the 2nd Law by Jaynes in favor of ID. Unfortunately Gange did not put forward a rigorous argument, but it might be a future project for research for the ID community.

What Is the Second Law?

I’ve studied the Second Law for over twenty-five years and I am convinced that it’s the most misunderstood and misused law in all of science. Creationists (wrongly) use it to say that evolution is impossible, and evolutionists (wrongly) use it to show that natural processes are equivalent to the activity of intelligence. With the discovery of new knowledge, the Second Law underwent revision, and was replaced by the New Generalized Second Law of Thermodynamics. The Second Law is a special case of the New Generalized Second Law.

In the words of the New Generalized Second Law, an observer functioning within a closed system will lose, but never gain, information.


Entropy is this uncertainty. These simple words come from some rather deep physics, parts of which are outlined in simplified ways in other chapters. (For further discussion of the more technical aspects of entropy and its relationship to the Second Law, see Appendix 2.)

In summary, entropy is a mathematical expression that allows scientists to assign numbers measuring the degree to which energy deteriorates into less useful forms with the passage of time. This decay corresponds to the progressive decline of an observer’s ability to extract useful work from the system due to his growing uncertainty of the whereabouts of the physical objects described by him at an earlier time.

When rigorously formulated with all of the “bells and whistles” the observer must be included as part of the system that is described. The mathematical expression that assigns numbers is found to measure also the observer’s uncertainty of the location and velocity of every object within the system, and at each instant of time. The New Generalized Second Law is a formal mathematical statement showing that the decline on average of an observer’s information is an inevitable result of the passing of time, and that the total information available to an observer cannot exceed what exists in the system’s overall description.

These considerations impose important constraints on the theory of the chemical origin of life, and the commonly held belief that natural laws can energize biological structures into increasing levels of complexity. Virtually all attempts to justify these ideas have focused upon the thermodynamic flow of energy in an organism, rather than on the genetic blueprint by which the flow is controlled. This unfortunate confusion has created the false impression that living organisms sprang from dead chemicals, and that their progeny prospered under processes which, under any other circumstance, would be regarded as the inspired product of intellect.

http://www.ccel.us/gange.ch7us.html

Here is my extrapolation of Gange’s argument. Life is made of software written in the universal language of physics for self-replicating machines. Only a vanishingly small fraction of alphabetic strings make syntactically correct English language sentences, and only a vanishingly small number of physical constructs will make syntactically correct self-replicating machines within the laws of physics. We intuitively know this — computers and software don’t spontaneously emerge, but proving it from the laws of physics isn’t quite so easy.

If uncertainty increases in a closed system, the software of life over time will deteriorate, hence, the system will lose cohesive information, hence the software inside the system will deteriorate. Open systems are of no help since the uncertainty is even higher for open systems. Connecting a closed system to an open system will only invite more uncertainty and opportunities to scramble the software that was in the closed system. Hence, life will not spontaneously evolve into more extravagantly complex self-replicators. Hence, the phrase “Clausius and Darwin don’t mix” actually might have more force than previously supposed. Unfortunately this argument isn’t rigorous yet, but it will be, God willing, someday. 🙂

It is not surprising Wigner showed interest in Gange’s book. Wigner used von Neumann’s proof of the 2nd law from Quantum Mechanics to attempt to argue for the existence of special laws of physics for living organisms since the organisms seemed to be in violation of the increasing generalized entropy that Gange refers to. Wigner’s arguments were referenced and criticized here by John Baez: Is Life Improbable?. I think Wigner was headed in the right direction, and so was Gange. The arguments just need a little reworking.

When thermal entropy is expressed in Joules/Kelvin, it cannot be used to support ID, however, if as Gange asserts, generalized entropy can be expressed in terms of bits, then it is a whole new ballgame. There is thermal entropy (measured in Joules/Kelvin) and then according to Gange there is generalized entropy (measured in bits).

Though I do not believe the Kelvin-Planck version of the 2nd law supports ID, I believe the Jaynes version holds much promise. Not only does it have theoretical promise, but I believe there is already substantial field observation and lab evidence of Gange’s claims.

Here is another compelling passage of Gange’s book:

Maxwell’s Mysterious “Demon”

The organizational intricacies of protein reflect information on a scale that a Supreme Intelligence can produce, but that nature cannot. To see why this is true, let’s think about a small imp who became known as “Maxwell’s Demon.”3 We will allow the imp to control a tiny window that connects two adjoining compartments. In your mind’s eye, imagine two boxes joined by a common wall. In the middle of the wall, picture a tiny window that connects one box to the other. On one side of the window there’s a shelf where the imp is perched.

The imp is able to open or close the window at will, and without effort. Both boxes contain air and from time to time, as a result of this air, a gentle breeze blows against either side of the window. The imp is told, “Open the window if the breeze on your side is strong; otherwise keep it shut.”

Now this may seem like a simple request, but the question is, can the imp obey the instruction? Although it may seem like something he can do, it turns out that were he to successfully perform the required task, he would violate one of the most fundamental laws of science. It’s worthwhile to learn why this is so, because we will not only uncover a fascinating insight regarding the origin of life, but we will also discover the answer to something that stumped the whole world for over half a century regarding Maxwell’s demon.

We’ve said that each of the two boxes contains air. But air consists of tiny molecules which are atomic specks so small that about 10 thousand billion will fit onto the head of a pin. Furthermore, these miniscule dots are in a state of constant motion; we sense them each time we feel a breeze. A strong breeze means that they’re moving fast whereas no breeze means that they’re hardly moving.

Now suppose the imp opens the window each time a strong breeze occurs. If he consistently does this, all the fast-moving molecules will pass through the open window and into the box on the other side. But since he keeps the window closed when there is no breeze, the slow molecules will remain in the box where he’s standing. Thus, the imp has succeeded in separating the fast and slow air molecules, putting the fast ones into the one box, and keeping the slow ones in the adjoining box. From a scientific point of view, faster air molecules mean a higher temperature and an increase in pressure.

Therefore, our imp has created a pressure and temperature difference between the two boxes; i.e., he has created energy!

But how can Maxwell’s demon work? How can he create energy? This question baffled the world for many years, and no one was able to offer a satisfactory answer. Scientists asked, “Why can’t an imp open a window?” If he could, he certainly could separate the fast- and slow-moving air into separate compartments. The fast-moving molecules will travel to one side, and the slower ones will remain in the other.

No one questioned the fact that, at least in principle, the imp had created energy. Let’s see how we know that this is true. To show that the imp has created energy, we can wait until he’s collected all the fast-moving air on one side. When that is done, we’ll open the window, but this time keep it open. Air from the high pressure side (box with the fast-moving molecules) will rush through the opening and into the other side. If a generator wheel is located near the window during the time it’s open, the resulting gush of air can be made to turn the wheel of the generator and, thereby, make electricity. Therefore, the imp does create energy. But here’s our dilemma: It’s impossible to create energy in a closed box! So no one could figure out how the imp could do it!

Death of a Demon

Maxwell invented his demon in the 1800s, but not until 1929 did a scientist named Leo Szilard find the answer. The imp can’t create energy — not because he’s unable to open and close the window, but because he doesn’t know when to do so. In other words, he doesn’t have the information necessary to identify which air molecule is moving fast and which is moving slow. But what’s even more important, it costs him energy to acquire the information he needs! In fact, Szilard did a careful analysis showing that it costs more energy than the imp can recover.4 Simply put, the process of creating energy forces you to lose it! We can phrase it yet another way: Information is equivalent to energy in the sense that to have one means you can create the other.

[cross posted originally at The Skeptical Zone Entropy and Disorder]

NOTES:
The question may come up “what about Granville Sewell’s claims?”. He and I have had bitter disagreements on the topic, but here is where our views can have reconciliation:

1. instead of Kelvin-Plank or the Asimov version, use the Jaynes “New Generalized 2nd Law”
2. use generalized entropy (measured in bits) instead of thermal entropy (Joules/Kelvin) to make ID arguments
3. instead of “order”, use the phrase “organization” (or some similar phrase since biological systems are organized but highly “disordered” in the Kolmogorov sense)

If these conventions are adopted, Dr. Sewell’s claims will be using terminology that physicists and information engineers will be less able to equivocate. I hope this post will help both Dr. Sewell and I to put forward even stronger arguments in favor of ID. We will surely need the help of some sharp physicists to make the arguments sketched out here more rigorous.

Comments
But then again, my understanding of negentropy may be faulty.Mapou
December 8, 2013
December
12
Dec
8
08
2013
01:10 PM
1
01
10
PM
PDT
I like the clear distinction being made between order and complexity. So-called "negentropy" creates islands of order at the expense of complexity. In this light, there is no way it can be used to support the claim that natural processes can give rise to living organisms. It's a powerful insight.Mapou
December 8, 2013
December
12
Dec
8
08
2013
01:09 PM
1
01
09
PM
PDT
To understand why I said there is still a lot of work to be done to make Jaynes Generalized law useful to ID consider the following example:
A living lab rat is take from room temperature to near absolute zero, it is now obviously dead, calculate the change in entropy
A chemist, a mechanical engineer, a physicist will say the rat's entropy in terms of joule/kelvin went down. Here is a case where lowering entropy was lethal. Now assuming we are using Jaynes Generalized entropy, I presume the number of bits of information actually went up since there is less uncertainty about the state of the system, yet we know the rat is dead. Clearly the act of freezing the rat reduced uncertainty about the energy distribution (and reduced uncertainty implies increased information), but somehow the rat in terms of its hardware/software lost information. The challenge is to reconcile the numbers. What is the delta-S in terms of bits (general bits) vs. functional bits. To be able to make a rigorous argument where bit numbers can be stated is the next challenge for ID.scordova
December 8, 2013
December
12
Dec
8
08
2013
12:40 PM
12
12
40
PM
PDT
Also from the book:
Not only is entropy wrongly identified with disorder; the error has caused some people to introduce a nonsensical thing called “negentropy.”15 This idea assumes that negative entropy exists, and that it can be identified with order. It says that the onset of life was accompanied by a change in negentropy that just balanced the increase in entropy, and that this change explains the order found in life. However, the idea of negentropy is quite wrong because it is defective in several basic ways.16 Nevertheless, over the past twenty years a number of people have used this erroneous concept in an attempt to justify the creation of life by natural processes.17 To understand how they do this, picture water in an ice cube tray in a refrigerator. Let the tray represent the earth, the refrigerator represent the sun, and the transition of water into ice, the creation of life. The order that arises when water becomes ice is said to be balanced by the disorder that occurs when the liquid refrigerant changes into a gas (molecules are in greater disarray in a gas than a liquid). The spontaneous creation of life is then justified by saying that the increased order in life corresponds to negentropy that is offset by the greater increase in entropy (disorder). But life is complex, nor ordered; and the basic natural process that changes water into ice is counterproductive to the creation of life because it results in a loss of complexity. The reason that ice is ordered and not complex is that ice is made up of millions of tiny atomic units that are identical to each other. This means that if we describe one of them, we will have described all of them. When water changes into ice, we actually need less information to describe the ice because its molecules no longer behave in an independent manner. It is less complex because we need less information to describe it. But unlike ice, life is vastly complex because we require staggering quantities of information to describe even the simplest of cells. Negentropy is thus an erroneous concept. …. The important truth for our discussion is that life isn’t ordered; it is complex. We saw that clearly in chapter 4 in our consideration of the materialistic world view. An increase in organization of a structure — from simple dust particles to the oriental rug to the vacuum cleaner to the house (to repeat an earlier metaphor) — requires the systematic increase of information, but information is not produced by natural processes in the magnitude necessary to explain the origin of life. Let’s summarize what’s been said thus far: 1. The Second Law requires entropy to increase. 2. Entropy cannot be identified with disorder. 3. Negentropy cannot be identified with order. 4. Ordered molecules (ice) present less information. 5. Living cells are not ordered — they are complex.
scordova
December 8, 2013
December
12
Dec
8
08
2013
10:00 AM
10
10
00
AM
PDT
From the book's appendix
“We noted earlier that entropy can be correlated-but not identified-with disorder. And we said, moreover, that this correlation is valid in only three cases-ideal gases, isotope mixtures, and crystals near zero degrees Kelvin. The truth of the matter is illustrated by considering the two chemically inert gases, helium, and argon.(7) In our mind’s eye we imagine two balloons, one filled with helium and the other with argon. First, we lower the temperature of both balloons to zero degrees Kelvin. This makes all the gas molecules stop moving in either balloon. Next, we get the molecules moving by heating both balloons to 300 degrees Kelvin (room temperature). Were we to mathematically calculate the increase in entropy, we would find that it was 20 percent higher in the argon balloon than in the helium balloon (154 v. 127 joules per mole per degree Kelvin). But since helium molecules are ten times lighter than argon molecules, they are moving three times faster and thus are more disordered. Here, then, is an example where higher entropy is accompanied by lower disorder, thereby demonstrating that we cannot identify one with the other. In the particular example cited, the greater argon entropy comes from the closer quantum translational energy levels identified with its greater molecular mass as described by the Sack?r-Tetrode equation. Let’s look at another example. Were we to continue dissolving salt in an isolated glass of water, we’d reach supersaturation, a point where the water could not dissolve any more salt. Under certain conditions, the dissolved salt can be made to separate from the water. When crystallization happens the entropy always increases. However, the temperature can go up or down, depending on the kind of salt used and the thermochemistry of the solution,(8) This means that the motion of the molecules, and therefore the disorder, can go up or down, whereas the entropy always goes up. A less obvious example is the spontaneous freezing of supercooled water.(9) Again we see that the entropy must increase, whereas the disorder can go up or down. “
scordova
December 8, 2013
December
12
Dec
8
08
2013
09:59 AM
9
09
59
AM
PDT
1 2

Leave a Reply