Uncommon Descent Serving The Intelligent Design Community

BS Watch: How does life come from randomness?

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

David Kaplan explains at Quanta (scroll down for vid) how the law of increasing entropy could drive random bits of matter into the stable, orderly structures of life. More.

According to our favorite physicist Rob Sheldon, the guy’s hair is way more formidable than his ideas. Okay, Sheldon didn’t put it quite that way but here is what he did say:

There are a number of fallacies in this video, which unfortunately, are like zombies and keep being resurrected. In addition, there’s a rhetorical “strawman” argument used to deflect rightful critique. Let’s address the strawman, and then the fallacies.

(1) life is really, really, really different from non-life. So some highly simplistic feature of life is extracted–in this case–“structure”. Then we show how non-life can sometimes have “structure” (snowflakes, etc.) Thus we “prove” that non-life can have the same feature as life, implying that the other 99.9999% of differences are just like “structure”.

(a) life is a whole lot more than structure, and indeed, “dynamics” is a much better definition of life than “structure”, simply because life’s structure is only analyzed when it is dead, pickled, dissected, photographed, etc.

(b) even when non-life shows “structure” it is only superficially similar to life. A snowflake, for example, can be described by 3 or 4 simple parameters, whereas a cell can’t be. Even in the best examples of this video, a simple graphics algorithm (based on fractal compression) can differentiate between life and non-life.

(2) non-equilibrium statistical mechanics is often invoked to explain how non-life can be structured. (Benard convection, for example.) This was the pipe-dream of 1977 Nobel prizewinner Ilya Prigogine. Needless to say, his effort to solve the origin of life (OOL) problem were found to be dead ends.

In the past 2 years, Jeremy England of MIT has been promoted as Prigogine’s successor–though no one ever mentions poor Ilya’s name, perhaps because of his total failure. Why was it a total failure?

Because the “structure” that non-equilibrium stat mech gives you, is
designed to maximize entropy production. Those cute patterns that spontaneously form in a boiling pot are designed to maximally conduct the flow of heat and energy, something a smooth and homogeneous system does poorly. This property is known as the Maximum Entropy Production Principle–that any system driven away from equilibrium does its best to
maximize entropy production. What is entropy? The opposite of information. So unlike life–which sustains and makes information, MEPP destroys information. This is not the way to make life, as all the followers of Prigogine discovered to their dismay.

(3) What about the hoary “open thermodynamic system” argument, that as long as the system is “open” and energy flows through it (such as from the sun), we can expect entropy to decrease and spontaneous order to form, where the excess “disorder” is carried away by the flow?

(a) To begin with, this was a physicist’s excuse as to why biology violates the 2nd law of thermodynamics. (“The laws of thermodynamics only apply to closed systems”, they explained.) This bug has now morphed into a “feature”, so instead of being unable to account for biology, suddenly we are told that biology is a product of this bug.

(b) This is false for 2 reasons:

I) Physicists can explain open systems. They do it all the time.

Non-equilibrium stat mech is the formal description of such an open system, and its roots go back to the 1950’s.

ii) The entropy flow of open systems can be measured, and sunlight has nowhere near the information needed to account for life on earth.
Mathematician Granville Sewell does an excellent job computing this “information flow” equation, and shows that despite numerous Darwinian scientists invoking sunlight to account for life, there is mathematically no possible way for sunlight (or particles, or magnetic fields, or anything else from the sun) to have that much information. We’re talking many, many orders of magnitude here, which Darwinian physicists refuse to calculate.

One of the few physicists willing to calculate it was Sir Fred Hoyle who calculated 10^40,000 as the information in a cell. That’s at least 10^39,960 times larger than the information in the entire history of sunlight.

So the entire video is a sales job: replace life with a snowflake, and then argue that physics can explain snowflakes with non-equilibrium stat mech. If you really want to irritate these Darwinians, read Sewell’s papers and start citing numbers.

See also: What we know and don’t know about the origin of life

Follow UD News at Twitter!

Comments
Rob, thanks for replying, but I don't think you've actually answered any of my criticisms. - I argued that your criticism of Jeremy England's work was mistaken, and based on conflating different definitions of "information". You don't seem to have responded. - I refuted your claim that "Darwinian physicists refuse to calculate" the relevant information flows, and showed that the actual information flow is far beyond anything evolution might require. No response. (Note: I suppose you could point out that I'm not actually a physicist, and therefore don't qualify as a "Darwinian physicist", but that would be quibbling. Dr. Emory F. Bunn -- an actual physicist -- has done a very similar calculation, just without the conversion to information units.) - I pointed out that you'd failed to convert Sir Fred Hoyle's probability figure into the appropriate form before you cited it as information. Your response, apparently, is that I should read Hoyle so we'll "have something to talk about beyond dimensional analysis". Why should I bother? In the first place, I don't need to know anything about the details of his calculation to know how to convert it to information units (or to see that you didn't do the conversion). In the second place, he may have believed in evolution, but he doesn't seem to have understood it at all well; therefore I doubt his calculations have any actual relevance. - Finally, I asked for a more specific reference to the "information flow" calculation you said Granville Sewell had done, and your reply seems to be "Sewell ~2010". That's not more specific. It doesn't even seem to be relevant, since the terse summary you give seems to be about information and entropy "at rest", not information flows. It also doesn't seem plausible, since from what I've read from Sewell indicates that he does not understand or accept a quantitive connection between ln(Omega) for different types of entropy. For instance, in his 2011 paper "Poker Entropy and the Theory of Compensation" (announced here at UD, but apparently not online anymore) he mocks the connection:
Thus I would like to extend Styer and Bunn’s results [essentially, applying S = k_B * ln(Omega) to things other than a thermodynamic microstate -GD] to the game of poker. The Boltzmann formula allows us to define the entropy of a poker hand as S = k_B ? log(W) where k_B = 1.38 ? 10^?23 Joules/degree Kelvin is the Boltzmann constant and W is the number of possible hands of a given type (number of “microstates”, W = p ? C(52, 5), so S = k_B ? log(p) + Constant, where p is the probability of the hand). For example, there are 54912 possible “three of a kind” poker hands and 3744 hands that would represent a “full house,” so if I am dealt a three of a kind hand, replace some cards and end up with a full house, the resulting entropy change is S2 ? S1 = k_B ? log(3744) ? k_B ?log(54912) = k_B ? log(1/14.666) = ?3.7?10^?23 Joules/degree. Of course, a decrease in probability by a factor of only 15 leads to a very small decrease in entropy, which is very easily compensated by the entropy increase in the cosmic microwave background, so there is certainly no problem with the second law here. There are some obvious problems. While one can certainly define a “poker entropy” as S_p = k_p ? log(W) and have a nice formula for entropy which increases when probability increases, why should the constant k_p used be equal to the Boltzmann constant k_B? In fact, it is not clear why poker entropy should have units of Joules/degree Kelvin. In the case of thermal entropy, the constant is chosen so that the statistical definition of thermal entropy agrees with the standard macroscopic definition. But there is no standard definition for poker entropy to match, so the constant k_p can be chosen arbitrarily. If we do arbitrarily set k_p = k_B, so that the units match, it still does not make any sense to add poker entropy and thermal entropy changes to see if the result is positive or not.
I assume your "Nobel prize ~1990" reference is to Pierre-Gilles de Gennes? I'm not significantly familiar with his work, but let me ask you one question: would de Gennes' work on phase transitions make any sense at all if the "k" for different types of entropy could be different, as Sewell seems to think? How could Sewell's work build on principles he doesn't understand or accept? BTW, your description of Shannon's work is also significantly inaccurate -- Shannon mostly used entropy as a measure of information, not the opposite. Depending on the situation, either may be appropriate. To oversimplify a bit, if you're using entropy to measure information that you have, it's clearly positive information; but if you're using it to measure information that you do not have (i.e. ignorance or uncertainty), it often makes sense to treat it as negative information.Gordon Davisson
July 9, 2016
July
07
Jul
9
09
2016
11:29 PM
11
11
29
PM
PDT
Gordon, information = negentropy (Claude Shannon 1940) entropy of ideal gas = S =k ln(Omega) Boltzmann ~1890 Omega_liquid-crystal >> Omega_ideal gas (Nobel prize ~1990) Omega_life >>> Omega_liquid-crystal Sewell ~2010, Hoyle ~1980 Read Hoyle's "Mathematics of Evolution" because he believed in both. Then maybe we have something to talk about beyond dimensional analysis.Robert Sheldon
July 9, 2016
July
07
Jul
9
09
2016
07:03 PM
7
07
03
PM
PDT
I was taught the "just add water" theory of the origin of life. Now it's the "just add heat" (and entropy!) theory. In light of the code, structure and mechanisms of even the 'simplest' forms of living things, both of these ideas make the same amazing sense to me (that is, none at all).... but as a thorough-going member the unwashed masses, I simply lack the requisite amount of credulity. Or perhaps, the 'right' sort of education (the kind that eschews the questioning mind). IMHO, "BS Watch" is the right category for this thread...leodp
July 9, 2016
July
07
Jul
9
09
2016
11:24 AM
11
11
24
AM
PDT
Entropy is only the opposite of information if you adopt a very unusual definition of information. Entropy is only the opposite of information if you adopt a very unusual definition of entropy.Mung
July 9, 2016
July
07
Jul
9
09
2016
11:12 AM
11
11
12
AM
PDT
Rob Sheldon's response is a complete mess. I'll pick out some of the worst bits here, and point out the problems:
(3) What about the hoary “open thermodynamic system” argument, that as long as the system is “open” and energy flows through it (such as from the sun), we can expect entropy to decrease and spontaneous order to form, where the excess “disorder” is carried away by the flow? (a) To begin with, this was a physicist’s excuse as to why biology violates the 2nd law of thermodynamics. (“The laws of thermodynamics only apply to closed systems”, they explained.) This bug has now morphed into a “feature”, so instead of being unable to account for biology, suddenly we are told that biology is a product of this bug.
This is a muddle of several related but different arguments. In the first place, entropy decrease and order formation are quite different. The second law forbids entropy decreases in isolated systems (and open systems with net entropy influxes), but in open systems with net entropy effluxes (essentially, where there's more entropy leaving than entering), entropy decreases are not forbidden by the second law. Just because something isn't forbidden by the second law does not mean that "we can expect" it to happen, it just means the second law doesn't tell you if it's going to happen or not. Thus, the claim that the second law forbids evolution is toast. Gone. Dead. The live question is whether other principles of thermodynamics (not the second law) require evolution or suggest that evolution is to be expected. Thermo does not rule out evolution, but does it rule it in? I think this is the "bug" vs. "feature" thing Rob is talking about. (BTW, my personal opinion is that the case for thermo ruling in evolution has not been made to my satisfaction. Essentially, thermo doesn't seem to be taking either side in the pro- vs anti-evolution war. More on this below...) As for order formation, the situation is even more complicated; disequilibrium is pretty much a requirement for the sort of order we're talking about, but it's not the same thing. The second law implies that systems that are either isolated or open with equilibrium boundary conditions monotonically approach equilibrium. But open systems with non-equibrium boundary conditions (like Earth), can move away from equilibrium. In open systems with far-from-equilibrium boundary conditions, and a high energy flow, we tend to observe the formation of various forms of order. The Earth is exactly such a system, as it receives about 1.73e17 Watts of radiation from the Sun's photosphere at a temperature of about 6,000 Kelvin, and radiates about the same amount to deep space, which (mostly) has a temperature of under 3 Kelvin (although the emitted radiation is more like 300 Kelvin). This is where Prigogine, England, et al come into it (and the potential thermodynamic case in favor of evolution) come in. They've attempted to characterize spontaneous order formation in far-from-equilibrium systems. [Earlier in Rob's missive:]
(2) non-equilibrium statistical mechanics is often invoked to explain how non-life can be structured. (Benard convection, for example.) This was the pipe-dream of 1977 Nobel prizewinner Ilya Prigogine. Needless to say, his effort to solve the origin of life (OOL) problem were found to be dead ends. In the past 2 years, Jeremy England of MIT has been promoted as Prigogine’s successor–though no one ever mentions poor Ilya’s name, perhaps because of his total failure. Why was it a total failure? Because the “structure” that non-equilibrium stat mech gives you, is designed to maximize entropy production. Those cute patterns that spontaneously form in a boiling pot are designed to maximally conduct the flow of heat and energy, something a smooth and homogeneous system does poorly. This property is known as the Maximum Entropy Production Principle–that any system driven away from equilibrium does its best to maximize entropy production. What is entropy? The opposite of information. So unlike life–which sustains and makes information, MEPP destroys information. This is not the way to make life, as all the followers of Prigogine discovered to their dismay.
Entropy is only the opposite of information if you adopt a very unusual definition of information. For example, if a gram of water freezes its entropy decreases by 1.22 Joules/Kelvin. At the standard conversion rate of 1 bit k_Boltzmann*ln(2) = 9.57e-24 Joule/Kelvin of thermodynamic entropy, that would be an increase of 1.27e23 bits of information (or about 16 zettabytes). Is that really a relevant definition of information? Moreover, maximum entropy production is pretty much what England is claiming that life does. From "A New Physics Theory of Life" in Quanta magazine:
From the standpoint of physics, there is one essential difference between living things and inanimate clumps of carbon atoms: The former tend to be much better at capturing energy from their environment and dissipating that energy as heat [and thus producing entropy -GD]. Jeremy England, a 31-year-old assistant professor at the Massachusetts Institute of Technology, has derived a mathematical formula that he believes explains this capacity. The formula, based on established physics, indicates that when a group of atoms is driven by an external source of energy (like the sun or chemical fuel) and surrounded by a heat bath (like the ocean or atmosphere), it will often gradually restructure itself in order to dissipate increasingly more energy. This could mean that under certain conditions, matter inexorably acquires the key physical attribute associated with life.
As I said, I'm not convinced England's theory is correct (although I'm not convinced it's wrong either -- I haven't looked at it closely enough or have enough expertise in the relevant physics to have a real opinion). But it's clear that living organisms do produce lots of entropy, so this isn't a strike against England's theory. But it does destroy Rob's claims that 1) entropy is the opposite of information and 2) that life "sustains and makes information". You cannot make both claims with the same definition of "information". Back to Rob:
ii) The entropy flow of open systems can be measured, and sunlight has nowhere near the information needed to account for life on earth. Mathematician Granville Sewell does an excellent job computing this “information flow” equation, and shows that despite numerous Darwinian scientists invoking sunlight to account for life, there is mathematically no possible way for sunlight (or particles, or magnetic fields, or anything else from the sun) to have that much information. We’re talking many, many orders of magnitude here, which Darwinian physicists refuse to calculate.
Completely wrong. I posted precisely that calculation last fall (based on my earlier entropy flux calculation), in response to an earlier posting of Rob's thoughts on the subject. Short summary: the entropy flux from Sun to Earth is about 3.83e13 W/K, from Earth to deep space is over 3.7e14 W/K, so Earth's net entropy efflux is over 3.3e14 W/K, which comes out to 3.4e37 bits per second of information. If that same rate held over Earth's 4.54e9-year history, that'd be over 4.8e54 bits. That's orders of magnitude more than evolution could possibly require. Rob cited Fred Hoyle to the contrary, but...
One of the few physicists willing to calculate it was Sir Fred Hoyle who calculated 10^40,000 as the information in a cell. That’s at least 10^39,960 times larger than the information in the entire history of sunlight.
Wrong again. He calculated 1 in 10^40,000 as the probability against the enzymes in a cell forming by chance. To convert that to information, you have to take the logarithm of the probability, giving -log2(1/1e40000) = 132,877 bits (or 1.3e5 if you prefer). As I calculated above, the Earth's net information-entropy flux per second is over 2e32 times that. Finally, I've read a lot of what Granville Sewell has written about thermodynamics and evolution, and I'm not aware of anyplace where that he computed anything like an "information flow" (or indeed much of anything about any real thermodynamic system). I don't see anything like that in the papers ("Entropy or Evolution" and "A Mathematician's View of Evolution") at the link you gave. Rob, can you be specific about what calculation you're talking about?Gordon Davisson
July 8, 2016
July
07
Jul
8
08
2016
10:59 PM
10
10
59
PM
PDT
Truth @1 Peace & joy. 150 years ago, Evolution was a perfectly good Theory. It filled a gap between the Bible and the newly discovered fossils. And it took perhaps a century to get seriously wobbly. Communism, which emerged as a Theory of social organization about the same time, followed about the same track. And in both cases, the disciples continue to defend the theories in a very religious kinda way: I don't care what FACTS you have, I BELIEVE in (Evolution, Communism, etc.). I don't see Communism disappearing any time soon. The people at the tops of the pyramids enjoy a comfortable life, and the fanatic believers have a reason to live. I figure we'll evolve past Darwinism about the same time we evolve past Communism.mahuna
July 8, 2016
July
07
Jul
8
08
2016
11:21 AM
11
11
21
AM
PDT
Wow! Darwinists are looking more and more ridiculous with each passing day. Future humans will look back and wonder how Darwinists were ever taken seriously. The whole misguided theory is (and always has been) based on wild speculation and philosophical assumptions. Happy to be alive to see this wretched theory exposed for what it is...a fraud.Truth Will Set You Free
July 8, 2016
July
07
Jul
8
08
2016
10:19 AM
10
10
19
AM
PDT

Leave a Reply