Uncommon Descent Serving The Intelligent Design Community

A Little Timeline on the Second Law Argument

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

A little timeline on the second law argument, as applied to evolution (see my BioComplexity article for more detail):

1. Scientists observed that the temperature distribution in an object always tends toward more uniformity, as heat flows from hot to cold regions, and defined a quantity called “entropy” to measure this randomness, or uniformity. The first formulations of the second law of thermodynamics stated that thermal “entropy” must always increase, or at least remain constant, in an isolated system.

2. It was realized that the reason temperature tends to become more uniformly (more randomly) distributed was purely statistical: a uniform distribution is more probable than a highly non-uniform distribution. Exactly the same argument, and even the same equations, apply to the distribution of anything else, such as carbon, that diffuses. In fact, one can define a “carbon entropy” in the same way as thermal entropy, and show, using the same equations, that carbon entropy must always increase, or remain constant, in an isolated system.

3. Since the reason thermal and carbon (and chromium, etc) distributions become more uniform in an isolated system is that the laws of probability favor more random, more probable, states, some scientists generalized the second law with statements such as “In an isolated system, the direction of spontaneous change is from order to disorder.” For these more general statements, “entropy” was simply used as a synonym for “disorder” and many physics texts gave examples of irreversible “entropy” increases that had nothing to do with heat conduction or diffusion, such as tornados turning towns into rubble, explosions destroying buildings, or fires turning books into ashes.

4. Some people then said, what could be a more spectacular increase in order, or decrease in “entropy”, than civilizations arising on a once-barren planet, and said the claim that entirely natural causes could turn dust into computers was contrary to these more general statements of the second law.

5. The counter-argument offered by evolutionists was always: but the second law only says order cannot increase in an isolated system, and the Earth receives energy from the sun, so computers arising from dust here does not violate the second law, as long as the increases in order here are “compensated” by decreases outside our open system.

6. In several publications, beginning in a 2001 Mathematical Intelligencer letter, I showed that while it is true that thermal entropy can decrease in an open system, it cannot decrease faster than it is exported through the boundary, or stated in terms of “thermal order” (= the negative of thermal entropy), in an open system thermal order cannot increase faster than it is imported through the boundary, and likewise “carbon order” cannot increase faster than it is imported through the boundary, etc. (Though I was not the first to notice this, it seemed to be a very little known fact.) Then I argued that the more general statements of the second law could also be generalized to open systems, using the tautology that “if an increase in order is extremely improbable when a system is isolated, it is still extremely improbable when the system is open, unless something is entering which makes it not extremely improbable.” Thus the fact that order can increase in an open system does not mean that computers can appear on a barren planet as long as the planet receives solar energy, something must be entering which makes the appearance of computers not extremely improbable, for example: computers.

7. I’m sure that physics texts are still being written which apply the second law to tornados and explosions and fires, and still say evolution does not violate these more general statements of the second law because they only apply to isolated systems. But I have found that after reading my writings on the second law (for example, my withdrawn-at-the-last-minute Applied Mathematics Letters article) or my videos (see below) no one wants to talk about isolated and open systems, they ALL now say, the second law of thermodynamics should only be applied to thermodynamics, it is only about heat. “Entropy” never meant anything other than thermal entropy, and even when physics textbooks apply the second law to more general situations, they are really only talking about thermal entropy. Whether the second law still applies to carbon entropy, for example, where the equations are exactly the same, is not clear.

8. Of course you can still argue that the “second law of thermodynamics” should never have been generalized (by physics textbook writers; creationists were not the first to generalize it!) and so it has no relevance to evolution. But there is obviously SOME law of Nature that prevents tornados from turning rubble into houses and cars, and the same law prevents computers from arising on barren planets through unintelligent causes alone. And if it is not a generalization of the second law of thermodynamics, it is a law of Nature very closely related to the second law!

Note added later: as clearly stated in the BioComplexity article, the statements about “X-entropy”, where X = heat, carbon, chromium,…, in an isolated or open system, are assuming nothing is going on except diffusion, in which case they illustrate nicely the common sense conclusion (tautology, actually) that “if an increase in order is extremely improbable when a system is isolated, it is still extremely improbable when the system is open, unless something is entering which makes it not extremely improbable.” Thus just showing that the statements about X-entropy are not always valid in more general situations does not negate the general, common sense, conclusion, and allow you to argue that just because the Earth is an open system, civilizations can arise from dust here without violating the second law (or at least the fundamental natural principle behind the second law). At some point you are going to have to argue that energy from the sun makes the spontaneous rearrangement of atoms into computers and spaceships and iPhones not astronomically improbable, all the popular easy ways to avoid the obvious conclusion are now gone. (see Why Evolution is Different, excerpted—and somewhat updated—from Chapter 5 of my Discovery Institute Press book. )

[youtube 259r-iDckjQ]

Comments
Arthur Hunt #15 Your extrapolation from oil-aggregation-and-separation-from-water-in-a-cruet to evolution is in perfect evolutionist-style, and in fact it doesn't work. You simply cannot equate a phenomenon due to standard chemical principles to the organization of a cell, a cybernetic factory where countless information processing nano-machines concur to the advanced metabolic tasks necessary to life. Yes your spontaneous-oil-aggregation-and-separation doesn't defy the 2nd law. But sparse molecules that spontaneously self-organize into a living cell would do, as would do a tornado that spontaneously constructs a 747. This is what Granville claims and I perfectly agree with him that is eminently relevant to the impossibility of evolution.niwrad
February 20, 2016
February
02
Feb
20
20
2016
01:04 PM
1
01
04
PM
PDT
From another forum a long time ago - as far as I can tell, the criticism is still valid, and it renders Sewell's ramblings as rather irrelevant: Professor Sewell propagates several incorrect notions, but one in particular is egregious, and has the happy property that the mistake can be seen (and corrected) on one's own kitchen countertop. Specifically, Sewell states: "It is a well-known prediction of the second law that, in a closed system, every type of order is unstable and must eventually decrease, as everything tends toward more probable (more random) states. Not only will carbon and temperature distributions become more disordered (more uniform), but the performance of all electronic devices will deteriorate, not improve. Natural forces, such as corrosion, erosion, fire and explosions, do not create order, they destroy it." Anyone reading this can put, in a modest cruet, some salad oil and some water. The cruet can be capped and the mixture shaken vigorously. Obviously, what results is a highly disorganized mixture, as the tiny globules of oil are dispersed in the water. Now, if one were to take Sewell seriously, one would expect that the disorganized mess in the capped cruet (an isolated system) would never, ever become anything other than an even more disorganized mess. But, again, anyone reading this knows that, if one were to set the cruet aside, and do absolutely nothing, the oil would spontaneously aggregate and separate from the water, and in fact a highly-ordered, perfectly-separated two-phase system would come about entirely on its own accord. By now, many readers must be wondering "is it so easy to defy the Second Law of Thermodynamics that we can do so in our kitchens?" The answer, as a chemist would tell us, is NO. The remarkable ordering that occurs in our cruet is not a defiance of the Second Law, but rather an obedience of the Law. Without going into detail, the reality of this is that the relentless drive to increasing entropy plays out at the microscopic scale to cause oil and water to separate, in effect to produce dramatic and spontaneous macroscopic ordering. Similar processes are at work inside living cells, and are largely responsible for the degree of order and organization that we see in cells. Put another way, this spontaneous assumption of macroscopic order is not a defiance of the Second Law, but an inevitable consequence of the Law. When it comes to evolution, similar principles (if based on more extensive chemistries) apply. There is no "thermodynamic failure". A perspective (such as Sewell's) that so completely ignores basic chemical principles that it predicts that oil and water will not spontaneously separate will miss this simple truth.Arthur Hunt
February 20, 2016
February
02
Feb
20
20
2016
12:33 PM
12
12
33
PM
PDT
niwrad @7:
Think, countless physics text books for decades stated that entropy is disorder and tends to increase.
The second law only requires that entropy increase in isolated systems; in open systems it's entirely normal for entropy to decrease. The Earth has at least 3.3e14 W/K more entropy leaving than entering, so as far as the second law is concerned its entropy could be decreasing by up to 3.3e14 J/K per second (actually, somewhat more since that's a lower bound). Also, entropy can be thought of as a measure of disorder, but it's quite different from what we normally think of as disorder. For instance: which is more disordered, a well-shuffled deck of cards, or a sorted deck that's 1 degree warmer (and otherwise identical)? Most people would say the shuffled deck is more disordered, but the warmer deck has the higher entropy. (In more detail: shuffling a deck of 52 cards increases its entropy by k_B*ln(52!) = 2.16e-21 J/K, while heating a 100g deck with a specific heat of 0.2 from 300K (= 26.85 Celsius = 80.33 Fahrenheit) to 301K (= 27.85 Celsius = 82.13 Fahrenheit) would increase its entropy by 0.28 J/K. In this example, the entropy change due to the temperature difference is over 100 quintillion times bigger than the difference due to shuffling.)
You patiently try to explain that systems always tend to go toward probable states and what they reply? “No, the 2nd law applies only to heat”. My God, but why heat passes from hot body to cold body? Because it is more probable the state of uniform thermal distribution.
The "increasing probability" versions of the second law only apply to systems with equilibrium boundary conditions. The Earth does not have equilibrium boundary conditions (it's in thermal contact with both the sun's photosphere at a temperature of 6000K, and the microwave background at a temp of 3K). Without equilibrium boundary conditions, you can't even define the relevant probability distribution, much less say that the system will move to states of higher and higher probability under it. (Again, more detail: consider a system that only interacts with its surroundings by exchanging heat at a constant temperature T. If it were fluctuating around equilibrium, the probability that it'll be in a macrostate with energy E and entropy S is proportional to e^((S-E/T)/k_B) (this is a form of the Boltzmann distribution). If it starts in a nonequilibrium state, it'll move through a sequence of states with monotonically nondecreasing probability under this distribution. But this probability distribution depends on the temperature T; without a single well-defined temperature, the distribution becomes undefined.) If you're going to reason about the second law's implications, you really need to use a form of the second law that applies to the situation you're considering. But if you do that, you'll find there's no conflict between the second law and evolution. BTW, as the deck-of-cards example shows, thermodynamic entropy isn't only about heat, but it is mostly about heat.Gordon Davisson
February 20, 2016
February
02
Feb
20
20
2016
12:13 PM
12
12
13
PM
PDT
Gordon @ 4. Hello Gordon, if you would indulge some questions? In a thermodynamic system, what is the difference between configurational entropy and thermal entropy? When entropy changes, what exactly is it that is changed? Can't thermodynamic entropy be stated in statistical terms [Boltzmann], and can't that be stated in information terms [Shannon], and don't many scientists accept that interpretation of entropy as being equally valid? To put it another way, isn't thermodynamic entropy just a special case of a broader principle?Mung
February 20, 2016
February
02
Feb
20
20
2016
11:26 AM
11
11
26
AM
PDT
niwrad:
No “bait and switch”. Intelligent beings work by means of knowledge.
That is an extreme oversimplification with no testable operational definition for "Intelligent", which in this case should be provided as a computer model showing all of the main features of any "Intelligent" system. niwrad:
But actually the issue of this thread is whether the 2nd law has something to do with the ID/evo debate.
The question of whether the 2nd law has something to do with the ID/evo debate amounts to expecting the consumer to later somehow for themselves explain how "intelligent cause" works by arguing against another theory entirely where there are "evo" words galore to go in endless circles over. That explains why those who buy into or get stuck in the "debate" experience serious unexpected problems from what is being sold as a "theory". What I have for scientific models and theory (my name above hyperlinks to a page for them) was long ago rejected by BioComplexity. But that is expected from an organization where what is needed is only more switch, not the bait.GaryGaulin
February 20, 2016
February
02
Feb
20
20
2016
09:38 AM
9
09
38
AM
PDT
GaryGaulin #10 No “bait and switch”. Intelligent beings work by means of knowledge. But actually the issue of this thread is whether the 2nd law has something to do with the ID/evo debate.niwrad
February 20, 2016
February
02
Feb
20
20
2016
08:02 AM
8
08
02
AM
PDT
niwrad:
I think maybe the ID argument from the 2nd law has more to do with how “unintelligent cause” works.
Then instead of the advertised theory that is expected to explain how "intelligent cause" works:
The theory of intelligent design holds that certain features of the universe and of living things are best explained by an intelligent cause, not an undirected process such as natural selection.
The consumer gets another deceptive "bait and switch" that changes the subject to something else? https://en.wikipedia.org/wiki/Bait-and-switchGaryGaulin
February 20, 2016
February
02
Feb
20
20
2016
07:29 AM
7
07
29
AM
PDT
GaryGaulin #8 I think maybe the ID argument from the 2nd law has more to do with how “unintelligent cause” works. It works in the direction of ... un-work so to speak, if with "work" we mean something organizational. Unintelligent forces are idle, laggard, they always prefer probable, easy tasks. They hate to organize ex novo, way too efforts.niwrad
February 20, 2016
February
02
Feb
20
20
2016
07:19 AM
7
07
19
AM
PDT
Then niwrad what does that explain about how “intelligent cause” works? Where is your cognitive model for us to test?GaryGaulin
February 20, 2016
February
02
Feb
20
20
2016
06:31 AM
6
06
31
AM
PDT
Hi Granville! After all evolutionists are our best fun, we should thank them. Think, countless physics text books for decades stated that entropy is disorder and tends to increase. Since this counters evolution, today some evolutionists work hard to do an errata corrige on those darned books. You patiently try to explain that systems always tend to go toward probable states and what they reply? "No, the 2nd law applies only to heat". My God, but why heat passes from hot body to cold body? Because it is more probable the state of uniform thermal distribution. Therefore indeed heat is an example among many that systems tend to probable states. But evolutionists not even are aware they shoot themselves on the foot. Too funny.niwrad
February 20, 2016
February
02
Feb
20
20
2016
05:26 AM
5
05
26
AM
PDT
Gordon, I dealt with an objection similar to your second comment in the BioComplexity paper, quoting from this: Bob Lloyd’s primary criticism [7] of my approach was that my “X-entropies”(e.g., “chromium entropy”) are not always independent of each other. He showed that in certain experiments in liquids, thermal entropy changes can cause changes in the other X-entropies. Therefore, he concluded,“the separation of total entropy into different entropies...is invalid.” He wrote that the idea that my X-entropies are always independent of each otherwas “central to all of the versions of his argument.” Actually, I never claimed that: in scenarios A and B, using the standard models for diffusion and heat conduction, and assuming nothing else is going on, the thermal and chromium entropies are independent, and then statement 1b nicely illustrates the general statement 2b (though I’m not sure a tautology needs illustrating).Granville Sewell
February 20, 2016
February
02
Feb
20
20
2016
12:14 AM
12
12
14
AM
PDT
Gordon Davidson, In the BioComplexity article I stated several times that I was assuming nothing is going on but diffusion (or heat conduction, in the case of thermal entropy). In this simple case, the carbon order cannot increase faster than it is imported, so it illustrates nicely the tautology that "if an increase in order is extremely improbable when a system is isolated, it is still extremely improbable when the system is open, unless something is entering which makes it NOT extremely improbable," which was my main point, so it is not true that ANYTHING can happen in an open system without violating the second law (or at least the general principle behind this law), as long as the increases in order are "compensated" by decreases outside the open system. Anyway, I don't think a tautology really needs illustrating, it stands even without an example.Granville Sewell
February 19, 2016
February
02
Feb
19
19
2016
11:58 PM
11
11
58
PM
PDT
Now, if anyone reading this is interested in how the second law of thermodynamics actually applies to various "entropies", here's a short summary: You can call anything you want "entropy" (for example, Sewell's X-entropies, the Shannon entropy of information theory, etc), but naming things "entropy" doesn't magically make the second law apply to them. The second law applies directly to only one entropy. Unfortunately, different people use different terms for this one entropy; I tend to call it the "thermodynamic" or "total" entropy; some people call it "thermal" entropy, or even other things. The second law applies indirectly to some other entropies because they're related to the thermodynamic(/total/whatever) entropy. But as a rule, if you don't know how some other entropy is related to the thermodynamic entropy, you don't know how (or even if) the second law applies to it. Generally, the way other entropies are related to the thermodynamic entropy is that they're part of it; that is, the thermodynamic entropy is sometimes the total of various partial entropies (which is why I call it the "total" entropy). For instance, the entropy of a classical ideal gas can be broken down into thermal and configurational components (relating to the movement and arrangement of the gas molecules, respectively). Again, the terminology is confusing: in this case, thermodynamic entropy = thermal entropy + configurational entropy. For less idealized systems, you generally don't get such a clean breakdown of the entropy into different components, but you sometimes do get an approximate breakdown. Real gases, for example, are often close enough to ideal that the thermal + configurational breakdown is a good approximation. Liquids, on the other hand, are messier than that. Now, the important thing to realize is that since the second law applies only to the thermodynamic (/total) entropy, and it doesn't place any limit on conversion of entropy between types. For example, if an ideal (or near-ideal) gas is compressed, some of its configurational entropy will be converted into thermal entropy, and it will heat up. If it's allowed to expand, the reverse happens: some of its thermal entropy is converted to configurational entropy as it cools down. If the compression/expansion happens slowly (and close to equilibrium), the conversion efficiency can be arbitrarily close to 100%. That's what's happening in most of my examples above. Sewell's X-entropies are sort of like the configurational entropies (though not close enough to be of any actual use), and in my examples the configurational entropy of the carbon atoms decreases, coupled to an equal-or-larger increase in some other partial entropy, so the total entropy increases (or stays constant) and the actual second law is fully satisfied. Sewell has repeatedly denied that this sort of conversion between entropies makes sense. He doesn't understand how it could work, or even why the different entropies should all have the same units. But reality is not limited to what he understands, and the clear fact is that this sort of conversion happens all the time, all over the place (and if you use the correct formulas instead of his X-entropies, the units do match up). So, let's apply this to the Earth and evolution: the sunlight recieved by Earth carries about 3.8e13 W/J of entropy, and the thermal (mostly infrared) radiation leaving Earth carries at least 3.7e14 W/K, for a net flux of at least 3.3e14 W/K leaving Earth (see here). There are some other entropy fluxes, but I'm pretty sure they're too small to matter. So, as far as the second law is concerned, the total entropy of Earth could be decreasing at up to 3.3e14 J/K per second, and the second law places no restriction on what form that decrease might take. That's plenty. Mind you, just because something is allowed by the second law doesn't mean it's actually possible; it might well be impossible for some other reason. Sewell really really really wants there to be some law of nature that forbids any sort of spontaneous organization, but the second law certainly doesn't do that and he has yet to make a coherent argument for any other law that does either.Gordon Davisson
February 19, 2016
February
02
Feb
19
19
2016
11:46 PM
11
11
46
PM
PDT
Again? When are you going to admit this dog just won't hunt?
6. In several publications, beginning in a 2001 Mathematical Intelligencer letter, I showed that while it is true that thermal entropy can decrease in an open system, it cannot decrease faster than it is exported through the boundary, or stated in terms of “thermal order” (= the negative of thermal entropy), in an open system thermal order cannot increase faster than it is imported through the boundary, and likewise “carbon order” cannot increase faster than it is imported through the boundary, etc.
As I've pointed out a number of times already, your 2001 letter (and subsequent rehashes) only showed this in a single very specific circumstance (diffusion through a uniform solid), and provided no basis at all for the claim that it's true in general. And it's not true in general. In fact, it fails in even slightly different situations: - Diffusion through a solid *in the presence of gravity*. In this case, lighter components(*) will diffuse toward the top, and denser ones toward the bottom, which can certainly decrease their X-entropies (despite there being no X-entropy flux into or out of the system). (* Actually, it's a little more complicated than that, but it's not worth worrying about here.) - Similarly, if you had a carbon powder uniformly mixed with air and left it isolated, the powder would settle to the bottom, decreasing its carbon entropy (again, with no carbon-entropy flux). - Start with a sealed container of propane gas, and cool it. When it gets cool enough, some of the propane will condense out as a liquid, leading to a less uniform distribution of carbon (and a decrease in the carbon entropy). Again, there's no carbon-entropy flux into or out of the system, although there is a thermal entropy flux in this case. - Mix a magnesium sulfate solution with a solution of sodium carbonate, and isolate it. The two will react to form sodium sulfate and magnesium carbonate. Magnesium carbonate is insoluble, so it precipitates out as a solid... producing a less uniform destribution of carbon, and thus again a decrease in carbon entropy. In an isolated system. - Start with a uniform block of graphite (or, actually, any other form of carbon), and isolate it completely. If some of the carbon is the C14 isotope, it'll spontaneously decay to nitrogen. Even though the distribution of carbon remains uniform, the carbon entropy will decrease due to the decreasing quantity of carbon. - Start with a block of graphite (or any other form of carbon), and compress it slightly (say, by 0.01%). (You could increase the pressure, or cool it, or whatever; it doesn't matter for this example.) The carbon entropy change in this case is, at least as far I can see, undefined. Depending on what volume you integrate over, you either get an (undefined) constant of integration that fails to cancel out, or a carbon-entropy density that diverges to -infinity (in the now-empty-of-carbon volume). This is just a small sample of the many situations where your derivation completely fails to correspond to reality. You really need to stop trying to pass it off as an actual valid law. (BTW, even if your claim was correct, it wouldn't support intelligent design; it'd imply that creating life required various ordered elements entering and leaving Earth, whether or not there was any intelligence involved.) There are a bunch of other problems with your summary, but if you can't even admit the limitations of your derivation, there's no point in going into subtler issues.Gordon Davisson
February 19, 2016
February
02
Feb
19
19
2016
11:29 PM
11
11
29
PM
PDT
And what does that explain about how “intelligent cause” works? Let's assume that your post had no intelligent cause.Mung
February 19, 2016
February
02
Feb
19
19
2016
08:23 PM
8
08
23
PM
PDT
And what does that explain about how "intelligent cause" works?GaryGaulin
February 19, 2016
February
02
Feb
19
19
2016
02:04 PM
2
02
04
PM
PDT
1 2 3 4

Leave a Reply