Uncommon Descent Serving The Intelligent Design Community

Other Types of Entropy

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

If you look at university physics texts which discuss the second law, you will find examples of “entropy” increases cited such as books burning, wine glasses breaking, bombs exploding, rabbits dying, automobiles crashing, buildings being demolished, and tornadoes tearing through a town (I have actually seen each of these cited). According to Sal, all of these “creationist” text writers are confused, because in most or all of these cases, “entropy” is actually decreasing. When an albatross dies, or a tornado destroys a 747, entropy is actually decreasing, he says. Of course, Sal is talking about “thermal” entropy, since the only formulation of the second law he recognizes as valid is the early Clausius formulation, which deals with thermal entropy alone.

Well, no one is arguing that these examples result in thermal entropy increases, they are examples of “entropy” (disorder) increases of a more general nature. The reason tornadoes can turn a 747 into rubble and not vice versa is, of all the configurations atoms could take, only a very small percentage could fly passengers safely across the country, and a very large percentage could not. Thus we can argue that the original 747 has lower “entropy” (more order) than the demolished machine. Another very confused creationist, Isaac Asimov, even wrote, in the Smithsonian Magazine,

We have to work hard to straighten a room, but left to itself, it becomes a mess again very quickly and very easily…How difficult to maintain houses, and machinery, and our own bodies in perfect working order; how easy to let them deteriorate. In fact, all we have to do is nothing, and everything deteriorates, collapses, breaks down, wears out—all by itself—and that is what the second law is all about.

There are many formulations of the second law, the later ones recognize a more general principle, for example, Kenneth Ford in Classical and Modern Physics writes

There are a variety of ways in which the second law of thermodynamics can be stated, and we have encountered two of them so far: (1) For an isolated system, the direction of spontaneous change is from an arrangement of lesser probability to an arrangement of greater probability, (2) For an isolated system, the direction of spontaneous change is from order to disorder.

Sal says the second law has nothing to do with order or disorder; that is certainly not true of these more general, later, formulations.

OK, so Sal doesn’t like these examples, since they are difficult to quantify; he is not alone. Thermodynamics texts, as opposed to general physics texts, tend to shy away from them for that reason. Goodness knows, if we watch a video of a tornado tearing through a town, it is so difficult to quantify what we are seeing that we can never be sure if the video is running forward or backward, or if entropy is increasing or decreasing. But there are other types of “entropy” which are as quantifiable as thermal entropy, for example, the “X-entropy” which measures disorder in the distribution of any diffusing component X is defined by essentially the same equations as are used to define thermal entropy, which measures disorder in the distribution of heat as it diffuses, and X-entropy is certainly equally quantifiable. And X-entropy has little or nothing to do with thermal entropy (doesn’t even have the same units), one can increase while the other decreases in a given system. So why do people like Styer and Bunn and Sal, insist on treating all types of entropy as thermal entropy, and attempt to express the entropy associated with evolution, or the entropy of a 747, in units of Joules/degree Kelvin?

If you insist on limiting the second law to applications involving thermal entropy, and that the only entropy is thermal entropy, than Sal is right that the second law has little to say about the emergence of life on Earth. But it is not just the “creationists” who apply it much more generally, many violent opponents of ID (including Asimov, Dawkins, Styer and Bunn) agree that this emergence does represent a decrease in “entropy” in the more general sense, they just argue that this decrease is compensated by increases outside our open system, an argument that is so widely used that I created the video below, Evolution is a Natural Process Running Backward to address it a few months ago.

Comments
It is fascinating to see how, as the theory of thermodynamics progressed, the focus of interest shifted from what it is possible for a system to do, to what it is possible for an observer to know about the system. - Jeremy Campbell, Grammatical Man
Mung
September 16, 2012
September
09
Sep
16
16
2012
09:43 AM
9
09
43
AM
PST
PS: Entropic force probably describes loosely what the strong tendency to move to predominant clusters of microstates DESCRIBES.kairosfocus
September 9, 2012
September
09
Sep
9
09
2012
10:44 PM
10
10
44
PM
PST
Hi KD Glanced at the book, seems okay though I would have strongly preferred that he did not use unusual symbols for probability, the cup-cap if he insisted would have been good. As it is the pipe as in P(A|B) has a standard meaning -- conditional probability -- that becomes confusing. In addition the premise of equal a priori probabilities in this context will come in for all sorts of objections. (I am not saying they are good, just that there will be endless demands for proof of a postulate of the theory justified in the end by its empirical success . . . ) The derivation of Boltzmann entropy is workable, though Nash's development is far more intuitive. Also, please, for decades now it has been Kelvins, "degrees" having been deprecated. L K Nash's Elements of Statistical Thermodynamics remains a classic teaching tool, and Robertson's Statistical Thermophysics lays out the informational approach that Ben Naim is using to some extent. KFkairosfocus
September 9, 2012
September
09
Sep
9
09
2012
10:42 PM
10
10
42
PM
PST
Entropy has no power to do anything. It can neither build nor destroy.
Whence then the phrase "entropic force", which is commonplace in scientific literature? Several posters have remarked on the confusion surrounding the topic of entropy: terminology, concepts, etc. This is a case in point.kdonnelly
September 9, 2012
September
09
Sep
9
09
2012
05:06 PM
5
05
06
PM
PST
For those who might be interested, I've found discussions of entropy in the following textbook to be helpful: Thermodynamics & Statistical Mechanics: An intermediate level course, by Richard Fitzpatrick, Associate Professor of Physics, University of Texas at Austin. Freely available in PDF format here. And I second Mung's recommendation of Arieh Ben-Naim's Entropy Demystified. I'm not competent to judge its scientific merits, but the content is accessible and straightforward.kdonnelly
September 9, 2012
September
09
Sep
9
09
2012
04:38 PM
4
04
38
PM
PST
Mung, Cf here for a case in point (melting of ice); and the OP will help. Entropy tends to disorder systems by degrading -- I would say diffusing, but that already has a meaning that applies entropy -- concentrations of energy and loosening constraints on configurations. That last part is where it eats up info, hashing it. As the OP shows there is a little switcheroo involved in Shannon vs Gibbs. Shannon's metric is avg info communicated per symbol which looks a lot like the other. After Jaynes et al were through with it, it looks like Gibbs entropy is looking at avg MISSING info on specific microstate compatible with a macrostate given by lab observable factors. Where design issues come in is that if we observe a relevant life-functional state, that is macro-observable (in the relevant sense, looking at a few items of micron scale is not like looking at 10^18 - 10^26 molecules at sub nanometre scale) AND sharply constrains configs. Indeed we can bring to bear knowledge of genomes, protein synthesis etc that MUST be going on. Compare that with a Humpty Dumpty [sp?] exercise of pricking open a cell and decanting it into a mini test tube. No function, and the parts diffuse all over the place through Brownian motion -- extra-large molecules. Never to be seen back together again. My nanobots-microjets thought exercise is about putting together a tiny functional system in the teeth of diffusion. The lesson is that there is a need for intelligently directed organising work [IDOW] that -- as S is a state variable -- can be broken in two parts: dS_clump to bring parts together anyhow, and dS_config, to put in functional order. Per state function: dS_tot = dS_clump + dS_config And, for the parts, the entropy falls drastically twice as you clump and config, when it is over the flyable microjet is in a tightly constrained island of function in the space of possible configs of parts. As a crude estimate let's have 100 * 100 * 100 = 10^6 1-micron location cells vs a 1-l cube with (10^4)^3 = 10^12 cells for diffused parts. So we would have 1 in 10^6 cells with something in it. To get them all next to one another is a task. Similarly, to configure the right part next to the right part in a 3-d array of nodes and arcs, where parts also need to be properly oriented gives a huge search space again. The chance based search mechanism is obvious, diffusion leading to overwhelming dominance of spread-out states. (Ever did the ink drop in a beaker exercise?) Even when clumped, non-functional states will dominate over functionally configured ones. And, the IDOW to assemble 10^6 parts correctly is well beyond 1,000 bits of info. Just the LIST much less the nodes and arcs net shows that. There is no good reason on the gamut of the observed cosmos to expect to assemble spontaneously, and that holds even for sub-assemblies. (Sir Fred Hoyle's 747 in a junkyard is overkill, just to put together a D'Arsonval moving coil instrument on its dashboard or the instrument panel clock -- with a nod to Paley -- would be more than enough. As another, imagine the task of our tornado to assemble a self-replicating watch.) Nope, absent something being built into the laws of the co9smos that forces assembly of something like a cell based living form in ladderlike fashion from molecular components in Darwin's pond or the like, just diffusion alone much less hydrolysis reactions and the need for gated encapsulation put the usual OOL scenarios on the death watch list. Of course, such a front loaded cosmos would SCREAM design. So, entropy turns our to be very relevant, once we see the issue of configurations and macro-micro info gaps. As for types of entropy, I suspect that we are really discussing applications to specific contexts. However there are also different metrics depending on how sophisticated a mathematical framework you want. Clausius --> Boltzmann --> Gibbs --> von Neumann --> Shannon and onwards from there. (And yes, von Neumann crops up here again, that joke about the nest of Martians in and around the old Austro-Hungarian Empire has just enough bite to make it pinch.) KFkairosfocus
September 9, 2012
September
09
Sep
9
09
2012
02:00 PM
2
02
00
PM
PST
p.s. And if it is the *same* thing that is changing in each case, what does it mean to say that there are "other types of entropy"?Mung
September 9, 2012
September
09
Sep
9
09
2012
12:30 PM
12
12
30
PM
PST
PaV:
Sal’s formula had “delta” S; I didn’t want to bother looking up the code for the delta symbol.
Right you are. I was looking at a quote of something written by Clausius rather than the actual equation. So what does a change in entropy really mean? What is it *really* that is changing?Mung
September 9, 2012
September
09
Sep
9
09
2012
12:29 PM
12
12
29
PM
PST
F/N: Re entropy:
It is the logarithm of the number of energy microstates consistent with the macroscopic state of a thermodynamic system.
This is intended to sum up the Boltzmann expression S = k log W in words. W includes, however, position and momentum degrees of freedom of microparticles [up to six per particle, three each . . . gets us into phase space) in the system in view. W is the number of ways mass and energy (which is in moving mass and can be partly stored in position with a vibrating particle, etc) can be distributed consistent with the lab-observable macrostate. Gibbs' expression generalises this. In so doing of course the bridge to Shannon's entropy opens up. Cf here. KFkairosfocus
September 8, 2012
September
09
Sep
8
08
2012
04:17 PM
4
04
17
PM
PST
Mung: Why do you refer to S as a change in entropy? Sal's formula had "delta" S; I didn't want to bother looking up the code for the delta symbol.PaV
September 8, 2012
September
09
Sep
8
08
2012
04:01 PM
4
04
01
PM
PST
DGW: I spoke about the Shannon entropy -- average info per symbol, usually symbolised by H. (To make things even more confusing, it seems Boltzmann used H for entropy; I do not know what Gibbs used. The modern symbol for entropy in thermodynamic contexts is S.) The loose usage where the term "entropy" in information is extended to include the info in a message of N symbols, clouds the issue and the proper meaning of H. Notice, how I have given a concrete example of Shannon's own usage in 1950/51 that makes the matter crystal clear:
The entropy is a statistical parameter which measures, in a certain sense, how much information is produced on the average for each letter of a text in the language. If the language is translated into binary digits (0 or 1) in the most efficient way, the entropy is the average number of binary digits required per letter of the original language. The redundancy, on the other hand, measures the amount of constraint imposed on a text in the language due to its statistical structure, e.g., in English the high fre-quency of the letter E, the strong tendency of H to follow T or of V to follow Q. It was estimated that when statistical effects extending over not more than eight letters are considered the entropy is roughly 2.3 bits per letter, the redundancy about 50 per cent.
As can be seen here, it is average info per symbol, aka Shannon Entropy, which is directly related to the Gibbs entropy metric. The Gibbs entropy metric turns out to be a measure of the average MISSING information to specify the microstate actually taken up by a system, where what we know is the macrostate specified by lab observable conditions and constraints. That is why I have now taken to speaking in terms of MmIG: the macro-micro info gap. This then implies that configuration is a relevant aspect of entropy, thus information. Going further, when we deal with something that exhibits FSCO/I, the observable function sharply constrains possible configs, i.e. we are in a low entropy, high KNOWN information context. The only empirically warranted way -- and this is backed up by the needle in the haystack challenge -- to put something complex enough to be FSCO/I into that state is IDOW -- intelligently directed organising work. AKA, Design. All of this brings us back to the significance of FSCO/I as an empirically reliable, tested sign of design. KFkairosfocus
September 8, 2012
September
09
Sep
8
08
2012
05:30 AM
5
05
30
AM
PST
Dr Sewell, you may find the remarks here helpful. KFkairosfocus
September 8, 2012
September
09
Sep
8
08
2012
05:16 AM
5
05
16
AM
PST
kf @29, Don't you mean that H is enthalpy and S is entropy?dgw
September 7, 2012
September
09
Sep
7
07
2012
09:40 PM
9
09
40
PM
PST
I agree with Gordon Davisson in that if it -the entropy du jour- does not apply to thermodynamics then the 2nd law of thermodynamics does not apply to it. However I still maintain that the fact that there is a 2nd law of thermodynamics is evidence for Intelligent Design.Joe
September 7, 2012
September
09
Sep
7
07
2012
06:03 PM
6
06
03
PM
PST
kf,
Please reserve entropy for H, as per Shannon.
Excellent point. One that I had intended to make (and may still) after a re-read of Sal's initial post (OP in a different thread). http://ada.evergreen.edu/~arunc/texts/cybernetics/weaver.pdfMung
September 7, 2012
September
09
Sep
7
07
2012
04:41 PM
4
04
41
PM
PST
F/N: Since it is a very useful intro at basic mathematical level and goes at Amazon for about US$ 10, L K Nash's Elements of Statistical Thermodynamics is a book I suggest for a basic read. Work through especially the discussion of the Boltzmann distribution in Ch 1. Harry S Robertson's Statistical Thermophysics is also useful on the Informational school of thought, but is a stiffer read, and is much more expensive. Go look up in a Library. KFkairosfocus
September 7, 2012
September
09
Sep
7
07
2012
04:30 PM
4
04
30
PM
PST
But you can call anything “entropy”
Just ask Clausius!Mung
September 7, 2012
September
09
Sep
7
07
2012
04:25 PM
4
04
25
PM
PST
Folks: Entropy has several linked expressions and contexts. The dS >/= d"Q/T expression is just one. Statistical thermodynamics gave us two expressions, one due to Boltzmann [S = k Log W) and the other to Gibbs, which is a weighted age of missing info required to specify a microstate given a macrostate compatible with a set of microstates: S = - k [SUM on i] pi* log pi Now it turns out that - log pi is a "natural" metric of info, hence the discussions ever since Shannon put forth his theory and identified avg info per symbol. Jaynes et al have identified this much as I have described, entropy in effect being a metric of additional -- thus missing -- info to specify a microstate given macrostate. Or, we could use more traditional terms about degrees of microscopic freedom consistent with a macrostate. Also, since we are missing that info, when we use a system in say a heat engine, we have to treat the system as random across the set of microstates -- a heat source not a mechanical source. (Contrast how efficient a wind turbine can be to a heat engine using air at the same temp as hot temp working fluid.) Relevance to FSCO/I is that cellular life is an example of macro-identifiable function that locks down state possibilities to a narrow pool of possibilities rather than a much broader config space. Island of function. A low entropy, high information state. One that is not to be "simply" accounted for by making appeals to open systems and flows of energy and or matter. Tornados reliably rip apart aircraft, they don't assemble them from parts. For reasons closely connected to the statistical-informational view of entropy. The loose usage of "entropy" to denote the info in N symbols of average info per symbol H, is potentially misleading. Please reserve entropy for H, as per Shannon. KF PS: Mung, I have not fundamentally altered my views in any recent time.kairosfocus
September 7, 2012
September
09
Sep
7
07
2012
04:22 PM
4
04
22
PM
PST
From the OP:
If you look at university physics texts which discuss the second law, you will find examples of “entropy” increases cited such as books burning, wine glasses breaking, bombs exploding, rabbits dying, automobiles crashing, buildings being demolished, and tornadoes tearing through a town (I have actually seen each of these cited).
"...there is a great deal of confusion between the description of what happens in a spontaneous process, and the interpretation of entropy. For example, in an expansion process of an ideal gas as described in Chapter 7, it is qualitatively correct to describe what happens by saying that: 1. The system has become more disordered. 2. The particles have spread from a smaller to a larger volume. 3. The total energy of the system has spread into a larger volume. 4. Part of the information we had on the location of the particles was lost in the process. All of these and many more (in particular the mixing in a mixing process depicted in Fig. 1.4) are valid descriptions of what happens in the spontaneous process. The tendency, in most textbooks on thermodynamics, is to use one of these descriptors of what happens in a spontaneous process, to either describe, or to interpret, or even define entropy. Clearly, in order to identify any of these descriptors with entropy, one must first show that the descriptor is a valid one for any spontaneous processes. Second, one must also show that one can relate quantitatively the change in one of these descriptors to changes in entropy. Unfortunately, this cannot be done for any of the listed descriptors." http://www.ariehbennaim.com/books/entropyd.htmlMung
September 7, 2012
September
09
Sep
7
07
2012
04:17 PM
4
04
17
PM
PST
Joe:
And what Granville is saying is that [entropy, a mathematical expression] can also be applied to other things, like information, hence Shannon entropy, which does not involve a quantity of heat and the temperature.
Actually, it's the other way around, as kf says. It's Shannon entropy that is the more general of the two.Mung
September 7, 2012
September
09
Sep
7
07
2012
04:02 PM
4
04
02
PM
PST
kf:
in short, we are looking at something connected to loss of access to info about specific microstate given a macrostate.
hi kf, Have your views on the relationship between information and entropy changed? RegardsMung
September 7, 2012
September
09
Sep
7
07
2012
03:56 PM
3
03
56
PM
PST
Entropy Demystified: The Second Law Reduced to Plain Common Sense Entropy and the Second Law: Interpretation and Misss-InterpretationsssMung
September 7, 2012
September
09
Sep
7
07
2012
03:54 PM
3
03
54
PM
PST
niwrad, Dr. Sewell was describing the destructive power of a tornado.Mung
September 7, 2012
September
09
Sep
7
07
2012
03:50 PM
3
03
50
PM
PST
PaV:
He tells us that the change in entropy, S, is equal to the integral (sum) over the initial and final values of dQ/T. (N.B. the ‘initial’ value of dQ/T is always higher than the final value of dQ/T, revealing again, this hidden directionality.
Potential nitpick. Why do you refer to S as a change in entropy? What was the entropy before, and how did you measure it?Mung
September 7, 2012
September
09
Sep
7
07
2012
03:42 PM
3
03
42
PM
PST
PaV:
This is the critical point: an entropy change always has a direction. This is exactly what the 2LofT tells us, but, we usually lose track of it. Nevertheless, it’s there.
"... the answer to the question of "Why" the process occurs in one direction is probabilistic." - Arieh Ben-NaimMung
September 7, 2012
September
09
Sep
7
07
2012
03:36 PM
3
03
36
PM
PST
Entropy has no power to do anything. It can neither build nor destroy.
It depends on what one means with "power". We can say that "entropy destroys" as we can say that "ignorance damages". Both are "negative" per se (ignorance is "non knowledge" while entropy is "non order" and "non information"). They haven't power if we with "power" mean something positive and constructive only. But these "non knowledge" and "non order" have results, negative and destructive. With "power" (in the negative sense) I had in mind exactly these results.niwrad
September 7, 2012
September
09
Sep
7
07
2012
02:20 PM
2
02
20
PM
PST
Entropy has no power to do anything. It can neither build nor destroy.
Hey that's very similar to natural selection and neo-darwinian processes. So similar only common descent can explain it.Joe
September 7, 2012
September
09
Sep
7
07
2012
02:17 PM
2
02
17
PM
PST
But there are other types of “entropy” which are as quantifiable as thermal entropy, the “X-entropy” which measures disorder in the distribution of any diffusing component X is defined by essentially the same equations as are used to define thermal entropy, which measures disorder in the distribution of heat as it diffuses, and X-entropy is certainly equally quantifiable.
But this provides no basis for thinking that the second law applies to "X-entropy". As far as I can see, you've provided only 4 reasons for thinking that the second law applies to these "X-entropies", none valid: * They're called "entropy". But you can call anything "entropy", and that doesn't imply any connection at all to the second law. * They're quantifiable. While I would argue that being quantifiable is necessary for the second law to apply, it's hardly sufficient. * They have similar mathematical form to thermal entropy (in at least some circumstances). This is suggestive, but hardly a real reason to think the second law applies to them. * Under some circumstances (diffusion through a solid, as analyzed in your paper) they always increase (unless there's a flux through the boundaries of the system). Again, this is suggestive, but hardly a real argument. In the first place, showing that something holds under some circumstances doesn't show that it holds under all circumstances. In the second place, even if something does hold under all circumstances, it doesn't necessarily have anything to do with the second law. While examples are not sufficient to establish a universal law, counterexamples are sufficient to refute a (claimed) universal law. As I've pointed out before, there are circumstances where X-entropies decrease despite there being no X-flux through the boundaries of the system. The example I gave in the linked article -- carbon settling to the bottom of a jar -- is hardly the only possible counterexample. In fact, if you'd taken gravity into account in your analysis you would've found that heavier substances (*) tend to concentrate at the bottom, lighter ones at the top; in both cases, the X-entropy would undergo a spontaneous decrease without any X-flux, violating equation 5 of your paper. (* actually, heaver vs. lighter isn't quite right -- substances that increase the overall density tend to settle, while those that decrease overall density will tend to rise.) Do these counterexamples show violations of the second law? Of course not, because the second law (the real second law) does not apply to these X-entropies.Gordon Davisson
September 7, 2012
September
09
Sep
7
07
2012
02:16 PM
2
02
16
PM
PST
It is the destructive power of entropy, as Dr. Sewell clearly explained.
Entropy has no power to do anything. It can neither build nor destroy.Mung
September 7, 2012
September
09
Sep
7
07
2012
01:23 PM
1
01
23
PM
PST
Mike Elzinga says:
Remember, entropy is simply a name given to a mathematical expression involving the quantity of heat and the temperature.
And what Granville is saying is that can also be applied to other things, like information, hence Shannon entropy, which does not involve a quantity of heat and the temperature. Unless of course we are talking about superconductors (zero resistance).
It is the logarithm of the number of energy microstates consistent with the macroscopic state of a thermodynamic system.
That means by observing some phenomena we should be able to piece it all back together to see what it was.Joe
September 7, 2012
September
09
Sep
7
07
2012
01:16 PM
1
01
16
PM
PST
1 2

Leave a Reply