Uncommon Descent Serving The Intelligent Design Community

A Designed Object’s Entropy Must Increase for Its Design Complexity to Increase – Part 2

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

In order for a biological system to have more biological complexity, it often requires a substantial increase in thermodynamic entropy, not a reduction of it, contrary to many intuitions among creationists and IDists. This essay is part II of a series that began with Part 1

The physicist Fred Hoyle famously said:

The chance that higher life forms might have emerged in this way is comparable to the chance that a tornado sweeping through a junkyard might assemble a Boeing 747 from the materials therein.

I agree with that assertion, but that conclusion can’t be formally derived from the 2nd law of thermodynamics (at least those forms of the 2nd law that are stated in many physics and engineering text books and used in the majority of scientific and engineering journals). The 2nd law is generally expressed in 2 forms:

2nd Law of Thermodynamics (THE CLAUSIUS POSTULATE)
No cyclic process is possible whose sole outcome is transfer of heat from a cooler body to a hotter body

or equivalently

2nd Law of Thermodynamics (THE KELVIN PLANCK POSTULATE)

No cyclic process is possible whose sole outcome is extraction of heat from a single source maintained at constant temperature and its complete conversion into mechanical work

In Part 1, I explored the Shannon entropy of 500 coins. If the coins are made of copper or some other metal, the thermodynamic entropy can be calculated. But let’s have a little fun, how about the thermodynamic entropy of a 747? [Credit Mike Elzinga for the original idea, but I’m adding my own twist]

The first step is to determine about how much matter we are dealing with. From the manufacturer’s website:

A 747-400 consists of 147,000 pounds (66,150 kg) of high-strength aluminum.

747 Fun Facts

Next we find out the the standard molar entropy of Aluminum (symbol Al). From Enthalpy Entropy and Gibbs we find that the standard entropy of aluminum at 25 Celcius at 1 atmosphere is 28.3 Joules/Kelvin/Mole.

Thus a 747’s thermodynamic entropy based on the aluminum alone is:

Suppose now that a tornado runs into 747 and tears of pieces of the wings, tail, and engines such that the weight of aluminum in what’s left of the 747 is now only 50,000 kg. Using the same sort of calculation, the entropy of the broken and disordered 747 is:

Hence the tornado lowers the entropy of the 747 by disordering and removing vital parts!

And even supposing we recovered all the missing parts such that we have the original weight of the 747, the entropy calculation has nothing to say about the functionality of the 747. Hence, the 2nd law, which inspired the notion of thermodynamic entropy has little to say about the design and evolution of the aircraft, and by way of extension it has little to say about the emergence of life on planet earth.

Perhaps an even more pointed criticism in light of the above calculations is that increasing mass in general will increase entropy (all other things being equal). Thus as a system becomes more complex, on average it will have more thermodynamic entropy. For example a simple empty soda can weighing 14 grams (using a similar calculation) has a thermodynamic entropy of 14.68 J/K which implies a complex 747 has 4.7 million times the thermodynamic entropy of a simple soda can. A complex biological organism like an Albatross has more thermodynamic entropy than a handful of dirt. Worse, when the Albatross dies, it loses body heat and mass, and hence its thermodynamic entropy goes down after it dies!

So the major point of Part II is that a designed object’s thermodynamic entropy often increases with the increasing complexity of the design for the simple reason that it has more parts and hence more mass. And as was shown in part 1, the Shannon entropy also tends to increase with the complexity of the design. Hence, at least two notions of entropy (Shannon and thermodynamic) can increase with increased complexity of a design (be it man-made design, evolution made design, or ….)

This concludes the most important points I wanted to get across. Below is merely an exploration of some of the fundamentals of thermodynamics for readers interested in the some of the technical details of thermodynamics and statistical mechanics. The next section can be skipped at the reader’s discretion since it is mostly an appendix to this essay.
========================================================================
THERMODYNAMICS AND STATISTICAL MECHANICS BASICS

Classical Thermodynamics can trace some of its roots to the work of Carnot in 1824 during his quest to improve the efficiency of steam engines. In 1865 we have a paper by Clausius that describes his conception of entropy. I will adapt his formula here:

Where S is entropy, Q is heat, and T is temperature. Perhaps to make the formula more accessible, let us suppose we have a 1000 watt heater running for 100 seconds that contributes to the boiling of water (already at 373.2ᵒK). What is the entropy contribution due this burst of energy from the heater? First I calculate the amount of heat energy input in the water:

Using Clausius’ formula, and the fact the process is isothermal, I then calculate the change of entropy in the water as:

So how does all this relate to Boltzmann and statistical mechanics? There was the intuition among scientists that thermodynamics could be related to classical (Newtonian) mechanics. They suspected that what we perceived as heat and temperature could be explained in terms of mechanical behaviors of large numbers of particles, specifically the statistical aspects of these behaviors, hence the name of the discipline is statistical mechanics.

A system of particles in physical space can be described in terms of position and momentum of the particles. The state of the entire system of particles can be expressed as a location in a conceptual Phase Space. We can slice up this conceptual phase space into a finite number of chunks because of the Liouville Theorem. These sliced-up chunks correspond to the microstates which the system can be found in, and furthermore the probability of the system being in a given microstate is the same for each microstate (equiprobable). Boltzmann made the daring claim that taking the logarithm of the number of microstates is related to the entropy Clausius defined for thermodynamics. The modern form of Boltzmann’s daring assertion is:

where Ω is the number of microstates of the system, S is the entropy, and kB is Boltzmann’s constant. Using Boltzmann’s forumula we can then compute the change of entropy:

As I pointed out Boltzmann’s equation looks hauntingly similar to Shannon’s entropy formula for the special case where the microstates of a Shannon information system are equiprobable.

Around 1877 Boltzmann published his paper connecting thermodynamics to statistical mechanics. This was the major breakthrough that finally bridged the heretofore disparate fields of thermodynamics and classical mechanics.

Under certain conditions we can relate Clausius notions of entropy to Boltzmann’s notions of entropy, and thus the formerly disparate fields of thermodynamics and classical mechanics are bridged. Here is how I describe symbolically the special case where Clausius’s notions of entropy agrees with Boltzmann’s notions of entropy:

[It should be noted, the above equality will not always hold.]

Mike Elzinga and I had some heated disagreement on the effect of spatial configuration to entropy. Perhaps to clarify, the colloquial notion of disordering things does not change the thermodynamic entropy (like taking a 747 and disordering its parts, as long as we have the same matter, it has the same thermodynamic entropy). But that’s not to say that changes in volume (which is a change in spatial configuration) won’t affect the entropy calculations. This can be seen in the formula for the entropy of an ideal monoatomic gas (the Sakur-Tetrode Equation):

where
S is the entropy
N is the number of atoms
kB is Boltzmann’s constant
V is the volume
E is the internal energy
ℏ = Dirac Constant (reduced Planck’s constant)

From this we can see that increasing either the volume which the gas occupies, the energy of the gas, or the number of particles in the gas will increase the entropy. Of course this must happen under reasonable limits since if the volume is too large there cannot be energy exchange in the particles and notions of what defines equilibrium begin to get fuzzy, etc.

Nowhere in this calculation are notions of “order” explicitly or implicitly identified, and hence such notions are inessential and possibly misleading to the understanding of entropy.

How the Sakur-Tetrode formula is derived is complicated, but if one wants to see how entropy can be calculated for simpler systems, Mike Elzinga provided a pedagogical concept test where the volume of the system is fixed and small enough such that the particles are close enough to interact. The volume is not relevant in his examples so the entropy calculations are simpler.

I went through a couple of iterations to solve the problems in his concept test. His test and my two iterations of answers (with help from Olegt on discrete math) are here:
Concept test attempt 1: Basic Statistical Mechanics

and
Concept test amendments: Purcell Pound

Acknowledgements
Mike Elzinga, Olegt, Elizabeth Liddle, Andy Jones, Rob Sheldon, Neil Rickert, the management, fellow authors and commenters at UD and Skeptical Zone.

[UPDATE 9/7/2012]
Boltzmann

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.” (Final paragraph of #87, p. 443.)

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

Comments
SC: Please, pause and see what your entropy estimate is based on, so much Al at a given P, T etc.
So are my numbers correct given P, T, etc.? I'm not asking about CSI, IC, FSCO/I, IDOW, SFOD-D, MmIG, WMDs, MIGs, BUFFs, AWACS, VLSI, DicNavAb, etc. I'm asking about the standard state entropy of the Aluminum content of: A. 747 B. Broken 747 C. Soda can A simple yes or no, would be helpful to everyone. You've been very verbose, and I'm not asking you to print more than 3 characters for a response of "yes", 2 characters for a response of "no", and 12 charcters to say "I don't know". You don't have to print a dissertation that doesn't answer the question I pose. If you don't want to answer the question, say so. "I don't want to answer the question. I want to talk about something else." (that would be 73 characters for a response). I'll accept that but please offer me the courtesy of saying that you prefer to talk about something else rather than answering a question I've posed more than a few times in this discussion.scordova
September 6, 2012
September
09
Sep
6
06
2012
08:50 AM
8
08
50
AM
PDT
SC: Please, pause and see what your entropy estimate is based on, so much Al at a given P, T etc. It is only telling one part of the story for a 747 or a broken 747 or an equal mass of soda cans, etc, indeed while useful for chem eng, the values are not addressing a serious associated issue. Where IDOW is also a highly relevant consideration in how we get TO a 747. That is what I am highlighting. And I have taken time to discuss the information issues tied to the Gibbs entropy metric, which is the context of Shannon entropy. A half-story can be doubly misleading precisely because so far as it goes it tells a compelling tale. But, we need to hear the rest of the story, which is where Jaynes et al (including Brillouin and Szilard) come in. BTW, did Jayne come up in your earlier discussions? If so, how and if not, why not; given that the info-entropy bridge is at the pivot of the matter? In addition, I think there is a crucially distorting loose usage of "entropy" in how you are arguing. Kindly note the way Shannon used it, and how specifically that usage ties to the Gibbs metric. KFkairosfocus
September 6, 2012
September
09
Sep
6
06
2012
08:26 AM
8
08
26
AM
PDT
Pardon a text chunk, but there is a very bad habit of failing to read linked materials:
Feel free to post what you feel is important. Thank you for contributing. But do you have a different set of thermodynamic entropy numbers than the ones I posted for: 1. 747 2. Broken 747 3. soda can If you don't think you can arrive at them using all the materials you've provided, please say so. I surely can't seem to get a different set of numbers based on what you said. Do you agree with my numbers for these three objects? A simple: A. Yes B. No C. Don't know would suffice rather than large chunks of text.scordova
September 6, 2012
September
09
Sep
6
06
2012
07:21 AM
7
07
21
AM
PDT
All systems of equal weight of aluminum could not possibly have the same entropy
For the sake of simplicity, let us assume the same material was used to configure completely different structures. Let us assume we are using a billion aluminum coins (perhaps with serial numbers to make them distict). We can configure the coins to correspond to the ascii representation of some passage in literature or we could configure the coins to a random sequence. Thermodynamic entropy would be the same in each case. Even though one configuration (that which corresponded to a literary passage) would be obviously designed. All I'm saying is one should not use thermodynamics to try to infer design, it's the wrong tool. One needs a different set of lenses (figuratively speaking) to see design, not thermodynamics. Even Shannon entropy in and of itself is insufficient. What one needs are independent specifications to discern design. We intuitively carry some of these specifications in our minds (like the specification of "all heads"), but some specifications are more subtle. But the point of my essays was to help readers understand what entropy really is. The fact that it may or may not help us make design inference is actually secondary to the essay. If you come away from this essay and realize that the 2nd law of thermodynamics or that thermodynamic entropy won't help us make design inferences, then I feel I've succeeded in communicating my point. My point is we have to use other means to infer design than looking at thermodynamic entropy numbers or using the 2nd law of thermodynamics.scordova
September 6, 2012
September
09
Sep
6
06
2012
07:16 AM
7
07
16
AM
PDT
KF at 24, I think that is what I was trying to say.butifnot
September 6, 2012
September
09
Sep
6
06
2012
07:09 AM
7
07
09
AM
PDT
And even supposing we recovered all the missing parts such that we have the original weight of the 747, the entropy calculation has nothing to say about the functionality of the 747. Hence, the 2nd law, which inspired the notion of thermodynamic entropy has little to say about the design and evolution of the aircraft, and by way of extension it has little to say about the emergence of life on planet earth.
Sal your approach appears incorrect, for me. It is reminiscent of "a royal flush is no more unlikely than any other hand". There is something different about a lump of aluminum and a 747, surely the entropy, being the result of the work done and information imparted to it. The system of interest is the aluminum plus the source of the work and information which brought it to its current state. All systems of equal weight of aluminum could not possibly have the same entropy. The entropy has not been calculated correctly, terms are missing and not accounted for. Perhaps the boundary should be different. Also the relevance of Shannon information is questionable, in my opinion.butifnot
September 6, 2012
September
09
Sep
6
06
2012
06:52 AM
6
06
52
AM
PDT
Um, you're completely ignoring the fact that the system can include its own error checking and self-correction. 747s do a lot of that. Soda cans do not. It's one of the reasons that complex systems are worth the trouble and expense over cheap, simple systems. But I guess I'm missing the point of discussing a non-living system (e.g., pennies in a box) as a means of understanding living systems. Biologic systems do a whole lot of "healing" as soon as they detect that "something's gone out of skew on treadle". What features of the penny-box system "heal" the pennies?mahuna
September 6, 2012
September
09
Sep
6
06
2012
06:19 AM
6
06
19
AM
PDT
F/N: 504 bits is also the amount of info storable in a D/RNA chain of 252 bases, corresponding to 84 3-letter codons, for a string of AA's after transcription processing, translation and assembly. If we include a stop codon and the fact that as standard the start puts up Methionone, we are back to 82 variable AAs in a chain of 83.kairosfocus
September 6, 2012
September
09
Sep
6
06
2012
06:03 AM
6
06
03
AM
PDT
F/N 2: To make it clear what sort of thing we are talking about, the following from the above post is 72 characters or 504 ASCII bits, including spaces:
F/N: I was just thinking about R/DNA strands, let’s go to 504 bit string
PS: 72 7-bit characters is 504 bits.kairosfocus
September 6, 2012
September
09
Sep
6
06
2012
05:57 AM
5
05
57
AM
PDT
F/N: I was just thinking about R/DNA strands, let's go to 504 bit strings for BB as that is also & bits * 72 characters, and let me represent the push-button on the LH too: ))--> || BLACK BOX || –> 504 bit stringkairosfocus
September 6, 2012
September
09
Sep
6
06
2012
05:50 AM
5
05
50
AM
PDT
Pardon a text chunk, but there is a very bad habit of failing to read linked materials:
NANOBOTS AND MICRO-JETS i] Consider the assembly of a Jumbo Jet, which requires intelligently designed, physical work in all actual observed cases. That is, orderly motions were impressed by forces on selected, sorted parts, in accordance with a complex specification. (I have already contrasted the case of a tornado in a junkyard that it is logically and physically possible can do the same, but the functional configuration[s] are so rare relative to non-functional ones that random search strategies are maximally unlikely to create a flyable jet, i.e. we see here the logic of the 2nd Law of Thermodynamics, statistical thermodynamics form, at work. [Intuitively, since functional configurations are rather isolated in the space of possible configurations, we are maximally likely to exhaust available probabilistic resources long before arriving at such a functional configuration or "island" of such configurations (which would be required before hill-climbing through competitive functional selection, a la Darwinian natural Selection could take over . . . ); if we start from an arbitrary initial configuration and proceed by a random walk.]) ii] Now, let us shrink the Hoylean example, to a micro-jet so small [~ 1 cm or even smaller] that the parts are susceptible to Brownian motion, i.e they are of about micron scale [for convenience] and act as "large molecules." (Cf. "materialism-leaning 'prof' Wiki's" blowing-up of Brownian motion to macro-scale by thought expt, here; indeed, this sort of scaling-up thought experiment was just what the late, great Sir Fred was doing in his original discussion of 747's.) Let's say there are about a million of them, some the same, some different etc. In principle, possible: a key criterion for a successful thought experiment. Next, do the same for a car, a boat and a submarine, etc. iii] In several vats of "a convenient fluid," each of volume about a cubic metre, decant examples of the differing mixed sets of nano-parts; so that the particles can then move about at random, diffusing through the liquids as they undergo random thermal agitation. iv] In the control vat, we simply leave nature to its course. Q: Will a car, a boat a sub or a jet, etc, or some novel nanotech emerge at random? [Here, we imagine the parts can cling to each other if they get close enough, in some unspecified way, similar to molecular bonding; but that the clinging force is not strong enough at appreciable distances [say 10 microns or more] for them to immediately clump and precipitate instead of diffusing through the medium.] ANS: Logically and physically possible (i.e. this is subtler than having an overt physical force or potential energy barrier blocking the way!) but the equilibrium state will on statistical thermodynamics grounds overwhelmingly dominate — high disorder. Q: Why? A: Because there are so many more accessible scattered state microstates than there are clumped-at -random state ones, or even moreso, functionally configured flyable jet ones. (To explore this concept in more details, cf the overviews here [by Prof Bertrand of U of Missouri, Rolla], and here -- a well done research term paper by a group of students at Singapore's NUS. I have extensively discussed on this case with a contributer to the ARN known as Pixie, here. Pixie: Appreciation for the time & effort expended, though of course you and I have reached very different conclusions.) v] Now, pour in a cooperative army of nanobots into one vat, capable of recognising jet parts and clumping them together haphazardly. [This is of course, work, and it replicates bonding at random. "Work" is done when forces move their points of application along their lines of action. Thus in addition to the quantity of energy expended, there is also a specificity of resulting spatial rearrangement depending on the cluster of forces that have done the work. This of course reflects the link between "work" in the physical sense and "work" in the economic sense; thence, also the energy intensity of an economy with a given state of technology: energy per unit GDP tends to cluster tightly while a given state of technology and general level of economic activity prevail. (Current estimate for Montserrat: 1.6 lbs CO2 emitted per EC$ 1 of GDP, reflecting an energy intensity of 6 MJ/EC$, and the observation that burning one US Gallon of gasoline or diesel emits about 20 lbs of that gas. Thereby, too, lies suspended much of the debate over responses to feared climate trends (and the ironies shown in the 1997, Clinton era Byrd-Hagel 95-0 Senate resolution that unless certain key "developing" nations also made the sacrifice, the US would not sign to the Kyoto protocol [they refused to amend the draft to include non-Annex I countries, and the US has refused to sign; signatories then have gone on to bust the required emissions cuts . . .], but that bit of internationalist "folly-tricks" and spin-doctoring is off topic, though illuminating on the concept of work and how it brings the significance of intelligent direction to bear on energy flows once we get to the level of building complicated things that have to function . . .)] Q: After a time, will we be likely to get a flyable nano jet? A: Overwhelmingly, on probability, no. (For, the vat has ~ [10^6]^3 = 10^18 one-micron locational cells, and a million parts or so can be distributed across them in vastly more ways than they could be across say 1 cm or so for an assembled jet etc or even just a clumped together cluster of micro-parts. [a 1 cm cube has in it [10^4]^3 = 10^12 cells, and to confine the nano-parts to that volume obviously sharply reduces the number of accessible cells consistent with the new clumped macrostate.] But also, since the configuration is constrained, i.e. the mass in the microjet parts is confined as to accessible volume by clumping, the number of ways the parts may be arranged has fallen sharply relative to the number of ways that the parts could be distributed among the 10^18 cells in the scattered state. (That is, we have here used the nanobots to essentially undo diffusion of the micro-jet parts.) The resulting constraint on spatial distribution of the parts has reduced their entropy of configuration. For, where W is the number of ways that the components may be arranged consistent with an observable macrostate, and since by Boltzmann, entropy, s = k ln W, we see that W has fallen so S too falls on moving from the scattered to the clumped state. vi] For this vat, next remove the random cluster nanobots, and send in the jet assembler nanobots. These recognise the clumped parts, and rearrange them to form a jet, doing configuration work. (What this means is that within the cluster of cells for a clumped state, we now move and confine the parts to those sites consistent with a flyable jet emerging. That is, we are constraining the volume in which the relevant individual parts may be found, even further.) A flyable jet results — a macrostate with a much smaller statistical weight of microstates. We can see that of course there are vastly fewer clumped configurations that are flyable than those that are simply clumped at random, and thus we see that the number of microstates accessible due to the change, [a] scattered --> clumped and now [b] onward --> functionally configured macrostates has fallen sharply, twice in succession. Thus, by Boltzmann's result s = k ln W, we also have seen that the entropy has fallen in succession as we moved from one state to the next, involving a fall in s on clumping, and a further fall on configuring to a functional state; dS tot = dSclump + dS config. [Of course to do that work in any reasonable time or with any reasonable reliability, the nanobots will have to search and exert directed forces in accord with a program, i.e this is by no means a spontaneous change, and it is credible that it is accompanied by a compensating rise in the entropy of the vat as a whole and its surroundings. This thought experiment is by no means a challenge to the second law. But, it does illustrate the implications of the probabilistic reasoning involved in the microscopic view of that law, where we see sharply configured states emerging from much less constrained ones.] vii] In another vat we put in an army of clumping and assembling nanobots, so we go straight to making a jet based on the algorithms that control the nanobots. Since entropy is a state function, we see here that direct assembly is equivalent to clumping and then reassembling from a random “macromolecule” to a configured functional one. That is: dS tot (direct) = dSclump + dS config. viii] Now, let us go back to the vat. For a large collection of vats, let us now use direct microjet assembly nanobots, but in each case we let the control programs vary at random a few bits at a time -– say hit them with noise bits generated by a process tied to a zener noise source. We put the resulting products in competition with the original ones, and if there is an improvement, we allow replacement. Iterate, many, many times. Q: Given the complexity of the relevant software, will we be likely to for instance come up with a hyperspace-capable spacecraft or some other sophisticated and un-anticipated technology? (Justify your answer on probabilistic grounds.) My prediction: we will have to wait longer than the universe exists to get a change that requires information generation (as opposed to information and/or functionality loss) on the scale of 500 – 1000 or more bits. [See the info-generation issue over macroevolution by RM + NS?] ix] Try again, this time to get to even the initial assembly program by chance, starting with random noise on the storage medium. See the abiogenesis/ origin of life issue? x] The micro-jet is of course an energy converting device which exhibits FSCI, and we see from this thought expt why it is that it is utterly improbable on the same grounds as we base the statistical view of the 2nd law of thermodynamics, that it should originate spontaneously by chance and necessity only, without agency. xi] Extending to the case of origin of life, we have cells that use sophisticated machinery to assemble the working macromolecules, direct them to where they should go, and put them to work in a self-replicating, self-maintaining automaton. Clumping work [if you prefer that to TBO’s term chemical work, fine], and configuring work can be identified and applied to the shift in entropy through the same s = k ln W equation. For, first we move from scattered at random in the proposed prebiotic soup, to chained in a macromolecule, then onwards to having particular monomers in specified locations along the chain -- constraining accessible volume again and again, and that in order to access observably bio-functional macrostates. Also, s = k ln W, through Brillouin, TBO link to information, viewed as "negentropy," citing as well Yockey-Wicken’s work and noting on their similar definition of information; i.e this is a natural outcome of the OOL work in the early 1980's, not a "suspect innovation" of the design thinkers in particular. BTW, the concept complex, specified information is also similarly a product of the work in the OOL field at that time, it is not at all a "suspect innovation" devised by Mr Dembski et al, though of course he has provided a mathematical model for it. [ I have also just above pointed to Robertson, on why this link from entropy to information makes sense — and BTW, it also shows why energy converters that use additional knowledge can couple energy in ways that go beyond the Carnot efficiency limit for heat engines.]
So, it is evident that the micro level analytical view is capturing something that the macro view is not, something that is highly relevant if we are concerned to accurately understand what is going on. In particular, configuration is important, and there is such a thing as organisation that is functional that requires to be accounted for, in the face of the overwhelming number of possible states of component parts. if these components are left to the blind forces of mechanical necessity and chance contingencies, the sheer scope of the space of possible configs is so large that such a blind sampling process -- on the gamut of a vat, of a planet or a solar system or the observed cosmos -- cannot be reasonably expected to turn up anything but the overwhelming bulk of the distributions of possible states. In particular, Al is normally found in the form of bright red-purple earth in my native land, Jamaica, that proverbially cannot grow grass, only airplanes. (There is a famous cable on the results of the analysis of the soil sample as to why certain land was so poor for growing grass to feed cattle.) Left to natural forces of geology at planetary scale over eons, the predictable outcome would be more of same: unproductive soil, eventually washed into the sea and diffused or settling as sedimentary deposits therein. And perhaps reforming much of the same through tectonic forces. To get the Jumbo Jet, IDOW on the massive scale had to be injected. Bauxite mines, railroads, Alumina refineries to get an intermediate product, more railroads and a Port Kaiser, shipping, onward Al refineries tuned to the particular ore from Jamaica, etc etc, then processing into Alloys and component parts, through an advanced economy. At every stage there is work going on, and that work is shifting the configurations of the Al atoms that will eventually be part of that Jumbo Jet. Work, dominated by intelligence towards complex function. Please do not tell me that that work of configuration is not relevant to the overall thermodynamics account, in the teeth of the above case. And, the scaled down case of parts to make a micro-jet through diffusion vs nanobots helps us see why that is a reasonable insistence. Simply because for particular purposes, we focus on specific aspects of the entropy account and set system boundaries for convenience, does not change that overall picture. Yes, the analysis has been indicated only in outline, and there is no prospect at this stage of generating a numerical value for the total entropy change involved in assembling a Jumbo Jet, but it is nonetheless real and it is reasonable to sketch in a rough outline that allows us to see more clearly cases where the same issues are far more important to our main concerns. (The same BTW, routinely happens in Economics, and it is the failure of reckoning with the overall picture that so often leads to economic fallacies and costly policy blunders.) I trust that his will help us clarify our thinking. KFkairosfocus
September 6, 2012
September
09
Sep
6
06
2012
04:03 AM
4
04
03
AM
PDT
Folks: The first key thing to note from the above and the previous post by SC, is that there is a debate on the nature of several thermodynamics and information theory concepts. One that is not well known outside physics, and one that in part is not even well known inside physics. IIRC, Robertson in his preface comments on how there are circles in the schools of thought, and they often do not communicate one to the other. The next thing to realise is that because these issues are closely connected to the debates on design, we are going to have partisan tactics coming in. One of these is the insistence on trying to decouple thermodynamics and information issues. In reply, I have noted back to Gilbert N Lewis -- yes, THAT G N Lewis -- it has been understood that entropy of a system, from a statistical thermodynamics perspective, is a measure of missing info on the specific microstate if what one has is the info on the observable macrostate. This leads to an "absolute" definition of entropy that is connected to information. Hence the significance of the Boltzmann and Gibbs formulations as opposed to the classical change of system state measures that are used relative to an initial state. Here is Lewis, again, as I have clipped a couple of times over the past day or so:
"Gain in entropy always means loss of information, and nothing more" [this can be given analytical teeth in the following terms: " in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes/no questions that need to be answered in order to fully specify the microstate, given that we know the macrostate." Where also such a case is in effect WLOG as we can reduce analogue to digital/discrete state]]
It also -- as the 500 coins in a string black box example shows -- highlights why there is a reason to connect a high entropy state to chaos or disorder. An initially arranged string of coins allowed to change state at random or spontaneously, through shaking or the equivalent, will -- per the balance of the distribution of possibilities -- strongly tend to a state near 50:50 H/T in no particular meaningful order. Now, too, we know that a string structure, with certain rules of arrangement, can be used to store the description of any material whatsoever. For, the string case (with room for a long enough string and associated rules of interpretation) can be used to set up a description on nodes and arcs, and how they are arranged to form a whole. This is essentially what something like AutoCad does. For onward discussion, it would be helpful to adjust the coins in a box case to a similar BB with a red button that on being pressed sends out a digital string of 502 bits:
|| BLACK BOX || --> 502 bit string
Here, our access tot he internal state is the emitted string, which we can observe on pressing the button. Initially, say it emits 1010 . . . , then after a time we find it decaying away from this and eventually consistently emitting bits that fit the usual binomial distribution with 50/50 probability of H/T. We may reasonably state that the BB has undergone a rise in entropy, consistent with moving from an internally ordered state to a random one. Then, we come back again and find that his system emits the first 72 or so ASCII characters for the words of this post. We would find the explanation that this is a matter of simple chance incredible and would ascribe the behaviour to intelligence. That is, somehow ther has evidently been intelligently directed organising work, not mere ordering similar to crystallisation. Our BB has emitted threee classes of string:
[a] ORDER -- 101010 . . . [b] RANDOM -- 1's and 0's in no particular order, consistent with an applicable random distribution [c] ORGANISATION -- 1's and 0's consistent with an intelligent message
Since it is a BB, we have no independent access to the internal state, but we have reason to see that there were three clusters of possible states, which would require distinct explanations on what could be going on in BB. Order is associated with states like a crystal forming on mechanical necessity (and which as we see can be simply described, i.e compressed), randomness with decay of order or with disorder (which will indeed be hard to compress from description but by citing the actual string), and there is a third possibility, organisation that is functional and specific as well as complex, which is resistant to compression but not quite as resistant as sheer disorder would be. APPLYING TO THE 747 DESTROYED BY TORNADO EXAMPLE, we can see an interesting divergence in perspectives and system definition that becomes highly relevant. Let us clip SC from the OP:
Suppose now that a tornado runs into 747 and tears of pieces of the wings, tail, and engines such that the weight of aluminum in what’s left of the 747 is now only 50,000 kg. Using the same sort of calculation, the entropy of the broken and disordered 747 is: S_b747 = 5.24 * 10^7 J/K Hence the tornado lowers the entropy of the 747 by disordering and removing vital parts! And even supposing we recovered all the missing parts such that we have the original weight of the 747, the entropy calculation has nothing to say about the functionality of the 747. Hence, the 2nd law, which inspired the notion of thermodynamic entropy has little to say about the design and evolution of the aircraft, and by way of extension it has little to say about the emergence of life on planet earth.
There is a fundamental error in this, driven by failing to assess the significance of micro vs macro descriptions and the gap between classical and statistical formulations of thermodynamics concepts. Now, in the case of the example of the 747 disarranged by the tornado, we are seeing that something has been left out of the reckoning, because in the calculation, the rearrangement of components away from a functional state was missed out in the discussion. This was done by allowing the exclusion of issues connected to arrangement into a functional whole, and by the error of allowing the calculation to set a system boundary that would miss the loss of parts. In particular, we should note that the metric used, which gathers the component of entropy relative to a defined initial state used in relevant tables that is connected tot he average random thermal vibrations of the Al atoms and the like, was shifted not by what was happening to this component of the entropy account, but by simply failing to keep balanced books. if the Al was of the same temp throughout, the Al would have the same value throughout, or at any rate at the beginning and end of the story. For, the Al belonging to the 747 was the same quantity and at the same temp. But this is not the only relevant component. TO BUILD A 747 OUT OF ITS COMPONENT PARTS, A LOT OF IDOW HAD TO BE PUT IN AND THERE IS NO PLAUSIBLE REASON THAT THIS END COULD HAVE BEEN EFFECTED BY SIMPLE INJECTION OF ENERGY. That is, Sir Fred's example of the utter implausibility of a tornado assembling a 747 out of parts, is obvious. Likewise, those who have to pay Boeing's assembly plant power and fuel bills know that a lot of energy went into building the jumbo, energy that was intelligently directed according to a plan, and which shifted the configuration of the Al etc atoms to a config E_747x, that was recognisably functional in a highly specific way, in a special zone of function T from a space of possible configs of the same atoms W. Also, the designers know full well that given inevitable tolerances etc, there is no one state for a functional 747, there is a zone of related configs, T that will fulfill the function, and something out of that narrow zone will not fly. But equally, raw energy -- as opposed to IDOW -- tends to rip apart and disorganise the complex, functionally specific system. That is what the imaginary tornado did. And assuming the same ambient conditions, the entropy metric for the Al will be the same throughout, per SC's calc: 6.94*10^7 J/K. (BTW, the degree symbol for Kelvins is archaic usage.) That is also why my own thought exercise in my always linked note is relevant, on microjets in a vat with parts small enough to suffer diffusion. [ . . . ]kairosfocus
September 6, 2012
September
09
Sep
6
06
2012
04:02 AM
4
04
02
AM
PDT
EDTA: I think all that Sal says is quite correct (except maybe for the second law of thermodinamics, on which I will briefly comment later), but perhaps the form in which he says it can confound somebody here. Just to try to help, I would go back to the famous 500 coins, and I will try to simplify as much as possible: a) The problem is not about the coins themselves. What we must consider for any discussion about digital information is a system of 500 coins in definite linear order, each of which can have one of two states. Let's call this "a material system that can be read as a binary string". b) There is no doubt that, in itself, such a system has no information, if we define "information" in a semiotic way, that is as something that has a specific meaning for a conscious observer. I would defionitrly suggest that we use the word "information" only in a semiotic sense, that is in the sense of "meaningful information". At the same time, I would strongly suggest that we use information as an abstract concept, and not as something that is really in any material system. c) At the same time, a material system such as the one we described has an intrinsic property that can be quantified, and that is very simply the answer to the question: how many bits can we "write" in this material system? Let's call that the "potential complexity" of the material system, or if we want its "Shannon entropy" (although the two things are probably not exactly the same thing). In this sense, the potential complexity of a given material system is a property that does not change, whatever information or lack of information can be found in the present state of that system. So, the potential complexity of our system of 500 coins is, without doubt, 500 bits. It always remains 500 bits, either the present state of the system derives from random tossing, and can be read as a truly random string (semiotic information absent), or it derives from design, and conveys the code for a very efficient algorithm for my computer (semiotic information present). The brute complexity of the physical system is always 500 bits. d) Now, I think that the main point of Sal's discourse is the following: if we have to write a designed string, if our designed string is less complex (shorter) we can use a less complex material system (with less Shannon entropy). If my program is 100 bit long, I can write it with only 100 coins. But if my program is 500 bit long, I need 500 coins. So, the Shannon entropy, or potential complexity, of the material system needs to be higher to allow a more complex design. It's very simple, and I don't understand why that apparently causes confusion here. e) At the same time, the design becomes more improbable. If we assume, for simplicity, that our design needs one specific string to be functional, andt does not allow any change, not even at one bit level, then a design of 100 bits will have a probability of arising in a random system by chance (for instance, by coin tossing) of 1: 2^100. That can be expressed, ala Shannon, as a 100 bit improbability. On the other hand, a program 500 bit long will have a probability of 1:2^500, that is a 500 bit improbability. Well, more on that later.gpuccio
September 6, 2012
September
09
Sep
6
06
2012
01:40 AM
1
01
40
AM
PDT
F/N: The 500 pennies are a good example of what is at stake. ... But if we left them on the table in the box and came back to see them neatly lined up giving the ASCII codes for the first 72 or so characters of this post, we would strongly suspect, indeed with high certainty we would infer that the coins had been deliberately arranged.
But the Shannon entropy of the system is still 500 bits (technically speaking, prior to observation). Ordering has nothing to do with the Shannon entropy. The fact that 500 coins provides 500 bits of shannon entropy makes the design inference possible in the first place.scordova
September 5, 2012
September
09
Sep
5
05
2012
11:02 PM
11
11
02
PM
PDT
The rigorous treatment is pretty far in my past but order and disorder are always brought in to entropy, correct.
Not quite. That was the point of my essays. Some engineering, chemistry, and physics books use the word "disorder" to describe thermodynamic entropy -- yet other books do not. The texts I learned from do not use the word "disorder" to describe entropy. My textbook was: Statistical Mechanics by Pathria and Beale When it actually comes to calculating entropy, it is taking the logarithm of the number of microstates. Mike Elzinga has protested the use of the notion "disorder" to describe entropy, and I agree with him. The examples I've provided are partly rooted in his work. The calculations I've provided were to illustrate the absence of using "disorder" to calculate entropy (Shannon and Thermodynamic). Notice the 747 example being hit by a tornado and having lower entropy, not higher, after being hit. A highly ordered system can have extremely high Shannon and Thermodynamic entropy. Example:
1 billion coins all heads High Shannon Entropy (1 billion bits) High Thermodynamic Entropy (similar calculations as the thermodynamic entropy of a 747) Low Algorthmic Entropy (Kolmogorov Complexity) inference: designed
NOTE: the high or low thermodynamic entropy has little to do with making a design inference. I've argue the 2nd law is inappropriate for making design inferences, but other IDists and creationists disagree.scordova
September 5, 2012
September
09
Sep
5
05
2012
10:47 PM
10
10
47
PM
PDT
Also, how does this integrate with the work of Granville Sewell regarding entropy, open systems and design detection?
1. it disagrees with Granville's work 2. it doesn't have much to say about open or closed systems, but I showed that an open system can reduce the entropy of a 747 :-) 3. a design is recognized as being one state from a space of large possibilities. Debmski requires the following of designs: A. Improbable B. Specified High improbability implies high Shannon entropy.
Once shaken, the disorder has increased, the number of bits needed to describe the arrangement of the coins has increased (i.e., the Chaitin/Kolmogorov complexity), the amount of surprise per coin (Shannon entropy) has increased (because the initial predictable pattern has been ruined), while the ability to detect the initially designed arrangement has decreased.
Not quite. 1. Shannon Entropy is the same whether the coins are ordered or not 2. Algorithmic entropy (Kolmogorov Complexity) rises if the coins go from ordered to disordered 3. Thermodynamic entropy goes up if the temperature goes up, and down if the temperature goes down 4. If the original design were 500 coins heads, then the disordering has erased the design. 5. If the original design is 500 coins heads, its shannon entropy is still 500 bits. After a tornado hits it and disorders it, its Shannon entorpy is still 500 bits. Compare this to Bill's statement.
Thus, a probability of one-eighths, which corresponds to tossing three heads in a row with a fair coin, corresponds to three bits, which is the negative logarithm to the base two of one-eighths
Shannon entropy doesn't necessarily imply disorder, neither does thermodynamic entropy. That was the point these two essays, to correct misconceptions of the what entropy is. High Shannon entropy allows the possibility of disorder, it doesn't make it inevitable, just highly probable in many cases, as is the case of 500 coins.scordova
September 5, 2012
September
09
Sep
5
05
2012
10:34 PM
10
10
34
PM
PDT
My understanding of entropy may be lacking here, but take the above example of coins put into a black box in a certain patterned arrangement: Once shaken, the disorder has increased, the number of bits needed to describe the arrangement of the coins has increased (i.e., the Chaitin/Kolmogorov complexity), the amount of surprise per coin (Shannon entropy) has increased (because the initial predictable pattern has been ruined), while the ability to detect the initially designed arrangement has decreased. So here, increasing entropy in its various forms appears to go with decreasing "designedness". Is this just a matter of interpretation and/or terminology? Also, how does this integrate with the work of Granville Sewell regarding entropy, open systems and design detection?EDTA
September 5, 2012
September
09
Sep
5
05
2012
08:46 PM
8
08
46
PM
PDT
PS: Now, clump the coins in pairs and reduce the 4-state elements in the chain to molecular size -- here we are looking at an informational equivalent to a D/RNA chain of 250 elements. Has that changed the issue significantly? Do you see here how the work of clumping vs configuring can then lead to two successive thermodynamic entropy reductions that correspond directly to the information fed into the chain by organising the elements into a meaningful and functionally specific message? (BTW, I would prefer 502 coins . . . ]kairosfocus
September 5, 2012
September
09
Sep
5
05
2012
06:08 PM
6
06
08
PM
PDT
F/N: The 500 pennies are a good example of what is at stake. Put them in a long skinny black box with little slots like a covered ice cube tray, shake and toss. What do you know about their state given the macro-picture provided by the BB? Ans, you just know they are likely to be in a peaked distribution centred on 50:50 HT in no particular order, and very very sharply peaked indeed. If they start in a far from 50:50 state and get shaken, they are very likely to move towards that 50:50 state, i.e. we see the time's arrow theme emerging. But if we left them on the table in the box and came back to see them neatly lined up giving the ASCII codes for the first 72 or so characters of this post, we would strongly suspect, indeed with high certainty we would infer that the coins had been deliberately arranged. Such an arrangement in the teeth of the cluster of accessible states and the config space, would be FSCI, and it is a sign of intelligently directed ordering work [IDOW], that is of design. That is because the sampling resources are so dwarfed by the space of possible configs -- with a search on the gamut of our solar system across its conventional lifespan, 1 straw to a cubical hay bale 1,000 LY on the side -- that an unintelligent sample of the configs is maximally unlikely to pick up anything but the bulk of the distribution: near 50:50, in no particular order. In short the special zone, T is too isolated to be credibly sampled by chance. But IDOW would easily explain it. Design. KFkairosfocus
September 5, 2012
September
09
Sep
5
05
2012
06:03 PM
6
06
03
PM
PDT
SC: Pardon, but I am highlighting a gap that has been underscored by the informational school of thought on thermodynamics. Hence my already linked remarks on the significance of MmIG. Here is Shannon from his 1950/1 paper, prediction and Entropy of Printed English:
The entropy is a statistical parameter which measures, in a certain sense, how much information is produced on the average for each letter of a text in the language. If the language is translated into binary digits (0 or 1) in the most efficient way, the entropy is the average number of binary digits required per letter of the original language. The redundancy, on the other hand, measures the amount of constraint imposed on a text in the language due to its statistical structure, e.g., in English the high fre-quency of the letter E, the strong tendency of H to follow T or of V to follow Q. It was estimated that when statistical effects extending over not more than eight letters are considered the entropy is roughly 2.3 bits per letter, the redundancy about 50 per cent.
This should serve to underscore the summary I made from Connor. H is a measure of average information per symbol, especially in a context where symbols are not equiprobable. I again draw your attention to the observation that in the statistical thermodynamics context, we see a macrostate, and then have a challenge of an info gap -- or, degrees of freedom [a dual way to look at the same issue] -- to the specific microstate, as there is a quite large in general number of possible but distinct microstates possible that are consistent with the macrostate. Such is the missing information on the microstate that defines its entropy. That degree of freedom is of course a measure of degree of want of constraint on configuration (and momentum) that leads to the inference to a higher degree of entropy entails a higher degree of disorder. In contexts relevant to the wider concerns on design theory, the point is that when you have molecules that have tightly constrained configurations, and these are arranged in tightly constrained ways, that are detectable form the simple fact of cell based life, such have far less freedom to take up varying states than an equivalent mass of atoms or monomers etc. But of course to produce such organisation, work has to be done in the usual case -- statistical miracles beyond the search capacity of the observed cosmos being not credible -- and that will be associated with the export of waste heat elsewhere. (Cf thought exercise here. And this is actually about a micro-jet assembled from its properly arranged parts.) But that is going a bit far down the road just now. Our more direct concern is the link from the informational theory metric and the thermodynamic one. I have pointed to Robertson for more details and have given excerpts. But this clip from Wiki may help clear the air until you can go visit a library (the Amazon price is stiffish, but then I gather textbooks are now at outrageous prices all around):
At an everyday practical level the links between information entropy and thermodynamic entropy are not close. Physicists and chemists are apt to be more interested in changes in entropy as a system spontaneously evolves away from its initial conditions, in accordance with the second law of thermodynamics, rather than an unchanging probability distribution. And, as the numerical smallness of Boltzmann's constant kB indicates, the changes in S / kB for even minute amounts of substances in chemical and physical processes represent amounts of entropy which are so large as to be right off the scale compared to anything seen in data compression or signal processing. But, at a multidisciplinary level, connections can be made between thermodynamic and informational entropy, although it took many years in the development of the theories of statistical mechanics and information theory to make the relationship fully apparent. In fact, in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon's information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states that it could be in, thus making any complete state description longer. (See article: maximum entropy thermodynamics.[Also,another article remarks: >>in the words of G. N. Lewis writing about chemical entropy in 1930, "Gain in entropy always means loss of information, and nothing more" . . . in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes/no questions that need to be answered in order to fully specify the microstate, given that we know the macrostate.>>]) Maxwell's demon can (hypothetically) reduce the thermodynamic entropy of a system by using information about the states of individual molecules; but, as Landauer (from 1961) and co-workers have shown, to function the demon himself must increase thermodynamic entropy in the process, by at least the amount of Shannon information he proposes to first acquire and store; and so the total entropy does not decrease (which resolves the paradox).
In short we are looking at the Macro-micro information gap, MmIG. And BTW, what happens when a body of gas undergoes free expansion, is that its molecules have their freedom put in a much less constrained state, hence we can see why there are now more ways for energy and mass to be arranged and distributed at micro level in the freely expanded state. I hope this helps KFkairosfocus
September 5, 2012
September
09
Sep
5
05
2012
05:39 PM
5
05
39
PM
PDT
Elizabeth Liddle redux. She who thought 100 pennies contains 100 bits of Shannon information. When asked, what is the information about, well, it was all downhill from there.Mung
September 5, 2012
September
09
Sep
5
05
2012
05:37 PM
5
05
37
PM
PDT
Given a 747, there does, must exist information to arrange the 747. Now there are many (infinite?) paths to arrange matter to arrive at the 747, that don't violate any physical laws. Whereas there are many likely ways to wreck a 747 and arrive at the same state. Now there is the 747 AND (+) some information somewhere. By necessity? Compared to a pile of the same materials + nothing. What is the entropy?butifnot
September 5, 2012
September
09
Sep
5
05
2012
05:12 PM
5
05
12
PM
PDT
Sal it immediately becomes apparent that correct application is all that matters, the calculation just comes out of it trivially.butifnot
September 5, 2012
September
09
Sep
5
05
2012
05:02 PM
5
05
02
PM
PDT
KF, I'm having difficulties understanding what you said. To help clarify, can you post your entropy numbers for: 747 broken 747 soda can If my numbers are wrong, please indicate the correct figures. That would be helpful to everyone concerned. Salscordova
September 5, 2012
September
09
Sep
5
05
2012
04:46 PM
4
04
46
PM
PDT
Sal: classical E-#'s are relative, as noted. The link in view is on the micro-state picture. I again suggest a read of Robertson. KFkairosfocus
September 5, 2012
September
09
Sep
5
05
2012
04:40 PM
4
04
40
PM
PDT
Will, or should the entropy associated with arranging the 747 enter into the calculation. What is the extent of the appropriate system for a relevant calculation?butifnot
September 5, 2012
September
09
Sep
5
05
2012
04:04 PM
4
04
04
PM
PDT
KF,
I trust this will help provide some balancing points.
In light of what you've written, is there something wrong with my calculation of the 747's thermodynamic entropy relative to an empty soda can? I've given several examples where I provided entropy numbers (in Joules/Kelvin). You're welcome to provide your alternate set of numbers for the reader based on the sources you are quoting. I'd appreciate if we could just deal with the numbers as engineers would, not as ID proponents or Darwinists or whatever, but just as Engineers reporting numbers. Salscordova
September 5, 2012
September
09
Sep
5
05
2012
03:50 PM
3
03
50
PM
PDT
Fascinating stuff KF!butifnot
September 5, 2012
September
09
Sep
5
05
2012
03:24 PM
3
03
24
PM
PDT
Nowhere in this calculation are notions of “order” explicitly or implicitly identified, and hence such notions are inessential and possibly misleading to the understanding of entropy.
The fundamental, first principle-level thing we are dealing with will emerge in many places - 'order' included.butifnot
September 5, 2012
September
09
Sep
5
05
2012
03:22 PM
3
03
22
PM
PDT
SC: I have already highlighted a comment this morning, here. Observe especially on the issue that in statistical thermodynamics contexts, entropy is a measure of the missing information on specific microstate given macrostate i.e. a measure of an info gap, which I have ambbreviated MmIG.* (Classical thermodynamics gives relative values that are useful in chemical and physical changes associated with thermodynamic variables.) In information systems, the informational entropy is assessed in a context of signal reception and the degree of surprise given by the state indicated by the signal as received. KF *PS: I repeat my comment that there is an informational school of thermodynamics, which has some things to say that are relevant. Let me clip from Robertson, on just one main point:
. . . the standard assertion that molecular chaos exists is nothing more than a poorly disguised admission of ignorance, or lack of detailed information about the dynamic state of a system . . . . If I am able to perceive order, I may be able to use it to extract work from the system, but if I am unaware of internal correlations, I cannot use them for macroscopic dynamical purposes. On this basis, I shall distinguish heat from work, and thermal energy from other forms . . . . [pp. vii - viii] . . . It has long been recognized that the assignment of probabilities to a set represents information, and that some probability sets represent more information than others . . . if one of the probabilities say p2 is unity and therefore the others are zero, then we know that the outcome of the experiment . . . will give [event] y2. Thus we have complete information . . . if we have no basis . . . for believing that event yi is more or less likely than any other [we] have the least possible information about the outcome of the experiment . . . . A remarkably simple and clear analysis by Shannon [1948] has provided us with a quantitative measure of the uncertainty, or missing pertinent information, inherent in a set of probabilities [NB: i.e. a probability different from 1 or 0 should be seen as, in part, an index of ignorance] . . . . [deriving informational entropy, . . . ] H({pi}) = - C [SUM over i] pi*ln pi, [. . . "my" Eqn 6] [where [SUM over i] pi = 1, and we can define also parameters alpha and beta such that: (1) pi = e^-[alpha + beta*yi]; (2) exp [alpha] = [SUM over i](exp - beta*yi) = Z [Z being in effect the partition function across microstates, the "Holy Grail" of statistical thermodynamics]. . . . [H], called the information entropy, . . . correspond[s] to the thermodynamic entropy [i.e. s, where also it was shown by Boltzmann that s = k ln w], with C = k, the Boltzmann constant, and yi an energy level, usually ei, while [BETA] becomes 1/kT, with T the thermodynamic temperature . . . A thermodynamic system is characterized by a microscopic structure that is not observed in detail . . . We attempt to develop a theoretical description of the macroscopic properties in terms of its underlying microscopic properties, which are not precisely known. We attempt to assign probabilities to the various microscopic states . . . based on a few . . . macroscopic observations that can be related to averages of microscopic parameters. Evidently the problem that we attempt to solve in statistical thermophysics is exactly the one just treated in terms of information theory. It should not be surprising, then, that the uncertainty of information theory becomes a thermodynamic variable when used in proper context . . . . Jayne's [summary rebuttal to a typical objection] is ". . . The entropy of a thermodynamic system is a measure of the degree of ignorance of a person whose sole knowledge about its microstate consists of the values of the macroscopic quantities . . . which define its thermodynamic state. This is a perfectly 'objective' quantity . . . it is a function of [those variables] and does not depend on anybody's personality. There is no reason why it cannot be measured in the laboratory." . . . . [pp. 3 - 6, 7, 36; replacing Robertson's use of S for Informational Entropy with the more standard H.]
I trust this will help provide some balancing points.kairosfocus
September 5, 2012
September
09
Sep
5
05
2012
03:19 PM
3
03
19
PM
PDT
1 2 3

Leave a Reply