Uncommon Descent Serving The Intelligent Design Community

A Designed Object’s Entropy Must Increase for Its Design Complexity to Increase – Part 2

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

In order for a biological system to have more biological complexity, it often requires a substantial increase in thermodynamic entropy, not a reduction of it, contrary to many intuitions among creationists and IDists. This essay is part II of a series that began with Part 1

The physicist Fred Hoyle famously said:

The chance that higher life forms might have emerged in this way is comparable to the chance that a tornado sweeping through a junkyard might assemble a Boeing 747 from the materials therein.

I agree with that assertion, but that conclusion can’t be formally derived from the 2nd law of thermodynamics (at least those forms of the 2nd law that are stated in many physics and engineering text books and used in the majority of scientific and engineering journals). The 2nd law is generally expressed in 2 forms:

2nd Law of Thermodynamics (THE CLAUSIUS POSTULATE)
No cyclic process is possible whose sole outcome is transfer of heat from a cooler body to a hotter body

or equivalently

2nd Law of Thermodynamics (THE KELVIN PLANCK POSTULATE)

No cyclic process is possible whose sole outcome is extraction of heat from a single source maintained at constant temperature and its complete conversion into mechanical work

In Part 1, I explored the Shannon entropy of 500 coins. If the coins are made of copper or some other metal, the thermodynamic entropy can be calculated. But let’s have a little fun, how about the thermodynamic entropy of a 747? [Credit Mike Elzinga for the original idea, but I’m adding my own twist]

The first step is to determine about how much matter we are dealing with. From the manufacturer’s website:

A 747-400 consists of 147,000 pounds (66,150 kg) of high-strength aluminum.

747 Fun Facts

Next we find out the the standard molar entropy of Aluminum (symbol Al). From Enthalpy Entropy and Gibbs we find that the standard entropy of aluminum at 25 Celcius at 1 atmosphere is 28.3 Joules/Kelvin/Mole.

Thus a 747’s thermodynamic entropy based on the aluminum alone is:

Suppose now that a tornado runs into 747 and tears of pieces of the wings, tail, and engines such that the weight of aluminum in what’s left of the 747 is now only 50,000 kg. Using the same sort of calculation, the entropy of the broken and disordered 747 is:

Hence the tornado lowers the entropy of the 747 by disordering and removing vital parts!

And even supposing we recovered all the missing parts such that we have the original weight of the 747, the entropy calculation has nothing to say about the functionality of the 747. Hence, the 2nd law, which inspired the notion of thermodynamic entropy has little to say about the design and evolution of the aircraft, and by way of extension it has little to say about the emergence of life on planet earth.

Perhaps an even more pointed criticism in light of the above calculations is that increasing mass in general will increase entropy (all other things being equal). Thus as a system becomes more complex, on average it will have more thermodynamic entropy. For example a simple empty soda can weighing 14 grams (using a similar calculation) has a thermodynamic entropy of 14.68 J/K which implies a complex 747 has 4.7 million times the thermodynamic entropy of a simple soda can. A complex biological organism like an Albatross has more thermodynamic entropy than a handful of dirt. Worse, when the Albatross dies, it loses body heat and mass, and hence its thermodynamic entropy goes down after it dies!

So the major point of Part II is that a designed object’s thermodynamic entropy often increases with the increasing complexity of the design for the simple reason that it has more parts and hence more mass. And as was shown in part 1, the Shannon entropy also tends to increase with the complexity of the design. Hence, at least two notions of entropy (Shannon and thermodynamic) can increase with increased complexity of a design (be it man-made design, evolution made design, or ….)

This concludes the most important points I wanted to get across. Below is merely an exploration of some of the fundamentals of thermodynamics for readers interested in the some of the technical details of thermodynamics and statistical mechanics. The next section can be skipped at the reader’s discretion since it is mostly an appendix to this essay.
========================================================================
THERMODYNAMICS AND STATISTICAL MECHANICS BASICS

Classical Thermodynamics can trace some of its roots to the work of Carnot in 1824 during his quest to improve the efficiency of steam engines. In 1865 we have a paper by Clausius that describes his conception of entropy. I will adapt his formula here:

Where S is entropy, Q is heat, and T is temperature. Perhaps to make the formula more accessible, let us suppose we have a 1000 watt heater running for 100 seconds that contributes to the boiling of water (already at 373.2ᵒK). What is the entropy contribution due this burst of energy from the heater? First I calculate the amount of heat energy input in the water:

Using Clausius’ formula, and the fact the process is isothermal, I then calculate the change of entropy in the water as:

So how does all this relate to Boltzmann and statistical mechanics? There was the intuition among scientists that thermodynamics could be related to classical (Newtonian) mechanics. They suspected that what we perceived as heat and temperature could be explained in terms of mechanical behaviors of large numbers of particles, specifically the statistical aspects of these behaviors, hence the name of the discipline is statistical mechanics.

A system of particles in physical space can be described in terms of position and momentum of the particles. The state of the entire system of particles can be expressed as a location in a conceptual Phase Space. We can slice up this conceptual phase space into a finite number of chunks because of the Liouville Theorem. These sliced-up chunks correspond to the microstates which the system can be found in, and furthermore the probability of the system being in a given microstate is the same for each microstate (equiprobable). Boltzmann made the daring claim that taking the logarithm of the number of microstates is related to the entropy Clausius defined for thermodynamics. The modern form of Boltzmann’s daring assertion is:

where Ω is the number of microstates of the system, S is the entropy, and kB is Boltzmann’s constant. Using Boltzmann’s forumula we can then compute the change of entropy:

As I pointed out Boltzmann’s equation looks hauntingly similar to Shannon’s entropy formula for the special case where the microstates of a Shannon information system are equiprobable.

Around 1877 Boltzmann published his paper connecting thermodynamics to statistical mechanics. This was the major breakthrough that finally bridged the heretofore disparate fields of thermodynamics and classical mechanics.

Under certain conditions we can relate Clausius notions of entropy to Boltzmann’s notions of entropy, and thus the formerly disparate fields of thermodynamics and classical mechanics are bridged. Here is how I describe symbolically the special case where Clausius’s notions of entropy agrees with Boltzmann’s notions of entropy:

[It should be noted, the above equality will not always hold.]

Mike Elzinga and I had some heated disagreement on the effect of spatial configuration to entropy. Perhaps to clarify, the colloquial notion of disordering things does not change the thermodynamic entropy (like taking a 747 and disordering its parts, as long as we have the same matter, it has the same thermodynamic entropy). But that’s not to say that changes in volume (which is a change in spatial configuration) won’t affect the entropy calculations. This can be seen in the formula for the entropy of an ideal monoatomic gas (the Sakur-Tetrode Equation):

where
S is the entropy
N is the number of atoms
kB is Boltzmann’s constant
V is the volume
E is the internal energy
ℏ = Dirac Constant (reduced Planck’s constant)

From this we can see that increasing either the volume which the gas occupies, the energy of the gas, or the number of particles in the gas will increase the entropy. Of course this must happen under reasonable limits since if the volume is too large there cannot be energy exchange in the particles and notions of what defines equilibrium begin to get fuzzy, etc.

Nowhere in this calculation are notions of “order” explicitly or implicitly identified, and hence such notions are inessential and possibly misleading to the understanding of entropy.

How the Sakur-Tetrode formula is derived is complicated, but if one wants to see how entropy can be calculated for simpler systems, Mike Elzinga provided a pedagogical concept test where the volume of the system is fixed and small enough such that the particles are close enough to interact. The volume is not relevant in his examples so the entropy calculations are simpler.

I went through a couple of iterations to solve the problems in his concept test. His test and my two iterations of answers (with help from Olegt on discrete math) are here:
Concept test attempt 1: Basic Statistical Mechanics

and
Concept test amendments: Purcell Pound

Acknowledgements
Mike Elzinga, Olegt, Elizabeth Liddle, Andy Jones, Rob Sheldon, Neil Rickert, the management, fellow authors and commenters at UD and Skeptical Zone.

[UPDATE 9/7/2012]
Boltzmann

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.” (Final paragraph of #87, p. 443.)

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

Comments
It seems very obvious to me that there is a creative intelligence (or force or agency) which is the source of anything new. Otherwise, why would there be new species or individuals? There is no reason or law in science which says 'new things must happen'. Natural philosophy safely assumes that there is a constant source of new things - the legendary cornucopia - but it can't, and actually doesn't need to explain it. The only glimmer of bad news is that Christians don't necessarily have a patent on it.jeeprs
September 12, 2012
September
09
Sep
12
12
2012
05:27 AM
5
05
27
AM
PDT
It seems very obvious to me that there is a creative intelligence (or force or agency) which is the source of anything new. Otherwise, why would there be new species or individuals? There is no reason or law in science which says 'new things must happen'. Natural philosophy safely assumes that there is a constant source of new things - the legendary [i]cornucopia[/i] - but it can't, and actually doesn't need to explain it. The only glimmer of bad news is that Christians don't necessarily have a patent on it.jeeprs
September 12, 2012
September
09
Sep
12
12
2012
05:27 AM
5
05
27
AM
PDT
Mung: I think the root issue is to understand the difference between average info per symbol sent and the MmIG that Gibbs' metric points to. When a complex -- many part, high info storing potential -- system is constrained to be in a high info functional state, there is high known info, at macro level. This can be seen by using the black box (BB) model, where one figuratiely pushes the button to see emission of a string of symbols that reveal its info state:
))-|| BB || --> N symbols of info on functional state
Such a system may or may not have high info per symbol or component involved, but in aggregate its info content is high and constrained by the overall function that is observable. (Where, there is a sloppy error of transferring the proper usage of Shannon's H as avg info per symbol to the info content of a message with N elements, N * H.) On heating the 747, we have indeed increased its entropy, as today is the 11th anniversary of the horrific proof of: we are moving it towards the point where its Al atoms have enough random energy to be free to engage in reactions with the O2 etc and burn. Or we could simply melt the plane. In so doing we would trigger a phase change and in addition destroy the functional config. The issue of confusion of entropy with complexity is the same one about the loose use of entropy to mean N*H. And I keep on pointing out just how often entropy shows up in the sense of time's arrow: the direction of spontneous change to access clusters of microstates that are dominant numerically, once the constraints that keep the system in other states are relieved. The 504 coin tray that starts out in HTHT . . . HT, then is shaken up and gradually reverts to near-50:50 H & T in no particular order is a good example in point. This system has moved from a high order to a low order config, one that is readily seen as being part of the predominant cluster of microstates. We can imagine a BB that has a coin tray in it and a coin reader that on pushing the button, emits the 504 bit coin state. Now, go away for a time and pick back up the same coin tray BB. Push the button again, and now we see the first 72 ASCII characters from this post. On many grounds, the best explanation for this organised state is IDOW, i.e. design by intelligently directed organising work. And such a system is in a tightly constrained functional state, though one distinct from the original ordered form, it is in an organised, functional info-rich state. That is the essential form of the design inference on tested, reliable sign that objectors to ID are so loathe to acknowledge. It also extends to the point that just flying either the 747 or the microjet of the other thought exercise, is telling us a lot about a constrained, specific and functional, complex, internal state. One tied to configuration work, IDOW. KFkairosfocus
September 10, 2012
September
09
Sep
10
10
2012
11:14 PM
11
11
14
PM
PDT
Entropy is also sometimes confused with complexity, the idea being that a more complex system must have a higher entropy. In fact, that is in all liklihood the opposite of reality. A system in a highly complex state is probably far from equilibrium and in a low entropy (improbable) state, where the equilibrium state would be simpler, less complex, and higher entropy.
What is Entropy?Mung
September 10, 2012
September
09
Sep
10
10
2012
10:27 PM
10
10
27
PM
PDT
A tornado that rips the wings off a 747 changes not one whit how much information is required to construct a 747. And if your zone if interest is how to build a 747, the amount of missing information isn't changed by a tornado either. In fact, it's most difficult to ascertain just what Sal's point is throughout this entire exercise. On the one hand he seems to be claiming that entropy and intelligent design are not connected. But then he also states that "a designed object’s entropy must increase for its design complexity to increase." Now that's just a bizarre statement. What does it even mean? If we load a 100k block of aluminum on a 747 we have increased it's entropy, but doing so has not increased the design complexity of the 747, rather it has only made it possible to increase the design complexity of the 747? How so? If we heat the 747, does that increase it's entropy? Does that then make it possible to increase the design complexity of the 747 in some way that not heating it would not have allowed?Mung
September 8, 2012
September
09
Sep
8
08
2012
02:58 PM
2
02
58
PM
PDT
PS: It should be noted that the true focal issue is the bridge between information and entropy in light of the Gibbs analysis. That, I have focussed on, noting as necessary over the past several days that a partial entropy calc on the state of a mass of Al at a given temp is a side track from that.kairosfocus
September 8, 2012
September
09
Sep
8
08
2012
05:11 AM
5
05
11
AM
PDT
Pardon typo: 3.136 * 10^5 Jkairosfocus
September 8, 2012
September
09
Sep
8
08
2012
04:57 AM
4
04
57
AM
PDT
SC: I am sorry, it is time to move beyond snip and snipe rhetorical games. Entropy issues are too intricate and connected for that sort of rhetorical resort to be used, whether willfully or because we do not appreciate the relevance of context. (To use an easily understood illustration, proof texting Bible verse hopscotch with disregard to immediate and broader contexts is not helpful in Bible study. The same error is even less effective with thermodynamics, especially where statistical thermodynamics is relevant.) You will see my main summary response here. Now, you will notice above that I did not bring out details on the issue of the internal energy in an atom of Al due to its having a binding energy per nucleon of about 8.3 MeV. That is strictly a part of its energy content, enthalpy and so entropy too. But we do not talk of such in general, because this part of the strictly correct account is not generally relevant unless we are in a supermassive star that is running down to having an Iron core. Where, Enthalpy, H = U + PV, and onward: d H(S, p) = T dS + Vdp Or if dp = 0, i.e. under constant pressure (just to highlight): dH(S) = TdS Similarly, to raise 1 kg of H2O 100 m, we use 1 kg * 9.8 N/kg * 100 m = 980 J of energy And to heat the same kg from 25 to 100 degrees, but not to actually boil it would take: 1 kg * 4,181 J/kg.K * 75 deg = 3.136 * 10^6 J Now, in describing energy involved in mechanical matters, it would be a distraction to point to how much more energy is bound up in the heat content of the water or in heating it up or cooling it down. Even if the water's temp (and perhaps even phase) is changing during the process of lifting it. That is a part of the wider energy account, but it is not relevant to the focal issue of gravitational potential energy involved in lifting the body of water. You have, unfortunately, done the like of that error. Right from the outset I pointed out (a) that the entropy component tied to being Al at a certain temp is irrelevant to the issue of the formation of that Al into a system such as a 747, and (b) that to go on to play system boundary games such that if a tornado twists and tears off sufficient mass, that part left in the boundary imposed will have less mass, is (c ) a further error of assignment of system boundary that distracts from -- indeed strawmannises -- the configurational issue. Yes, we can find entropy numbers for Al at a given temp in tables. These are useful for say the folks who want to use the enthalpy locked away in the Al to make it into rocket fuel. (I am fairly sure bin Laden did not do any precise calculation, all he needed to know is that jet fuel is much like kerosine and that Al immersed in a hot enough fire would burn like HMS Sheffield did when the fuel in the Exocet missile that hit it set its Al superstructure on fire off the Falkland islands. And, that Al is a component of the rocket fuel formerly used in the Shuttle's boosters.) But, such is irrelevant to the focal questions as to:
1 --> whether entropy as understood in Gibbs' terms is linked to the Shannon metric H of average info per symbol, and 2 --> thus grounds a point that Gibbs entropy -- a generalisation of Boltzmann entropy which in turn can be directly connected to entropy changes measured and used in classical thermodynamics -- is linked to the average missing info on specific microstate configuration that is consistent with the macrostate specified in terms of P, V, T, etc.
Once that bridge is established, regardless of those who do not like it, configurational and informational matters are a part of the entropy story. Which has been done since the early efforts of Brillouin and with more elaboration since Jaynes et al. I have outlined the way that becomes relevant to the concerns linked to design theory here, earlier this morning. My underlying point is that (d) we should and normally do discuss the relevant aspect of a thermodynamic quantity, and should not allow ourselves to be distracted by something that is irrelevant to the matter under consideration, which is (e) that component of entropy which is tied to configuration and which is in turn linked to information. The characteristic concern of those looking at disorder vs organisation or even order, is matters of configuration. And so, it is those issues that must be focal KFkairosfocus
September 8, 2012
September
09
Sep
8
08
2012
04:55 AM
4
04
55
AM
PDT
KF finally admits it, the more aluminum in the 747, the more thermodynamic entropy:
the more Al the more entropy
and Kf writes:
Which is not relevant save to certain chem rxns and associated enthalpy results.
Not relevant to what? Design. If so, thank you. Thermodynamic entropy is not relevant to design. Thanks for proving my point. Thus, the Clausius postulate and the Kelvin-Plank postulate (the two primary statements of the 2nd law of thermodynamics) are irrelevant to design. Thank you very much. You could have been a little more succinct in saying so.
the more Al the more entropy
So my calculations stand and so does the inference. thermodynamic entropy increases with increasing amounts of matter. Therefore a fully functioning 747 has more thermodynamic entropy than a 747 that had its wings ripped off by a tornado. Hence a torndado reduces the thermodynamic entropy of a 747 in that case, it does not increase it. Thank you very much. And if you want to insist on Shannon and symbols, the 747 with its diverse organization also has substantially more shannon entropy than a soda can. So if you actually followed the consequences Jaynes, you'd come to the same conclusion: designs (be they man-made or whatever) often require more entropy, not less! Your insistence that I read Jaynes is tiresome since you don't actually admit the the consequences of Jaynes: the more parts required for a functioning system the higher its entropy, which is exactly the claim of this two part series. Nay, YOU need to read Jaynes and understand it, not me. Also, for the love of PZ Myers, can't you be a litte more focused in your writing style. Your handle implies focus not circumlocution. Perhaps the remedy for the dissonance between the writing style and the handle is to rename the handle as KairosCircumlocution.scordova
September 8, 2012
September
09
Sep
8
08
2012
12:33 AM
12
12
33
AM
PDT
. . . in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes/no questions that need to be answered in order to fully specify the microstate, given that we know the macrostate.
One thing I like about the Ben-Naim books is that he starts with very simple cases, like a game of 20 questions, or marbles, or dice, and builds on them. So at first he might deal with two outcomes of equal probability, then perhaps more outcomes but the distribution remains the same, but then adds additional factors such as a non-equiprobable distribution.Mung
September 7, 2012
September
09
Sep
7
07
2012
05:36 PM
5
05
36
PM
PDT
Sackur and Tetrode calculated the number of quantum mechanical states of an ideal gas (W). Using the Boltzmann definition of entropy (S), they then calculated the entropy of an ideal gas. This approach provides a method of calculating the entropy of an ideal gas, but does not provide any interpretation of entropy. My approach in this book does not calculate W, nor does it use Boltzmann's definition of entropy. Instead, I start with Shannon's measure of information (SMI) and apply it to the distribution of locations and momenta of particles in an ideal gas. This leads directly to the entropy of an ideal gas - bypassing the calculation of W. This procedure provides an interpretation for the entropy conferred by whatever interpretation you accept for hte SMI. This approach is radically different from the Sackur-Tetrode calculation. - Arieh Ben-Naim
http://www.ariehbennaim.com/books/entropy2ndlaw.htmlMung
September 7, 2012
September
09
Sep
7
07
2012
05:21 PM
5
05
21
PM
PDT
SC: Insofar as, the more Al, the more entropy bound up with being Al at that temp, the more Al the more entropy. Which is not relevant save to certain chem rxns and associated enthalpy results. Yes Al makes a nice intense fire [HMS Sheffield or Airplanes in the WTC, anyone . . . ] and even rocket fuel. That -- as you have now had ample opportunity to notice -- is utterly irrelevant to the issue that Entropy in information systems has to do with average info per symbol, and that through Jaynes et al, it has been noticed that this can be linked to the Macro-micro state info gap giving a scaled metric on average missing info to specify microstate per microstate consistent with a given macrostate. That is, there IS a credible link from entropy to information, which makes relevant considerations on functionally constrained configs. Never mind distractions on what could happen given the heat content [enthalpy] of Al. and, it is a relevant issue to reflect on the issue of possible configs and the constraint on such to be in a specifically functional organised state. Even, if the numbers will be lower than the thermal ones. (Ever contrasted the energy to raise 1 kg of H2O 100 m and that to heat it from say 25 deg C to boiling? The two are both energy, but it is worth considering on the effects under different heads on energy but in connexion with different energy states. And we have not touched on the related issues of nuclear binding energies, which are part of the energy story, too.) So, my point is that similar reasoning is relevant to the cases and should be used. KFkairosfocus
September 7, 2012
September
09
Sep
7
07
2012
03:53 PM
3
03
53
PM
PDT
PS: Trying to rule the long known link between high entropy states, degradation of energy resources available to do work, and disorder as in effect not to be mentioned, also is not good enough. To give you an idea of some of the links, let me snip a discussion on a marbles and pistons model. And, I am pretty well satisfied that we don't only need to be able to do number calcs, but we must also understand what those numbers are about (BTW, one of the reasons L K Nash is an excellent intro to Stat thermo-D). I clip my always linked note, APP 1: _________ >> Let us reflect on a few remarks on the link from thermodynamics to information: 1] TMLO: In 1984, this well-received work provided the breakthrough critical review on the origin of life that led to the modern design school of thought in science. The three online chapters, as just linked, should be carefully read to understand why design thinkers think that the origin of FSCI in biology is a significant and unmet challenge to neo-darwinian thought. (Cf also Klyce's relatively serious and balanced assessment, from a panspermia advocate. Sewell's remarks here are also worth reading. So is Sarfati's discussion of Dawkins' Mt Improbable.) 2] But open systems can increase their order: This is the "standard" dismissal argument on thermodynamics, but it is both fallacious and often resorted to by those who should know better. My own note on why this argument should be abandoned is: a] Clausius is the founder of the 2nd law, and the first standard example of an isolated system -- one that allows neither energy nor matter to flow in or out -- is instructive, given the "closed" subsystems [i.e. allowing energy to pass in or out] in it. Pardon the substitute for a real diagram, for now: Isol System: | | (A, at Thot) --> d'Q, heat --> (B, at T cold) | | b] Now, we introduce entropy change dS >/= d'Q/T . . . "Eqn" A.1 c] So, dSa >/= -d'Q/Th, and dSb >/= +d'Q/Tc, where Th > Tc d] That is, for system, dStot >/= dSa + dSb >/= 0, as Th > Tc . . . "Eqn" A.2 e] But, observe: the subsystems A and B are open to energy inflows and outflows, and the entropy of B RISES DUE TO THE IMPORTATION OF RAW ENERGY. f] The key point is that when raw energy enters a body, it tends to make its entropy rise. This can be envisioned on a simple model of a gas-filled box with piston-ends at the left and the right: ================================= ||::::::::::::::::::::::::::::::::::::::::::|| ||::::::::::::::::::::::::::::::::::::::::::||=== ||::::::::::::::::::::::::::::::::::::::::::|| ================================= 1: Consider a box as above, filled with tiny perfectly hard marbles [so collisions will be elastic], scattered similar to a raisin-filled Christmas pudding (pardon how the textual elements give the impression of a regular grid, think of them as scattered more or less hap-hazardly as would happen in a cake). 2: Now, let the marbles all be at rest to begin with. 3: Then, imagine that a layer of them up against the leftmost wall were given a sudden, quite, quite hard push to the right [the left and right ends are pistons]. 4: Simply on Newtonian physics, the moving balls would begin to collide with the marbles to their right, and in this model perfectly elastically. So, as they hit, the other marbles would be set in motion in succession. A wave of motion would begin, rippling from left to right 5:As the glancing angles on collision will vary at random, the marbles hit and the original marbles would soon begin to bounce in all sorts of directions. Then, they would also deflect off the walls, bouncing back into the body of the box and other marbles, causing the motion to continue indefinitely. 6: Soon, the marbles will be continually moving in all sorts of directions, with varying speeds, forming what is called the Maxwell-Boltzmann distribution, a bell-shaped curve. 7: And, this pattern would emerge independent of the specific initial arrangement or how we impart motion to it, i.e. this is an attractor in the phase space: once the marbles are set in motion somehow, and move around and interact, they will soon enough settle into the M-B pattern. E.g. the same would happen if a small charge of explosive were set off in the middle of the box, pushing our the balls there into the rest, and so on. And once the M-B pattern sets in, it will strongly tend to continue. (That is, the process is ergodic.) 8: A pressure would be exerted on the walls of the box by the average force per unit area from collisions of marbles bouncing off the walls, and this would be increased by pushing in the left or right walls (which would do work to push in against the pressure, naturally increasing the speed of the marbles just like a ball has its speed increased when it is hit by a bat going the other way, whether cricket or baseball). Pressure rises, if volume goes down due to compression. (Also, volume of a gas body is not fixed.) 9: Temperature emerges as a measure of the average random kinetic energy of the marbles in any given direction, left, right, to us or away from us. Compressing the model gas does work on it, so the internal energy rises, as the average random kinetic energy per degree of freedom rises. Compression will tend to raise temperature. (We could actually deduce the classical — empirical — P, V, T gas laws [and variants] from this sort of model.) 10: Thus, from the implications of classical, Newtonian physics, we soon see the hard little marbles moving at random, and how that randomness gives rise to gas-like behaviour. It also shows how there is a natural tendency for systems to move from more orderly to more disorderly states, i.e. we see the outlines of the second law of thermodynamics. [ --> This can be elaborated on the number of accessible microstates consistent with a given macrostate and how there is a sharply peaked distribution around the equilibrium value] 11: Is the motion really random? First, we define randomness in the relevant sense: In probability and statistics, a random process is a repeating process whose outcomes follow no describable deterministic pattern, but follow a probability distribution, such that the relative probability of the occurrence of each outcome can be approximated or calculated. For example, the rolling of a fair six-sided die in neutral conditions may be said to produce random results, because one cannot know, before a roll, what number will show up. However, the probability of rolling any one of the six rollable numbers can be calculated. 12: This can be seen by the extension of the thought experiment of imagining a large collection of more or less identically set up boxes, each given the same push at the same time, as closely as we can make it. At first, the marbles in the boxes will behave very much alike, but soon, they will begin to diverge as to path. The same overall pattern of M-B statistics will happen, but each box will soon be going its own way. That is, the distribution pattern is the same but the specific behaviour in each case will be dramatically different. 13: Q: Why? 14: A: This is because tiny, tiny differences between the boxes, and the differences in the vibrating atoms in the walls and pistons, as well as tiny irregularities too small to notice in the walls and pistons will make small differences in initial and intervening states -- perfectly smooth boxes and pistons are an unattainable ideal. Since the system is extremely nonlinear, such small differences will be amplified, making the behaviour diverge as time unfolds. A chaotic system is not predictable in the long term. So, while we can deduce a probabilistic distribution, we cannot predict the behaviour in detail, across time. Laplace's demon who hoped to predict the future of the universe from the covering laws and the initial conditions, is out of a job. 15: To see diffusion in action, imagine that at the beginning, the balls in the right half were red, and those in the left half were black. After a little while, as they bounce and move, the balls would naturally mix up, and it would be very unlikely indeed — through logically possible — for them to spontaneously un-mix, as the number of possible combinations of position, speed and direction where the balls are mixed up is vastly more than those where they are all red to the right, all alack to the left or something similar. (This can be calculated, by breaking the box up into tiny little cells such that they would have at most one ball in them, and we can analyse each cell on occupancy, colour, location, speed and direction of motion. thus, we have defined a phase or state space, going beyond a mere configuration space that just looks at locations.) 16: So, from the orderly arrangement of laws and patterns of initial motion, we see how randomness emerges through the sensitive dependence of the behaviour on initial and intervening conditions. There would be no specific, traceable deterministic pattern that one could follow or predict for the behaviour of the marbles, through we could work out an overall statistical distribution, and could identify overall parameters such as volume, pressure and temperature . . . >> _________ That is just foundational discussion, so we can get a picture of what is going on. With suitable models and quantities plus come clever algebra and calculus, we can build up various specific thermodynamics models, but these mathematical models will not undo the basic points already seen. In particular we will see some illustrations on why rise in entropy is seen as tied to increased disorder as it is based on more ways that energy and mass at micro-levels can be distributed consistent with a given macro-state. More constrained and special states or clusters of states will be more orderly and less entropic. Regardless of who does not like this qualitative usage. In the context of FSCO/I, we are dealing with organised states that are recognisable as highly specific -- notice that! -- based on function. Organised not merely orderly, as in say a crystal, we have an information rich arrangement that is structured around a function and once the function is there, a knowing observer can know a lot about just how tightly constrained the space of possible configs is here. When that organisation is deranged function vanishes and the components are nowhere so tightly constrained from the macro-picture. It is reasonable to speak about disorganisation and to associate it with a rise in entropy, in the MmIG sense. Notice, we are here measuring an info gap, equivalent to measuring degrees of freedom and energetic implications. Just as, if we have a block of ice at melting point and feed in latent heat, that heat goes into disordering -- notice the shift in terms -- the molecules and appears in the higher entropy of water at melting point than in ice at the same temp. We hardly need to underscore that the molecules of ice have a greater degree of freedom than those of the ice that has not yet melted. (Indeed the marble model can be extended to a model of melting, with slight adjustments.) Okay, enough for the moment. KFkairosfocus
September 7, 2012
September
09
Sep
7
07
2012
01:25 PM
1
01
25
PM
PDT
That is why I have called your attention to Jaynes repeatedly,
And will Jaynes contradict this claim?
entropy increases with the number of particles (other things being equal)
No. Thus you're citation actually favors my claims in these two essays. I demonstrated in a rather pointed way using your own example of a ideal monoatomic gas.
Of course, entropy is an extensive state function.
So why don't you decode what that means? It means the more particles the higher the entropy! Hence a 747 has substantially more entropy (on the order or 4.7 million times) than a soda can. Something you seem unwilling to admit directly except in the most hard-to-understand language.scordova
September 7, 2012
September
09
Sep
7
07
2012
01:25 PM
1
01
25
PM
PDT
SC: Pardon, but I have the distinct impression that you are beginning to indulge snip-snipe tactics. In a matter as involved and integrated as this, that is not good enough. Of course, entropy is an extensive state function. That is what you exploited to do your calcs for the mass of Al in a 747, but those partial-story calcs -- good enough for certain chem engg' tasks -- do not differentiate between Al in ingots, Al in sheets, etc and Al in aircraft. They are in effect based on degrees of freedom of atoms, and temp-related energy distributions leading to numbers of states of the material; so, they simply are not looking at the whole entropy-relevant story. That is why the underlying Gibbs -- Shannon analysis and its link to the missing info on microstate given macrostate become relevant. That is why I have called your attention to Jaynes repeatedly, and it is why I took time to highlight the correct usage of H, average info per symbol. Also to draw out the parallel to thermodynamics systems through the Gibbs relationship. As in what I have abbreviated as MmIG, the macro-state micro-state info gap. And where also I pointed out that a system that is in the sort of constrained state involved in a functionally specific entity, has a very different degree of freedom consistent with that state than just raw materials lying around. The sort of state where I have repeatedly pointed out, the observed way to get there -- given the issue of tight zones in a space of possibilities otherwise and the needle in the haystack problem -- is IDOW, intelligently directed organising work. Design. I trust this will help you focus attention on the points of concern. KFkairosfocus
September 7, 2012
September
09
Sep
7
07
2012
12:59 PM
12
12
59
PM
PDT
[UPDATE 9/7/2012] Boltzmann
“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.” (Final paragraph of #87, p. 443.)
That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they? Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.” There is no basis in physical science for interpreting entropy change as involving order and disorder.
scordova
September 7, 2012
September
09
Sep
7
07
2012
08:32 AM
8
08
32
AM
PDT
From KairsoFocus himself
S_sys = k_B [SUM over i's] p_i * I_i .... an estimate of what it would take to describe the state of 1 cc of monoatomic ideal gas at 760 mm HG and 0 degrees C, i.e. 2.687 * 10^19 particles with 6 degrees of positional and momentum freedom would help us. Let us devote 32 bits — 16 bits to get 4 hex sig figs, and a sign bit plus 15 bits for the binary exponent to each of the (x, y, z) and (P_x, P-y and P-z) co-ordinates in the phase space. We are talking about: 2.687 * 10^19 particles x 32 bits per degree of freedom x 6 degrees of freedom each _____________ 5.159 * 10^21 bits of info
That calculation suggests: THE MORE PARTICLES THERE ARE THE HIGHER THE ENTROPY! Which corresponds to the Sakur-Tetrode Equation I posted in the OP for monoatomic ideal gases!scordova
September 7, 2012
September
09
Sep
7
07
2012
06:42 AM
6
06
42
AM
PDT
PS: Ch 13 of the part of Motion Mountain here may also be helpful.kairosfocus
September 7, 2012
September
09
Sep
7
07
2012
01:04 AM
1
01
04
AM
PDT
EDTA: Shannon entropy is about the average information per symbol in a communication, where symbols may not be equiprobable (as is the usual case, e.g. ~ 1/8 of normal English text is E or e). As I have summarised above and in the other thread, when Shannon deduced the metric, it was seen to have the same mathematical shape as the Gibbs expression for the statistical mechanics entropy of a body where for a given Macro-observable state, various micro-states y_i are possible with different probabilities p_i. Where microstates are different arrangements of mass and energy at ultra-microscopic level. The obvious debate over the parallel occurred, and the upshot of it, per JAYNES et al, is that we do face a communication situation, where we seek to infer concerning what we do not directly see, the specific distribution of mass and energy at micro-level, from what we can see, the major lab-observable characteristics. Thus, it can be shown that Gibbs Entropy is a scaled measure of the average missing information per such microstate given what we do observe, the macrostate. In effect the more random and large the number of such possibilities, the larger is the quantum of missing info. I suggest you may find it useful to refer to what I said in the other thread, especially at 2, 5, 7, and 56 (notice how Shannon himself uses the term "entropy" in an informational context), with 15 - 16 and 23 - 25 above in this thread. I clip here from 56 in that other thread:
what happens when you have a message of N elements? In the case of a system of complexity N elements, then the cumulative, Shannon metric based information — notice how I am shifting terms to avoid ambiguity — is, logically, H + H + . . . H N times over, or N * H. And, as was repeatedly highlighted, in the case of the entropy of systems that are in clusters of microstates consistent with a macrostate, the thermodynamic entropy is usefully measured by and understood on terms of the Macro-micro information gap (MmIG], not on a per state or per particle basis but a cumulative basis: we know macro quantities, not the specific position and momentum of each particle, from moment to moment, which given chaos theory we could not keep track of anyway. A useful estimate per the Gibbs weighed probability sum entropy metric — which is where Shannon reputedly got the term he used from in the first place, on a suggestion from von Neumann — is:
>>in the words of G. N. Lewis writing about chemical entropy in 1930, “Gain in entropy always means loss of information, and nothing more” . . . in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes/no questions that need to be answered in order to fully specify the microstate, given that we know the macrostate. >>
Where, Wiki gives a useful summary:
The macroscopic state of the system is defined by a distribution on the microstates that are accessible to a system in the course of its thermal fluctuations. So the entropy is defined over two different levels of description of the given system. The entropy is given by the Gibbs entropy formula, named after J. Willard Gibbs. For a classical system (i.e., a collection of classical particles) with a discrete set of microstates, if E_i is the energy of microstate i [--> Notice, summation is going to be over MICROSTATES . . . ], and p_i is its probability that it occurs during the system’s fluctuations, then the entropy of the system is S_sys = – k_B [SUM over i's] P_i log p_i
Also, {- log p_i} is an information metric, I_i, i.e the information we would learn on actually coming to know that the system is in microstate i. Thus, we are taking a scaled info metric on the probabilistically weighted summmation of info in each microstate. Let us adjust:
S_sys = k_B [SUM over i's] p_i * I_i
This is the weighted average info per possible microstate, scaled by k_B. (Which of course is where the Joules per Kelvin come from.) In effect the system is giving us a message, its macrostate, but that message is ambiguous over the specific microstate in it. After a bit of mathematical huffing and puffing, we are seeing that the entropy is linked to the average info per possible microstate. Where this is going is of course that when a system is in a state with many possible microstates, it has enormous freedom of being in possible configs, but if the macro signals lock us down to specific states in small clusters, we need to account for how it could be in such clusters, when under reasonable conditions and circumstances, it could be easily in states that are far less specific. In turn that raises issues over IDOW. Which then points onward to FSCO/I being a sign of intelligent design.
In 57, I went on to make an estimate:
an estimate of what it would take to describe the state of 1 cc of monoatomic ideal gas at 760 mm HG and 0 degrees C, i.e. 2.687 * 10^19 particles with 6 degrees of positional and momentum freedom would help us. Let us devote 32 bits — 16 bits to get 4 hex sig figs, and a sign bit plus 15 bits for the binary exponent to each of the (x, y, z) and (P_x, P-y and P-z) co-ordinates in the phase space. We are talking about: 2.687 * 10^19 particles x 32 bits per degree of freedom x 6 degrees of freedom each _____________ 5.159 * 10^21 bits of info That is, to describe the state of the system at a given instant, we would need 5.159 * 10^21 bits, or 644.9 * 10^18 bytes. That is how many yes/no quest5ions, in the correct order, would have to be answered and processed every clock tick we update. And with 10^-14 s as a reasonable chemical reaction rate, we are seeing a huge amount of required processing to keep track. As to how that would be done, that is anybody’s guess.
Now, all of this is complex and in parts quite hard to follow, where the literature is complex and in my opinion usually not put together in the best way to introduce the puzzled student. But that is unfortunately typical in several fields of physics once things get highly mathematical. That is why I normally do focus on the information origination challenge directly, but in the end due to the underlying physics and chemistry linked to the OOL issue, thermodynamics is a connected issue, and one that is hotly contested. So, when it is raised directly as in the current set of threads here at UD and apparently an earlier set at TSZ, it has to be dealt with. One of the issues is that in Physics, there are diverse schools of thought on statistical mechanics/ or statistical thermodynamics/ or statistical thermophysics, and these schools diverge in views significantly on the links between thermodynamics and information theory. However in recent years the momentum has been shifting towards the acceptance of some of the insights of Jaynes et al. (That, BTW, is why I have asked whether those who have been discussing this at more technical level have been calling that name, it is a test of whether the informational school of thought has been reckoned with.) I should let Jaynes speak for himself, though a clip from Harry S Robertson in Statistical Thermophysics, Prentice:
Jayne's [summary rebuttal to a typical objection] is ". . . The entropy of a thermodynamic system is a measure of the degree of ignorance of a person whose sole knowledge about its microstate consists of the values of the macroscopic quantities . . . which define its thermodynamic state. This is a perfectly 'objective' quantity . . . it is a function of [those variables] and does not depend on anybody's personality. There is no reason why it cannot be measured in the laboratory." [p. 36]
Other than this, I think I find the summary here about as helpful as this field gets. After this, the Wiki article here -- which lacks adequate illustrations and introductory level cases -- may be also helpful. Leonard Nash's Elements of Statistical Thermodynamics may be a classic introduction (I am ashamed to have to admit, by a Chemist; Physicists have simply not done a very good job here I find), one that is blessedly short and uses apt illustrations and cases. After all the issues have been sorted out -- including loose use of the Shannon avg info per symbol term Entropy, which is explicitly an average per symbol to mean the cumulative info communicated by a string of N elements, which is N*H -- we come back to this: once we see a living cell, we know that at molecular level, it is in a tightly confined set of possibilities relative to the possible arrangements of the component molecules and atoms. That is, we have moved to a distinct macro-observable state that allows us to know a lot more about the molecules than we would if we were to simply take such a cell and prick it, then decant its components into a test tube, where they would be subject to diffusion etc. This sort of Humpty-Dumpty exercise has been done and the result is predictable: the cell never spontaneously reassembles. The message there is that the best explanation we have for the sort of relevant functionally specific, complex organisation and information we deal with, is that we have intelligently designed organising work -- design -- as its only observed and (in light of the needle in the haystack challenge) analytically plausible cause. This is controversial on OOL etc, not because that is not true but because there is a dominant ideology that thinks that such IDOW was not possible at the origin of cell based life, namely evolutionary materialism. they even wave the flag of science as the flag of their party. KFkairosfocus
September 7, 2012
September
09
Sep
7
07
2012
12:33 AM
12
12
33
AM
PDT
EDTA: The concept of Shannon entropy is useful in evaluating the functional information in a protein. That has been used by Durston. Indeed, there is a difference between the potential entropy of a string of a given length, that is simply the search space for that string, and the improbability of some specific functional state of that string. In that case, which is what really we are interested in in ID theory, we must calculate the probability of a functional state (for a specifically defined, measurable function), and that is made by dividing the number of functional states by the total number of states (the target space by the search space). For proteins, the Durston method assigns a Shannon value to each aminoacid in a specific protein molecule, according to the variation of that AA in the known proteome. IOWs, an aminoacid site that is free to change in a practically random way will have the maximum uncertainty. A functional constraint at that aminoacid site will result in some conservation throughout the proteome, and therefore in a reduction of uncertainty. Therefore, the functional constraint in a protein sequence correspond to the total reduction of uncertainty determined by function, versus the maximum uncertainty of a totally random, non functionally constrained sequence. This is a simple, indirect way of approximating the functional information in a protein family. So, to sum up: a) The total search space (potential improbability) of a string state depends only on its length and alphabet. For a protein sequence, it is 20^[protein length in aminoacids]. b) The improbability of a given functional state is the ratio between the functional space and the search space. A simple way of approximating that for protein families is by the application of a concept derived from Shannon entropy to the known proteome.gpuccio
September 6, 2012
September
09
Sep
6
06
2012
11:49 PM
11
11
49
PM
PDT
To bolster the case that Shannon entropy might not apply to finite observations, consider this: to be able to calculate entropy, you have to have probabilities. If the probabilities are those of a finite string of symbols (heads/tails, e.g.) that came from the source, then you have to know the whole string. (If you can't see the whole finite string, then your calculation won't be worth anything, as the remaining symbols might deviate from the initial symbols' probabilities.) But if you know the whole string of symbols, then you already have full knowledge of the internals of the system in question. So there is no surprise remaining in the system. If you just intend to describe information storage capacity, perhaps that terminology would be more straightforward.EDTA
September 6, 2012
September
09
Sep
6
06
2012
07:56 PM
7
07
56
PM
PDT
I see where I misunderstood, and it may bring out one or two points that will help. Gpuccio said,
But if my program is 500 bit long, I need 500 coins. So, the Shannon entropy, or potential complexity, of the material system needs to be higher to allow a more complex design.
So for the purposes here, Shannon entropy or potential complexity is the storage capacity of the medium in question. I was taking Shannon entropy to be the amount of information/surprise gained by revealing the arrangement of the coins. In the former case, the maximum potential entropy is high, since each coin can be in either state. In the latter case, if we know the arrangement, a uniform arrangement (all heads, e.g.) is uninformative, and so the entropy would be low. Context is everything. But there's more to the 500 coin example. In the texts I have read, Shannon entropy was defined as the average bits per symbol emitted by a "source", calculated using Shannon's formula. As such, the length of the output is not given, nor is it needed. The entropy is thus a property of the source. But is Shannon entropy even defined for a finite string obtained from a source? Oh sure, you can apply his formula using the measured probabilities from the finite string, and get some number. But those measured probabilities are only approximations to the source's actual symbol probabilities because your sample is finite in length. And it always depends on what you know about the source or system in advance. Perhaps Shannon entropy is best left out of situations where a finite size/length system is being discussed.EDTA
September 6, 2012
September
09
Sep
6
06
2012
07:44 PM
7
07
44
PM
PDT
SC: I have just come from a fireworks of a public meeting on education issues, and have little time and less inclination to go over the problem again. You gave an entropy calc that gives a number for a mass of Al under given temp and pressure relative to a baseline, and which runs into incorrect system boundary issues in trying to address the difference between a flyable 747 and one torn up by a tornado. The calc you gave yields a relative number that is about Al being Al. It simply does not address the other aspects. And I have already said enough for those who need to see that there is more, for instance on the link between Shannon and Gibbs via Jaynes. KFkairosfocus
September 6, 2012
September
09
Sep
6
06
2012
06:31 PM
6
06
31
PM
PDT
Entropy is as simple to understand as "one if by land, two if by sea."Mung
September 6, 2012
September
09
Sep
6
06
2012
05:06 PM
5
05
06
PM
PDT
Entropy is much 'harder' to apprehend, let alone comprehend, than a simple computation. I believe your analysis is incorrect and incomplete, giving wrong entropy values.butifnot
September 6, 2012
September
09
Sep
6
06
2012
02:38 PM
2
02
38
PM
PDT
SC: The numbers are materially incomplete and the issue of systrem boundary is material to addressing the real change in entropy of the 747 damaged by a hurricane. The loss of Al because of how you did entropy accounting and set a system boundary is not the material shift. KF
So what are the correct entorpy numbers? You're invited to provide them: A. 747 B. Broken 747 C. soda can don't need to be that exact, but maybe some ball park numbers in Joules/Kelvin, and you can provide justification as to how you arrived at your figures. There are engineers reading this blog, they are entitled to an attempt at an answer. "I don't have an answer" is an acceptable answer. I don't think this is too much to ask if one is trying to make an inference about the evolution of a system.scordova
September 6, 2012
September
09
Sep
6
06
2012
12:36 PM
12
12
36
PM
PDT
SC: The numbers are materially incomplete and the issue of systrem boundary is material to addressing the real change in entropy of the 747 damaged by a hurricane. The loss of Al because of how you did entropy accounting and set a system boundary is not the material shift. KFkairosfocus
September 6, 2012
September
09
Sep
6
06
2012
12:09 PM
12
12
09
PM
PDT
Harry S Robertson, Statistical Thermophysics, Prentice. More later.kairosfocus
September 6, 2012
September
09
Sep
6
06
2012
12:06 PM
12
12
06
PM
PDT
Salvador:
But the point of my essays was to help readers understand what entropy really is.
So what *IS* entropy, really?
Arieh Ben-Naim is a modern antagonist of the term entropy. He advocates abandoning the word entropy altogether, and replacing it with missing information.
http://en.wikipedia.org/wiki/Arieh_Ben-Naim http://www.amazon.com/dp/981437489X http://www.ariehbennaim.com/books/entropyf.htmlMung
September 6, 2012
September
09
Sep
6
06
2012
09:48 AM
9
09
48
AM
PDT
hi kf, What is the Robertson text you're referring to? As for Jaynes, are you referring to the 1957 paper? Information theory and statistical mechanics. http://en.wikipedia.org/wiki/Edwin_Thompson_Jaynes http://en.wikipedia.org/wiki/Maximum_entropy_thermodynamicsMung
September 6, 2012
September
09
Sep
6
06
2012
09:12 AM
9
09
12
AM
PDT
1 2 3

Leave a Reply