Uncommon Descent Serving The Intelligent Design Community

Granville Sewell’s important contribution to physics: Entropy-X

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Abstract:   In Sewell’s discussion of entropy flow, he defines “Entropy-X”, which is a very useful concept that should clarify misconceptions of entropy promulgated by Wikipedia and scientists unfamiliar with thermodynamics. Sewell’s important contribution is to argue that one can and should reduce the “atom-entropy” into a subsets of mutually exclusive “entropy-X”. Mathematically, this is like factoring an N x M matrix into block diagonal form, by showing that the cross terms between blocks do not contribute to the total. Each of the blocks in the diagonal then correspond to a separately computed entropy, or “Entropy-X”.  This contribution not only clarifies many of the misunderstandings of laypersons, it also provides a way for physicists to overcome their confusion regarding biology.

– 0 –

Introduction:     Entropy was initially discussed in terms of thermodynamics, as a quantity that came out of the energy, temperature, work formulas.  Ludwig Boltzmann found a way to relate billiard-ball counting statistics to this thermodynamic quantity, with a formula he had inscribed on his tombstone:  S = k ln(Ω). The right-hand-side of this equation, contains a logarithm of the possible ways to arrange the atoms. The left-hand-side is the usual thermodynamic quantity. Relating the two different worlds of counting and heat, is this constant “k”, now called the “Boltzmann constant”.

Now for the shocking part. There is no theory that predicts its value. It is a conversion constant that is experimentally determined. It works best when the real physical system approximates billiard balls–such as noble gasses. The constant gets progressively worse, or needs more adjustments, if the right-hand-side becomes N2 (diatomic gas) or CO2 (triatomic). Going from gas to liquid introduces even more corrections, and by the time we get to solids we use a completely different formula.

For example, a 1991 Nobel prize was awarded for studying how a long oily molecule moves around in a liquid, because not every state of rearrangement is accessible for tangled strings.  So 100 years after Boltzmann, we are just now tackling liquids and gels and trying to understand their entropy.

Does that mean we don’t know what entropy is?

No, it means that we don’t have a neat little Boltzmann factor for relating thermodynamic-S to computer statistics. We still believe that it is conserved, we just can’t compute the number very easily.  This is why Granville Sewell uses “X-entropy” to describe all the various forms of order in the system. We know they must be individually conserved, barring some conversion between the various types of entropy in the sytem, but we can’t compute it very well.

Nor is it simply that the computation gets too large. For example, in a 100-atom molecule, the entropy is NOT computed by looking at all the 100! permutations of atoms, simply because many of those arrangements are energetically impossible.  Remember, when Boltzmann described “ln(Ω)” he was calling it the possible states of the system. If the state is too energetic, it isn’t accessible without a huge amount of energy.  In particle physics, this limitation becomes known as “spontaneous symmetry breaking”, and is responsible for all the variation we see in our universe today.

So rather than counting “atom states”, we assemble atoms into molecules and form new entities that act as complete units, as “molecules”, and then the entropy consists of counting “states of the molecule”–a much smaller number than “states of the atoms of the molecules”.  Molecules form complexes, and then we compute “states of the molecular complexes”. And complexes form structures, such as membranes. Then we compute “states of the structures”. This process continues to build as we approach the size of a cell, and then we have to talk about organs and complete organisms, and then ecologies and Gaia. The point is that our “unit” of calculation is a getting larger and larger as the systems display larger and larger coherence.

Therefore it is completely wrong to talk about single-atom entropy and the entropy of sunlight when we are discussing structural entropy, for as Granville and previous commentators have said, the flow of energy in sunlight with a coherence length of one angstrom cannot explain the decameter coherence of a building.

So from consideration of the physics, it is possible to construct a hierarchical treatment of entropy which enables entropy to address the cell, but in 1991 we had barely made it to the oily liquid stage of development. So on the one hand, unlike many commentators imagine, physicists don’t know how to compute an equivalent “Boltzmann equation” for the entropy of life, but on the other hand Granville is also right, we don’t need to compute the Boltzmann entropy to show that it must be conserved.

– o –

Mathematical Discussion:   Sewell’s contribution is to recognize that there must be a hierarchical arrangement of atoms that permit the intractible problem of calculating the entropy to be treated as a sum of mutually exclusive sub-entropies.

This contribution can be seen in the difference between “Entropy-chromium” and “Entropy-heat” that he introduces in his paper, where Entropy-chromium is the displacement of chromium atoms in a matrix of iron holding the velocity of the atoms constant, and Entropy-heat considers a variation in velocities while holding the position constant.  These two type of entropy have a large energy barrier separating them on the order of several eV per atom that prevent them from interconverting. At sufficiently high temperature–say, the temperature at which the iron-chrome allow was poured–the chromium atoms have sufficient mobility to overcome the energy barrier and move around. But at the present room temperature, they are immobile.  So in the creation event of the chromium bar, the entropy was calculated for both position and velocity, but as it cooled, “spontaneous symmetry breaking” produced two smaller independent entropies from the single larger one.

Now the beginning of this calculation is the full-up, atom entropy where everything is accessible. This “big-bang” entropy gives at least 7 degrees of freedom for each atom. That is, the number of bins available for each atom are at least 7–one for the species, 3 that give the position in x,y,z and 3 that give the velocity in Vx,Vy,Vz.  We could subdivide species into all the different quantum numbers that define atoms, but for the moment we’ll ignore that nuance.  In addition, the quantization of space and velocity into “Planck” sizes of 10^-34 meters, means that our bins do not always have a real number, but have a quantized length or velocity specified by an integer number of Planck sizes.  But again, the real quantization is that atoms don’t overlap, so we can use a much coarser quantization of 10^-10 meters or angstrom atomic lengths. The reason this is important, is that we are reducing ln(Ω) by restricting the number of states of the system that we need to consider.

But if S = ln(Ω), then this means we are mathematically throwing away entropy! How is that fair?

There is a curious result in quantum mechanics, that says if we can’t distinguish two particles, then there is absolutely no difference if they are swapped. This is another way of saying that their position entropy is zero. So if we have two states of a system, separated by a Planck length, but can’t tell the difference, it doesn’t contribute to the entropy.  Now this isn’t to say that we can’t invent a system that can tell the difference, but since a Planck length corresponds to light with gamma-ray intensity, we really have to go back to the Big Bang to find a time when this entropy mattered.   This reconfirms our assumption that as a system cools, it loses entropy and has fewer and fewer states available.

But even this angstrom coarse graining in position represented by “Entropy-Chromium”, is still too fine for the real world because biology is not made out of noble gasses, bouncing in a chamber. Instead, life exists in a matrix of water, an H2O molecule of nanometer size. Just as we cannot tell the difference if we swap the two hydrogen atoms in the water molecule around, we can’t tell the difference if we swap two water molecules around. So the quantization entropy gets even coarser, and the number of states of the system shrink, simply because the atoms form molecules.

A very similar argument holds for the velocities. A hydrogen atom can’t have every velocity possible because it is attached to an oxygen atom. That chemical bond means that the entire system has to move as a unit. But QM tells us that as the mass of a system goes up, the wavelength goes down, which is to say, the number of velocities we have to consider in our binning is reduced as we have a more massive system. Therefore the velocity entropy drops as the system becomes more chemically bound.

And of course, life is mostly made out of polymers of 100’s or 1000’s of nanometers in extent, which have even more constraints as they get tangled around each other and attach or detach from water molecules. That was what the 1991 Nobel prize was about.

Mathematically, we can write the “Big Bang” entropy as a N x M matrix, where N is the number of particles and M the number of bins. As the system cools and becomes more organized, the entropy is reduced, and the system becomes “block-diagonal”, where blocks can correspond to molecules, polymer chains, cell structures, organelles, cells, etc.

Now here is the key point.

If we only considered the atom positions, and not these molecular and macro-molecular structures, the matrix would not be in block diagonal form. Then when we computed the Boltzmann entropy, ln(Ω), we would have a huge number of states available. But in fact, biology forms so much structure, that ln(Ω) is greatly reduced.  All those cross-terms in the matrix are empty, because they are energetically inaccessible, or topologically inaccessible (see the 1991 Nobel prize). Sewell is correct, there is a tremendous amount of order (or reduced entropy) that is improperly calculated if life is considered as a ball of noble gas.

Let me say this one more time. Sewell is not only correct about the proper way to calculate entropy, he has made a huge contribution in arguing for the presence of (nearly) non-convertible forms of sub-entropy.

– o –

Thermodynamic Comment:   Some have argued that this “sub-entropy” of Sewell’s can be explained by some sort of spontaneous symmetry breaking due to the influx of energy. We have talked about “cooling” causing spontaneous symmetry breaking, which is consistent with the idea that higher temperatures have higher entropy, but the idea that “heating” can also drive symmetry breaking and lowered entropy is incoherent. This is simply because thermodynamically d(TS) = dE, or dS = d(E/T), which is to say, energy brings entropy and temperature simultaneously.

Let’s look at this a bit closer and apply it to the Earth.  The Sun is at 5500K, and therefore its energy comes principally in the form of yellow photons.  The Earth global temperature averages out to about 300K, so it emits infrared photons.   In thermodynamic equilibrium, the energy entering has to balance the energy leaving (or the Earth would heat up!).  Since the entropy of the photons hitting the Earth have almost twenty times less than the entropy of the photons leaving the Earth, the Earth must be a source for entropy. Ergo, sunlight should be randomizing every molecule in the Earth and preventing life from happening.

Does this make sense? I mean everybody and their brother say that entropy can decrease if you have a heat engine in the system. Energy comes into your refrigerator as low entropy energy. Energy billows out of the coils in the back as high entropy heat. But inside the fridge is a low-entropy freezer.  Couldn’t this apply to Earth? (E.g., compensation argument.)

Well, if the Earth had a solar-powered refrigerator, and some insulation, and some fancy piping, yes. But of course, all that is an even more low entropy system than the electricity, so we are invoking bigger and bigger systems to get the entropy in the small freezer to go down. The more realistic comparison is without all that special pleading. Imagine you leave your freezer door open accidently and leave for the weekend. What will you find? A melted freezer and a very hot house, because your fridge would run continuously trying to cool something down that warmed even faster. In fact, this is how dehumidifiers function. So your house would be hot and dry, with a large puddle of coolish water in the kitchen. Some fridges dump that water into a pan under the heating coils, which would evaporate the water and we would then be back to our original state but at higher temperature. Would the entropy of the kitchen be greater or smaller than if you had unplugged the fridge for the weekend? Greater, because of all that heat.

And in fact, this is the basic argument for why objects that sit in space for long periods of time have surfaces that are highly randomized.  This is why Mars can’t keep methane in its atmosphere. This is why Titan has aerosols in its atmosphere. This is why the Moon has a sterile surface.

If the Earth is unique, there is more than simply energy flow from the Sun that is responsible.

Comments
F/N: Thaxton et al use E for internal energy, not U. Also beware of the divergent terminologies on isolated, closed and open systems, also there are different sign conventions for direction of work flows relative to a system, work on system being positive or negative depending on author. And more.kairosfocus
July 11, 2013
July
07
Jul
11
11
2013
02:28 AM
2
02
28
AM
PDT
KS: With all reasonable respect, it is high time to stop the ideological posturing, especially red herrings led away to strawmen and soaked in ad hominems then set alight to cloud, confuse, poison and polarise. Serious matters are on the table and they are being addressed seriously, hopefully this will not have to get a lot more complex beyond this point. KF PS: With aid of Wiki etc, TMLO here [an HTML excerpt] 7 and then 8, from chs 7 - 9 especially, may be a very useful primer on the sort of issues that lurk here.kairosfocus
July 11, 2013
July
07
Jul
11
11
2013
02:25 AM
2
02
25
AM
PDT
OOPS: N_A = 6.02 * 10^23kairosfocus
July 11, 2013
July
07
Jul
11
11
2013
02:16 AM
2
02
16
AM
PDT
EL (attn RS): Pardon a comment on your exchange with RS. A fundamental point of phenomena in our world is that things have a temperature, which is a metric of average random energy per molecule (or similar entity) per degree of accessible freedom in light of relevant barriers of access. This thermal energy is usually dispersed in a 3-d pattern, across translation [3 degrees corresponding to the three orthogonal axes], rotation, vibration modes etc. Consequently, heat capacity exists as a measure of how much added heat it takes to increase the temp of a given substance. "Surprise," it varies with temperature itself in many key cases, i.e. there is a phenomenon known as freezing out of degrees of freedom. A theoretical gateway into understanding this is from the gases, and our knowledge that bodies in thermal equilibrium A and B, and B and C are such that A and C will also be in thermal equilibrium, i.e. temp, T, is an equivalence relationship. We can go to our marbles in a piston model and work out that for Maxwell Boltzmann statistics for ideal gas -- dilute noble gas in effect [though we can usually get away with using thin enough air etc] -- molecules, the molar gas constant R (8.314 J/K mol), divided down by number of molecules in a mole N_A [6.02 * 10^123 or so) and with some suitable fractional parameter to adjust for number of modes, will give us a measure of the energy accessible per particle on simply being at a given temp. k_B = R/N_A is therefore a useful constant, and 1/2 * kT energy per degree of accessible freedom is a useful estimate that can be worked out from the above. Basically, if your potential wells or accessible modes can move in increments of 1/2 * kT, then they are accessible at T. Going quantum brings to bear steps of energy and an irremovable zero point energy. As of now we have accessible energy from the random energy per degree of freedom from having a temp. This is available locally to move things around in possibility space, leading to a natural tendency to explore the space of possibilities for energy and mass to be distributed at molecular or comparable level, in a blind, random walk. This makes Boltzmann's special case for entropy, where number of possible distributions, W, is flat random accessible, a very important value indeed: S = k * Log (W) Gibbs worked out a different approach, which brings to bear that here may be potential barriers as hinted at by RS in the initial post, so that different modes are more or less accessible on a case by case basis, essentially S = [SUM on i] p_i * log (P_i) Complication. In the quantum world, often the relevant one at this stage, potential hills that have to be surmounted to move from one valley to another -- the source of metastability of various configs of entities, including in chem rxns and the like -- are porous, i.e. there is a finite probability of tunnelling that depends on accessible energy and strength of the barrier. Then come the many special cases, even oil-water mixing and fluid-fluid mixing more generally. In these cases, Physicists and Chemists tend to combine laws and derive thermodynamic potentials that evaluate drivers, Gibbs -- same fellow -- free energy being maybe the most useful. Wiki has a nice description:
Just as in mechanics, where potential energy is defined as capacity to do work, similarly different potentials have different meanings. The Gibbs free energy is the maximum amount of non-expansion work that can be extracted from a closed system; this maximum can be attained only in a completely reversible process. When a system changes from a well-defined initial state to a well-defined final state, the Gibbs free energy delta-G equals the work exchanged by the system with its surroundings, minus the work of the pressure forces, during a reversible transformation of the system from the same initial state to the same final state.[2] Gibbs energy (also referred to as [delta-]G) is also the chemical potential that is minimized when a system reaches equilibrium at constant pressure and temperature. Its derivative with respect to the reaction coordinate of the system vanishes at the equilibrium point. As such, it is a convenient criterion of spontaneity for processes with constant pressure and temperature . . . . The Gibbs free energy is defined as: G(p,T) = U + pV ? TS which is the same as: G(p,T) = H ? TS [--> i.e. enthalpy, roughly heat content, H = U + PV, a sort of generalisation of internal energy plus pressure-volume energy] where: U is the internal energy (SI unit: joule) p is pressure (SI unit: pascal) V is volume (SI unit: m3) T is the temperature (SI unit: kelvin) S is the entropy (SI unit: joule per kelvin) H is the enthalpy (SI unit: joule)
Essentially, if delta_G is negative, a spontaneous process is favoured, and if positive, it is not. At 0, we are at equilibrium and tend to be there. This embeds entropy and involves especially changes at molecular levels. In effect mix 1st and 2nd laws to give the TdS equation, and bring to bear factors. That is, Gibbs free energy consideration are materially based on entropy considerations, and are effectively entropy enabled/disabled in key parts. As a result of these considerations, there are ways in which counting possible states is related to energy and system behaviour, via the existence of temperature. Similarly, we have seen why energetically uphill cases tend to be disfavoured, we would have to inject energy to move uphill, energy not easily accessible in the system from temperature or easy rearrangements. Catalysis works by lowering potential barriers through providing alternative paths, and enzymes and other biological nano-machine process units also use ATP as an energy enabler that injects the jump to make the leap. Enzymes are of course proteins, ribosomes are assemblies of proteins and RNA, etc. Each of these is complex and built up from highly endothermic molecules that are highly functionally specific and complex. To set up such internal process units in a sort of nano scale chem engineering system, we have encapsulation with smart gating, organised metabolic pathways of extreme complexity and integration strongly reminiscent of a refinery network (recall those wall-sized cell biochem pathways charts?), and code based von Neumann self replication, all of which perform important and necessary functions. Essentially, none of this is such that diffusion-like mechanisms, would favour formation of such a complex organised entity. RS, has raised issues of long range interactions in response to such. This (if I understand him right . . . and note his remark on sloppy expression of what he really means) is essentially an appeal to self-organisation based on structures within a system, and perhaps to ordering based on boundary conditions. This is of course back to the point that ordering is not the same as wiring diagram based functional organisation in the face of high contingency. Where, information holding capacity is directly linked to the requisite of high contingency. However, obviously, if there are barriers too high for relevant accessible thermal energy to surmount [about 1/40 electron Volt at "room temp" is a longstanding rule of thumb [I usually remember this from the rule of thumb energy of thermal neutrons in nuclear interactions . . . ], where red light photons hold about 2 eV each and blue light might be about 4 eV), accessible state counting and assumption of random walks across so accessible states on assumption of thermal energy accessibility are going to break down. This is going to partition up access to entropy. As a useful example, a major part of why oil and water don't mix is that the water would kill off access to many ways to interact among the oil molecules so the statistics lock us up into separate phases. This is part of what is being exploited in the lipid bilayer membranes used by cells, but cell membranes are not just layers, they enfold sensors, sophisticated gating and more, integrated into the wider cell system. This brings us to the significance of assembly by organised shaft work, i.e. constructive work, leading to things that exhibit functionally specific, complex organisation and associated information [FSCO/I]. It is underscored that diffusion and similar forces do not point that way, but away from it. It is underscored that long ranger interactions and potential barriers etc, point to mechanically necessary ordering as opposed to highly contingent functionally specific organisation. It is underscored that the empirically observed, reliably known source of FSCO/I is design. Where also the living cell, including the simplest we see or can conceive, will be chock full of such FSCO/I. (Start with geometry based nanomachines dependent on homochiral molecules to work, embedding essentially 1 bit per monomer just from this.) This brings us back home to the home base of design thinking in the world of life, OOL. Thence the observation that OOL is the necessary root of the Darwinist tree of life, so we see that no roots, no shoot, no tree. So also, we see that the best empirically and analytically warranted account of the origin of FSCO/I --- a major component of any reasonably conceivable first cell based life form -- is design, serving to co-ordinate constructive work through shaft work producing energy conversion devices. OOL is the pivot. KFkairosfocus
July 11, 2013
July
07
Jul
11
11
2013
02:13 AM
2
02
13
AM
PDT
Robert:
#26, #37 EBL No, there are many 2nd laws, as Granville says. There is one for thermo entropy S, and one for chromium entropy S1, one for socks entropy S2 … Only in S=k ln(W) where W=noble gas statistics, is it possible to convert thermo entropy to noble-gas-statistical entropy. This one success, however, suggests that S1 = ln(W1), S2=ln(W2)…are all conserved, even if we don’t know “k”, the conversion constant to thermal entropy S. The reason it is SLoT (emphasizing T), is because entropy was defined in the context of heat engines, Carnot cycles, and steam power, long before Ludwig Boltzmann converted it to statistics. Boltzmann’s discovery enabled it to be exactly computable from statistics of ordering. This was considered a “deep” result, having far more explanatory power than simply heat engines, since just about anything could be discussed in terms of ordering.
Well, he didn’t “convert it to statistics”. He expressed it in terms of the probability of microstates, but his formula has units of energy – hence the constant. You can use it for purely statistically, as in Shannon’s entropy, but then the SLoT wouldn’t necessarily apply. That’s why I keep saying that it’s important to say what a p value is the probability of, under what conditions. In Boltzmann’s formula, the p value is the probability of microstates, in which is defined in terms of energy. The constant gives you the answer in joules/kelvin.
#68 EBL You are confusing “energy” with “entropy”. The 1st law was proven by a British spy in the Revolutionary War, who nonetheless got an elementary school named after him in Woburn, Massachusetts–Count Rumford. He showed that work could be converted to heat by a precise ratio. They were convertible. The same could not be said for entropy, the topic of the 2nd law. Granville (and all physicists) have no trouble with converting energy and work, we just have trouble with people who convert thermal entropy into life, or Shannon entropy, or chromium entropy, or socks entropy… Your comment about “entropy not being conserved,.. but decreasing” is mistaken. The SLoT prohibits thermal entropy of closed systems from decreasing–it is either conserved and constant, or increasing. Likewise your comment that SLoT “is about energy” is also mistaken. The appropriate equation is dS = dQ/T. Entropy is about the ratio of energy over temperature. Quite another sort of bird altogether.
Point taken – actually I mistyped (meant “increasing”). I was thinking “order”. And yes I probably was conflating energy with entropy (joules vs joules per kelvin). However, that doesn’t actually mean that the 2LoT is not about energy: dS = dQ/T is an equation with an energy term. So how can it not be about energy? It seems to be you and Granville who think that “sock entropy” has something to do with the 2LoT!
Again, I wasn’t discussing energy gradients, but entropy gradients. So your comments aren’t relevant to my discussion. Nevertheless, atmospheric scientists refer to tornadoes as entropy-driven” systems because of the way vorticity plays a part in their creation–NOT energy gradients!
Well, internal energy gradients, surely? But your point is taken. I shall take more care with my units!
#70 EBL Thermodynamics is NOT about energy primarily, look at the word–”thermo”= heat, “dynamics” = motion. Its about heat flow. Or more precisely dS = dQ/T, but I already said that.
But both heat and motion are about energy! That’s like saying that velocity is not about distance travelled!
#72, 73, 76, 86, 89 etc Lizzie, neither dust devils nor tornadoes violate SLoT, but they neither are they spontaneous. They are driven by gradients of entropy (not gradients of energy). Your Wikipedia reference to convection layers is missing a big piece of the physics, because if simply convection were enough, you should see tornadoes in every pot of water that is boiling on the stove.
Well, you sometimes do. And yes, of course they do not violate the SLoT, but they are perfectly “spontaneous” in the sense of not being designed. In other words, contra Granville’s claim, low entropy systems do emerge, regularly, on earth, without violating the 2LoT, and without being designed. A dust devil is precisely the kind of refrigerator you seemed to be implying required something not available on earth without some kind of intervention.
But I’m glad you brought them up–they are a good example of how to generate order using entropy gradients.
Yes, they are indeed.
Shannon entropy has different units from thermal entropy. They are as far from each other as fish and bicycles.
Yes, I know. That was my point. You can’t just take a law that applies to fish, and then express concern when it is violated by bicycles.Elizabeth B Liddle
July 11, 2013
July
07
Jul
11
11
2013
12:52 AM
12
12
52
AM
PDT
Timaeus, I want to thank you again for suggesting that I take a closer look at Robert Sheldon's OP and comments. The whole situation is just fantastic. You try to defend Granville's wretched paper, saying that he doesn't claim that evolution violates the second law. But you refer me to Robert, a PhD physicist who not only thinks that evolution violates the second law, he even thinks that a tree violates the second law every time it sprouts a new leaf! And who casually dismisses violations of the second law:
Don’t have a cow Keith, our eternal destiny doesn’t depend on keeping the 2nd law, fortunately.
Well, if Robert defends Granville's paper, it must be good. ID "science" is in beautiful shape. More, please! Do you have any other recommendations? Anyone else whose writings I should take a closer look at? :D P.S. I hope you've learned your lesson about credentials.keiths
July 11, 2013
July
07
Jul
11
11
2013
12:42 AM
12
12
42
AM
PDT
Robert Sheldon:
#32 CS3 However, the mistake of Styer, Bunn, and Lloyd, is that they think entropy is convertible, so for example, chromium entropy can be “compensated” by thermal entropy. But this is the one thing we don’t know, how to convert S1, S2 into S and thereby add or subtract from the sum. This is why Sewell conserves them seperately.
I completely agree. Just to be clear, my comment was definitely not directed at you (it was primarily for EL). While I have not yet fully digested your initial post, I certainly also appreciate Sewell's paper. Thanks for your interesting post!CS3
July 10, 2013
July
07
Jul
10
10
2013
07:35 PM
7
07
35
PM
PDT
EL: It seems you want for me to try to prove over and over again that you and ilk are setting up and knocking over strawmen, the better to dismiss such and repeat falsehoods over and over again as though I am guilty by your assertions. (This is similar to how you hosted slander, then denied for months then when it was shown beyond dispute tried to defend it, in part on what J'cans call "a nuh nutten.") Has it as yet registered with you that: (a) The Clausius situation I began my analysis c. 2006 - 2008 with, has local entropy variations that are positive and negative, but are connected in particular ways by heat transfer and work transfers leading to the 2nd law? (which BTW then sets up relations that apply this law to other case4s that are not isolated.) Let me refresh your memory:
Heat transfer in Isolated system: || A (at T_a) –> d’Q –> B (at T_b) || dS >/= (-d’Q/T_a) + (+d’Q/T_b), Lex 2, Th in a common form [--> FYI, A here has a local entropy reduction, linked to the rise in B such that Lex 2 th emerges] Heat engine, leaving off the isolation of the whole [--> case 2, extending to the heat engine and with contexts for energy converters in general]: A –> d’Q_a –> B’ =====> D (shaft work) Where also, B’ –> d’Q_b –> C, heat disposal to a heat sink Where of course the sum of heat inflows d’Q_a will equal the sum of rise in internal energy of B dU_b, work done on D, dW, and heat transferred onwards to C, d’Q_b. [--> Shaft work is what in our observation performs constructive work, from turning the drive shaft of an engine to powering a robot's limbs etc to do constructive work, even "bio-robots" i.e. our limbs.]
(b) Notice this, from GS's Second Thoughts on the second Law article, from about a decade ago:
. . . The second law is all about probability, it uses probability at the microscopic level to predict macroscopic change: the reason carbon distributes itself more and more uniformly in an insulated solid is, that is what the laws of probability predict when diffusion alone is operative. [--> he highlights how heat transfers follow diffusion-like laws] The reason natural forces may turn a spaceship, or a TV set, or a computer into a pile of rubble but not vice-versa is also probability: of all the possible arrangements atoms could take, only a very small percentage could fly to the moon and back, or receive pictures and sound from the other side of the Earth, or add, subtract, multiply and divide real numbers with high accuracy. The second law of thermodynamics is the reason that computers will degenerate into scrap metal over time, and, in the absence of intelligence, the reverse process will not occur; and it is also the reason that animals, when they die, decay into simple organic and inorganic compounds, and, in the absence of intelligence, the reverse process will not occur . . . . What happens in a[n isolated] system depends on the initial conditions; what happens in an open system depends on the boundary conditions as well. As I wrote in "Can ANYTHING Happen in an Open System?", "order can increase in an open system, not because the laws of probability are suspended when the door is open, but simply because order may walk in through the door [--> or be imposed by bulk laws such as that heated bodies of fluids, e.g. air, expand and so lower their density, leading to upthrust and floatation in teh wider atmsophere, which will tend to draw in air from elsewhere, leading to a convection loop, thence winds and related phenomena including vortices etc.] ... If we found evidence that DNA, auto parts, computer chips, and books entered through the Earth's atmosphere at some time in the past, then perhaps the appearance of humans, cars, computers, and encyclopedias on a previously barren planet could be explained without postulating a violation of the second law here . . . But if all we see entering is radiation and meteorite fragments, it seems clear that what is entering through the boundary cannot explain the increase in order observed here." Evolution is a movie running backward, that is what makes it special. THE EVOLUTIONIST, therefore, cannot avoid the question of probability by saying that anything can happen in an open system, he is finally forced to argue that it only seems extremely improbable, but really isn't, that atoms would rearrange themselves into spaceships and computers and TV sets . . .
(C) Returning to what I argued, the challenge then goes to what diffusion-like processes and the like can credibly do (given available atomic and temporal resources on solar system or cosmos as a whole, constraining scope of sampling of large config spaces of beyond astronomical scale), vs what organised shaft work is easily observed to do, when constructive work issuing in FSCO/I occurs. More from 6 above:
The pivotal questions are, the performance of constructive work by dW, the possibility that this results in FSCO/I, the possibility that B itself shows FSCO/I enabling it to couple energy and do shaft work, and the suggested ideas that d’Q’s can spontaneously lead to shaft work constructing FSCO/I as an informational free lunch. [--> notice, what I set out to address in a context where the matter of local shifts in entropies of interacting bodies had been addressed already.] By overwhelming statistical weights of accessible microstates consistent with the gross flows outlined, this is not credibly observable on the gamut of our solar system or the observed cosmos. [--> Notice, what I am saying, why, and why this implies that the attempts to twist me into denying lex 2 th are strawman tactics] There is but one empirically, reliably observed source for FSCO/I, design. The analysis on statistical weights of microstates and random forces such as diffusion and the like, shows why. [--> the nanobots and microjets thought exercise, clipped at 44 above after several posts in which KS's strawman tactics are exposed and corrected on lognstanding record, draws out why] Which brings us to the point Hoyle was making: letting his tornado stand in for an agitating energy source triggering configs at random, tornadoes passing through junkyards don’t build jumbo jets or even instruments on their panels. But similar intelligently directed expenditures, through precise patterns of work, routinely do build jumbo jets.
(d) In short, I have at no point tried to suggest that when we have coupled changes in energy in local subsystems, there are no local entropy decreases or increases. Just the opposite, laid out in diagrams and algebra several times in this thread alone. So, to pretend or suggest otherwise by setting up and knocking over a strawman, is a false accusation by implication. (e) Similarly, at no point that I am aware of has GS tried to imply that a local decrease of entropy of an interacting subsystem is impossible. he is pointing out that when the manner of interaction is by diffusion like forces, it by probability reasons linked to clusters of microstates and the occurrence of utterly dominant scattered states due to sheer statistical weight, typical entities that manifest FSCO/I are not credible products. (f) That is, what I am doing is bringing to bear an explicit Clausius law and shaft work organised to carry out constructive work analysis that draws out why this is so. (g) When therefore you suggest vortices as cases of local order emerging by thermomynamic forces feeding into fluid dynamics, you are NOT answering to the relevant case but are setting up and knocking over a strawman. Just as KS's attempt with freezing ponds was doing the same. (h) Do I need to clip here my discussion of hurricanes and snowflakes again, from 42 above to highlight that I have in fact addressed both vortices and crystallisation as claimed counter examples, by highlighting just how these are not FSCO/I and wiring diagram organisation, but cases of order fed by heat flows? (Or, should I let it suffice to state that I specifically said that I was clipping this in anticipation of objections using such cases? If you doubt me just click the just linked.) (i) In short, I have abundant reason to point out that that which KS sneeringly dismissed as spamming the thread, was on target and substantially relevant. Pardon some fairly direct words: had he and you simply been willing to recognise that someone on the design side may actually know a bit of what they were talking about, you would have saved us all a lot of going in circles of responding to strawman tactics. KFkairosfocus
July 10, 2013
July
07
Jul
10
10
2013
07:06 PM
7
07
06
PM
PDT
Robert:
#64–66 Axel Seems to be channelling KS while under the influence. Read above rebuttals to KS.
Lol. Axel is on your side, Robert, though you probably wish he weren't.keiths
July 10, 2013
July
07
Jul
10
10
2013
06:36 PM
6
06
36
PM
PDT
Elizabeth:
But that air was still before it got into motion and became a dust devil (let’s ignore the tornado for now, as the dust devil energy accounting is simpler)
That's not a given- that the air was still.
And the reason the air moves (upwards) is that it is heated by the hot ground.
And the air above that hot ground is cooler. It is a simple event.Joe
July 10, 2013
July
07
Jul
10
10
2013
06:25 PM
6
06
25
PM
PDT
This is a long comment thread, and I was chastised for having abandonned it too soon. I will try to answer some of the questions raised about my meanings and definitions. #26, #37 EBL No, there are many 2nd laws, as Granville says. There is one for thermo entropy S, and one for chromium entropy S1, one for socks entropy S2 ... Only in S=k ln(W) where W=noble gas statistics, is it possible to convert thermo entropy to noble-gas-statistical entropy. This one success, however, suggests that S1 = ln(W1), S2=ln(W2)...are all conserved, even if we don't know "k", the conversion constant to thermal entropy S. The reason it is SLoT (emphasizing T), is because entropy was defined in the context of heat engines, Carnot cycles, and steam power, long before Ludwig Boltzmann converted it to statistics. Boltzmann's discovery enabled it to be exactly computable from statistics of ordering. This was considered a "deep" result, having far more explanatory power than simply heat engines, since just about anything could be discussed in terms of ordering. #32 CS3 However, the mistake of Styer, Bunn, and Lloyd, is that they think entropy is convertible, so for example, chromium entropy can be "compensated" by thermal entropy. But this is the one thing we don't know, how to convert S1, S2 into S and thereby add or subtract from the sum. This is why Sewell conserves them seperately. #27 collin ---> has the gist of the argument. We can have conservation laws of things without quantifying the things. E.g, most of the time we don't know "S0", the additive offset in thermodynamic entropy calculations. #39 KF Has supported me and GS, but I would point out that diffusion ASSUMES local forces only. Once again, the theory of diffusion that is used by GS to discuss SLoT, assumes that heat acts randomly like noble-gasses diffusing. When long-range forces dominate, as in long, oily molecules, proteins, DNA, or fully ionized plasmas, then diffusion doesn't behave the way it is expected, the conversion constant "k" is unknown, and the system is said to be "open". #40, #46, #57, #61 KS Premise 2 is wrong. S(C) =/= S(A) + S(B). a)Tallis discusses the possibility of "non-extensive" entropy that violates this assumption even for heat. b)Every physics book has the trick problem of a gas expanding into a vacuum with the trick question, what is the final temperature? It turns out that the sum is undefined because the entropy is "space-dependent". c)But even more importantly, only if A and B are EXACTLY the same entropy, say, thermal entropy, is it even likely that they can be added. And even then, many other conditions have to be met. They have to have the same kind of "atom", otherwise like mixing water and alcohol, the mixture is less than the sum. All of which Granville explains clearly, and you confuse. #45, 52 GD No, Granville did compensation properly. You assumed that all forms of entropy are convertible. They aren't, or at least, if they are then somebody needs to go collect their Nobel Prize as they did in 1991 for oily liquids. Therefore molecular entropy lost in biological growth CANNOT be converted to thermal entropy from the Sun, and thus cannot be "compensated", as you mistakenly aver. #49-51 Yes, I am defending Granville's paper. It's very clear and nicely written. One can refuse to engage the defenses, but of course, that opens one up to being out-flanked. #53 KS (and #54, #63 T) Don't have a cow Keith, our eternal destiny doesn't depend on keeping the 2nd law, fortunately. As far as "thermodynamics" is concerned, the cell is a heat engine, consuming fuel in the form of glucose, and putting out waste in the form of CO2 and H2O, as well as intermediate sized acids. This does not violate SLoT, nor could it, or else we wouldn't have to eat to live. But it also says nothing about where the heat engine came from. Clausius or Carnot or Boltzmann said nothing about where the steam engine originated, they only said they could describe its use of coal and water and its output of work. The Origin of Life (OOL), the origin of chromium entropy, the origin of socks entropy VIOLATES the 2nd Law of Thermodynamics when interpreted by Boltzmann as the statistical order of these objects. This is easily explained, it is because all these things mentioned are in an "open" system, their origin is external to the system. And that was the whole point of Granville's paper. One more time: if we restrict ourselves to thermal entropy alone, then the cell strictly obeys the 2nd Law. If we allow ourselves the luxury of applying Boltzmann's definition, S=k ln(W), then the cell no longer obeys this formulation, as best as we can estimate W. #58 KS You keep asking about what people think. Are you religious? If so, then what faith? Some religions put emphasis on epistemology (knowing the truth), others on metaphysics (being the truth), while others on ethics (obeying the truth). From your posts, I would assume you are a category 2 believer. Which may be why Granville's paper bothers you. #59 KF I am notoriously sloppy in my wording, but I'm sure we could clarify our respective positions and find agreement. #64--66 Axel Seems to be channelling KS while under the influence. Read above rebuttals to KS. #68 EBL You are confusing "energy" with "entropy". The 1st law was proven by a British spy in the Revolutionary War, who nonetheless got an elementary school named after him in Woburn, Massachusetts--Count Rumford. He showed that work could be converted to heat by a precise ratio. They were convertible. The same could not be said for entropy, the topic of the 2nd law. Granville (and all physicists) have no trouble with converting energy and work, we just have trouble with people who convert thermal entropy into life, or Shannon entropy, or chromium entropy, or socks entropy... Your comment about "entropy not being conserved,.. but decreasing" is mistaken. The SLoT prohibits thermal entropy of closed systems from decreasing--it is either conserved and constant, or increasing. Likewise your comment that SLoT "is about energy" is also mistaken. The appropriate equation is dS = dQ/T. Entropy is about the ratio of energy over temperature. Quite another sort of bird altogether. Again, I wasn't discussing energy gradients, but entropy gradients. So your comments aren't relevant to my discussion. Nevertheless, atmospheric scientists refer to tornadoes as "entropy-driven" systems because of the way vorticity plays a part in their creation--NOT energy gradients! #70 EBL Thermodynamics is NOT about energy primarily, look at the word--"thermo"= heat, "dynamics" = motion. Its about heat flow. Or more precisely dS = dQ/T, but I already said that. #72, 73, 76, 86, 89 etc Lizzie, neither dust devils nor tornadoes violate SLoT, but they neither are they spontaneous. They are driven by gradients of entropy (not gradients of energy). Your Wikipedia reference to convection layers is missing a big piece of the physics, because if simply convection were enough, you should see tornadoes in every pot of water that is boiling on the stove. But I'm glad you brought them up--they are a good example of how to generate order using entropy gradients. #82 EBL Shannon entropy has different units from thermal entropy. They are as far from each other as fish and bicycles.Robert Sheldon
July 10, 2013
July
07
Jul
10
10
2013
05:24 PM
5
05
24
PM
PDT
EL: Weather is a manifestation of order and chance, on planetary scale.
Yes it is, and not just on a planetary scale. 0n very small scales too.
Convection — an orderly pattern resulting from a differential heating and a means by which heat is dispersed — leads to wind systems, water vapour content and variation with height, pressure etc leads to precipitation, clouds and more. Dust devils are vortices with entrained dust. Vortices being a characteristic orderly pattern in fluids where rotation is injected. Remember, the issue is that we see necessity, chance and choice at work in our world.
Yes indeed. And all those vortices you mention represent local entropy reductions. None of them violate the 2nd Law of thermodynamics. They can raise things to a higher energy potential, and undiffuse what was diffused. They render the most probable states ones in which there are gradients, rather than ones in which there are not.
And the argument is not about order but organisation that is functionally specific, and complex enough to be beyond credible chance contingency (by diffusion etc) is commonly and only seen as the product of choice. For reasons closely connected to why chance is not a good explanation for FSCO/I — sampling of the config space where one would have to credibly capture quite rare and isolated zones.
Exactly. So the issue has nothing to do with order in the sense that -entropy is order, but with organisation. Therefore it has nothing to do with the 2nd Law of thermodynamics. Granville's claim simply boils down to the same argument as Dembski's and the 2nd Law is irrelevant to it.
Dust devils, tornadoes and hurricanes are simply not to be compared to proteins formed in ribosomes on coded instructions. to attempt such verges on a strawman. KF
Yes indeed, but the straw man is of Granville's making. Local entropy reduction, on a very powerful scale, is perfectly possible on earth, so the idea that because life represents reduced entropy, therefore evolution can't be right, is a straw man. Reduced entropy may be necessary for life, but it isn't sufficient. Simple logic therefore tells us that saying that entropy reduction is forbidden by the 2nd Law, therefore no evolution, isn't valid. There may be reasons evolution can't happen, but the idea that the required entropy reduction would violate the 2nd Law isn't one of them. If it did, so would any vortex.Elizabeth B Liddle
July 10, 2013
July
07
Jul
10
10
2013
03:01 PM
3
03
01
PM
PDT
EL: Weather is a manifestation of order and chance, on planetary scale. Convection -- an orderly pattern resulting from a differential heating and a means by which heat is dispersed -- leads to wind systems, water vapour content and variation with height, pressure etc leads to precipitation, clouds and more. Dust devils are vortices with entrained dust. Vortices being a characteristic orderly pattern in fluids where rotation is injected. Remember, the issue is that we see necessity, chance and choice at work in our world. And the argument is not about order but organisation that is functionally specific, and complex enough to be beyond credible chance contingency (by diffusion etc) is commonly and only seen as the product of choice. For reasons closely connected to why chance is not a good explanation for FSCO/I -- sampling of the config space where one would have to credibly capture quite rare and isolated zones.Dust devils, tornadoes and hurricanes are simply not to be compared to proteins formed in ribosomes on coded instructions. to attempt such verges on a strawman. KFkairosfocus
July 10, 2013
July
07
Jul
10
10
2013
12:36 PM
12
12
36
PM
PDT
Some notes: 1: wind systems are often manifestations of convective effects, ulimately at planetary scale. Tehndencies to form vortices come about by various means, including Coriolis effects, and shearing differently directed winds creating a bass for rotation. 2 --> Wind systems are manifestations of fluid dynami cs, with aid of some thermodynamics. Of course the systems are extremely non linear and sensitive to initial conditions. 3 --> Once we see that entropy is also about MISSING info on microstate on knowing macrostate variables such as pressure, temp etc, we can see the degrees of freedom present, so we see a lower relative quantum of info, which under relevant circumstances of energy to change states etc, can be converted into work, at least in part. (States, include motion of massive objects, however tiny and this is immediately an energy storage mechanism. Translation, rotation, vibration.) 4 --> Constructive work imparts forced ordered motion to components that in accord with some plan, makes them go to a specific functional state. The classic instance is forming a protein by chaining monomers and forcing peptide reactions in accord with a stored code. 5 --> Such a protein then folds based on its config of elements in its chain, sometimes on its own, often with aid of a Chaperone molecule (prions are yet lower energy state folds that are non functional and indeed are implicated in serious diseases.) 6 --> Again, both information and energy are involved, and functional specificity is a tightly constrained outcome, thus low entropy, paid for elsewhere. The natural tendency once the metastability is broken, is for proteins to denature, and break down. DNA is also notoriously metastable at best. 7 --> In short, information is involved in constructive work issuing in FSCO/I, and it is intimately bound up in the high energy states built up based on instructions and put to work thereafter. These are energetically very much uphill, endothermic molecules. 8 --> Knowing that something is in a functional state is a very useful thing, and can often be used to perform desired work in itself. It also confines you to a very narrow zone in a config space. Finding such a cluster by chance or blind processes, will be difficult indeed. Starting with OOL. 9 --> When it comes to tornadoes and 747's Hoyle's argument was in effect that this is a macro analogue, crudely, of diffusion. GS is speaking about this sort of thing. because diffusion processes nad the like tend so strongly to move to dominant clusters of states -- and away from FSCO/I -- they are not feasible as means to do constructive work. So, there is no free lunch here to get constructive work for "free" from chance factors. Thus the issue of analysing how work gets done. 10 --> Constructive work requires organised forced ordered motion. This yields entities that are wired according to a functional plan, whether a Jumbo jet or a string of characters in a post like this, or a protein chain or a DNA chain. Such requires entities capable of the required "shaft work," whether at micro or macro level. Thus we see men and machines expending a similar amount of energy and building the Jumbo Jet. In the cell, mRNA, tRNA, ribosomes, etc work together to build proteins. 11 --> FSCO/I is in common, and the only empirically warranted originating source of FSCO/I is design. Certainly not diffusion or the like, whether in isolated, closed or open systems. KFkairosfocus
July 10, 2013
July
07
Jul
10
10
2013
12:29 PM
12
12
29
PM
PDT
Obviously the air has to be in motion for them to form, Joe, because they consist of air in motion. But that air was still before it got into motion and became a dust devil (let's ignore the tornado for now, as the dust devil energy accounting is simpler) And the reason the air moves (upwards) is that it is heated by the hot ground. And as a result, the column of still air that preceded the dust devil, and which had high entropy (was in a low energy state) now rises and turns and becomes a low entropy system called a dust devil. This pumps hot air from the ground and cools it. Eventually, after the dust devil has vacuumed around and got rid of all the hot air, it goes back to being still air again. It's no big deal, but it's the simplest example I know of a non living system that effectively behaves like a fridge, and forms spontaneously, i.e. no-one designs it, or plugs it in, or does anything at all. As I said, they form on Mars, and so far there is no sign of designers (or life) on Mars.Elizabeth B Liddle
July 10, 2013
July
07
Jul
10
10
2013
11:31 AM
11
11
31
AM
PDT
Look at the videos, Joe.
Oh boy, back to deception and misdirection. Which video shows a TORNADO spontaeously arising from still air? I saw videos of dust/ dirt devils and even wiki says they require the air to be in motion in order to form.Joe
July 10, 2013
July
07
Jul
10
10
2013
10:12 AM
10
10
12
AM
PDT
KF:
GS is arguing that diffusion-like processes characterise systems, whether isolated or open, and so do I.
They certainly characterise systems on which no work is being done. But clearly, diffusion is not the only spontaneous direction of rearrangement of matter. If it were, how would you explain weather?Elizabeth B Liddle
July 10, 2013
July
07
Jul
10
10
2013
09:41 AM
9
09
41
AM
PDT
Joe
Tornadoes spontaneously arising from still air? Pulllease
Oh boy, back to Cantor's infinities. Look at the videos, Joe.Elizabeth B Liddle
July 10, 2013
July
07
Jul
10
10
2013
09:39 AM
9
09
39
AM
PDT
Thanks for the extract, KF. Yes indeed, Shannon entropy and thermodynamic entropy are mathematically almost identical, and not by coincidence. And indeed, you could interpret the thermodynamic entropy (measured in joules) as representing the Shannon information of the system (in bits) if the number of possible microstates represented the number of possible messages that could be transmittted by the system. You could further directly relate the two by saying that the thermodynamic entropy (in joules) is a measure of how many bits would be needed to specify the microstate. Therefore the greater the thermodynamic entropy (the more joules) the less information the description of the macrostate (how many joules) tells you about the microstate (i.e the more possible microstates fit that description). Because of course, as you are well aware, the quantity of Shannon entropy in a message tells you nothing about the message it contains. It just means that there are more possible messages that it could contain. So how do we relate this to Granville's argument? It comes back to specification, as I'm sure you will agree. A Boeing 747 is a highly "specified" description of an arrangement of junkyard parts, while a "A junkyard of parts" can describe a vast number of of arrangements. But in Shannon terms, both have identical Shannon entropy (unless some of the parts have gone missing in the tornado, or were not included in the Boeing, in which case one or the other may have less). Just as "KAIROSFOCUS" has exactly the same Shannon entropy as SCAOSFIUORK, even thought one has meaning and the other does not. So it is not true to say that the reason a tornado will not make a Boeing 747 in a junkyard is not that a Boeing has much less Shannon entropy (in bits) than a junkyard, nor is it that a Boeing has much less thermodynamic entropy (in joules) than a junkyard (it may have more, it may have less). So what ever the reason is that a tornado cannot build a Boeing 747 in a junkyard is not only nothing to do with the 2nd Law of thermodynamics, but it has nothing to do with entropy, whether thermal or Shannon. Which is not to say that a tornado can build a 747. It can't. It's just that Granville's argument that it can't is incorrect. However, when it comes to living things, it is perfectly true that living things have less entropy than the things they are composed of, just as a tornado (or a dust devil) has less entropy than still air. So can entropy decrease spontaneously? Yes, it can. Convection currents would not be possible if it couldn't, nor would saltpans. And seeds, whether designed or not, would not grow into trees and convert low entropy carbon dioxide and water into sugar and starch. So yet again, Granville's argument fails. Life does not violate the 2nd Law, because the 2nd Law does not prevent the spontaneous development of systems of entropy. It just says that this only happens if work is done by a system external to the low entropy system, which, as a result will experience an increase of entropy at least equal to the local decrease.Elizabeth B Liddle
July 10, 2013
July
07
Jul
10
10
2013
09:34 AM
9
09
34
AM
PDT
KF, it's all about the evidence and people have noticed that their position doesn't have any. They can talk and talk but they cannot show that the rubber meets the road. Tornadoes spontaneously arising from still air? PullleaseJoe
July 10, 2013
July
07
Jul
10
10
2013
09:03 AM
9
09
03
AM
PDT
Joe the pretence or suggestion above that anyone -- other than RS it seems if his qualifications are not meant to be his main point -- is arguing that the 2nd law fails, is a strawman caricature. GS is arguing that diffusion-like processes characterise systems, whether isolated or open, and so do I. My own analysis begins from Clausius' example and examines the fate of energy importing bodies, which are parallel to what would have happened with earth. The bottomline is that functionally specific complex organisation and associated implied information, are not empirically gotten for free, and certainly are not credibly gotten from diffusion or the like. The strawman tactic looks to be set up to distract attention form this problem, which is a serious one, as in effect a lot of very specific info that is even coded in some parts, is in effect being pulled out of noise. KFkairosfocus
July 10, 2013
July
07
Jul
10
10
2013
08:58 AM
8
08
58
AM
PDT
Dr Liddle: Are you familiar with the perspective of Jayne et al down to Robertson et al? And, with the impact they have at length had on understanding what entropy is? Can you specifically show where they are wrong when in effect they summarise -- I have clipped on this in that which is being dismissed as spamming by those who have no intention to address matters on the merits -- more or less as follows:
the entropy of a body or system can be seen as the average missing information to specify the microstate of a system, given its macro state, suitably referred to energy by a conversion factor.
Let me again clip for your convenience the summary from Robertson:
. . . we may average the information per symbol in [a] communication system thusly (giving in terms of -H to make the additive relationships clearer): - H = p1 log p1 + p2 log p2 + . . . + pn log pn or, H = - SUM [pi log pi] . . . Eqn 5 H, the average information per symbol transmitted [usually, measured as: bits/symbol], is often termed the Entropy; first, historically, because it resembles one of the expressions for entropy in statistical thermodynamics. As Connor notes: "it is often referred to as the entropy of the source." [p.81, emphasis added.] Also, while this is a somewhat controversial view in Physics, as is briefly discussed in Appendix 1 below, there is in fact an informational interpretation of thermodynamics that shows that informational and thermodynamic entropy can be linked conceptually as well as in mere mathematical form. Though somewhat controversial even in quite recent years, this is becoming more broadly accepted in physics and information theory, as Wikipedia now discusses [as at April 2011] in its article on Informational Entropy (aka Shannon Information, cf also here):
At an everyday practical level the links between information entropy and thermodynamic entropy are not close. Physicists and chemists are apt to be more interested in changes in entropy as a system spontaneously evolves away from its initial conditions, in accordance with the second law of thermodynamics, rather than an unchanging probability distribution. And, as the numerical smallness of Boltzmann's constant kB indicates, the changes in S / kB for even minute amounts of substances in chemical and physical processes represent amounts of entropy which are so large as to be right off the scale compared to anything seen in data compression or signal processing. But, at a multidisciplinary level, connections can be made between thermodynamic and informational entropy, although it took many years in the development of the theories of statistical mechanics and information theory to make the relationship fully apparent. In fact, in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon's information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states that it could be in, thus making any complete state description longer. (See article: maximum entropy thermodynamics.[Also,another article remarks: >>in the words of G. N. Lewis writing about chemical entropy in 1930, "Gain in entropy always means loss of information, and nothing more" . . . in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes/no questions that need to be answered in order to fully specify the microstate, given that we know the macrostate.>>]) Maxwell's demon can (hypothetically) reduce the thermodynamic entropy of a system by using information about the states of individual molecules; but, as Landauer (from 1961) and co-workers have shown, to function the demon himself must increase thermodynamic entropy in the process, by at least the amount of Shannon information he proposes to first acquire and store; and so the total entropy does not decrease (which resolves the paradox).
Summarising Harry Robertson's Statistical Thermophysics (Prentice-Hall International, 1993) -- excerpting desperately and adding emphases and explanatory comments, we can see, perhaps, that this should not be so surprising after all. (In effect, since we do not possess detailed knowledge of the states of the vary large number of microscopic particles of thermal systems [typically ~ 10^20 to 10^26; a mole of substance containing ~ 6.023*10^23 particles; i.e. the Avogadro Number], we can only view them in terms of those gross averages we term thermodynamic variables [pressure, temperature, etc], and so we cannot take advantage of knowledge of such individual particle states that would give us a richer harvest of work, etc.) For, as he astutely observes on pp. vii - viii:
. . . the standard assertion that molecular chaos exists is nothing more than a poorly disguised admission of ignorance, or lack of detailed information about the dynamic state of a system . . . . If I am able to perceive order, I may be able to use it to extract work from the system, but if I am unaware of internal correlations, I cannot use them for macroscopic dynamical purposes. On this basis, I shall distinguish heat from work, and thermal energy from other forms . . .
And, in more details, (pp. 3 - 6, 7, 36, cf Appendix 1 below for a more detailed development of thermodynamics issues and their tie-in with the inference to design; also see recent ArXiv papers by Duncan and Samura here and here):
. . . It has long been recognized that the assignment of probabilities to a set represents information, and that some probability sets represent more information than others . . . if one of the probabilities say p2 is unity and therefore the others are zero, then we know that the outcome of the experiment . . . will give [event] y2. Thus we have complete information . . . if we have no basis . . . for believing that event yi is more or less likely than any other [we] have the least possible information about the outcome of the experiment . . . . A remarkably simple and clear analysis by Shannon [1948] has provided us with a quantitative measure of the uncertainty, or missing pertinent information, inherent in a set of probabilities [NB: i.e. a probability different from 1 or 0 should be seen as, in part, an index of ignorance] . . . . [deriving informational entropy, cf. discussions here, here, here, here and here; also Sarfati's discussion of debates and the issue of open systems here . . . ] H({pi}) = - C [SUM over i] pi*ln pi, [. . . "my" Eqn 6] [where [SUM over i] pi = 1, and we can define also parameters alpha and beta such that: (1) pi = e^-[alpha + beta*yi]; (2) exp [alpha] = [SUM over i](exp - beta*yi) = Z [Z being in effect the partition function across microstates, the "Holy Grail" of statistical thermodynamics]. . . . [H], called the information entropy, . . . correspond[s] to the thermodynamic entropy [i.e. s, where also it was shown by Boltzmann that s = k ln w], with C = k, the Boltzmann constant, and yi an energy level, usually ei, while [BETA] becomes 1/kT, with T the thermodynamic temperature . . . A thermodynamic system is characterized by a microscopic structure that is not observed in detail . . . We attempt to develop a theoretical description of the macroscopic properties in terms of its underlying microscopic properties, which are not precisely known. We attempt to assign probabilities to the various microscopic states . . . based on a few . . . macroscopic observations that can be related to averages of microscopic parameters. Evidently the problem that we attempt to solve in statistical thermophysics is exactly the one just treated in terms of information theory. It should not be surprising, then, that the uncertainty of information theory becomes a thermodynamic variable when used in proper context . . . . Jayne's [summary rebuttal to a typical objection] is ". . . The entropy of a thermodynamic system is a measure of the degree of ignorance of a person whose sole knowledge about its microstate consists of the values of the macroscopic quantities . . . which define its thermodynamic state. This is a perfectly 'objective' quantity . . . it is a function of [those variables] and does not depend on anybody's personality. There is no reason why it cannot be measured in the laboratory." . . . . [pp. 3 - 6, 7, 36; replacing Robertson's use of S for Informational Entropy with the more standard H.]
As is discussed briefly in Appendix 1, Thaxton, Bradley and Olsen [TBO], following Brillouin et al, in the 1984 foundational work for the modern Design Theory, The Mystery of Life's Origins [TMLO], exploit this information-entropy link, through the idea of moving from a random to a known microscopic configuration in the creation of the bio-functional polymers of life, and then -- again following Brillouin -- identify a quantitative information metric for the information of polymer molecules. For, in moving from a random to a functional molecule, we have in effect an objective, observable increment in information about the molecule. This leads to energy constraints, thence to a calculable concentration of such molecules in suggested, generously "plausible" primordial "soups." In effect, so unfavourable is the resulting thermodynamic balance, that the concentrations of the individual functional molecules in such a prebiotic soup are arguably so small as to be negligibly different from zero on a planet-wide scale . . .
This outlines a good part of my reason for taking this view seriously, and as the Wiki clip acknowledges, it is increasingly seen as a reasonable perspective or school of thought. (Remember there are several schools of thought on quantum physics also.) Taking this back to the matters in hand, disffusive forces and the like, overwhelmingly, move systems to microstate clusters where the bulk of possibilities for a system lie; as I showed in brief in the toy example for diffusion. In short, let me clip from no 6 again, as it is obvious you have not attended tot he point:
For instance (following a useful simple model of diffusion in Yavorsky and Pinski’s nice elementary Physics), if we have ten each of white and black marbles in two rows in a container: ||**********|| ||0000000000|| There is bit one way to be as shown, but over 63,000 to be 5B/W in each row, and 6:4 and 4:6 cases are close to such numbers:44,000 +. So, if the system is sufficiently agitated for balls to swap positions at random on a regular basis, it soon moves towards a dominant cluster near the 5:5 peak and “forgets” the initial state. This basic pattern works for heat, diffusion of ink drops in beakers of water, C atoms in a bar of steel, and more. The basic reason for such does not go away if the system is opened up to energy and mass flows, and the point that rare, specifically and simply describable states ill seldom be revisited or found, for enough complexity — 500 – 1,00 bits, soon becomes that such states are beyond the reach of the solar system’s or the observed cosmos’ search capacity. RS’ point that there are states that can be locked away from interaction so that it is reasonable to partition entropy accounting, is also quite useful. My own emphasis is that we need to see the difference between what diffusion like factors/ forces will strongly tend to do and what produces shaft work thence constructive work ending in FSCO/I.
To assume or hope that such a type of effect will perform organised shaft work that6 constructs an entity manifesting FSCO/I is empirically futile. there is a logical possibility but he overwhelming balance of statistical weights of micro state clusters will push the system towards states that are anything but what FSCO/I requires. As has been repeatedly pointed out, but ignored and derided or twisted into a strawman and dismissed. KFkairosfocus
July 10, 2013
July
07
Jul
10
10
2013
08:53 AM
8
08
53
AM
PDT
Eizabeth:
Dust devils (and tornadoes) are winds, Joe!
I know! However the winds that form tornadoes are not themselves tornadoes. And you sed the air was still- that is incorrect.
Winds are convection currents. In the case of dust devils (check the wiki) they actually only form in “light or no wind”.
Thjey need the movement of air in order to form. Even wiki says that.
This is because they are rising convection currents from a layer of hot air heated by hot ground which in turn is heated by solar radiation.
Right, the air is NOT still. You sed it was still. And true, hot air flowing into cool air is not a violation of the law.Joe
July 10, 2013
July
07
Jul
10
10
2013
08:45 AM
8
08
45
AM
PDT
keiths:
If you asked 100 highly-trained physicists whether they think that life violates the second law, what do you think they would say?
No one cares what they say. People care what they can demonstrate. And it is a given that they cannot demonstrate that blind and undirected chemical processes can produce a living organisms from non-living matter.Joe
July 10, 2013
July
07
Jul
10
10
2013
08:40 AM
8
08
40
AM
PDT
Joe
Elizabeth, Tornadoes form during thunderstorms. The air is hardly still. There are updrafts and downdrafts. Look at tornado alley- cold winds coming down from the north collide with warm winds coming up from the Gulf of Mexico. Winds mean the air is moving, Liz. And dust devils also require wind.
Dust devils (and tornadoes) are winds, Joe! Winds are convection currents. In the case of dust devils (check the wiki) they actually only form in "light or no wind". This is because they are rising convection currents from a layer of hot air heated by hot ground which in turn is heated by solar radiation. On windy days they don't form because that layer of hot air keeps getting whisked away. Sometimes (because the ground is non-uniformly heat absorbent) a particular patch of ground will get hotter than the rest, and the hot air will start to rise (i.e. become a vertical wind) and turn. This forms a chimney which acts exactly like a fridge - pumps hot air from near the ground and out of the top of the chimney. No law is violated when they do so. There are some awesome picture of dust devils on Mars too.Elizabeth B Liddle
July 10, 2013
July
07
Jul
10
10
2013
08:38 AM
8
08
38
AM
PDT
JWT: KS has played a strawman distortion game, which I have repeatedly corrected. At this point, I suggest you acquaint yourself with the substantial point, that he and others of his ilk are trying to find some subterfuge to get away with pretending or suggesting that diffusion and the like overwhelmingly dispersive forces, can reasonably perform constructive work leading to FSCO/I. The focus on that per 2nd law a body that loses heat will reduce its entropy which has to go to something at a lower temp which then increases its entropy beyond that lost, is a red herring, led away to the strawman that GS and I have denied or implied denial of the 2nd law. In fact the relevant entity is in the diagrams and discussions I have put up, is B or B' . . . an energy IMPORTING entity, especially when B' performs shaft work on importing energy, perhaps by heat. Or otherwise. The statistical underpinnings of the 2nd law lead directly to the issue that diffusion etc will overwhelmingly lead away from constructive work: systems gravitate to configuration clusters that have overwhelming statistical weight, and on the relevant gamut for OOL etc, the resources of the observed cosmos are not enough to sample enough of the phase space at random in the cosmic lifespan, to credibly ever hit on the sort of special isolated and rare clusters of configs implied by the FSCO/I of life. These, ad more on related points -- there is a fairly large body of related issues -- have been outlined and explained for those who really do want to understand. (And on the whole, I have dodged the math, which gets hairy real fast. A more mathematical analysis that leads to the same essential point is in TMLO chs 7 & 8, here. It could be updated a bit, but is essentially on target as far as it goes. The appendix 1 my always linked gives a bit more too.) KF.kairosfocus
July 10, 2013
July
07
Jul
10
10
2013
08:29 AM
8
08
29
AM
PDT
Elizabeth, Tornadoes form during thunderstorms. The air is hardly still. There are updrafts and downdrafts. Look at tornado alley- cold winds coming down from the north collide with warm winds coming up from the Gulf of Mexico. Winds mean the air is moving, Liz. And dust devils also require wind.Joe
July 10, 2013
July
07
Jul
10
10
2013
08:00 AM
8
08
00
AM
PDT
In fact here's some nice footage, that makes my point really well. You can see how still the air is from the glassy surface of the puddles, and the fact that no dust is being lifted anywhere other than where the dust devil is. From wikipedia:
Dust devils form when hot air near the surface rises quickly through a small pocket of cooler, low- pressure air above it. If conditions are just right, the air may begin to rotate. As the air rapidly rises, the column of hot air is stretched vertically, thereby moving mass closer to the axis of rotation, which causes intensification of the spinning effect by conservation of angular momentum. The secondary flow in the dust devil causes other hot air to speed horizontally inward to the bottom of the newly forming vortex. As more hot air rushes in toward the developing vortex to replace the air that is rising, the spinning effect becomes further intensified and self-sustaining. A dust devil, fully formed, is a funnel-like chimney through which hot air moves, both upwards and in a circle. As the hot air rises, it cools, loses its buoyancy and eventually ceases to rise. As it rises, it displaces air which descends outside the core of the vortex. This cool air returning acts as a balance against the spinning hot-air outer wall and keeps the system stable.[4] The spinning effect, along with surface friction, usually will produce a forward momentum. The dust devil is able to sustain itself longer by moving over nearby sources of hot surface air. As available extreme hot air near the surface is channelled up the dust devil, eventually surrounding cooler air will be sucked in. Once this occurs, the effect is dramatic, and the dust devil dissipates in seconds. Usually this occurs when the dust devil is not moving fast enough (depletion) or begins to enter a terrain where the surface temperatures are cooler, causing unbalance.[5] Certain conditions increase the likelihood of dust devil formation. Flat barren terrain, desert or tarmac: Flat conditions increase the likelihood of the hot-air "fuel" being a near constant. Dusty or sandy conditions will cause particles to become caught up in the vortex, making the dust devil easily visible. Clear skies or lightly cloudy conditions: The surface needs to absorb significant amounts of solar energy to heat the air near the surface and create ideal dust devil conditions. Light or no wind and cool atmospheric temperature: The underlying factor for sustainability of a dust devil is the extreme difference in temperature between the near-surface air and the atmosphere. Windy conditions will destabilize the spinning effect (like a Tornado) of a dust devil.
And in fact this directly falsifies Robert's claim in the OP:
Does this make sense? I mean everybody and their brother say that entropy can decrease if you have a heat engine in the system. Energy comes into your refrigerator as low entropy energy. Energy billows out of the coils in the back as high entropy heat. But inside the fridge is a low-entropy freezer. Couldn’t this apply to Earth? (E.g., compensation argument.)
Yes, it could, and does, Robert. These dust devils are heat engines. They pump hot air from near the ground, cooling it. They are low entropy "freezers". Hot air billows out of the top of the "chimney" as high energy "heat". And they form spontaneously, yet do not violate the 2nd Law of thermodynamics. Or do you think they do?Elizabeth B Liddle
July 10, 2013
July
07
Jul
10
10
2013
07:59 AM
7
07
59
AM
PDT
Joe
Except tornadoes do NOT form out of “uniform still air”. I would think that would be a violation of some law.
Well, they do, Joe. The air molecules going round and round in a tornado are the same ones that were sitting quietly doing their knitting a few moments before, just as the debris going round and round with them was sitting quietly being someone's house a few moments earlier. And have you never seen a dust devil form on a still hot day?Elizabeth B Liddle
July 10, 2013
July
07
Jul
10
10
2013
07:27 AM
7
07
27
AM
PDT
Elizabeth:
William, science requires very precise and specific definitions. If you do not define your terms – including your units, you will find that your argument doesn’t work.
And that is exactly why evolutionism and materialism do not work abd are not scienceJoe
July 10, 2013
July
07
Jul
10
10
2013
07:17 AM
7
07
17
AM
PDT
1 2 3 4 5

Leave a Reply