Uncommon Descent Serving The Intelligent Design Community

# Granville Sewell’s important contribution to physics: Entropy-X

Share
Flipboard
Print
Email

Abstract:   In Sewell’s discussion of entropy flow, he defines “Entropy-X”, which is a very useful concept that should clarify misconceptions of entropy promulgated by Wikipedia and scientists unfamiliar with thermodynamics. Sewell’s important contribution is to argue that one can and should reduce the “atom-entropy” into a subsets of mutually exclusive “entropy-X”. Mathematically, this is like factoring an N x M matrix into block diagonal form, by showing that the cross terms between blocks do not contribute to the total. Each of the blocks in the diagonal then correspond to a separately computed entropy, or “Entropy-X”.  This contribution not only clarifies many of the misunderstandings of laypersons, it also provides a way for physicists to overcome their confusion regarding biology.

– 0 –

Introduction:     Entropy was initially discussed in terms of thermodynamics, as a quantity that came out of the energy, temperature, work formulas.  Ludwig Boltzmann found a way to relate billiard-ball counting statistics to this thermodynamic quantity, with a formula he had inscribed on his tombstone:  S = k ln(Ω). The right-hand-side of this equation, contains a logarithm of the possible ways to arrange the atoms. The left-hand-side is the usual thermodynamic quantity. Relating the two different worlds of counting and heat, is this constant “k”, now called the “Boltzmann constant”.

Now for the shocking part. There is no theory that predicts its value. It is a conversion constant that is experimentally determined. It works best when the real physical system approximates billiard balls–such as noble gasses. The constant gets progressively worse, or needs more adjustments, if the right-hand-side becomes N2 (diatomic gas) or CO2 (triatomic). Going from gas to liquid introduces even more corrections, and by the time we get to solids we use a completely different formula.

For example, a 1991 Nobel prize was awarded for studying how a long oily molecule moves around in a liquid, because not every state of rearrangement is accessible for tangled strings.  So 100 years after Boltzmann, we are just now tackling liquids and gels and trying to understand their entropy.

Does that mean we don’t know what entropy is?

No, it means that we don’t have a neat little Boltzmann factor for relating thermodynamic-S to computer statistics. We still believe that it is conserved, we just can’t compute the number very easily.  This is why Granville Sewell uses “X-entropy” to describe all the various forms of order in the system. We know they must be individually conserved, barring some conversion between the various types of entropy in the sytem, but we can’t compute it very well.

Nor is it simply that the computation gets too large. For example, in a 100-atom molecule, the entropy is NOT computed by looking at all the 100! permutations of atoms, simply because many of those arrangements are energetically impossible.  Remember, when Boltzmann described “ln(Ω)” he was calling it the possible states of the system. If the state is too energetic, it isn’t accessible without a huge amount of energy.  In particle physics, this limitation becomes known as “spontaneous symmetry breaking”, and is responsible for all the variation we see in our universe today.

So rather than counting “atom states”, we assemble atoms into molecules and form new entities that act as complete units, as “molecules”, and then the entropy consists of counting “states of the molecule”–a much smaller number than “states of the atoms of the molecules”.  Molecules form complexes, and then we compute “states of the molecular complexes”. And complexes form structures, such as membranes. Then we compute “states of the structures”. This process continues to build as we approach the size of a cell, and then we have to talk about organs and complete organisms, and then ecologies and Gaia. The point is that our “unit” of calculation is a getting larger and larger as the systems display larger and larger coherence.

Therefore it is completely wrong to talk about single-atom entropy and the entropy of sunlight when we are discussing structural entropy, for as Granville and previous commentators have said, the flow of energy in sunlight with a coherence length of one angstrom cannot explain the decameter coherence of a building.

So from consideration of the physics, it is possible to construct a hierarchical treatment of entropy which enables entropy to address the cell, but in 1991 we had barely made it to the oily liquid stage of development. So on the one hand, unlike many commentators imagine, physicists don’t know how to compute an equivalent “Boltzmann equation” for the entropy of life, but on the other hand Granville is also right, we don’t need to compute the Boltzmann entropy to show that it must be conserved.

– o –

Mathematical Discussion:   Sewell’s contribution is to recognize that there must be a hierarchical arrangement of atoms that permit the intractible problem of calculating the entropy to be treated as a sum of mutually exclusive sub-entropies.

This contribution can be seen in the difference between “Entropy-chromium” and “Entropy-heat” that he introduces in his paper, where Entropy-chromium is the displacement of chromium atoms in a matrix of iron holding the velocity of the atoms constant, and Entropy-heat considers a variation in velocities while holding the position constant.  These two type of entropy have a large energy barrier separating them on the order of several eV per atom that prevent them from interconverting. At sufficiently high temperature–say, the temperature at which the iron-chrome allow was poured–the chromium atoms have sufficient mobility to overcome the energy barrier and move around. But at the present room temperature, they are immobile.  So in the creation event of the chromium bar, the entropy was calculated for both position and velocity, but as it cooled, “spontaneous symmetry breaking” produced two smaller independent entropies from the single larger one.

Now the beginning of this calculation is the full-up, atom entropy where everything is accessible. This “big-bang” entropy gives at least 7 degrees of freedom for each atom. That is, the number of bins available for each atom are at least 7–one for the species, 3 that give the position in x,y,z and 3 that give the velocity in Vx,Vy,Vz.  We could subdivide species into all the different quantum numbers that define atoms, but for the moment we’ll ignore that nuance.  In addition, the quantization of space and velocity into “Planck” sizes of 10^-34 meters, means that our bins do not always have a real number, but have a quantized length or velocity specified by an integer number of Planck sizes.  But again, the real quantization is that atoms don’t overlap, so we can use a much coarser quantization of 10^-10 meters or angstrom atomic lengths. The reason this is important, is that we are reducing ln(Ω) by restricting the number of states of the system that we need to consider.

But if S = ln(Ω), then this means we are mathematically throwing away entropy! How is that fair?

There is a curious result in quantum mechanics, that says if we can’t distinguish two particles, then there is absolutely no difference if they are swapped. This is another way of saying that their position entropy is zero. So if we have two states of a system, separated by a Planck length, but can’t tell the difference, it doesn’t contribute to the entropy.  Now this isn’t to say that we can’t invent a system that can tell the difference, but since a Planck length corresponds to light with gamma-ray intensity, we really have to go back to the Big Bang to find a time when this entropy mattered.   This reconfirms our assumption that as a system cools, it loses entropy and has fewer and fewer states available.

But even this angstrom coarse graining in position represented by “Entropy-Chromium”, is still too fine for the real world because biology is not made out of noble gasses, bouncing in a chamber. Instead, life exists in a matrix of water, an H2O molecule of nanometer size. Just as we cannot tell the difference if we swap the two hydrogen atoms in the water molecule around, we can’t tell the difference if we swap two water molecules around. So the quantization entropy gets even coarser, and the number of states of the system shrink, simply because the atoms form molecules.

A very similar argument holds for the velocities. A hydrogen atom can’t have every velocity possible because it is attached to an oxygen atom. That chemical bond means that the entire system has to move as a unit. But QM tells us that as the mass of a system goes up, the wavelength goes down, which is to say, the number of velocities we have to consider in our binning is reduced as we have a more massive system. Therefore the velocity entropy drops as the system becomes more chemically bound.

And of course, life is mostly made out of polymers of 100’s or 1000’s of nanometers in extent, which have even more constraints as they get tangled around each other and attach or detach from water molecules. That was what the 1991 Nobel prize was about.

Mathematically, we can write the “Big Bang” entropy as a N x M matrix, where N is the number of particles and M the number of bins. As the system cools and becomes more organized, the entropy is reduced, and the system becomes “block-diagonal”, where blocks can correspond to molecules, polymer chains, cell structures, organelles, cells, etc.

Now here is the key point.

If we only considered the atom positions, and not these molecular and macro-molecular structures, the matrix would not be in block diagonal form. Then when we computed the Boltzmann entropy, ln(Ω), we would have a huge number of states available. But in fact, biology forms so much structure, that ln(Ω) is greatly reduced.  All those cross-terms in the matrix are empty, because they are energetically inaccessible, or topologically inaccessible (see the 1991 Nobel prize). Sewell is correct, there is a tremendous amount of order (or reduced entropy) that is improperly calculated if life is considered as a ball of noble gas.

Let me say this one more time. Sewell is not only correct about the proper way to calculate entropy, he has made a huge contribution in arguing for the presence of (nearly) non-convertible forms of sub-entropy.

– o –

Thermodynamic Comment:   Some have argued that this “sub-entropy” of Sewell’s can be explained by some sort of spontaneous symmetry breaking due to the influx of energy. We have talked about “cooling” causing spontaneous symmetry breaking, which is consistent with the idea that higher temperatures have higher entropy, but the idea that “heating” can also drive symmetry breaking and lowered entropy is incoherent. This is simply because thermodynamically d(TS) = dE, or dS = d(E/T), which is to say, energy brings entropy and temperature simultaneously.

Let’s look at this a bit closer and apply it to the Earth.  The Sun is at 5500K, and therefore its energy comes principally in the form of yellow photons.  The Earth global temperature averages out to about 300K, so it emits infrared photons.   In thermodynamic equilibrium, the energy entering has to balance the energy leaving (or the Earth would heat up!).  Since the entropy of the photons hitting the Earth have almost twenty times less than the entropy of the photons leaving the Earth, the Earth must be a source for entropy. Ergo, sunlight should be randomizing every molecule in the Earth and preventing life from happening.

Does this make sense? I mean everybody and their brother say that entropy can decrease if you have a heat engine in the system. Energy comes into your refrigerator as low entropy energy. Energy billows out of the coils in the back as high entropy heat. But inside the fridge is a low-entropy freezer.  Couldn’t this apply to Earth? (E.g., compensation argument.)

Well, if the Earth had a solar-powered refrigerator, and some insulation, and some fancy piping, yes. But of course, all that is an even more low entropy system than the electricity, so we are invoking bigger and bigger systems to get the entropy in the small freezer to go down. The more realistic comparison is without all that special pleading. Imagine you leave your freezer door open accidently and leave for the weekend. What will you find? A melted freezer and a very hot house, because your fridge would run continuously trying to cool something down that warmed even faster. In fact, this is how dehumidifiers function. So your house would be hot and dry, with a large puddle of coolish water in the kitchen. Some fridges dump that water into a pan under the heating coils, which would evaporate the water and we would then be back to our original state but at higher temperature. Would the entropy of the kitchen be greater or smaller than if you had unplugged the fridge for the weekend? Greater, because of all that heat.

And in fact, this is the basic argument for why objects that sit in space for long periods of time have surfaces that are highly randomized.  This is why Mars can’t keep methane in its atmosphere. This is why Titan has aerosols in its atmosphere. This is why the Moon has a sterile surface.

If the Earth is unique, there is more than simply energy flow from the Sun that is responsible.

That's good news, Lizzie. I'm glad it went well. keiths
btw My tooth came out so easily, I didn't have time to figure out the answer to cantor's challenge. Although in the end I simply remembered the formula, and reverse engineered the logic (googled my brain as it were). So I cheated. Good to have done though, and thanks for the challenge, cantor! (And recovery is going well - mild analgesics seem to be adequate, and porridge + bananas isn't too bad a diet) Elizabeth B Liddle
CS3, If I'm reading you correctly, you're asking whether it's possible in principle for us to discover violations of what we think are fundamental physical laws. The answer is yes, of course. It's possible, but they are considered laws for a reason, and so the evidence that they have been violated has to be very strong in order to convince us. Recall the intense scrutiny that followed the announcement by the OPERA team that they had observed neutrinos apparently moving faster than the speed of light. Yet here we have Granville casually asserting that there are thousands of different kinds of entropy, with a different second law for each. We have Robert backing him up, and even claiming that the second law is violated constantly by growing plants. Bizarre! In short, we have crank science of an high order. I do not understand why ID supporters embrace crank science so readily. I suppose it has something to do with the fact that they think that they're right and that the vast majority of scientists are wrong. If scientists are deluded about evolution, the ID supporter thinks, then they may be deluded about thermodynamics, or climate change, or the age of the earth. In the case of Granville and Robert, they seem to sincerely believe that Granville has stumbled upon something of great value to physics -- something that everyone else is just too blind to see. It's ludicrous, especially since Granville knows very little about thermodynamics and makes a raft of embarrassing errors in his paper. If ID wants to be taken seriously by science, it needs to start by taking science seriously. Indiscriminately embracing cranks is not the way to do that. keiths
KF, You're missing the point. If the universe has zero net energy, as the WMAP measurements suggest (to within 0.4%!), then the Big Bang cannot have violated the first law, regardless of whether it was the result of a fluctuation or some other mechanism. x + 0 = x, for all x. keiths
5for: Tornadoes do not perform constructive work in accord with a Wicken wiring diagram, yielding entities with FSCO/I, such as Jumbo Jets, or going to diffusion and the like at micro scale, neither do forces that scatter concentrations out in accordance with random exploration of accessible microstates, credibly get us to the living cell out of Darwin's warm little pond or the like. Again, the notion that design advocates are arguing that there are no subsystems that undergo entropy reduction. (e.g. a hot body interacting with a cold one by passing d'Q of energy loses numbers of ways that energy and mass may be distributed at micro level, in a way that the receiving sub system then in aggregate increases the number of ways to such an extent that net there is at least conservation of entropy.) What is being pointed out it that he very cluster of forces and mechanisms, such as diffusion and the like, that lead to this, are such that we are not at all credibly going to get from such constructive shaft work in accord with a Wicken wiring diagram issuing in FSCO/I rich subsystems. And where teh heat engine or energy conversion device that performs the sort of constructive work we are discussion -- such as assembling a protein step by step -- exhibits FSCO/I in itself, that too needs to be explained. The only empirically grounded, analytically credible cause of FSCO/I is design. Attempts to evade this by appealing to spontaneous order tracing to convection and boundary conditions leading to vortices, or crystallisation on freezing by extracting the energy that prevents polarised molecules from forming an ordered structure, a crystal, reflect a fundamental confusion between randomness, order and specified complexity, especially functionally specified complexity. The att4empt to pull the rabbit of a living, metabolising, encapsulated, gated, von Neumann, code based self replicating cell out of such hoped for lucky noise, is a mark of the bankruptcy of evolutionary materialism right from the root of the tree of life. So, we may properly point out: no roots, no shoots, branches or twigs. Design sits at the table as of right of well-warranted induction backed up by the statistical analysis that undergirds thermodynamics, right from the root of the tree of life. KF kairosfocus
Onlookers: KS is too educated not to know that a pre-existing gravity field in which our observed cosmos forms at an instant and starts to expand, is an underlying assumed pre-universe. He is also too intelligent and educated not to know that such a fluctuation would be contingent and thus either chance or choice. It being known that he is evolutionary materialist or a fellow traveller, it is not hard to see which is his choice. Just, he wishes to get something for and from nothing, without reckoning what a real nothing is -- non being. Cf recent discussion of such poorly thought through metaphysics presented in a lab coat, here at UD -- on pulling a cosmos out of a non-existent hat. KF kairosfocus
So 5for, you are cosigning tornadoes?? someone seems to have no shame! bornagain77
Niwrad @109 - you are the priceless one. Lizzie did not say tornadoes do not destroy. She said no they do not represent the destructive nature of entropy increase. Your attempt to lampoon what she said by misquoting her was noticed by this onlooker. For shame! 5for
as to
It’s pretty easy (to explain the Big Bang), because the universe’s positive energy is offset by the negative energy of gravitation. Most cosmologists therefore think that the net energy of the universe is zero.
Seems somebody is channeling Peter Atkins
William Lane Craig vs Peter Atkins (HQ) 4a/11 http://www.youtube.com/watch?v=wMR8auaJK0o
Craig's rejoinder to Atkin's argument, at the 50:00 minute of this following video, is classic:
William Lane Craig vs Peter Atkins: "Does God Exist?", University of Manchester, October 2011 http://www.youtube.com/watch?v=Ssq-S5M8wsY&feature=player_detailpage#t=3009s
At least Krauss's argument from nothing is not quite as absurd as Atkin's is/was:
Why Atheism Is Nonsense - Part 2 "Something is Nothing" http://www.youtube.com/watch?v=-9m8P6UC1EI
Please note that Krauss refers at the 2:20 mark that the energy of empty space is not zero (i.e. does not balance to zero as Atkins holds) but is a 'gazillion times the energy we see'. I believe what Krauss is referring to is this fact:
Vacuum energy Excerpt: Vacuum energy is an underlying background energy that exists in space even when the space is devoid of matter (free space). (Vacuum energy has a postulated) value of 10^113 Joules per cubic meter. and: (10^113 joules) per (cubic meter) = 10 ^113 pascals (Pa) and 10^113 Pa approx = 4.6×10^113 Pa = 6.7×10^109 psi; Of note: The Planck pressure, (4.63x10^108 bar), is/was not reached except shortly after the Big Bang or in a black hole. http://en.wikipedia.org/wiki/Orders_of_magnitude_%28pressure%29
Of course the problem for Krauss was that he tried to redefine nothing as this empty space that is boiling with these virtual particles. The problem with all that of course is that space-time is also shown to have come into existence at the Big Bang, thus Krauss does not even have space-time filled with virtual particles to appeal to since even space-time did not exist. But I guess all that is just a small detail that could be overlooked if your primary goal is to deny Genesis in the first place: Notes:
bornagain77
It’s pretty easy, because the universe’s positive energy is offset by the negative energy of gravitation. Most cosmologists therefore think that the net energy of the universe is zero.
Even assuming this is true, would it not be fair to say that represents, as I said, a "violation" of what was, at least at one point, "the normal understanding of a fundamental natural law"? In fact, many scientists argued against the Big Bang because it violated their notion of the fundamentals of science:
"Some prominent scientists began to feel the same irritation over the expanding universe that Einstein had expressed earlier. Eddington wrote in 1931, 'I have no ax to grind in this discussion, but the notion of a beginning is repugnant to me. The expanding universe is preposterous...incredible, it leaves me cold.' The German chemist Walter Nernst wrote 'To deny the infinite duration of time would be to betray the very foundation of science.'"
So, is it not possible that a conflict could be found between, say, the current materialist view of the origin and development of life and the current understanding of what the fundamental laws of nature are? Maybe there is something missing in one of these two current understandings. Maybe that something is "natural" (like "negative energy of gravitation"), or maybe it is not. CS3
KF, I didn't say anything about a multiverse or a fluctuation. I explained to CS3 that the Big Bang is not an example of a first law violation, because the net energy of the universe appears to be zero:
It’s pretty easy, because the universe’s positive energy is offset by the negative energy of gravitation. Most cosmologists therefore think that the net energy of the universe is zero.
You disputed that, and you were wrong. Again. keiths
Onlookers, yet another KS strawman, duly laced with ad hominems and set alight to cloud the issue, poison and polarise the atmosphere. There is no evidence observed of a multiverse or another wider universe that threw this one up as a fluctuation, all is admitted speculation. KS cites evidence pointing to some aspects of fine tuning, if anything, ad then pretends to find his underlying sub cosmos to bubble us up as if it were empirically observed rather than a highly speculative model. KF kairosfocus
EL:
just as certain physical constraints make a dust devil probable on a cool sunny day in a dry desert, in other words, make a local entropy decrease probable, so we have no reason a priori to think that there may also have been physical constraints that made self-replication probable on earth 3 or 4 billion years ago.
1 --> whirlwinds are observed, speculated self replicating molecules simply are not and have not. Why are you trying to suggest the second is as empirical as the other? Or, that we must answer to air castle molecules with speculated properties as though they were realities? Show them real first, then we can do science, as in based on observations. Absent that you are talking materialist fantasies. 2 --> The only self replicating life observed uses gated encapsulation, metabolism using nanotech molecular machines and a code based replication facility following von Neumann's kinematic self replicator architecture. Such is chock-full of FSCO/I, manifests oodles of constructive work using FSCO/I rich machines, and we know but one empirically substantiated, analytically credible source for FSCO/I. 3 --> Worse, dust devils are a manifestation of mechanical necessity riding on convection processes. Like hurricanes, they reflect order, not organisation on a Wicken wiring diagram, i.e. there is a category confusion linked to your refusal to acknowledge the logic of the design explanatory filter's first step, namely that necessity leading to regularity is distinct from high contingency tracing to chance or choice where also complex functional organisation has but one empirically warranted explanation, design. 4 --> Notice Wiki:
Dust devils form when hot air near the surface rises quickly through a small pocket of cooler, low- pressure air above it. If conditions are just right, the air may begin to rotate. As the air rapidly rises, the column of hot air is stretched vertically, thereby moving mass closer to the axis of rotation, which causes intensification of the spinning effect by conservation of angular momentum. The secondary flow in the dust devil causes other hot air to speed horizontally inward to the bottom of the newly forming vortex. As more hot air rushes in toward the developing vortex to replace the air that is rising, the spinning effect becomes further intensified and self-sustaining. A dust devil, fully formed, is a funnel-like chimney through which hot air moves, both upwards and in a circle. As the hot air rises, it cools, loses its buoyancy and eventually ceases to rise. As it rises, it displaces air which descends outside the core of the vortex. This cool air returning acts as a balance against the spinning hot-air outer wall and keeps the system stable.[4] The spinning effect, along with surface friction, usually will produce a forward momentum. The dust devil is able to sustain itself longer by moving over nearby sources of hot surface air. As available extreme hot air near the surface is channelled up the dust devil, eventually surrounding cooler air will be sucked in. Once this occurs, the effect is dramatic, and the dust devil dissipates in seconds. Usually this occurs when the dust devil is not moving fast enough (depletion) or begins to enter a terrain where the surface temperatures are cooler, causing unbalance.
5 --> Order, shaped by initial conditions, boundary conditions and mechanical laws. 6 --> Contrast Wicken and Orgel, as has repeatedly been brought to your attention and just as repeatedly ignored the better to repeat long since cogently answered assertions -- yes, talking points repeated in an ad nauseam drumbeat regardless of cogent response or correction -- to the point of willfulness. Let me clip from point 8 at 41, dismissed by KS as spam, reflecting his own refusal to attend to cogent material:
WICKEN, 1979: >> ‘Organized’ systems are to be carefully distinguished from ‘ordered’ systems. Neither kind of system is ‘random,’ but whereas ordered systems are generated according to simple algorithms [[i.e. “simple” force laws acting on objects starting from arbitrary and common- place initial conditions] and therefore lack complexity, organized systems must be assembled element by element according to an [[originally . . . ] external ‘wiring diagram’ with a high information content . . . Organization, then, is functional complexity and carries information. It is non-random by design or by selection, rather than by the a priori necessity of crystallographic ‘order.’ [[“The Generation of Complexity in Evolution: A Thermodynamic and Information-Theoretical Discussion,” Journal of Theoretical Biology, 77 (April 1979): p. 353, of pp. 349-65. (Emphases and notes added. Nb: “originally” is added to highlight that for self-replicating systems, the blue print can be built-in.)] >> ORGEL, 1973: >> . . . In brief, living organisms are distinguished by their specified complexity. Crystals are usually taken as the prototypes of simple well-specified structures, because they consist of a very large number of identical molecules packed together in a uniform way. Lumps of granite or random mixtures of polymers are examples of structures that are complex but not specified. The crystals fail to qualify as living because they lack complexity; the mixtures of polymers fail to qualify because they lack specificity. [[The Origins of Life (John Wiley, 1973), p. 189.]
7 --> Here is a key part of what I noted on vortices and crystals in my clipped remarks on hurricanes and snowflakes, in 42 above:
A tropical cyclone is by and large shaped by convective and Coriolis forces acting on a planetary scale over a warm tropical ocean whose surface waters are at or above about 80 degrees F. That is, it is a matter of chance + necessity leading to order under appropriate boundary conditions, rather than to complex, functionally specified information. Similarly, the hexagonal, crystalline symmetry of snowflakes is driven by the implications of the electrical polarisation in the H-O-H (water) molecule — which is linked to its kinked geometry, and resulting hexagonal close packing.
8 --> Now you tried to make much of how forming an ordered vortex implies a local reduction in entropy, as if voila, reductions in entropy resulting in constructive work creating FSCO/I rich systems can be bought so cheaply. 9 --> There is a basis for the structure of a crystal or vortex that reflects order. The properties of molecules, atoms or ions, the specifics of convectional circumstances. None of these has anything to do with aperiodic systems built up in accordance with wiring diagrams or coded complex specifications such as we see with say proteins. 10 --> In short, FSCO/I patently cannot be had on the cheap. Let me cite Thaxton et al on the matter, TMLO ch 8:
Only recently has it been appreciated that the distinguishing feature of living systems is complexity rather than order.4 This distinction has come from the observation that the essential ingredients for a replicating system---enzymes and nucleic acids---are all information-bearing molecules. In contrast, consider crystals. They are very orderly, spatially periodic arrangements of atoms (or molecules) but they carry very little information. Nylon is another example of an orderly, periodic polymer (a polyamide) which carries little information. Nucleic acids and protein are aperiodic polymers, and this aperiodicity is what makes them able to carry much more information. By definition then, a periodic structure has order. An aperiodic structure has complexity. In terms of information, periodic polymers (like nylon) and crystals are analogous to a book in which the same sentence is repeated throughout. The arrangement of "letters" in the book is highly ordered, but the book contains little information since the information presented---the single word or sentence---is highly redundant. It should be noted that aperiodic polypeptides or polynucleotides do not necessarily represent meaningful information or biologically useful functions. A random arrangement of letters in a book is aperiodic but contains little if any useful information since it is devoid of meaning. [NOTE: H.P. Yockey, personal communication, 9/29/82. Meaning is extraneous to the sequence, arbitrary, and depends on some symbol convention. For example, the word "gift," which in English means a present and in German poison, in French is meaningless]. Only certain sequences of letters correspond to sentences, and only certain sequences of sentences correspond to paragraphs, etc. In the same way only certain sequences of amino acids in polypeptides and bases along polynucleotide chains correspond to useful biological functions. Thus, informational macro-molecules may be described as being and in a specified sequence . . . . Three sets of letter arrangements show nicely the difference between order and complexity in relation to information: 1. [Class 1:] An ordered (periodic) and therefore specified arrangement: THE END THE END THE END THE END Example: Nylon, or a crystal . . . . 2. [Class 2:] A complex (aperiodic) unspecified arrangement: AGDCBFE GBCAFED ACEDFBG Example: Random polymers (polypeptides). 3. [Class 3:] A complex (aperiodic) specified arrangement: THIS SEQUENCE OF LETTERS CONTAINS A MESSAGE! Example: DNA, protein. Yockey7 and Wickens5 develop the same distinction, that "order" is a statistical concept referring to regularity such as could might characterize a series of digits in a number, or the ions of an inorganic crystal. On the other hand, "organization" refers to physical systems and the specific set of spatio-temporal and functional relationships among their parts. Yockey and Wickens note that informational macromolecules have a low degree of order but a high degree of specified complexity. In short, the redundant order of crystals cannot give rise to specified complexity of the kind or magnitude found in biological organization; attempts to relate the two have little future.
________ The category confusion is duly corrected. At least for the benefits of onlookers, for on long track record, you simply will not listen or respond appropriately. (I wish that at length you will show me wrong, but every evidence points to my being right.) KF kairosfocus
KF:
For which the empirical evidence is nil.
KF, you crack me up. Recent WMAP measurements show that the universe is flat to within 0.4%. Flat means zero energy. You are terrible at bluffing. keiths
F/N:
It’s pretty easy, because the universe’s positive energy is offset by the negative energy of gravitation. Most cosmologists therefore think that the net energy of the universe is zero.
This is a very poorly, popularly worded way of speculating that we are in effect a fluctuation of a wider universe as a whole. For which the empirical evidence is nil. This is speculative metaphysics not science and it is often very badly done -- e,g, something from nothing (non-being). Because, those who do it are not particularly qualified in metaphysics. KF PS: Onlookers, it has long since been pointed out that there are good reasons to partition entropy, KS is just playing the usual irresponsible talking point strawman tactic games he seems to have played for years. The aim is to swarm down by drumbeat repetition, wearing down. (Americans expect knockout quick wins. This is a case where ideologues will need to know they are only succeeding in showing how unreasonable, irresponsible and outright uncivil they are, all warning flags.) kairosfocus
KS: I think you need to read here on then go look yourself in the eye in a mirror. Your tactics are straight out of the rulebook. Alinsky's rulebook. KF kairosfocus
CS3,
How do you reconcile the First Law with the Big Bang?
It's pretty easy, because the universe's positive energy is offset by the negative energy of gravitation. Most cosmologists therefore think that the net energy of the universe is zero. And sure, what we refer to as laws aren't necessarily sacrosanct and inviolable. The first law can be violated if you "repay" the debt fast enough -- this is what vacuum fluctuations are about -- and the second law is actually expected to be violated if enough time elapses or if the system involved is small enough, since it is a statistical law, not an absolute one. However, to suggest as Robert does that the second law is continually being violated all over the world every time a plant grows, is rather extreme even by crank standards. As is suggesting that Granville has made "an important contribution to physics" with his "X-entropy." keiths
F/N 2: It is always a sign that the objections are breaking down when there is the attempt to resort to multiverses and some version or other of the anthropic principle. The problem is that this is ad hoc, first, and has utterly zero actual empirical warrant. Second, even if there were a multiverse the problem is that he observed cosmos sits at a LOCALLY fine tuned operating point suited for life. That is, in John Leslie's example, it is like a long wall with a local zone in which there is jut one fly. Swat, it is hit by a bullet. Even if there are other parts of the wall elsewhere that positively are carpeted with flies so that a bullet hitting anywhere would smash a fly, the reasonable person on seeing this case would infer to a good marksman wielding a tack-driving rifle, which is itself a serious challenge. Onlookers will want to look here and onwards linked for more. kairosfocus
Lizzie:
And, Kairosfocus, this is not a “talking point”: it is simply a point which I would be very pleased to see you address.
Lizzie, You don't understand. Every point made by an Alinskyite evo mat propagandist such as you is a drumbeat strawman talking point, soaked in oil of ad hominem and set alight to cloud, poison, and polarise the atmosphere, and also to homosexualise the sacred institution of marriage, as Plato warned. Why do you persist in the teeth of correction? Whereas KF's points are straight from the mouths of angels and always linked to boot. keiths
EL: if you will take time to notice you will see that first Robertson is using the Gibbs approach which directly reckons with varied probabilities of accessing modes. The Gibbs formulation reduces to Boltzmann on a flat random distribution but is not locked down to it. However the Boltzmann approach helps us see what is going on by using an instructive simple case. Second, The info measure of entropy is similarly based on that same approach, i.e. the flat random probability is only a special case and is not a main part of the analysis. In the case of Durston et al, the shift from null state to ground state to functional state on empirical observation reckons with that too. Finally, in the FSCO/I threshold the metric is NOT -- repeat, NOT -- built around probabilites but something far more blunt but effective for a threshold: sampling. 500 bits worth of complexity swallows up the search resources of our solar system to the proportion of a blind sample of 1 straw to a cubical haystack 1,000 Light years thick. That is if you were to superpose the haystack on our galactic neighbourhood, and pull a straw sized sample, regardless of many thousands of star systems in it, with all but absolute certainty, you will only pull a straw, because the bulk so utterly dominates. This is the same point in thermodynamics. So as has been said to you umpteen times and studiously ignored in haste to find any dodge, the matter is settled simply on the relative rarity of FSCO/I such that we have no reason to expect so small a relative sample, if blind, to capture it. Where the req't of multi part matching, arrangement and coupling to yield function guarantees that we are dealing with rather special and unusual arrangements. But then by now I need to recognise that you will probably never acknowledge this point so this is a note for onlookers, in absence of a reasonable responsiveness on your part. KF PS: As has been explained, thermodynamics, especially the statistical form, is concerned with a lot more than heat flows. Energy flows is not a good term as work for just one instance is not a flow. There are a great many linked pehenomena that are not really about heat and heat engines, e.g. viscosity, diffusion, etc etc. All of this is part of why I gave that pistons and marbles thought exercise to help see this. But instead of accepting that this has some significance in helping the onlooker, this was derided uncivilly as "spamming." Translation, we hold you in contempt and have not the slightest intent to listen to or even put up with what you have to say -- much less acknowledge that you just may have some knowledge here. After all design thinkers and the like are all obviously ignorant, stupid, insane or wicked. kairosfocus
keithS: How do you reconcile the First Law with the Big Bang? Could there then not have also been one (or perhaps more) creative events (whether due to intelligence or not) in the history of the origin and/or development of life which also "violated" at least the normal understanding of a fundamental natural law? At least in theory, even if you do not believe the evidence currently indicates such? On a related note, is there any reason that multiverses and the anthropic principle can be used to explain why we are in a universe with laws finely-tuned for the development of life, but could not be used, at least in theory, to explain one or more extremely improbable events in the origin and/or development of life (other than that allowing such a possibility might give some credibility to those who have claimed all along that materialist origin of life and evolutionary theories are insufficient)? CS3
Elizabeth B Liddle
No they do not [tornados do not destroy]. A tornado often rearranges things so that the arrangement has less entropy than it did before, such as piling things up, or depositing things in high places (a tornado lifts a sofa into a tree).
Elizabeth you are priceless. You (Evolutionists & Co.) should be here to show us how spontaneous organization (evolution) arises instead of following the general towards-disorder trend of the SLoT and you what offer? a tornado that lifts a sofa into a tree! :)
And in the statistical mechanics sense it is about energy. And the only anythings that the 2nd Law of Thermodynamics applies to is the distribution of energy states. You can get the “entropy” of other things of courses, and we do. But the 2nd Law won’t necessarily hold true for those things.
Disagree. The SLoT of Thermodynamics-statistical mechanics deals with matter and energy and microstates and states and entropy and order and information and organization and CSI... niwrad
I do think that the major contributor to misunderstanding in these discussion is lack of clarity regarding what a probability is. A probability is just a number between 0 and 1, and can be converted into "bits" by taking the negative log. It tells us absolutely nothing on its own, and is not the property of a pattern, but may be probability of a pattern, or one of a class of patterns) under given conditions. Robert makes this clear,pointing out that 1 over the number of possible permutations of a pattern is not the same as the probability of each permutation, because there may be physical constraints that favour some more than others. And that is absolutely key. If we see a rather special-looking pattern, we cannot simply say: this is improbable, it must have been designed. We must also ask: what physical constraints might have made this pattern probable? And just as certain physical constraints make a dust devil probable on a cool sunny day in a dry desert, in other words, make a local entropy decrease probable, so we have no reason a priori to think that there may also have been physical constraints that made self-replication probable on earth 3 or 4 billion years ago. And certainly, no reason to infer that there were not simply because the probability under random walk is prohibitively low. And, Kairosfocus, this is not a "talking point": it is simply a point which I would be very pleased to see you address. Elizabeth B Liddle
KF
I have taken time to show how and why thermodynamics is not simplicitas about flow of heat,
but it IS about the flow of energy.
but takes in a LOT of molecular and similar scale forces and phenomena, in the first instance. It extends to macro phenomena
Indeed. As in my example of a tornado that lifts a sofa into a tree.
and as a classic case in point, diffusion is not about heat but about molecular randomness.
And that randomness is about the kinetic energy of the molecules, and thus about heat. Elizabeth B Liddle
PS: And BTW, the inference that we are dealing with shaft work carrying out constructive work resulting in FSCO/I, not diffusion like forces has nothing to do with the usual false dichotomy of objectors tot he design inference, that it is about a contrast natural vs supernatural. As has been stated, cited and explained and routinely willfully ignored in haste to get back to strawman talking points, from at least Plato on the record, the proper contrast is nature = chance + necessity acting spontaneously (in the Gibbs sense] vs art acting by constrictive work. if you will not acknowledge by now that design theorists from Thaxton et al on, openly and freely acknowledge that design inferences on the observed world of life, does not allow us to infer on that to designers within or beyond the cosmos, it reflects sustained willful misrepresentation on your side's part, not honest discussion or disagreement. Have at minimum the honesty to acknowledge that we have openly said that for nigh on 30 years. kairosfocus
EL: I have taken time to show how and why thermodynamics is not simplicitas about flow of heat, but takes in a LOT of molecular and similar scale forces and phenomena, in the first instance. It extends to macro phenomena and as a classic case in point, diffusion is not about heat but about molecular randomness. Through issues on conservation and degradation of energy it has a lot to say about macro-scale phenomena. Through the connexion to information and to config spaces, if speaks much to relvant phenomena on information and organisation. As for the latest attempt to dismiss complex specified info, the materially relevant part is functionally specific comple3x info, often implicit in wiring-diagram [nodes and arcs] organisation. Describing something that can be observed to work or not work in a relevant context [functional specificity], and is dependent on putting many matched parts [complexity] together in a narrow cluster of ways [specific organisation] from the set of possible ways to clump or scatter such across our planet or solar system or observed cosmos, is plainly objective. Indeed, by describing something as functional in a particular way, we have a quick and dirty way to specify it without having to in detail list off its parts, arrangement and coupling. Your post and this are both ASCII coded posts in English in the context of a thread and exhibit FSCO/I. It is time that you accepted reality instead of trying to rhetorically brush it off. KF kairosfocus