Uncommon Descent Serving The Intelligent Design Community

Failure of the “compensation argument” and implausibility of evolution

Categories
Biophysics
Intelligent Design
Share
Facebook
Twitter/X
LinkedIn
Flipboard
Print
Email

Granville Sewell and Daniel Styer have a thing in common: both wrote an article with the same title “Entropy and evolution”. But they reach opposite conclusions on a fundamental question: Styer says that the evolutionist “compensation argument” (henceforth “ECA”) is ok, Sewell says it isn’t. Here I briefly explain why I fully agree with Granville. The ECA is an argument that tries to resolve the problems the 2nd law of statistical mechanics (henceforth 2nd_law_SM) posits to unguided evolution. I adopt Styer’s article as ECA archetype because he also offers calculations, which make clearer its failure.

The 2nd_law_SM as problem for evolution.

The 2nd_law_SM says that a isolated system goes toward its more probable macrostates. In this diagram the arrow represents the 2nd_law_SM rightward trend/direction:

organization … improbable_states … systems ====>>> probable_states

Sewell says:

“The second law is all about using probability at the microscopic level to predict macroscopic change. […] This statement of the second law, or at least of the fundamental principle behind the second law, is the one that should be applied to evolution.”

The physical evolution of a isolated system passes spontaneously through macrostates with increasing values of probability until arriving to equilibrium (the most probable macrostate). Since organization is highly improbable a corollary of the 2nd_law_SM is that isolated systems don’t self-organize. That is the opposite of what biological evolution pretends.

See the picture:

cs1

Styer’s ECA.

Since the 2nd_law_SM applies to isolated systems the ECA says: the Earth E is not a isolated system, then its entropy can decrease thanks to an entropy increase (compensation) in the surroundings S (wrt to the energy coming from the Sun). Unfortunately to consider open the systems is useless, because, as Sewell puts it:

“If an increase in order is extremely improbable when a system is closed, it is still extremely improbable when the system is open, unless something is entering which makes it not extremely improbable.”

Here is how Styer applies the ECA to show that “evolution is consistent with the 2nd law”.
Suppose that, due to evolution, each individual organism is 1000 times more improbable that the corresponding individual was 100 years ago (Emory Bunn says 1000 times is incorrect, it should be 10^25 times, but this is a detail). If Wi is the number of microstates consistent with the specification of an initial organism I 100 years ago, and Wf is the number of microstates consistent with the specification of today’s improved and less probable organism F, then

Wf = Wi / 1000

At this point he uses Boltzmann’s formula:

S = k * ln (W)

where S = entropy, W = number of microstates, k = 1.38 x 10^-23 joules/degrees, ln = logarithm.

Then he calculates the entropy change over 100 years, and finally the entropy decrease per second:

Sf – Si = -3.02 x 10^-30 joules/degrees

By considering all individuals of all species he gets the change in entropy of the biosphere each second: -302 joules/degrees. Since he knows that the Earth’s physical entropy throughput (due to energy from the Sun) each second is: 420 x 10^12 joules/degrees he concludes: “at a minimum the Earth is bathed in about one trillion times the amount of entropy flux required to support the rate of evolution assumed here”, then evolution is largely consistent with the 2nd law.

The problem in Styer’s argument (and in general in the ECA).

Although it could seem an innocent issue of measure units the introduction of the Boltzmann’s formula with k = 1.38 x 10^-23 joules/degrees in this context is a conceptual error. With such formula the ECA has transformed a difficult problem of probability (in connection with the arise of ultra-complex organized systems) into a simple issue of energy (“joule” is unit of energy, work, or amount of heat). This assumes a priori that energy is able to organize organisms from sparse atoms. But such assumption is totally gratuitous and unproved. That energy can do that is exactly what the ECA should prove in the first place. So Styer’s ECA begs the question.

Similarly Andy McIntosh (cited by Sewell) says:

Both Styer and Bunn calculate by slightly different routes a statistical upper bound on the total entropy reduction necessary to ‘achieve’ life on earth. This is then compared to the total entropy received by the Earth for a given period of time. However, all these authors are making the same assumption—viz. that all one needs is sufficient energy flow into a [non-isolated] system and this will be the means of increasing the probability of life developing in complexity and new machinery evolving. But as stated earlier this begs the question…

The Boltzmann’s formula in the ECA, with its introduction of joules of energy, establishes a bridge between probabilities and the joules coming from the Sun. Unfortunately this link is unsubstantiated here because no one has proved that joules cause biological organization. On the contrary, in my previous post “The illusion of organizing energy” I explained why any kind of energy per se cannot create organization in principle. To greater reason, thermal energy is unable to the task. In fact, heat is the more degraded and disordered kind of energy, the one with maximum entropy. So the ECA would contain also an internal contradiction: by importing entropy in E one decreases entropy in E!

The problem of Boltzmann’s formula, as used in the ECA, is then “to buy” probability bonus with energy “money”. Sewell expresses the same concept with different words:

The compensation argument is predicated on the idea […] that the universal currency for entropy is thermal entropy.

That conversion / compensation is not allowed if one hasn’t proved at the outset a direct causation role of energy in producing the effect, biological organization, which is in the opposite direction of the 2nd_law_SM rightward arrow (extreme left on the above diagram). In a sense the ECA conflates two different planes. This wrong conflation is like to say that a roulette placed inside a refrigerated room can easily output 1 million “black” in a row because its entropy is decreased compared to the outside.

Note that evolution doesn’t imply a single small deviation from the trend, quite differently it implies countless highly improbable processes happened continually in countless organisms during billion years. Who claims that evolution doesn’t violate the 2nd_law_SM, would doubt a violation if countless tornados always turned rubble into houses, cars and computers for billion years? Sewell asks (backward tornado is the metaphor he uses more). In conclusion Roger Caillois is right: “Clausius and Darwin cannot both be right.”

Implausibility of evolution.

Styer’s paper is also an opportunity to see the problem of evolution from a probabilistic viewpoint. You will note the huge difference of difficulty of the probabilistic scenario compared to the above enthusiastic thermal entropy scenario, with potentially 1,000,000,000,000 times evolution!
In Appendix #2 he proposes a problem for students: “How much improved and less probable would each organism be, relative to its (possibly single-celled) ancestor at the beginning of the Cambrian explosion? (Answer: 10 raised to the 1.8 x 10^22 times)”. Call this monster number “a”, Wi = the initial microstates, Wf = the final microstates, W = the total microstates. According to Styer’s answer (which is correct as calculation) we have:

Wf = Wi / a

The probability of the initial macrostate is Wi / W. The probability of the final macrostate is Wf / W. Suppose Wf = 1, then Wi is = a. W must be equal or greater a otherwise (Wi / W) would be greater than 1 (impossible). Therefore the probability to occur of the final macrostate is:

(Wf / W) equal or less (1 / a)

This is the probability of evolution of a single individual organism in the Cambrian:

1 on 10 raised to the 1.8 x 10^22

a number with more than 10^22 digits (10 trillion billion digits). This miraculous event had to occur 10^18 times, for each of other organisms.

Dembski’s “universal probability bound” is:

1 / 10^150

1 on a number with “only” 150 digits. Therefore evolution is far beyond the plausibility threshold. In conclusion: the ECA fails to prove that “evolution is consistent with the 2nd law”, and we have also a proof of the implausibility of evolution based on probability.

Some could object: “you cannot have both ways, if the ECA is wrong then Appendix #2 is wrong too, because it uses the same method, then the evolution probability is not correct”.
Answer: the method is biased toward evolution both in ECA and in Appendix #2. This means the evolution probability is even worse than that, and the implausibility of evolution holds to greater reason.

Comments
Also, my worldview doesn't hold that particles are ultimate. It holds that fields are ultimate, unless we discover that they aren't! In any case, the fact that toilets are made up of particles doesn't make it incoherent to speak of toilets, just as the fact that societies are made up of people doesn't make it incoherent to speak about the United States. For anyone, including materialists. You may want to slow down and think this through, fifth.phoenix
April 4, 2015
April
04
Apr
4
04
2015
09:33 PM
9
09
33
PM
PDT
Phoenix In a sense a toilet is ultimate. Materialism can't even account for where the materials for the toilet comes from even less it's arrangement to make an actual toilet. Give it a go .. good luck chump.Andre
April 4, 2015
April
04
Apr
4
04
2015
09:32 PM
9
09
32
PM
PDT
Phoenix, My take on it is that Box is trying to get his conversation partners to deal with the logical extension of their worldview. I am quite certain that both sides will agree that gravity acting on a fluid will form reservoirs on an uneven earth in accordance with thermodynamic law, but again, it is irrelevant to the larger issue – except to the extent that it provides a point of comparison to organization that comes into being only by the translation of a medium whose arrangements are inert to local thermodynamics.Upright BiPed
April 4, 2015
April
04
Apr
4
04
2015
09:32 PM
9
09
32
PM
PDT
fifth, Your worldview holds that toilets are ultimate? And that when you separate the tank from the bowl, that "ultimate" toilet ceases to exist?phoenix
April 4, 2015
April
04
Apr
4
04
2015
09:25 PM
9
09
25
PM
PDT
phoenix says, If there is no need for such an entity, then materialists, by your own admission, can speak coherently of macroscopic things. I say, Geez Are you really this slow?? Unlike my worldview Materialism holds that particles are ultimate. Therefore it behooves the materialist to explain how given materialism that particles can yield whole things. Since I don't hold that particles are ultimate I don't need to posit an explanation for how the whole arises. Materialism can't borrow my worldview's conclusions when it denies it's premises. again geez Peacefifthmonarchyman
April 4, 2015
April
04
Apr
4
04
2015
09:22 PM
9
09
22
PM
PDT
KF, Do you believe that evolution and/or OOL violate the second law? Straight answer, please.phoenix
April 4, 2015
April
04
Apr
4
04
2015
09:16 PM
9
09
16
PM
PDT
fifthmonarchyman,
Particles and wholes are equally ultimate according to my worldview. There is no need for an entity (immaterial or otherwise) to unite them.
You're still not thinking this through. If there is no need for such an entity, then materialists, by your own admission, can speak coherently of macroscopic things. You and Box got it wrong.phoenix
April 4, 2015
April
04
Apr
4
04
2015
09:11 PM
9
09
11
PM
PDT
phoenix: Dismissive not substantial. I have pointed out what the evidence warrants. Lucky noise and irrelevant energy flows allegedly substituting for relevant information flows and organising constructive work, is what is truly bizarre. KFkairosfocus
April 4, 2015
April
04
Apr
4
04
2015
09:10 PM
9
09
10
PM
PDT
Upright Biped,
Pheonix, the organization that Box and Fifth are trying to get Piotr and Zachriel to address (in earnest) only comes into being through the translation of an informational medium. Your counterexample, the formation of a lake, is irrelevant to the conversation.
No, it's quite relevant. Here's Box:
Why are materialists – like Piotr and Zachriel – arguing that under materialism there is more than particles in motion, that “things” and “objects” exist on their own?
A lake is a "thing", is it not? PS How's the website coming? Has EugeneS joined your "board of directors"?phoenix
April 4, 2015
April
04
Apr
4
04
2015
09:06 PM
9
09
06
PM
PDT
phoenix asks, What is the immaterial entity that unites all the rock particles into a rock whole? I say, third time Particles and wholes are equally ultimate according to my worldview. There is no need for an entity (immaterial or otherwise) to unite the particles. If you still don't understand I can't help you peacefifthmonarchyman
April 4, 2015
April
04
Apr
4
04
2015
09:05 PM
9
09
05
PM
PDT
CJYman, Thank you. I am glad to see your comments appearing here again. There are comments of yours that I have had bookmarked for years now.Upright BiPed
April 4, 2015
April
04
Apr
4
04
2015
09:01 PM
9
09
01
PM
PDT
Pheonix, the organization that Box and Fifth are trying to get Piotr and Zachriel to address (in earnest) only comes into being through the translation of an informational medium. Your counterexample, the formation of a lake, is irrelevant to the conversation.Upright BiPed
April 4, 2015
April
04
Apr
4
04
2015
08:57 PM
8
08
57
PM
PDT
kairosfocus, I am quite comfortable thinking of entropy in informational terms, but the fact that it can be viewed that way hardly justifies the bizarre notions of Sewell and others regarding the Second Law.phoenix
April 4, 2015
April
04
Apr
4
04
2015
08:56 PM
8
08
56
PM
PDT
Fifthmonarchyman, If it's so obvious, then tell us. How can a non-materialist speak coherently of lakes, rocks, iPhones and toilets if a materialist cannot? What is the immaterial entity that unites all the rock particles into a rock whole? And what happens to that entity when the rock cracks in two? You and Box didn't think this through.phoenix
April 4, 2015
April
04
Apr
4
04
2015
08:45 PM
8
08
45
PM
PDT
CJYman,
P3. If configuration entropy is apparently violated, then statistical thermodynamics is apparently violated.
What does that mean? Entropy isn't a law that can be violated.phoenix
April 4, 2015
April
04
Apr
4
04
2015
08:39 PM
8
08
39
PM
PDT
Hey phoenix, Do you honestly believe that holding the one and the many to be equally ultimate means I believe in toilet souls or are you just Trolling? Peacefifthmonarchyman
April 4, 2015
April
04
Apr
4
04
2015
08:38 PM
8
08
38
PM
PDT
phoenix says, Nothing in your #323 addresses my question: I say, please pass the duct tape me head she is fixing to blow. Is it really possible that someone could completely miss something when it is right in front of their face? peacefifthmonarchyman
April 4, 2015
April
04
Apr
4
04
2015
08:33 PM
8
08
33
PM
PDT
fifthmonarchyman:
I would venture to bet that their presuppositions were similar to mine and when they formulated the second law they never dreamed of a universe in which “whole” things were nothing but arraignments of particles in motion.
You think they believed in toilet souls, or some other non-material entity that unites a toilet's particles into a unified whole? What happens when you pull the tank off the bowl? Does the toilet soul vanish, to be replaced by tank and bowl souls?phoenix
April 4, 2015
April
04
Apr
4
04
2015
08:32 PM
8
08
32
PM
PDT
assorted lord Kelvin quotes, "The assumption of atoms can explain no property of body which has not previously been attributed to the atoms themselves." and "I need scarcely say that the beginning and maintenance of life on earth is absolutely and infinitely beyond the range of all sound speculation in dynamical science. The only contribution of dynamics to theoretical biology is absolute negation of automatic commencement or automatic maintenance of life." and "...Creative Power is the only feasible answer to the origin of life from a scientific perspective." and "Overwhelming strong proofs of intelligent and benevolent design lie around us." end quote: ;-) peacefifthmonarchyman
April 4, 2015
April
04
Apr
4
04
2015
08:28 PM
8
08
28
PM
PDT
fifthmonarchyman, Nothing in your #323 addresses my question:
If it’s illegitimate for a materialist to speak of macroscopic things — a lake, for instance — then why doesn’t a dualist [or any non-materialist] face the same problem? Does an immaterial “lake soul” unite the particles into a coherent “lake whole”?
You and Box goofed. You thought you were scoring a point against materialism, but you didn't realize that your argument, if valid, would undermine your own position -- unless you can defend the existence of immaterial lake souls, rock souls, iPhone souls, toilet souls, etc.phoenix
April 4, 2015
April
04
Apr
4
04
2015
08:27 PM
8
08
27
PM
PDT
phoenix says, Piotr and Zachriel are discussing the second law of thermodynamics. God knows (so to speak) what you, Box, et al are talking about. I say, Kelvin was a devote Christian and Rudolf Clausius' father was a Protestant Pastor. I would venture to bet that their presuppositions were similar to mine and when they formulated the second law they never dreamed of a universe in which "whole" things were nothing but arraignments of particles in motion. peacefifthmonarchyman
April 4, 2015
April
04
Apr
4
04
2015
08:06 PM
8
08
06
PM
PDT
First, Upright Biped, I enjoy reading through your posts. Your discussions of information & protocols are bang on. Please continue. The disconnect between information and physical law is one of the more powerful arguments for ID. scordova: "NO! J/K is dimensionless! It only indicates the method used to count the energy microstates." Then you should have no problems with anything I laid out in my last post #216. My next reply will be to Keiths at TSZ, and work is very busy this weekend. The conversation is really just starting to take off so I'm looking forward to continuing where I left off discussing organization of energy flow. I'll be joining back in as soon as I can. Zachriel, again referencing your snowstorm example, improbable things happen everyday but that is only because of relevant compensation. IOW, probability is always contextual. Probabilities require 'givens' to be calculated. Some things may appear to be very improbable, but given specific circumstances, they can become almost guaranteed within a certain timeframe. Furthermore, this discussion is less about absolute entropy or probability measurement than a direction of entropy (multiplicity from low to high probability) and what it takes to locally reverse that direction. Finally this discussion is about the specific circumstances that allow this reversibility based on the macrostate that we are measuring. In the cases that we are dealing with temperature as a macrostate, what we need is a sufficiently uneven heat flow from a lower entropy area and also to a higher entropy area to locally and temporarily experience a negative change in entropy. But of course, remember that is when measuring temperature as a macrostate. Did you already forget our exchange earlier where I stated: "The ‘imbalance in energy, causing energy flow’ or ‘the earth is an open system and the sun provides all the energy we need’ compensation argument that has been flogged to death works perfectly fine in this case, both theoretically and experimentally. I fail to see your point as it relates to my 2 scenarios and further clarification and resulting point relating to 2LOT. Oh, and your response to my question seems to indicate that you think snowstorms are formed at seeming insurmountable odds. I’m not quite sure I follow. Could you please clarify." IOW, snowstorms are formed at seeming insurmountable odds given ... what conditions ... This relates very well to what you are discussing re: probability, entropy, life and snowstorms. I'll post a small tidbit here of what I am going to be posting for Keiths at TSZ: "I am already well aware of at least some of the assumptions required to discuss 2LOT. For one, an enabler such as 'motional energy' is required; for another 'restraints' must be taken into consideration to determine whether a higher level of entropy will indeed be actualized. Entropy always increases unless there exists what I and others have been calling 'compensation.' -- the "change, connected therewith, occurring at the same time" according to Rudolf Clausius. That has been half of the main point during this whole multi-thread discussion. The other half has to do with the connection between entropy, specific macrostates not defined in J/K terms, and the compensation required to 'build' these relatively low multiplicity macrostates. Then finally, how is this 'non J/K' entropy related to 2LOT? Again, does anyone have a problem with the following premises and conclusions? P1. Configuration multiplicity provides the basis for statistical thermodynamics. P2. Statistical thermodynamics provides the rationale behind why 2LOT exists. P3. If configuration entropy is apparently violated, then statistical thermodynamics is apparently violated. P4. If statistical thermodynamics is apparently violated then the foundation of our understanding of why 2LOT is true is apparently violated. C1. Therefore, a violation of configuration entropy would be a violation of the foundation of 2LOT. C2. If the 'change in J/K' measurements of 2LOT are to remain correct, a re-write of the connection between statistical thermodynamics and 2LOT would be required. "The whole point from the very beginning of the tornado vs. city block example was that certain macrostates require very specific conditions beyond simply uneven heat transfer for any sort of reversibility and are indeed irreversible even under conditions of mere open system heat flow. Simply put: a compensation factor is required for certain configuration macrostates (Tornado vs. city block or sun vs. doghouse examples) to be reversible and anyone who states otherwise is promoting an apparent violation of the very foundations of the operation of 2LOT. When we finally have that understanding settled, we can carry on with a discussion of what is the required compensation — mere heat flow in an open system or a prior thermodynamic system of lower configuration entropy or something else?"CJYman
April 4, 2015
April
04
Apr
4
04
2015
07:54 PM
7
07
54
PM
PDT
Phoenix: Oh, this . . . the underpinnings of 2LOT, inextricable from it for some 100+ years since Gibbs, Boltzmann et al, and the more recent link to an informational view: _____________ Clip, App I my BN: >> . . . 3] So far we have worked out of a more or less classical view of the subject. But, to explore such a question further, we need to look more deeply at the microscopic level. Happily, there is a link from macroscopic thermodynamic concepts to the microscopic, molecular view of matter, as worked out by Boltzmann and others, leading to the key equation: s = k ln W . . . Eqn.A.3 That is, entropy of a specified macrostate [in effect, macroscopic description or specification] is a constant times a log measure of the number of ways matter and energy can be distributed at the micro-level consistent with that state [i.e. the number of associated microstates; aka "the statistical weight of the macrostate," aka "thermodynamic probability"]. The point is, that there are as a rule a great many ways for energy and matter to be arranged at micro level relative to a given observable macro-state. That is, there is a "loss of information" issue here on going from specific microstate to a macro-level description, with which many microstates may be equally compatible. Thence, we can see that if we do not know the microstates specifically enough, we have to more or less treat the micro-distributions of matter and energy as random, leading to acting as though they are disordered. Or, as Leon Brillouin, one of the foundational workers in modern information theory, put it in his 1962 Science and Information Theory, Second Edition:
How is it possible to formulate a scientific theory of information? The first requirement is to start from a precise definition. . . . . We consider a problem involving a certain number of possible answers, if we have no special information on the actual situation. When we happen to be in possession of some information on the problem, the number of possible answers is reduced, and complete information may even leave us with only one possible answer. Information is a function of the ratio of the number of possible answers before and after, and we choose a logarithmic law in order to insure additivity of the information contained in independent situations [as seen above in the main body, section A] . . . . Physics enters the picture when we discover a remarkable likeness between information and entropy. This similarity was noticed long ago by L. Szilard, in an old paper of 1929, which was the forerunner of the present theory. In this paper, Szilard was really pioneering in the unknown territory which we are now exploring in all directions. He investigated the problem of Maxwell's demon, and this is one of the important subjects discussed in this book. The connection between information and entropy was rediscovered by C. Shannon in a different class of problems, and we devote many chapters to this comparison. We prove that information must be considered as a negative term in the entropy of a system; in short, information is negentropy. The entropy of a physical system has often been described as a measure of randomness in the structure of the system. We can now state this result in a slightly different way: Every physical system is incompletely defined. We only know the values of some macroscopic variables, and we are unable to specify the exact positions and velocities of all the molecules contained in a system. We have only scanty, partial information on the system, and most of the information on the detailed structure is missing. Entropy measures the lack of information; it gives us the total amount of missing information on the ultramicroscopic structure of the system. This point of view is defined as the negentropy principle of information [added links: cf. explanation here and "onward" discussion here -- noting on the brief, dismissive critique of Brillouin there, that you never get away from the need to provide information -- there is "no free lunch," as Dembski has pointed out ; ->) ], and it leads directly to a generalization of the second principle of thermodynamics, since entropy and information must, be discussed together and cannot be treated separately. This negentropy principle of information will be justified by a variety of examples ranging from theoretical physics to everyday life. The essential point is to show that any observation or experiment made on a physical system automatically results in an increase of the entropy of the laboratory. It is then possible to compare the loss of negentropy (increase of entropy) with the amount of information obtained. The efficiency of an experiment can be defined as the ratio of information obtained to the associated increase in entropy. This efficiency is always smaller than unity, according to the generalized Carnot principle. Examples show that the efficiency can be nearly unity in some special examples, but may also be extremely low in other cases. This line of discussion is very useful in a comparison of fundamental experiments used in science, more particularly in physics. It leads to a new investigation of the efficiency of different methods of observation, as well as their accuracy and reliability . . . . [From an online excerpt of the Dover Reprint edition, here. Emphases, links and bracketed comment added.]
4] Yavorski and Pinski, in the textbook Physics, Vol I [MIR, USSR, 1974, pp. 279 ff.], summarise the key implication of the macro-state and micro-state view well: as we consider a simple model of diffusion, let us think of ten white and ten black balls in two rows in a container. There is of course but one way in which there are ten whites in the top row; the balls of any one colour being for our purposes identical. But on shuffling, there are 63,504 ways to arrange five each of black and white balls in the two rows, and 6-4 distributions may occur in two ways, each with 44,100 alternatives. So, if we for the moment see the set of balls as circulating among the various different possible arrangements at random, and spending about the same time in each possible state on average, the time the system spends in any given state will be proportionate to the relative number of ways that state may be achieved. Immediately, we see that the system will gravitate towards the cluster of more evenly distributed states. In short, we have just seen that there is a natural trend of change at random, towards the more thermodynamically probable macrostates, i.e the ones with higher statistical weights. So "[b]y comparing the [thermodynamic] probabilities of two states of a thermodynamic system, we can establish at once the direction of the process that is [spontaneously] feasible in the given system. It will correspond to a transition from a less probable to a more probable state." [p. 284.] This is in effect the statistical form of the 2nd law of thermodynamics. Thus, too, the behaviour of the Clausius isolated system above is readily understood: importing d'Q of random molecular energy so far increases the number of ways energy can be distributed at micro-scale in B, that the resulting rise in B's entropy swamps the fall in A's entropy. Moreover, given that FSCI-rich micro-arrangements are relatively rare in the set of possible arrangements, we can also see why it is hard to account for the origin of such states by spontaneous processes in the scope of the observable universe. (Of course, since it is as a rule very inconvenient to work in terms of statistical weights of macrostates [i.e W], we instead move to entropy, through s = k ln W. Part of how this is done can be seen by imagining a system in which there are W ways accessible, and imagining a partition into parts 1 and 2. W = W1*W2, as for each arrangement in 1 all accessible arrangements in 2 are possible and vice versa, but it is far more convenient to have an additive measure, i.e we need to go to logs. The constant of proportionality, k, is the famous Boltzmann constant and is in effect the universal gas constant, R, on a per molecule basis, i.e we divide R by the Avogadro Number, NA, to get: k = R/NA. The two approaches to entropy, by Clausius, and Boltzmann, of course, correspond. In real-world systems of any significant scale, the relative statistical weights are usually so disproportionate, that the classical observation that entropy naturally tends to increase, is readily apparent.) 5] The above sort of thinking has also led to the rise of a school of thought in Physics -- note, much spoken against in some quarters, but I think they clearly have a point -- that ties information and thermodynamics together. Robertson presents their case [--> in his Statistical Thermophysics, Prentice, 1993]; in summary:
. . . It has long been recognized that the assignment of probabilities to a set represents information, and that some probability sets represent more information than others . . . if one of the probabilities say p2 is unity and therefore the others are zero, then we know that the outcome of the experiment . . . will give [event] y2. Thus we have complete information . . . if we have no basis . . . for believing that event yi is more or less likely than any other [we] have the least possible information about the outcome of the experiment . . . . A remarkably simple and clear analysis by Shannon [1948] has provided us with a quantitative measure of the uncertainty, or missing pertinent information, inherent in a set of probabilities [NB: i.e. a probability should be seen as, in part, an index of ignorance] . . . . [deriving informational entropy, cf. discussions here, here, here, here and here; also Sarfati's discussion of debates and open systems here; the debate here is eye-opening on rhetorical tactics used to cloud this and related issues . . . ] S({pi}) = - C [SUM over i] pi*ln pi, [. . . "my" Eqn A.4] [where [SUM over i] pi = 1, and we can define also parameters alpha and beta such that: (1) pi = e^-[alpha + beta*yi]; (2) exp [alpha] = [SUM over i](exp - beta*yi) = Z [Z being in effect the partition function across microstates, the "Holy Grail" of statistical thermodynamics]. . . .[pp.3 - 6] S, called the information entropy, . . . correspond[s] to the thermodynamic entropy, with C = k, the Boltzmann constant, and yi an energy level, usually ei, while [BETA] becomes 1/kT, with T the thermodynamic temperature . . . A thermodynamic system is characterized by a microscopic structure that is not observed in detail . . . We attempt to develop a theoretical description of the macroscopic properties in terms of its underlying microscopic properties, which are not precisely known. We attempt to assign probabilities to the various microscopic states . . . based on a few . . . macroscopic observations that can be related to averages of microscopic parameters. Evidently the problem that we attempt to solve in statistical thermophysics is exactly the one just treated in terms of information theory. It should not be surprising, then, that the uncertainty of information theory becomes a thermodynamic variable when used in proper context [p. 7] . . . . Jayne's [summary rebuttal to a typical objection] is ". . . The entropy of a thermodynamic system is a measure of the degree of ignorance of a person whose sole knowledge about its microstate consists of the values of the macroscopic quantities . . . which define its thermodynamic state. This is a perfectly 'objective' quantity . . . it is a function of [those variables] and does not depend on anybody's personality. There is no reason why it cannot be measured in the laboratory." . . . . [p. 36.] [Robertson, Statistical Thermophysics, Prentice Hall, 1993. (NB: Sorry for the math and the use of text for symbolism. However, it should be clear enough that Roberson first summarises how Shannon derived his informational entropy [though Robertson uses s rather than the usual H for that information theory variable, average information per symbol], then ties it to entropy in the thermodynamic sense using another relation that is tied to the Boltzmann relationship above. This context gives us a basis for looking at the issues that surface in prebiotic soup or similar models as we try to move from relatively easy to form monomers to the more energy- and information- rich, far more complex biofunctional molecules.)] >>
_____________ The consequence of this is that the configurational work to create a functionally specific, complex organised entity demands adequate explanation; e.g. at OOL or onwards origin of body plans etc. We readily observe that wiring diagram-based functional specificity beyond a modest description length of 500 - 1,000 bits (the last is 143 ASCII characters) is not plausibly accounted for on blind needle in haystack "search" based on chance and mechanical necessity. Instead, it comes from energy, mass and information flows connected to energy converters that generate shaft work and/or ordered flows, associated constructors that use information to assemble FSCO/I-rich configurations, and appropriate exhaust of waste, with algorithmic halting. The protein synthesis process is an apt example, as is the cellular metabolic network, also we can point to how a zygote becomes a fetus etc. Thus, the claim or suggestion that irrelevant mass or energy flows without credible information flows and appropriate converter-constructor entities can somehow originate entities like this by in effect lucky noise is highly dubious. Or in Sewell's words:
. . . The second law is all about probability, it uses probability at the microscopic level to predict macroscopic change: the reason carbon distributes itself more and more uniformly in an insulated solid is, that is what the laws of probability predict when diffusion alone is operative. The reason natural forces may turn a spaceship, or a TV set, or a computer into a pile of rubble but not vice-versa is also probability: of all the possible arrangements atoms could take, only a very small percentage could fly to the moon and back, or receive pictures and sound from the other side of the Earth, or add, subtract, multiply and divide real numbers with high accuracy. The second law of thermodynamics is the reason that computers will degenerate into scrap metal over time, and, in the absence of intelligence, the reverse process will not occur; and it is also the reason that animals, when they die, decay into simple organic and inorganic compounds, and, in the absence of intelligence, the reverse process will not occur. The discovery that life on Earth developed through evolutionary "steps," coupled with the observation that mutations and natural selection -- like other natural forces -- can cause (minor) change, is widely accepted in the scientific world as proof that natural selection -- alone among all natural forces -- can create order out of disorder, and even design human brains, with human consciousness. Only the layman seems to see the problem with this logic. In a recent Mathematical Intelligencer article ["A Mathematician's View of Evolution," The Mathematical Intelligencer 22, number 4, 5-7, 2000] I asserted that the idea that the four fundamental forces of physics alone could rearrange the fundamental particles of Nature into spaceships, nuclear power plants, and computers, connected to laser printers, CRTs, keyboards and the Internet, appears to violate the second law of thermodynamics in a spectacular way.1 . . . . What happens in a[n isolated] system depends on the initial conditions; what happens in an open system depends on the boundary conditions as well. As I wrote in "Can ANYTHING Happen in an Open System?", "order can increase in an open system, not because the laws of probability are suspended when the door is open, but simply because order may walk in through the door.... If we found evidence that DNA, auto parts, computer chips, and books entered through the Earth's atmosphere at some time in the past, then perhaps the appearance of humans, cars, computers, and encyclopedias on a previously barren planet could be explained without postulating a violation of the second law here . . . But if all we see entering is radiation and meteorite fragments, it seems clear that what is entering through the boundary cannot explain the increase in order observed here." Evolution is a movie running backward, that is what makes it special. THE EVOLUTIONIST, therefore, cannot avoid the question of probability by saying that anything can happen in an open system, he is finally forced to argue that it only seems extremely improbable, but really isn't, that atoms would rearrange themselves into spaceships and computers and TV sets . . . [NB: Emphases added. I have also substituted in isolated system terminology as GS uses a different terminology.]
Nothing much . . . KFkairosfocus
April 4, 2015
April
04
Apr
4
04
2015
07:47 PM
7
07
47
PM
PDT
phoenix says, If it’s illegitimate for a materialist to speak of macroscopic things — a lake, for instance — then why doesn’t a dualist face the same problem? I say, I for one am a Christian. I really have no idea how a "dualist" would handle the problem. But I as a Christian would point out that since both the one and the many are equally ultimate in the triune God and since temporal unity and plurality are the creation of the same God, then logically neither the "particle" nor the "whole" can demand the sacrifice of the other to itself. It's Christianity 101 peacefifthmonarchyman
April 4, 2015
April
04
Apr
4
04
2015
07:42 PM
7
07
42
PM
PDT
fifthmonarchyman,
It should be obvious by now that when it comes to the second law the two sides are not even discussing the same things.
That's for sure. Piotr and Zachriel are discussing the second law of thermodynamics. God knows (so to speak) what you, Box, et al are talking about.phoenix
April 4, 2015
April
04
Apr
4
04
2015
06:55 PM
6
06
55
PM
PDT
Box,
Why this tedious discussion about materialism?
Because you brought it up, Box. If it's illegitimate for a materialist to speak of macroscopic things -- a lake, for instance -- then why doesn't a dualist face the same problem? Does an immaterial "lake soul" unite the particles into a coherent "lake whole"?phoenix
April 4, 2015
April
04
Apr
4
04
2015
06:49 PM
6
06
49
PM
PDT
Box says Why this tedious discussion about materialism? Why are materialists – like Piotr and Zachriel – arguing that under materialism there is more than particles in motion, that “things” and “objects” exist on their own? I say, We must keep in mind that our Materialist friends were given the opportunity to justify their claims way back 255...... quote: If you think that “whole” physical entities genuinely exist it is incumbent on you to explain how mere particles in motion can give rise to whole things given a materialistic framework. end quote: All we got was a mocking dodge........followed by crickets. Box you have done a great job of keeping the absurdity of materialistic presuppositions front and center. It should be obvious by now that when it comes to the second law the two sides are not even discussing the same things. peacefifthmonarchyman
April 4, 2015
April
04
Apr
4
04
2015
05:50 PM
5
05
50
PM
PDT
Why this tedious discussion about materialism? Why are materialists - like Piotr and Zachriel - arguing that under materialism there is more than particles in motion, that "things" and "objects" exist on their own? Because they cannot deal with the argument presented in #214
Given materialism there are no organisms – there are just particles in motion. And these particles in motion don’t give a hoot about some (non-existent) organism. These particles don’t form a whole, they are all doing their own thing, blissfully unaware of something bigger than them. And one cannot explain the coherence we see in organisms from blind uninterested parts. So your basis of reasoning is a cause – the particles in motion – that is insufficient to explain the effect – a coherent whole, the organism.
Box
April 4, 2015
April
04
Apr
4
04
2015
04:11 PM
4
04
11
PM
PDT
Zach: Categorizations and functions are theories about the object, not properties [of the object].
Exactly my simple point!
Zach: The object still exists (...).
Not in any meaningful way, since the "object" is nothing but particles in motion.
Box: Terming a rock a paperweight does not endow a rock with new causal power, it does not change a rock in any way –
Zach: No, but it’s still a rock.
Which is nothing over and above particles in motion.Box
April 4, 2015
April
04
Apr
4
04
2015
03:28 PM
3
03
28
PM
PDT
UB: If the dynamic properties of matter do not explain the origin of representationalism, then what does? Piotr: Physics does not have to explain why GGN encodes “glycine” during translation.
puntUpright BiPed
April 4, 2015
April
04
Apr
4
04
2015
03:23 PM
3
03
23
PM
PDT
1 9 10 11 12 13 22

Leave a Reply