Uncommon Descent Serving The Intelligent Design Community

Are Darwinian claims for evolution consistent with the 2nd law of thermodynamics?

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

A friend wrote to ask because he came across a 2001 paper, Entropy and Self-Organization in Multi-Agent Systems by H. Van Dyke Parunak and Sven Brueckner Proceedings of the International Conference on Autonomous Agents (Agents 2001), 124-130:

Emergent self-organization in multi-agent systems appears to contradict the second law of thermodynamics. This paradox has been explained in terms of a coupling between the macro level that hosts self-organization (and an apparent reduction in entropy), and the micro level (where random processes greatly increase entropy). Metaphorically, the micro level serves as an entropy “sink,” permitting overall system entropy to increase while sequestering this increase from the interactions where selforganization is desired. We make this metaphor precise by constructing a simple example of pheromone-based coordination, defining a way to measure the Shannon entropy at the macro (agent) and micro (pheromone) levels, and exhibiting an entropybased view of the coordination.

The thought seems to be that entropy decreases here but somehow increases somewhere where we can’t see it.

I’ve (O’Leary for News) always thought that a fishy explanation, especially because I soon discovered that even raising the question is considered presumptive evidence of unsound loyalties. The sort I am long accustomed to hearing from authoritarians covering up a scandal.

So not only do I not believe it, but after that sort of experience I get the sense I shouldn’t believe it. Depending on where I am working, I might need to parrot it to keep my job, of course, but it would be best not to actually believe it.

Dr Sheldon
Rob Sheldon

Rob Sheldon told us both,

What you read is the “standard” physics response. It is misleading on many levels.

a) Physicists really, really can’t explain what goes on in biology. Neither their definition of entropy, nor their definition of information (Shannon, etc) work. Rather than admit that they don’t know what is going on, they simply extrapolate what they do know (ideal gasses) to biology and make pronouncements.

b) While it is true that “open” systems may allow energy and matter to flow through them, which would change the information in the system, this does not nor cannot explain biology. The best treatment of this is Granville Sewell’s articles on different types of entropy. Truly excellent. It explains why sunlight does not carry enough information to create life out of precursor molecules. And people who claim this are either: (i) deluded that physics entropy = biology entropy, or (ii) equivocating on the use of the word “entropy”, or (iii) unable to handle basic math, or most likely, (iv) all the above.

c) This paper suggests that the cell has machinery for converting sunlight to information–e.g. photosynthesis. While true, this machine must be even more complicated than the carbohydrates it produces. Ditto for self-replicating machinery, etc. So if we permit some high level of information to enter the system, then low-level information can be created from energy sources. This argument really is indistinguishable from ID, though they may not realize it.

In conclusion, the violation of the 2nd Law remains true for biology, and there still is no good physics explanation for it.

It’s a good thing they didn’t realize it. They won’t have to issue some embarrassing repudiation of their work.

And I don’t have to believe something for which we have no evidence just to protect the tenurebots’ theory.

Follow UD News at Twitter!

Comments
Box: Please retract yourself from this thread and read up. This discussion is not about heat distribution. Please read up. This discussion has the title of "Are Darwinian claims for evolution consistent with the 2nd law of thermodynamics?" Box (quoting): Thus unless we are willing to argue that the influx of solar energy into the Earth makes the appearance of spaceships, computers and the Internet not extremely improbable, we have to conclude that at least the basic principle behind the second law has in fact been violated here. Those processes do not violate the 2nd law of thermodynamics. Nothing humans have done or can do violates the 2nd law of thermodynamics. That's rather the whole point.Zachriel
March 5, 2015
March
03
Mar
5
05
2015
07:36 AM
7
07
36
AM
PDT
Hangonasec, Humans, cars, high-speed computers, libraries full of science texts and encyclopedias, TV sets, airplanes and spaceships.
Hangonasec: Which simply means that all apparent ordering is accompanied by the shedding of free energy.
"Apparent ordering" ... sure. - On the free energy from the sun ...
Granville Sewell: Thus unless we are willing to argue that the influx of solar energy into the Earth makes the appearance of spaceships, computers and the Internet not extremely improbable, we have to conclude that at least the basic principle behind the second law has in fact been violated here.
... which boils down to the "compensation" argument:
Granville Sewell: It is widely argued that the spectacular local decreases in entropy that occurred on Earth as a result of the origin and evolution of life and the development of human intelligence are not inconsistent with the second law of thermodynamics, because the Earth is an open system and entropy can decrease in an open system, provided the decrease is compensated by entropy increases outside the system. I refer to this as the compensation argument, and I argue that it is without logical merit, amounting to little more than an attempt to avoid the extraordinary probabilistic difficulties posed by the assertion that life has originated and evolved by spontaneous processes. To claim that what has happened on Earth does not violate the fundamental natural principle behind the second law, one must instead make a more direct and difficult argument. (...) Of course the whole idea of compensation, whether by distant or nearby events, makes no sense logically: an extremely improbable event is not rendered less improbable simply by the occurrence of "compensating" events elsewhere. According to this reasoning, the second law does not prevent scrap metal from reorganizing itself into a computer in one room, as long as two computers in the next room are rusting into scrap metal -- and the door is open. (Or the thermal entropy in the next room is increasing, though I am not sure how fast it has to increase to compensate for computer construction!)
Box
March 5, 2015
March
03
Mar
5
05
2015
07:27 AM
7
07
27
AM
PDT
H, chemical issues and Gibbs Free energy are significant but I would add points in 72 above, there is more to thermodynamics once informational issues come to bear. Gotta go. KFkairosfocus
March 5, 2015
March
03
Mar
5
05
2015
07:13 AM
7
07
13
AM
PDT
a) Physicists really, really can’t explain what goes on in biology. Neither their definition of entropy, nor their definition of information (Shannon, etc) work. Rather than admit that they don’t know what is going on, they simply extrapolate what they do know (ideal gasses) to biology and make pronouncements.
I'd say biology owes more to concepts of chemical thermodynamics, which is of course rooted in physics, but nothing to do with ideal gases. Free energy can be measured, and no movement is found against a free energy gradient, granted that there is frequent coupling to give a net 'downhill' slope. Simply: systems that can shed energy do so. In doing so, entropy increases. But entropy increase can be associated with an increase in apparent 'order', and this is the source of frequent confusion. Is molecular hydrogen more 'ordered' than atomic hydrogen? Kind of, but in strict entropic terms, the energy has to be accounted for as well. Two atoms of hydrogen locally tethered experience force. Allow that system to equilibrate, and energy is released, unrecoverably. The system, with its quanta of 'lost' energy, is less ordered, even though the part we detect is not. Biology does no more. Electrons pass down gradients of serial electonegativity. Sunlight is but one mechanism of elevating electrons to enable them to descend such a chain. In the atomic species found on earth, there are some already furnished with more, and some with less, electronegativity. Those with more can act as electron donors, those with less electron acceptors. There are entire ecosystems which receive not one joule of energy from the sun. Of course, like any entropic system, they are slowly running down - moving towards equilibrium. But none of it violates the 2nd Law of thermodynamics. Which simply means that all apparent ordering is accompanied by the shedding of free energy.Hangonasec
March 5, 2015
March
03
Mar
5
05
2015
06:50 AM
6
06
50
AM
PDT
Zach, Please retract yourself from this thread and read up. This discussion is not about heat distribution.Box
March 5, 2015
March
03
Mar
5
05
2015
06:44 AM
6
06
44
AM
PDT
Box: However it’s totally irrelevant, because who said they are? You did.
Zachriel: Drawing ten straight flushes in a row from a fair, well-shuffled deck is not plausible Box: And why is not plausible? Exactly because of the second law
Sewell: While the first formulations were all about heat, it is now universally recognized that the second law of thermodynamics can be used, in a quantifiable way, in many other applications. Probability theory dates to the 17th century. How is Sewell's conflation "universally recognized"? What "other applications"?Zachriel
March 5, 2015
March
03
Mar
5
05
2015
06:35 AM
6
06
35
AM
PDT
Zach: The odds of a royal flush are not an example of thermodynamics.
True. However it's totally irrelevant, because who said they are? You are not playing stupid with me now or what?
Sewell: While the first formulations were all about heat, it is now universally recognized that the second law of thermodynamics can be used, in a quantifiable way, in many other applications.
Box
March 5, 2015
March
03
Mar
5
05
2015
06:21 AM
6
06
21
AM
PDT
Box: Indeed. And why is not plausible? Exactly because of the second law, which is all about probability. Um, no. The odds of a royal flush are not an example of thermodynamics. The caloric cost of shuffling and dealing and playing of cards are paid for by the alcohol.Zachriel
March 5, 2015
March
03
Mar
5
05
2015
06:06 AM
6
06
06
AM
PDT
Z, kindly cf 72 above. KFkairosfocus
March 5, 2015
March
03
Mar
5
05
2015
06:04 AM
6
06
04
AM
PDT
Zachriel: That is incorrect.
Indeed. "Anything is possible in an open system", often proclaimed by your ilk, is totally nonsensical. Thank you for the admission.
Zachriel: Drawing ten straight flushes in a row from a fair, well-shuffled deck is not plausible (...)
Indeed. And why is not plausible? Exactly because of the second law, which is all about probability.Box
March 5, 2015
March
03
Mar
5
05
2015
06:02 AM
6
06
02
AM
PDT
Timaeus: But that, in itself, doesn’t make Sewell’s argument wrong. It just means that his choice of words creates some confusion about what exactly he is arguing. From paper Box cited: "It is widely argued that the spectacular local decreases in entropy that occurred on Earth as a result of the origin and evolution of life and the development of human intelligence are not inconsistent with the second law of thermodynamics, because the Earth is an open system and entropy can decrease in an open system, provided the decrease is compensated by entropy increases outside the system. I refer to this as the compensation argument, and I argue that it is without logical merit" http://bio-complexity.org/ojs/index.php/main/article/view/70 There is no ambiguity in that statement. Sewell is wrong about a fundamental finding in physics. And that's just the very first lines! Box: This boils down to the “compensation argument” aka ‘anything is possible in an open system’. That is incorrect. There are all sorts of physical restrictions on what is possible and what is plausible. The 2nd law merely restricts in one manner what is possible. Granville Sewell: Of course the whole idea of compensation, whether by distant or nearby events, makes no sense logically: an extremely improbable event is not rendered less improbable simply by the occurrence of “compensating” events elsewhere. Just because something is consistent with the 2nd law of thermodynamics doesn't mean it is possible or plausible. Drawing ten straight flushes in a row from a fair, well-shuffled deck is not plausible, but is consistent with the 2nd law of thermodynamics.Zachriel
March 5, 2015
March
03
Mar
5
05
2015
05:48 AM
5
05
48
AM
PDT
Gordon Davisson: There’s a direct relationship between entropy and probability only in the case of an isolated system fluctuating around thermodynamic equilibrium. In open systems fluctuating around equilibrium there’s a more complicated relationship, and for non-equilibrium systems (e.g. pretty much everything relating to life) the relationship breaks down entirely.
This boils down to the “compensation argument” aka 'anything is possible in an open system'. Peter Urone: "it is always possible for the entropy of one part of the universe to decrease, provided the total change in entropy of the universe increases."
Granville Sewell: Of course the whole idea of compensation, whether by distant or nearby events, makes no sense logically: an extremely improbable event is not rendered less improbable simply by the occurrence of "compensating" events elsewhere. According to this reasoning, the second law does not prevent scrap metal from reorganizing itself into a computer in one room, as long as two computers in the next room are rusting into scrap metal--and the door is open.
Box
March 5, 2015
March
03
Mar
5
05
2015
05:06 AM
5
05
06
AM
PDT
F/N: Thaxton et al at the end of Ch 7, TMLO, as they were about to bridge to the discussion on spontaneous origin of proteins and D/RNA (an onward discussion that has of course been further developed in the past 30 years):
While the maintenance of living systems is easily rationalized in terms of thermodynamics, the origin of such living systems is quite another matter. Though the earth is open to energy flow from the sun, the means of converting this energy into the necessary work to build up living systems from simple precursors remains at present unspecified (see equation 7-17). The "evolution" from biomonomers of to fully functioning cells is the issue. Can one make the incredible jump in energy and organization from raw material and raw energy, apart from some means of directing the energy flow through the system? In Chapters 8 and 9 we will consider this question, limiting our discussion to two small but crucial steps in the proposed evolutionary scheme namely, the formation of protein and DNA from their precursors. It is widely agreed that both protein and DNA are essential for living systems and indispensable components of every living cell today.11 Yet they are only produced by living cells. Both types of molecules are much more energy and information rich than the biomonomers from which they form. Can one reasonably predict their occurrence given the necessary biomonomers and an energy source? Has this been verified experimentally? These questions will be considered . . .
Again, the focus on functionally specific interactive organisation and linked information in the context of high complexity should be plain. KF PS: I could not but notice how Rational Wiki popped up in the Google search and tried to dismiss a technical, scientific discussion as a "religious" book. This speaks volumes and belies the "rational" part of the title. TMLO is one of the foundational technical works behind design theory, period.kairosfocus
March 5, 2015
March
03
Mar
5
05
2015
04:20 AM
4
04
20
AM
PDT
Me_Think #74, You are right. Every 12 years or so I make a tiny mistake and this must be it. I should have linked to "Entropy and evolution" - Granville Sewell (2013). The quote I provided in post #70 is on page 4 of the paper.Box
March 5, 2015
March
03
Mar
5
05
2015
03:45 AM
3
03
45
AM
PDT
Box @ 70 You probably linked the wrong paper. What you quoted is not there in the paperMe_Think
March 4, 2015
March
03
Mar
4
04
2015
07:03 PM
7
07
03
PM
PDT
PS: The underlying roots of my emphasis on config spaces and the implication that COMPLEX, INTERACTIVE AND SPECIFIC FUNCTION SHARPLY CONSTRAINS THE POSSIBLE CONFIGS TO ISLANDS OF FUNCTION (LEADING TO NEEDLE IN HAYSTACK SEARCH CHALLENGES) should be clear.kairosfocus
March 4, 2015
March
03
Mar
4
04
2015
06:47 PM
6
06
47
PM
PDT
F/N: As I have pointed out above several times, Sewell is looking at the phase and/or configuration space, informational view of thermodynamics. I link and clip my always linked note . . . at minimum so the onlooker will not be carried away by dismissive remarks above and elsewhere: _______________ http://www.angelfire.com/pro/kairosfocus/resources/Info_design_and_science.htm#shnn_info >> . . . we may average the information per symbol in [a] communication system thusly (giving in terms of -H to make the additive relationships clearer): - H = p1 log p1 + p2 log p2 + . . . + pn log pn or, H = - SUM [pi log pi] . . . Eqn 5 H, the average information per symbol transmitted [usually, measured as: bits/symbol], is often termed the Entropy; first, historically, because it resembles one of the expressions for entropy in statistical thermodynamics. As Connor notes: "it is often referred to as the entropy of the source." [p.81, emphasis added.] Also, while this is a somewhat controversial view in Physics, as is briefly discussed in Appendix 1 below, there is in fact an informational interpretation of thermodynamics that shows that informational and thermodynamic entropy can be linked conceptually as well as in mere mathematical form. Though somewhat controversial even in quite recent years, this is becoming more broadly accepted in physics and information theory, as Wikipedia now discusses [as at April 2011] in its article on Informational Entropy (aka Shannon Information, cf also here):
At an everyday practical level the links between information entropy and thermodynamic entropy are not close. Physicists and chemists are apt to be more interested in changes in entropy as a system spontaneously evolves away from its initial conditions, in accordance with the second law of thermodynamics, rather than an unchanging probability distribution. And, as the numerical smallness of Boltzmann's constant kB indicates, the changes in S / kB for even minute amounts of substances in chemical and physical processes represent amounts of entropy which are so large as to be right off the scale compared to anything seen in data compression or signal processing. But, at a multidisciplinary level, connections can be made between thermodynamic and informational entropy, although it took many years in the development of the theories of statistical mechanics and information theory to make the relationship fully apparent. In fact, in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon's information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states that it could be in, thus making any complete state description longer. (See article: maximum entropy thermodynamics.[Also,another article remarks: >>in the words of G. N. Lewis writing about chemical entropy in 1930, "Gain in entropy always means loss of information, and nothing more" . . . in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes/no questions that need to be answered in order to fully specify the microstate, given that we know the macrostate.>>]) Maxwell's demon can (hypothetically) reduce the thermodynamic entropy of a system by using information about the states of individual molecules; but, as Landauer (from 1961) and co-workers have shown, to function the demon himself must increase thermodynamic entropy in the process, by at least the amount of Shannon information he proposes to first acquire and store; and so the total entropy does not decrease (which resolves the paradox).
Summarising Harry Robertson's Statistical Thermophysics (Prentice-Hall International, 1993) -- excerpting desperately and adding emphases and explanatory comments, we can see, perhaps, that this should not be so surprising after all. (In effect, since we do not possess detailed knowledge of the states of the vary large number of microscopic particles of thermal systems [typically ~ 10^20 to 10^26; a mole of substance containing ~ 6.023*10^23 particles; i.e. the Avogadro Number], we can only view them in terms of those gross averages we term thermodynamic variables [pressure, temperature, etc], and so we cannot take advantage of knowledge of such individual particle states that would give us a richer harvest of work, etc.) For, as he astutely observes on pp. vii - viii:
. . . the standard assertion that molecular chaos exists is nothing more than a poorly disguised admission of ignorance, or lack of detailed information about the dynamic state of a system . . . . If I am able to perceive order, I may be able to use it to extract work from the system, but if I am unaware of internal correlations, I cannot use them for macroscopic dynamical purposes. On this basis, I shall distinguish heat from work, and thermal energy from other forms . . .
And, in more details, (pp. 3 - 6, 7, 36, cf Appendix 1 below for a more detailed development of thermodynamics issues and their tie-in with the inference to design; also see recent ArXiv papers by Duncan and Samura here and here):
. . . It has long been recognized that the assignment of probabilities to a set represents information, and that some probability sets represent more information than others . . . if one of the probabilities say p2 is unity and therefore the others are zero, then we know that the outcome of the experiment . . . will give [event] y2. Thus we have complete information . . . if we have no basis . . . for believing that event yi is more or less likely than any other [we] have the least possible information about the outcome of the experiment . . . . A remarkably simple and clear analysis by Shannon [1948] has provided us with a quantitative measure of the uncertainty, or missing pertinent information, inherent in a set of probabilities [NB: i.e. a probability different from 1 or 0 should be seen as, in part, an index of ignorance] . . . . [deriving informational entropy, cf. discussions here, here, here, here and here; also Sarfati's discussion of debates and the issue of open systems here . . . ] H({pi}) = - C [SUM over i] pi*ln pi, [. . . "my" Eqn 6] [where [SUM over i] pi = 1, and we can define also parameters alpha and beta such that: (1) pi = e^-[alpha + beta*yi]; (2) exp [alpha] = [SUM over i](exp - beta*yi) = Z [Z being in effect the partition function across microstates, the "Holy Grail" of statistical thermodynamics]. . . . [H], called the information entropy, . . . correspond[s] to the thermodynamic entropy [i.e. s, where also it was shown by Boltzmann that s = k ln w], with C = k, the Boltzmann constant, and yi an energy level, usually ei, while [BETA] becomes 1/kT, with T the thermodynamic temperature . . . A thermodynamic system is characterized by a microscopic structure that is not observed in detail . . . We attempt to develop a theoretical description of the macroscopic properties in terms of its underlying microscopic properties, which are not precisely known. We attempt to assign probabilities to the various microscopic states . . . based on a few . . . macroscopic observations that can be related to averages of microscopic parameters. Evidently the problem that we attempt to solve in statistical thermophysics is exactly the one just treated in terms of information theory. It should not be surprising, then, that the uncertainty of information theory becomes a thermodynamic variable when used in proper context . . . . Jayne's [summary rebuttal to a typical objection] is ". . . The entropy of a thermodynamic system is a measure of the degree of ignorance of a person whose sole knowledge about its microstate consists of the values of the macroscopic quantities . . . which define its thermodynamic state. This is a perfectly 'objective' quantity . . . it is a function of [those variables] and does not depend on anybody's personality. There is no reason why it cannot be measured in the laboratory." . . . . [pp. 3 - 6, 7, 36; replacing Robertson's use of S for Informational Entropy with the more standard H.]
As is discussed briefly in Appendix 1, Thaxton, Bradley and Olsen [TBO], following Brillouin et al, in the 1984 foundational work for the modern Design Theory, The Mystery of Life's Origins [TMLO], exploit this information-entropy link, through the idea of moving from a random to a known microscopic configuration in the creation of the bio-functional polymers of life, and then -- again following Brillouin -- identify a quantitative information metric for the information of polymer molecules. For, in moving from a random to a functional molecule, we have in effect an objective, observable increment in information about the molecule. This leads to energy constraints, thence to a calculable concentration of such molecules in suggested, generously "plausible" primordial "soups." In effect, so unfavourable is the resulting thermodynamic balance, that the concentrations of the individual functional molecules in such a prebiotic soup are arguably so small as to be negligibly different from zero on a planet-wide scale. By many orders of magnitude, we don't get to even one molecule each of the required polymers per planet, much less bringing them together in the required proximity for them to work together as the molecular machinery of life. The linked chapter gives the details. More modern analyses [e.g. Trevors and Abel, here and here], however, tend to speak directly in terms of information and probabilities rather than the more arcane world of classical and statistical thermodynamics . . . >> ________________ So, in fact, there is a legitimate physical view on thermodynamics that brings out the force of Sewell's point. Which, we may clip (yet again, one of the all too typical Darwinist debate tactics is to ignore or "forget" clarifications or corrective information so that the same strawman caricatures keep on coming up):
. . . The second law is all about probability, it uses probability at the microscopic level to predict macroscopic change: the reason carbon distributes itself more and more uniformly in an insulated solid is, that is what the laws of probability predict when diffusion alone is operative [--> a clear context . . . ]. The reason natural forces may turn a spaceship, or a TV set, or a computer into a pile of rubble but not vice-versa is also probability: of all the possible arrangements atoms could take, only a very small percentage could fly to the moon and back, or receive pictures and sound from the other side of the Earth, or add, subtract, multiply and divide real numbers with high accuracy. The second law of thermodynamics is the reason that computers will degenerate into scrap metal over time, and, in the absence of intelligence, the reverse process will not occur; and it is also the reason that animals, when they die, decay into simple organic and inorganic compounds, and, in the absence of intelligence, the reverse process will not occur. [--> again, a clear context, the statistical weights of clusters of possible microstates define probabilities to all relevant intents] The discovery that life on Earth developed through evolutionary "steps," coupled with the observation that mutations and natural selection -- like other natural forces -- can cause (minor) change, is widely accepted in the scientific world as proof that natural selection -- alone among all natural forces -- can create order out of disorder, and even design human brains, with human consciousness. Only the layman seems to see the problem with this logic. In a recent Mathematical Intelligencer article ["A Mathematician's View of Evolution," The Mathematical Intelligencer 22, number 4, 5-7, 2000] I asserted that the idea that the four fundamental forces of physics alone could rearrange the fundamental particles of Nature into spaceships, nuclear power plants, and computers, connected to laser printers, CRTs, keyboards and the Internet, appears to violate the second law of thermodynamics in a spectacular way.1 . . . . What happens in a[n isolated] system depends on the initial conditions; what happens in an open system depends on the boundary conditions as well. As I wrote in "Can ANYTHING Happen in an Open System?","order [--> including organisation, and from context, FSCO/I] can increase in an open system, not because the laws of probability are suspended when the door is open, but simply because order may walk in through the door.... If we found evidence that DNA, auto parts, computer chips, and books entered through the Earth's atmosphere at some time in the past, then perhaps the appearance of humans, cars, computers, and encyclopedias on a previously barren planet could be explained without postulating a violation of the second law here . . . But if all we see entering is radiation and meteorite fragments, it seems clear that what is entering through the boundary cannot explain the increase in order observed here." Evolution is a movie running backward, that is what makes it special. THE EVOLUTIONIST, therefore, cannot avoid the question of probability by saying that anything can happen in an open system, he is finally forced to argue that it only seems extremely improbable, but really isn't, that atoms would rearrange themselves into spaceships and computers and TV sets . . .
The point is a serious one and the first level it must be addressed at is the claimed spontaneous organisation and origin of cell based life in Darwin's warm pond or the like. KFkairosfocus
March 4, 2015
March
03
Mar
4
04
2015
06:42 PM
6
06
42
PM
PDT
Timaeus: Gordon Davisson has no problem distinguishing between the original notion and Sewell’s “extended” notion.
Probably due to the fact that your idea of Sewell having an innovative "extended notion" of the second law is nonsense. The existence of different kinds of entropy finds universal acceptance these days.
Sewell: While the first formulations were all about heat, it is now universally recognized that the second law of thermodynamics can be used, in a quantifiable way, in many other applications.
Box
March 4, 2015
March
03
Mar
4
04
2015
06:33 PM
6
06
33
PM
PDT
Gordon Davisson: Sewell claims that the second law applies separately to each different kind of entropy, but this is not true in general.
Sewell does NOT claim that AT ALL. In fact he explicitly states that in general even in solids different kinds of entropy can effect each other. From the paper (p.4):
G. Sewell: He [Bob Lloyd] wrote that the idea that my X-entropies are always independent of each other was “central to all of the versions of his argument.” Actually, I never claimed that: (….) But even in solids, the different X-entropies can affect each other under more general assumptions . Simple definitions of entropy are only useful in simple contexts. [My Emphasis]
Box
March 4, 2015
March
03
Mar
4
04
2015
06:11 PM
6
06
11
PM
PDT
Timaeus, What Sewell has presented is a hunch. That's fine for what it is, but without some way to test his idea, it's no better than string theory.rhampton7
March 4, 2015
March
03
Mar
4
04
2015
06:03 PM
6
06
03
PM
PDT
Timaeus at 55 Thanks for expressing something I have been thinking for a while. I feel like there is some principle that science just hasn't fully discovered yet. Here is an analogy: In the middle ages, scientists could tell you quite a bit about how light worked. Everyone could see the difference between colors and and brightness/shadow. But no one could tell you what "red" was or what "color" is. But later it was discovered that colors are light waves moving at a different frequency. People knew that light existed, but people couldn't prove what light was, scientifically, prior to the 19th century. Here I think there is an intuitive feel that the "order" or "organizing principle" found in cells, computers, pocket watches, eyeballs and brains does not come about without forethought and abstract reasoning. This "order" is not to be confused with the term "order" in the 2nd law, but it is "something" that we can detect. CSI, irreducible complexity, Sewell's argument, they all "get at" this thing. Maybe we should just call it "the Word."Collin
March 4, 2015
March
03
Mar
4
04
2015
05:10 PM
5
05
10
PM
PDT
Timaeus: But that, in itself, doesn’t make Sewell’s argument wrong. It just means that his choice of words creates some confusion about what exactly he is arguing. How many legs does a dog have if you call the tail a leg?Zachriel
March 4, 2015
March
03
Mar
4
04
2015
05:06 PM
5
05
06
PM
PDT
And wanting them to get next weeks winning lotto numbers falls under psychic experiences not under life after death or veridical Nde's . They do have people that have brought back new information as far as science is concerned but again that falls under another category. And moswallstreeter43
March 4, 2015
March
03
Mar
4
04
2015
04:18 PM
4
04
18
PM
PDT
Zachriel: "Then it’s not the 2nd law of thermodynamics." Nolo contendere. But that, in itself, doesn't make Sewell's argument wrong. It just means that his choice of words creates some confusion about what exactly he is arguing. But once one gets by the title, and reads his explanation why the 2nd law needs to be "extended" outside its original sphere, one can understand what he is driving at. Gordon Davisson has no problem distinguishing between the original notion and Sewell's "extended" notion. And Gordon then does the intellectually appropriate thing, i.e., to grant for the moment Sewell's special use of terms, but show that even the extended principle which Sewell employs -- whether it's called the 2nd law or something else -- does not contradict evolution. I do think that Sewell has some important things to say about the creation of order; however, as a point of literary strategy, I would not have chosen to use the term "second law of thermodynamics" in my argument. To write as if you are reviving or restating the second law argument calls up past arguments which have been soundly defeated, and then puts you in the position of having to explain "Well, my argument isn't based on thermodynamics as the term is most commonly understood..." He would have been better to stay away from that terminology and invent his own term for the type of order and disorder that he was talking about. And if he felt it absolutely necessary to point out some relationship between his new principle and the original second law, he could have done that in a footnote at the end of the piece, or in a speculative "aftermath" section in his conclusion, leaving the notion of thermodynamics out of the title and the main argument of the article. The way he wrote it up, he ended up having to defend two very large theses: (1) that the second law is actually merely an instance of a more general law; (2) that this more general law forbids evolution. It's always better if you can restrict yourself to one thesis per article; it's not only easier for the reader to tell what you are talking about -- it's easier to defend one radical thesis than two.Timaeus
March 4, 2015
March
03
Mar
4
04
2015
03:15 PM
3
03
15
PM
PDT
SA @42
ID opponent: What do you mean by the terms “intelligent,” “design,” “chance,” “complex,” “functional,” “specified” and “information.” These terms are so vague as to render your argument meaningless. One can be certain that DDD is being employed when a person involved in a debate displays a convenient lapse of understanding of even the most common terms. In extreme cases ID opponents have even claimed that a term they themselves injected into the debate has no clear meaning.
Cuts both ways, I must say.Hangonasec
March 4, 2015
March
03
Mar
4
04
2015
02:36 PM
2
02
36
PM
PDT
Joe
Me: Fine, GAs don’t model unguided evolution AND they demonstrate that unguided evolution leads to genetic entropy. Joe: Wrong. They CAN be used to demonstrate unguided evolution leads to genetic entropy.
Do you understand what the word 'wrong' means, Joe? Examine my sentence again, and yours.Hangonasec
March 4, 2015
March
03
Mar
4
04
2015
02:24 PM
2
02
24
PM
PDT
""I want them to have knowledge they couldn’t have had otherwise. That seems pretty reasonable to me"" The 57 year old social worker did have knowledge that he couldn't have had before. This is the whole point of the study since strict controls were taken. The patient had no way to access the info of his veridical nde that he eerie maces . Ur just in denial my friendwallstreeter43
March 4, 2015
March
03
Mar
4
04
2015
02:11 PM
2
02
11
PM
PDT
Box: Here is a thought experiment for you: try to imagine a more spectacular violation than what has happened on our planet. Those processes do not violate the 2nd law of thermodynamics. Nothing humans have done or can do violates the 2nd law of thermodynamics.Zachriel
March 4, 2015
March
03
Mar
4
04
2015
01:48 PM
1
01
48
PM
PDT
So, how does the spontaneous rearrangement of matter on a rocky, barren, planet into human brains and spaceships and jet airplanes and nuclear power plants and libraries full of science texts and novels, and super computers running partial differential equation solving software, represent a less obvious or less spectacular violation of the second law -- or at least of the fundamental natural principle behind this law -- than tornados turning rubble into houses and cars? Here is a thought experiment for you: try to imagine a more spectacular violation than what has happened on our planet.
[Sewell]Box
March 4, 2015
March
03
Mar
4
04
2015
01:43 PM
1
01
43
PM
PDT
@ Gordon Davisson Sorry, but your view on the problem of decreasing entropy in open systems is not very persuading. In fact Sewell has already considered your little but defunct thought experiment, because your experiment – your jar- is actually open to an ordering principle with respect to weight: gravity. Put your jar into a gravity-free environment and nothing will happen. That’s exactly the principle which Sewell is discussing. Something has to transgress the border of the open system to create an X-entropy-lowering effect. Next thought-experiment, please.halloschlaf
March 4, 2015
March
03
Mar
4
04
2015
01:35 PM
1
01
35
PM
PDT
1 2 3 4 5

Leave a Reply