Uncommon Descent Serving The Intelligent Design Community

2nd Law of Thermodynamics — an argument Creationists and ID Proponents should NOT use

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

ID proponents and creationists should not use the 2nd Law of Thermodynamics to support ID. Appropriate for Independence Day in the USA is my declaration of independence and disavowal of 2nd Law arguments in support of ID and creation theory. Any student of statistical mechanics and thermodynamics will likely find Granville Sewell’s argument and similar arguments not consistent with textbook understanding of these subjects, and wrong on many levels. With regrets for my dissent to my colleagues (like my colleague Granville Sewell) and friends in the ID and creationist communities, I offer this essay. I do so because to avoid saying anything would be a disservice to the ID and creationist community of which I am a part.

 [Granville Sewell  responds to Sal Cordova here. ]

I’ve said it before, and I’ll say it again, I don’t think Granville Sewell 2nd law arguments are correct. An author of the founding book of ID, Mystery of Life’s Origin, agrees with me:

“Strictly speaking, the earth is an open system, and thus the Second Law of Thermodynamics cannot be used to preclude a naturalistic origin of life.”

Walter Bradley, Thermodynamics and the Origin of Life

To begin, it must be noted there are several versions of the 2nd Law. The versions are a consequence of the evolution and usage of theories of thermodynamics from classical thermodyanmics to modern statistical mechanics. Here are textbook definitions of the 2nd Law of Thermodynamics, starting with the more straight forward version, the “Clausius Postulate”

No cyclic process is possible whose sole outcome is transfer of heat from a cooler body to a hotter body

and the more modern but equivalent “Kelvin-Plank Postulate”:

No cyclic process is possible whose sole outcome is extraction of heat from a single source maintained at constant temperature and its complete conversion into mechanical work

How then can such statements be distorted into defending Intelligent Design? I argue ID does not follow from these postulates and ID proponents and creationists do not serve their cause well by making appeals to the 2nd law.

I will give illustrations first from classical thermodynamics and then from the more modern versions of statistical thermodynamics.

The notion of “entropy” was inspired by the 2nd law. In classical thermodynamics, the notion of order wasn’t even mentioned as part of the definition of entropy. I also note, some physicists dislike the usage of the term “order” to describe entropy:

Let us dispense with at least one popular myth: “Entropy is disorder” is a common enough assertion, but commonality does not make it right. Entropy is not “disorder”, although the two can be related to one another. For a good lesson on the traps and pitfalls of trying to assert what entropy is, see Insight into entropy by Daniel F. Styer, American Journal of Physics 68(12): 1090-1096 (December 2000). Styer uses liquid crystals to illustrate examples of increased entropy accompanying increased “order”, quite impossible in the entropy is disorder worldview. And also keep in mind that “order” is a subjective term, and as such it is subject to the whims of interpretation. This too mitigates against the idea that entropy and “disorder” are always the same, a fact well illustrated by Canadian physicist Doug Craigen, in his online essay “Entropy, God and Evolution”.

What is Entropy? by Tim Thompson

From classical thermodynamics, consider the heating and cooling of a brick. If you heat the brick it gains entropy, and if you let it cool it loses entropy. Thus entropy can spontaneously be reduced in local objects even if entropy in the universe is increasing.

Consider the hot brick with a heat capacity of C. The change in entropy delta-S is defined in terms of the initial hot temperature TH and the final cold temperature TM:

Supposing the hot temperature TH is higher than the final cold temperature TM, then Delta-s will be NEGATIVE, thus a spontaneous reduction of entropy in the hot brick results!

The following weblink shows the rather simple calculation of how a cold brick when put in contact with a hot brick, reduces spontaneously the entropy of the hot brick even though the joint entropy of the two bricks increases. See: Massachussetts Institute of Technology: Calculation of Entropy Change in Some Basic Processes

So it is true that even if universal entropy is increasing on average, local reductions of entropy spontaneously happen all the time.

Now one may argue that I have used only notions of thermal entropy, not the larger notion of entropy as defined by later advances in statistical mechanics and information theory. But even granting that, I’ve provided a counter example to claims that entropy cannot spontaneously be reduced. Any 1st semester student of thermodynamics will make the calculation I just made, and thus it ought to be obvious to him, than nature is rich with example of entropy spontaneously being reduced!

But to humor those who want a more statistical flavor to entropy rather than classical notions of entropy, I will provide examples. But first a little history. The discipline of classical thermodynamics was driven in part by the desire to understand the conversion of heat into mechanical work. Steam engines were quite the topic of interest….

Later, there was a desire to describe thermodynamics in terms of classical (Newtonian-Lagrangian-Hamiltonian) Mechanics whereby heat and entropy are merely statistical properties of large numbers of moving particles. Thus the goal was to demonstrate that thermodynamics was merely an extension of Newtonian mechanics on large sets of particles. This sort of worked when Josiah Gibbs published his landmark treatise Elementary Principles of Statistical Mechancis in 1902, but then it had to be amended in light of quantum mechanics.

The development of statistical mechanics led to the extension of entropy to include statistical properties of particles. This has possibly led to confusion over what entropy really means. Boltzmann tied the classical notions of entropy (in terms of heat and temperature) to the statistical properties of particles. This was formally stated by Plank for the first time, but the name of the equation is “Boltzmann’s entropy formula”:

where “W” (omega) is the number of microstates (a microstate is roughly the position and momentum of a particle in classical mechanics, its meaning is more nuanced in quantum mechanics). So one can see that the notion of “entropy” has evolved in physics literature over time….

To give a flavor for why this extension of entropy is important, I’ll give an illustration of colored marbles that illustrates increase in the statistical notion of entropy even when no heat is involved (as in classical thermodynamics). Consider a box with a partition in the middle. On the left side are all blue marbles, on the right side are all red marbles. Now, in a sense one can clearly see the arrangement is highly ordered since marbles of the same color are segregated. Now suppose we remove the partition and shake the box up such that the red and blue marbles mix. The process has caused the “entropy” of the system to increase, and only with some difficulty can the original ordering be restored. Notice, we can do this little exercise with no reference to temperature and heat such as done in classical thermodynamics. It was for situations like this that the notion of entropy had to be extended to go beyond notions of heat and temperature. And in such cases, the term “thermodynamics” seems a little forced even though entropy is involved. No such problem exists if we simply generalize this to the larger notion of statistical mechanics which encompasses parts of classical thermodynamics.

The marble illustration is analogous to the mixing of different kinds of distinguishable gases (like Carbon-Dioxide and Nitrogen). The notion is similar to the marble illustration, it doesn’t involve heat, but it involves increase in entropy. Though it is not necessary to go into the exact meaning of the equation, for the sake of completeness I post it here. Notice there is no heat term “Q” for this sort of entropy increase:

where R is the gas constant, n the total number of moles and xi the mole fraction of component, and Delta-Smix is the change in entropy due to mixing.

But here is an important question, can mixed gases, unlike mixed marbles spontaneously separate into localized compartments? That is, if mixed red and blue marbles won’t spontaneously order themselves back into compartments of all blue and red (and thus reduce entropy), why should we expect gases to do the same? This would seem impossible for marbles (short of a computer or intelligent agent doing the sorting), but it is a piece of cake for nature even though there are zillions of gas particles mixed together. The solution is simple. In the case of Carbon Dioxide, if the mixed gases are brought to a temperature that is below -57 Celcius (the boiling point of Carbon Dioxide) but above -195.8 Celcius (the boiling point of Nitrogen), the Carbon Dioxide will liquefy but the Nitrogen will not, and thus the two species will spontaneously separate and order spontaneously re-emerges and entropy of the local system spontaneously reduces!

Conclusion: ID proponents and creationists should not use the 2nd Law to defend their claims. If ID-friendly dabblers in basic thermodynamics will raise such objections as I’ve raised, how much more will professionals in physics, chemistry, and information theory? If ID proponents and creationists want to argue that the 2nd Law supports their claims but have no background in these topics, I would strongly recommend further study of statistical mechanics and thermodynamics before they take sides on the issue. I think more scientific education will cast doubt on evolutionism, but I don’t think more education will make one think that 2nd Law arguments are good arguments in favor of ID and the creation hypothesis.

UPDATE:
Dr. Sewell has been kind enough to respond. He pointed out my oversight of not linking to his papers. My public apologies to him. His response contains links to his papers.
Response to scordova

UPDATE:
at the request of Patrick at SkepticalZone, I am providing a link to his post which provides links to others discussions of Dr. Sewell’s work. I have not read these other critiques, and for sure, they may not have much attempt at civility. But because Skeptical Zone has been kind enough to publish my writings, I have some obligation to reciprocate. Again, I have not read those critiques. What I have written was purely in response to the Evolution News and Views postings regarding the topic of thermodynamics.

Patrick’s Links to other critiques

Comments
Dr. Sewell has been kind enough to respond. I also updated the original post to provide links to his papers. My sincere apologies for the oversight of not linking to them earlier. See: https://uncommondescent.com/intelligent-design/response-to-scordova/scordova
July 4, 2012
July
07
Jul
4
04
2012
03:49 PM
3
03
49
PM
PDT
SC: Without going into a debate, you may find the discussion of a diffusion model from Yavorski and Pinsky, in my always linked, app I, helpful. Practising physicists usually do not make over much about the specifics of history. The macro picture is rooted in the micro one. And the key to why entropy increases in Clausius' first example has to do with micro-level issues and moving to clusters of configs of higher relative statistical weight if free to do so. BTW, if a system is opened up to receive energy, as that same example shows, entropy tends to INCREASE. KFkairosfocus
July 4, 2012
July
07
Jul
4
04
2012
03:38 PM
3
03
38
PM
PDT
I think that the best argument from the 2nd law relates not to biological evolution but to the low entropy in the early universe. Oxford physicist Roger Penrose has computed that the initial conditions of the universe were extremely unlikely - about 1 change in 10 to the power of 10 to the power of 123. This clearly points to design - much more than any argument related to bio-evolution. It is exceedingly surprising that the entire universe is not so dominated by black holes that there could exist any life whatsoever...ahainli
July 4, 2012
July
07
Jul
4
04
2012
03:09 PM
3
03
09
PM
PDT
Assuming there is no 2nd law violation, what law does the spontaneous generation of functional complexity violate, exactly? It takes energy to purposefully order and organize arrangements of matter into information. Taking for example magnetic letters on a board, distributed haphazardly, suppose I wanted to arrange the letters into a specific phrase. I would either need to input the phrase manually, or leave it to law and chance. What law says that I'm required to impart information into the arrangement of letters that need to be organized specifically, if I don't wish to wait?Chance Ratcliff
July 4, 2012
July
07
Jul
4
04
2012
03:06 PM
3
03
06
PM
PDT
I’m not going to take sides in this but will suggest that the first equation in the post be corrected to include differential dT . I actually found myself wondering, because of the omission, if the OP has analysis background.
I see a dT as in CdT. So you misread, you didn't see and your attempted ad hominem isn't appreciated. Further, it had no basis since you obviously can't see the CdT has a dT!scordova
July 4, 2012
July
07
Jul
4
04
2012
02:56 PM
2
02
56
PM
PDT
I trust this will prove helpful. KF
I agree with most of what is in Mystery of Life's Origin and I quoted one of the author's (Walter Bradley's) thoughts on the matter of the second law. But the 2nd law proceeds from the microstates, not the other way around! The first formulations of the 2nd law had no model whatsover of microstates. Hence, the Clausius postulate is inappropriate to describe the modern notions of entropy (like Mixing or things like configuration and organization). A DNA strand like AATG..... etc in the sense of its effect on Thermodynamic entropy is not really different from another DNA strand of the same length but with different characters like TTTG...... but in terms of organization, this could make all the difference between life and death. The Clausius postulate (the first formal statement of the 2nd Law) has little if anything to say about such matters, that's why we should not use the 2nd law. Any way, I posted this because I think there is disgreement over the matter in the ID community and I wanted others in the ID community who held my reservations to know they weren't alone in their doubts. If I have these reservations, I'm sure other ID sympathizers will share them. But thank you for your comments. This could be a very education discussion, and such input is valuable to clarify the issues. In the book you mention, use was made of the notion of "Configuration Entropy" which measures the level of disorganization (not disorder). These are not at all a part of the Clausius postulate. If one has to rely on certain version of the 2nd Law, rather than making a claim that can be justified by all widely-accepted versions of the 2nd law, then this is going into difficult territory, and I wish to let my ID comrades know, they will find it challenging defending ID with such arguments because their opponents will simply keep throwing up the Clausius postulate and say "how does the Clausius postulate disprove the naturalistic origin of life?" The Clausius postulate:
No cyclic process is possible whose sole outcome is transfer of heat from a cooler body to a hotter body
Here is a more acceible way to argue against naturalistic origins of life courtesy Jonathan Wells (who was an undergraduate physics major before getting his doctorate in biology):
So here is an interesting proposition by Dr. Jonathan Wells and aptly named “The Humpty Dumpty” solution. Take a sterile medium and in this medium insert a single-celled animal. Then simply rupture the outer membrane of the animal (or cell) and let all the contents of the animal freely float in the solution. Now, in this environment you have all the available components (according to reductionists) necessary for life. In fact, you have many times more available organic compounds than have been ever produced in any experiment that attempts to simulate the early formation of life as did the Miller-Urey experiment. All the components for life are right there waiting for you. What sane biologist in Ann Arbor or anywhere would say that you can create life from this “primordial soup?” The old nursery rhyme is true: you can’t put Humpty Dumpty together again.
That's the way to argue against naturalistic origins without using the Clausius postulate.scordova
July 4, 2012
July
07
Jul
4
04
2012
02:51 PM
2
02
51
PM
PDT
I'm not going to take sides in this but will suggest that the first equation in the post be corrected to include differential dT . I actually found myself wondering, because of the omission, if the OP has analysis background.groovamos
July 4, 2012
July
07
Jul
4
04
2012
02:46 PM
2
02
46
PM
PDT
Just a little test, because my comment doesn't show up. It's the first time I'm commenting.hallowach
July 4, 2012
July
07
Jul
4
04
2012
02:27 PM
2
02
27
PM
PDT
Hi SC: Pardon, but I do not agree. I suggest, you read here, here [esp chs 7 & 8], and also here. In this last, please note the discussion on parts of a micro-jet moving about by forces of diffusion in a vat. These links will give my reasons in outline. I particularly note that, by the nature of the case, functionally specific, complex organised states will be deeply isolated in the space of possible configs, and will be maximally elusive to blind searches. In short, on the accessible atomic resources and time, spontaneous origin of such states will be maximally implausible similar to monkeys at keyboards typing significant passages from Shakespeare or the like. This is closely related to the analytical grounds of the statistical form of the 2nd law. And, as I note here, while indeed there is a distinction between order and organisation -- the latter being a special type of order that is information-rich, there is also a bridge between thermodynamics, entropy and information. citing Wiki speaking against interest:
At an everyday practical level the links between information entropy and thermodynamic entropy are not close. Physicists and chemists are apt to be more interested in changes in entropy as a system spontaneously evolves away from its initial conditions, in accordance with the second law of thermodynamics, rather than an unchanging probability distribution. And, as the numerical smallness of Boltzmann's constant kB indicates, the changes in S / kB for even minute amounts of substances in chemical and physical processes represent amounts of entropy which are so large as to be right off the scale compared to anything seen in data compression or signal processing. But, at a multidisciplinary level, connections can be made between thermodynamic and informational entropy, although it took many years in the development of the theories of statistical mechanics and information theory to make the relationship fully apparent. In fact, in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon's information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states that it could be in, thus making any complete state description longer. (See article: maximum entropy thermodynamics.[Also,another article remarks: >>in the words of G. N. Lewis writing about chemical entropy in 1930, "Gain in entropy always means loss of information, and nothing more" . . . in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes/no questions that need to be answered in order to fully specify the microstate, given that we know the macrostate.>>]) Maxwell's demon can (hypothetically) reduce the thermodynamic entropy of a system by using information about the states of individual molecules; but, as Landauer (from 1961) and co-workers have shown, to function the demon himself must increase thermodynamic entropy in the process, by at least the amount of Shannon information he proposes to first acquire and store; and so the total entropy does not decrease (which resolves the paradox).
Last but not least, I think an excerpt from Thaxton et al, at the end of Ch 7 of TMLO, will I believe, help to further frame discussion:
While the maintenance of living systems is easily rationalized in terms of thermodynamics, the origin of such living systems is quite another matter. Though the earth is open to energy flow from the sun, the means of converting this energy into the necessary work to build up living systems from simple precursors remains at present unspecified (see equation 7-17). The "evolution" from biomonomers of to fully functioning cells is the issue. Can one make the incredible jump in energy and organization from raw material and raw energy, apart from some means of directing the energy flow through the system? In Chapters 8 and 9 we will consider this question, limiting our discussion to two small but crucial steps in the proposed evolutionary scheme namely, the formation of protein and DNA from their precursors. It is widely agreed that both protein and DNA are essential for living systems and indispensable components of every living cell today.11 Yet they are only produced by living cells. Both types of molecules are much more energy and information rich than the biomonomers from which they form. Can one reasonably predict their occurrence given the necessary biomonomers and an energy source? Has this been verified experimentally? These questions will be considered . . .
I trust this will prove helpful. KFkairosfocus
July 4, 2012
July
07
Jul
4
04
2012
02:09 PM
2
02
09
PM
PDT
I´m really sorry if my English is not good enough. I hope you can understand my thoughts anyway. I think Mr Scordova you made a mistake. That is quite obvious if you have a second look at your gas example. The separation effect of the two gases is due to the presence of the gravitation field that you introduced without noting it. Actually you won't be able to construct your little experiment in a surrounding where gravitation is missing. So in fact your gases are not separating because of the reduction of temperature but because of the presence of a directed gravitational field. Because the condensing CO2-molecules are heavier than the N-Molecules they sink in direction of the directed gravitational field. This was present throughout the whole time of the experiment but is only able to show up its presence and force when temperatures are reduced. This sort of directed field is exactly what Mr Sewell proposes must enter the border of non-isolated systems to get order. Imagine you would create a strong gravitational field of the right strength in your first scenario (where the gases are still mixed), for instance put the whole stuff in a centrifuge. Voila! The separation effect will show up as well, but nobody would propose, that no special ordering force had entered the border of the non isolated system. It's the same as you would introduce a sieve in your marbles container, where the clumpsing red marbles (clumpsing happens because of temperature reduction) can't pass the sieve but the blue ones can. Without a doubt the blue ones will collect at the bottom of the container. I'm sorry but the arguments of Mr Sewell are in my opinion still valid.hallowach
July 4, 2012
July
07
Jul
4
04
2012
02:07 PM
2
02
07
PM
PDT
Though my opinion on this is probably worthless, I will, none-the-less, go on record as disagreeing with you scordova and as agreeing with Dr. Sewell. Namely you cite this example:
"Nitrogen will not, and thus the two species will spontaneously separate and order spontaneously re-emerges and entropy of the local system spontaneously reduces!"
And Yet,
Three subsets of sequence complexity and their relevance to biopolymeric information - Abel, Trevors Excerpt: Three qualitative kinds of sequence complexity exist: random (RSC), ordered (OSC), and functional (FSC).,,, Shannon information theory measures the relative degrees of RSC and OSC. Shannon information theory cannot measure FSC. FSC is invariably associated with all forms of complex biofunction, including biochemical pathways, cycles, positive and negative feedback regulation, and homeostatic metabolism. The algorithmic programming of FSC, not merely its aperiodicity, accounts for biological organization. No empirical evidence exists of either RSC of OSC ever having produced a single instance of sophisticated biological organization. Organization invariably manifests FSC rather than successive random events (RSC) or low-informational self-ordering phenomena (OSC).,,, Testable hypotheses about FSC What testable empirical hypotheses can we make about FSC that might allow us to identify when FSC exists? In any of the following null hypotheses [137], demonstrating a single exception would allow falsification. We invite assistance in the falsification of any of the following null hypotheses: Null hypothesis #1 Stochastic ensembles of physical units cannot program algorithmic/cybernetic function. Null hypothesis #2 Dynamically-ordered sequences of individual physical units (physicality patterned by natural law causation) cannot program algorithmic/cybernetic function. Null hypothesis #3 Statistically weighted means (e.g., increased availability of certain units in the polymerization environment) giving rise to patterned (compressible) sequences of units cannot program algorithmic/cybernetic function. Null hypothesis #4 Computationally successful configurable switches cannot be set by chance, necessity, or any combination of the two, even over large periods of time. We repeat that a single incident of nontrivial algorithmic programming success achieved without selection for fitness at the decision-node programming level would falsify any of these null hypotheses. This renders each of these hypotheses scientifically testable. We offer the prediction that none of these four hypotheses will be falsified. http://www.tbiomed.com/content/2/1/29
Scordova, in your example, purporting to show why Dr. Sewell is wrong, I still see no movement towards viable functional sequence complexity (functional information) and it seems very reasonable to me that you have in fact moved away from viable functional sequence complexity. i.e. In regards to functional information I would hold that the entropy increased when you moved from RSC to OSC in your example. Thus entropy, the second law, holds as valid as far as the generation of complex functional information is concerned, which is the context in which Dr. Sewell used it! further note:
“Is there a real connection between entropy in physics and the entropy of information? ….The equations of information theory and the second law are the same, suggesting that the idea of entropy is something fundamental…” Siegfried, Dallas Morning News, 5/14/90, [Quotes Robert W. Lucky, Ex. Director of Research, AT&T, Bell Laboratories & John A. Wheeler, of Princeton & Univ. of TX, Austin] “Bertalanffy (1968) called the relation between irreversible thermodynamics and information theory one of the most fundamental unsolved problems in biology.” Charles J. Smith – Biosystems, Vol.1, p259. “Gain in entropy always means loss of information, and nothing more.” Gilbert Newton Lewis – preeminent Chemist of the first half of last century "Klimontovich’s S-theorem, an analogue of Boltzmann’s entropy for open systems, explains why the further an open system gets from the equilibrium, the less entropy becomes. So entropy-wise, in open systems there is nothing wrong about the Second Law. S-theorem demonstrates that spontaneous emergence of regular structures in a continuum is possible.,,, The hard bit though is emergence of cybernetic control (which is assumed by self-organisation theories and which has not been observed anywhere yet). In contrast to the assumptions, observations suggest that between Regularity and Cybernetic Systems there is a vast Cut which cannot be crossed spontaneously. In practice, it can be crossed by intelligent integration and guidance of systems through a sequence of states towards better utility. No observations exist that would warrant a guess that apart from intelligence it can be done by anything else." Eugene S – UD Blogger https://uncommondescent.com/genetics/id-foundations-15c-a-faq-on-front-loading-thanks-to-genomicus/comment-page-1/#comment-418185
bornagain77
July 4, 2012
July
07
Jul
4
04
2012
01:34 PM
1
01
34
PM
PDT
a few things to ponder, by definition a closed system should be considered evidence of ID. That a decrease in localized entropy happens in nature and things renew in nature is evidence of ID. The laws of nature themselves being a product of ID by the One designer/creator Hashem (G-d). Entropy/2nd law in an open system (the universe and the earth to some extent as it interacts w/ the sun..) is an argument against old universe fable, as the longer a continuous life sustaining atmosphere in any location, in our knowledge base that being earth, the lower the odds, the greater the miracle. see the recent complex creation, pearlman cta authorPearlman CTA
July 4, 2012
July
07
Jul
4
04
2012
01:03 PM
1
01
03
PM
PDT
Regarding the following quote and the concept it expresses: “Strictly speaking, the earth is an open system, and thus the Second Law of Thermodynamics cannot be used to preclude a naturalistic origin of life.” I've heard this argument before, but it seems to have a deep flaw. The Earth may well be an "open system", but the UNIVERSE is not. And life didn't just originate on the Earth, but rather on the Earth WITHIN this universe. As it happens, I don't bother with 2nd law arguments much myself… I actually think there are more "accessible" arguments to be made. But I do think 2nd law arguments are valid.TRoutMac
July 4, 2012
July
07
Jul
4
04
2012
12:30 PM
12
12
30
PM
PDT
There further needs to be a distinction between ORDER and ORGANIZATION. Microsoft windows or any binary computer program are highly DIS-ordered in the classical sense, but highly organized in the conceptual sense. To the extent entropy may deal with "disorder" but not relevant to organization, it is not an appropriate argument for ID. A ZIP file will exhibit levels of disorder comparable to random coin flips, but this does not imply a ZIP file is disorganized. More attention should be made to these nuances. The problem for evolution and the origin of life is one of ORGANIZATION not ORDER. Hence, 2nd law arguments are not appropriate. Possibly arguments that borrow from statistics (which is part of statistical mechanics) and information science are appropriate.scordova
July 4, 2012
July
07
Jul
4
04
2012
12:23 PM
12
12
23
PM
PDT
Thank you for debunking Dr Sewell. He continues to put out gibberish about how evolution is related to the second law and entropy, while studiously avoiding a coherent defintion of either. The second law can be stated by first defining a Stable State: A system is in a Stable State if it is, at most, hopelessly improbable that it can attain another state, different by a finite amount, without a finite and permanent change in the environment. The phrase "hopelessly improbable" accounts for the remote probability that the entropy of an isoalted system will spontaneously decrease. Using that defintion, the first and secod laws are given by the Creationist Law of Stable Equilibrium: "In the absence of supernatural forces, a bounded system can attain one and only one stable state". The "one and only one" is the first law It applies to all systems, black holes, quantum relativisitic, newtonian, systesm without matter, the works. In regard to entropy its defintion is starightforward and needs no integrals. Entropy: It is a property of a system equal to.... The "Lost Work" divided by the Temperature of the reservoir used to define the lost work. Lost Work: It is a property of a system equal to.....the differnce between the Energy and the "Available Energy" Available Energy: It is a property of a system equal to..........the maximum work that can be extracted from the system while coming to equilibrum with a heat reservior. Although Entropy is not necessarily statistical, Bolzmann's Entropy S = k ln W falls under this defintion.chris haynes
July 4, 2012
July
07
Jul
4
04
2012
11:56 AM
11
11
56
AM
PDT
1 2

Leave a Reply