Uncommon Descent Serving The Intelligent Design Community

Granville Sewell on the 2nd Law


The 2nd Law of Thermodynamics has never been a friend of materialistic evolution. Granville Sewell’s arguments concerning it at the following two links are worth pondering:

Link 1: from the book IN THE BEGINNING

Link 2: video presentation “A Mathematician’s View of Evolution”

Correct me if I 'm wrong, but the 2nd law isn't directional with respect to time. Meaning that (to state it very basically) at this point in time, the system that we are in is moving from a low entropy one to a high entropy one. But that must also mean that high entropy preceded low entropy when we consider moving back in time, correct? Of course, this may have more to do with our concept of what time is than it does with the second law. Winston Macchi
10 --> In short, if all the matter of the universe were converted into robot monkeys and type-writers with paper and supplies to keep the system going, and the monkeys were to type away for the lifespan of the universe, they would be utterly unlikely to produce a single meaningful English paragraph of reasonable length. 11 --> And in case you still have trouble seeing the problem with "open systems," let me excerpt my remarks in APP 1 my always linked, on the implications of he basic Clausius expression for the second Law: ______________ >> Isol System: | | (A, at Thot) --> d'Q, heat --> (B, at T cold) | | b] Now, we introduce entropy change dS >/= d'Q/T . . . "Eqn" A.1 c] So, dSa >/= -d'Q/Th, and dSb >/= +d'Q/Tc, where Th > Tc d] That is, for system, dStot >/= dSa + dSb >/= 0, as Th > Tc . . . "Eqn" A.2 e] But, observe: the subsystems A and B are open to energy inflows and outflows, and the entropy of B RISES DUE TO THE IMPORTATION OF RAW ENERGY. f] The key point is that when raw energy enters a body, it tends to make its entropy rise. For the injection of energy to instead do something useful, it needs to be coupled to an energy conversion device. g] When such devices, as in the cell, exhibit FSCI, the question of their origin becomes material, and in that context, their spontaneous origin is strictly logically possible but negligibly different from zero probability on the gamut of the observed cosmos. (And, kindly note: the cell is an energy importer with an internal energy converter. That is, the appropriate entity in the model is B and onward B' below. Presumably as well, the prebiotic soup would have been energy importing, and so materialistic chemical evolutionary scenarios therefore have the challenge to credibly account for the origin of the FSCI-rich energy converting mechanisms in the cell relative to Monod's "chance + necessity" [cf also Plato's remarks] only.) h] Now, as just mentioned, certain bodies have in them energy conversion devices: they COUPLE input energy to subsystems that harvest some of the energy to do work, exhausting sufficient waste energy to a heat sink that the overall entropy of the system is increased. Illustratively, for heat engines -- and (in light of exchanges with email correspondents circa March 2008) let us note: a good slice of classical thermodynamics arose in the context of studying, idealising and generalising from steam engines [which exhibit organised, functional complexity, i.e FSCI; they are of course artifacts of intelligent design and also exhibit step-by-step problem-solving processes (even including "do-always" looping!)]: | | (A, heat source: Th): d'Qi --> (B', heat engine, Te): --> d'W [work done on say D] + d'Qo --> (C, sink at Tc) | | i] A's entropy: dSa >/= - d'Qi/Th j] C's entropy: dSc >/= + d'Qo/Tc k] The rise in entropy in B, C and in the object on which the work is done, D, say, compensates for that lost from A. The second law -- unsurprisingly, given the studies on steam engines that lie at its roots -- holds for heat engines. l] However for B since it now couples energy into work and exhausts waste heat, does not necessarily undergo a rise in entropy having imported d'Qi. [The problem is to explain the origin of the heat engine -- or more generally, energy converter -- that does this, if it exhibits FSCI.] m] There is also a material difference between the sort of heat engine [an instance of the energy conversion device mentioned] that forms spontaneously as in a hurricane [directly driven by boundary conditions in a convective system on the planetary scale, i.e. an example of order], and the sort of complex, organised, algorithm-implementing energy conversion device found in living cells [the DNA-RNA-Ribosome-Enzyme system, which exhibits massive FSCI]. n] In short, the decisive problem is the [im]plausibility of the ORIGIN of such a FSCI-based energy converter through causal mechanisms traceable only to chance conditions and undirected [non-purposive] natural forces. This problem yields a conundrum for chem evo scenarios, such that inference to agency as the probable cause of such FSCI -- on the direct import of the many cases where we do directly know the causal story of FSCI -- becomes the better explanation. >> _______________ 12 --> In short, unless input energy is coupled to a system in an ORGANISED way, it will be utterly more likely to add to the forces of disorganisation, i.e entropy. To see what this means, consider the contrast between light falling on a living leaf and a dead one, which will simply dry up in the sun. =============== Sir Fred Hoyle knew what he was talking about. GEM of TKI kairosfocus
SO: Remember, Sir Fred -- holder of a Nobel-equivalent prize for work in Astrophysics -- was an expert on thermodynamics. The basic issue is that functional organisation [which is macro-observable] is a complex, relatively rare state in the relevant space of configurations. So much so that it is swamped out like a small isolated island or archipelago, in the midst of a sea of non-function. So, on the gamut of the observed cosmos, it is utterly implausible [note the peer reviewed status of Abel's discussion] for a random walk starting from an arbitrary initial arrangement of parts to get to a complex functional one. And, without initial relevant function, which includes self-replicating capacity, we cannot begin to discuss about chance variation and culling out of less effective functioning variant sub populations as a relevant means of moving to optimal function. So, let us take a leaf from Paley's Ch 2, and apply to Sir Fred's 747. Suppose we saw a 747 model that in the course of its workings, produced another 747 once a warehouse of parts was accessible.) 1 --> Would that be more or less complex and functional than the familiar jumbo? Plainly, yes. (And even so, a 747 would be less complex than a typical living cell.) 2 --> Would it be more or less plausible that such a Jumbo would have originated by chance e.g. through a tornado passing though a junkyard in Seattle? (The tornado provides energy input or presumably adequate scale, and could even fly in fresh parts, let us suppose from Topeka Kansas; i.e. we have here an open system in the full sense.) 3 --> Now, we may see from von Neumann, what a self-replicating system that fulfills a function of its own [not just replicates itself] requires:
(i) an underlying storable code to record the required information to create not only (a) the primary functional machine [here, a flyable 747] but also (b) the self-replicating facility; and, that (c) can express step by step finite procedures for using the facility; (ii) a coded blueprint/tape record of such specifications and (explicit or implicit) instructions, together with (iii) a tape reader [called “the constructor” by von Neumann] that reads and interprets the coded specifications and associated instructions; thus controlling: (iv) position-arm implementing machines with “tool tips” controlled by the tape reader and used to carry out the action-steps for the specified replication (including replication of the constructor itself); backed up by (v) either: (1) a pre-existing reservoir of required parts and energy sources, or (2) associated “metabolic” machines carrying out activities that as a part of their function, can provide required specific materials/parts and forms of energy for the replication facility, by using the generic resources in the surrounding environment.
4 --> Parts (ii), (iii) and (iv) are each necessary for and together are jointly sufficient to implement a self-replicating machine with an integral von Neumann universal constructor. That is, we see here an irreducibly complex set of core components that must all be present in a properly organised fashion for a successful self-replicating machine to exist. [Take just one core part out, and self-replicating functionality ceases: the self-replicating machine is irreducibly complex (IC).] 5 --> This irreducible complexity is compounded by the requirement (i) for codes, requiring organised symbols and rules to specify both steps to take and formats for storing information, and (v) for appropriate material resources and energy sources. 6 --> Such a self-replicating 747, or even an ordinary one, plainly is not a plausible product of a tornado in a junkyard. (Cf my discussion here on a quasi-molecular scale version.) 7 --> Similarly, suppose we have the self-replicating 747. Is it plausible that by random changes in the coded tape, step by step, we would be able to get a series of flyable aircraft -- let alone superior flyable aircraft -- all the way to say a Space Shuttle? 8 --> In short, the key problem is that functional specifying information for any complicated entity is both specific and complex, thus deeply isolated in the relevant configuration space. 9 --> Just 125 bytes of information specifies 1.07 * 10^301 possible configurations, vastly beyond the number of states our observed universe can scan through in its thermodynamically credible lifespan. [ . . . ] kairosfocus
second opinion: Well, they are obviously both second law arguments. My only contribution to this debate is pointing out that the laws of probability are not suspended just because a system is open. Natural forces do not do macroscopically describable things which are extremely improbable from the microscopic point of view, whether a system is closed or open---in an open system you just have to take into account the boundary conditions before deciding if an increase in order is extremely improbable or not. Granville Sewell
Dr. Sewell, may I ask you how your argument is fundamentally different from Holye's junkjard tornado? second opinion
Thanks for the link Morris,,, definitely a good reference: I especially like the last part: It is proposed in conclusion that it is the non-material information (transcendent to the matter and energy) that is actually itself constraining the local thermodynamics to be in ordered disequilibrium and with specified raised free energy levels necessary for the molecular and cellular machinery to operate. I would love to see the entire paper to see which transcendent equations (information) he refers to that are constraining the local thermodynamics: The Underlying Mathematical Foundation Of The Universe -Walter Bradley - video http://www.metacafe.com/watch/4491491 The Five Foundational Equations of the Universe and Brief Descriptions of Each: http://docs.google.com/Doc?docid=0AYmaSrBPNEmGZGM4ejY3d3pfNDdnc3E4bmhkZg&hl=en (I know Boltzmann is probably one of them) Here is a 2008 video of Dr. McIntosh Design, Information and The Word of God http://edinburghcreationgroup.org/digwog.xml Here is the part of the video that deals with entropy in biology (McIntosh is commenting on Sanford's work): Evolution vs. Genetic Entropy - video http://www.metacafe.com/watch/4028086 here is another paper I found from McIntosh from 2006: Functional Information and Entropy in living systems Conclusions In this paper we have considered the concept of logical entropy as a parallel to the Boltzmann probability formula for system states. We have then considered the role of information in reducing at a fundamental level the logical entropy and concluded that rather than regarding negative entropy as being a source of information at the fundamental level, it is far more self-consistent to regard the information defined in terms of a source from which negative logical entropy is derived at the molecular level, and which can be quantified using Shannon principles. It has often been asserted that the logical entropy of an open system could reduce through chance exchanges of that system with its environment. By considering the Gibbs free energy connecting two possible states, it is evident that this involves thermodynamic hurdles which demand effectively a different physics. Self-organisation (so called) only takes place when existing information is already inherent in the system and not vice versa. In an open system, energy (such as from the sun) may increase the local temperature difference (and thus increase the potential for useful work that can be done locally), but without a machine (that is, a device which is made or programmed to use the available energy), there is still no possibility of the self-organisation of matter. There has to be previously written information or order (often termed “teleonomy”) for passive, non-living chemicals to respond and become active. Thus the following summary statement applies to all known systems: Energy + Information equals Locally reduced entropy (Increase of order) (or teleonomy) with the corollary: Matter and Energy alone does not equal Decrease in Entropy http://www.heveliusforum.org/Artykuly/Func_Information.pdf bornagain77
Someone else who has been doing first class work to show how Darwinism violates the Second Law is Andrew McIntosh at Leeds University in England. He has just had a review article on the topic published in a secular journal, Design & Nature and Ecodynamics: http://journals.witpress.com/pages/papers.asp?iID=47&in=4&vn=4&jID=19 (for some reason I can't get the link to embed properly) You have to pay $30 for the full paper, but here is the abstract:
Information and entropy – top-down or bottom-up development in living systems? A.C. McINTOSH Abstract This paper deals with the fundamental and challenging question of the ultimate origin of genetic information from a thermodynamic perspective. The theory of evolution postulates that random mutations and natural selection can increase genetic information over successive generations. It is often argued from an evolutionary perspective that this does not violate the second law of thermodynamics because it is proposed that the entropy of a non-isolated system could reduce due to energy input from an outside source, especially the sun when considering the earth as a biotic system. By this it is proposed that a particular system can become organised at the expense of an increase in entropy elsewhere. However, whilst this argument works for structures such as snowflakes that are formed by natural forces, it does not work for genetic information because the information system is composed of machinery which requires precise and non-spontaneous raised free energy levels – and crystals like snowflakes have zero free energy as the phase transition occurs. The functional machinery of biological systems such as DNA, RNA and proteins requires that precise, non-spontaneous raised free energies be formed in the molecular bonds which are maintained in a far from equilibrium state. Furthermore, biological structures contain coded instructions which, as is shown in this paper, are not defined by the matter and energy of the molecules carrying this information. Thus, the specified complexity cannot be created by natural forces even in conditions far from equilibrium. The genetic information needed to code for complex structures like proteins actually requires information which organises the natural forces surrounding it and not the other way around – the information is crucially not defined by the material on which it sits. The information system locally requires the free energies of the molecular machinery to be raised in order for the information to be stored. Consequently, the fundamental laws of thermodynamics show that entropy reduction which can occur naturally in non-isolated systems is not a sufficient argument to explain the origin of either biological machinery or genetic information that is inextricably intertwined with it. This paper highlights the distinctive and non-material nature of information and its relationship with matter, energy and natural forces. It is proposed in conclusion that it is the non-material information (transcendent to the matter and energy) that is actually itself constraining the local thermodynamics to be in ordered disequilibrium and with specified raised free energy levels necessary for the molecular and cellular machinery to operate.
Stephen Morris
Dr. Sewell, here is the link: Can ANYTHING Happen in an Open System? http://www.youtube.com/watch?v=dyAjvOJiOes Here is an excerpt of a podcast that I enjoyed of yours: Finely Tuned Big Bang, Elvis In The Multiverse, and the Schroedinger Equation - Granville Sewell - video http://www.metacafe.com/watch/4233012 here is another video by Dr. Thomas Kindell: Evolution Vs. Thermodynamics - Open System Refutation - Thomas Kindell http://www.metacafe.com/watch/4143014 entire video The Physics of the Small and Large: What is the Bridge Between Them? Roger Penrose Excerpt: "The time-asymmetry is fundamentally connected to with the Second Law of Thermodynamics: indeed, the extraordinarily special nature (to a greater precision than about 1 in 10^10^123, in terms of phase-space volume) can be identified as the "source" of the Second Law (Entropy)." http://www.pul.it/irafs/CD%20IRAFS%2702/texts/Penrose.pdf How special was the big bang? - Roger Penrose Excerpt: "This now tells us how precise the Creator's aim must have been: namely to an accuracy of one part in 10^10^123." http://www.ws5.com/Penrose/ This 1 in 10^10^123 number, for the time-asymmetry of the initial state of entropy (or order) for the universe, also lends strong support for "highly specified infinite information" creating the universe since the universe, as a whole, has been losing order ever since the big bang; "Gain in entropy always means loss of information, and nothing more." Gilbert Newton Lewis The Thermodynamic Argument Against Materialism and Evolution - Thomas Kindell - video http://www.metacafe.com/watch/4168488 Does God Exist? The End Of Christianity - Finding a Good God in an Evil World - video http://www.metacafe.com/watch/4007708 Switchfoot - Dare You To Move http://www.youtube.com/watch?v=iOTcr9wKC-o bornagain77
Well, that didn't work! I guess you can't embed videos in the comments. Funny, the embedded icon showed up in the "preview". Granville Sewell
Here is the Youtube (shortened) version of the video: Granville Sewell
It may be of interest that also Marxists in the former Soviet bloc criticised Darwinism and "scientism" in this point. Marxists claimed that evolution clearly contradicts the law of enthropy and claimed that in fact we are witnessing negentropic processes. Marxistic branch of atheism didn't have problems articulating and accepting such claims, because they supposed that biological world has its own laws which cannot be reduced to those of physics (criticising this way "reductionism" ). I summarized it on my blog, see "Marxistic critique of Darwinism". http://cadra.wordpress.com/ VMartin
Dr. Sewell, Your book was very informative and a pleasure to read. semi off topic, I think if you have not seen these short articles you might enjoy them: Alain Aspect and Anton Zeilinger by RICHARD CONN HENRY Excerpt: Quantum mechanics makes no mention of reality (Figure 1). Indeed, quantum mechanics proclaims, “We have no need of that hypothesis.” http://henry.pha.jhu.edu/aspect.html THE REAL SCANDAL OF QUANTUM MECHANICS - Richard Conn Henry Excerpt: We know for a fact that the universe is not “made of” anything. Get it through your heads, physicists! It is sometimes said that the only thing that is real are the observations, but even that is not true: observations are not real either. They, and everything else, are purely mental. http://henry.pha.jhu.edu/scandal.pdf Richard Conn Henry is Professor in the Henry A. Rowland Department of Physics and Astronomy at The Johns Hopkins University , bornagain77

Leave a Reply