Evolutionary biology News Origin Of Life

Rob Sheldon: Increasing the heat energy leads to decreasing the information

Spread the love
Dr Sheldon
Rob Sheldon

Further to Origin of complex cells: Can energy create information? (Lane seems to think that energy can create or substitute for huge amounts of information. This seems wrong but it is apparently acceptable to The Scientist, Rob Sheldon, noting that reader’ thoughts were solicited, writes to say,

In thermodynamics, we have the fundamental thermodynamic relation or defining equation dU = dQ + dW = TdS – PdV, where U=internal energy, Q=heat, W=work, T=temperature,S=entropy, P=pressure, V=volume, and “d” means “the change of”. In a closed system that is “reversible” , (no eddies, turbulence etc) and the volume doesn’t change much (incompressible like water), then we can eliminate the work and get the equation dQ = TdS, which is to say, the change in heat energy of our system is equal to the temperature times the change in entropy of the system. Or we can rewrite this dQ/T = dS.

What does this mean?

Well, since S is the entropy, and using Shannon’s definition, -S = Information, then dQ/T = -d(Info)

So addressing Lane’s book, heat energy is not information. Increasing the heat energy leads to decreasing the information. The same amount of heat energy at higher temperature has more information than the same heat at lower temperature. For life to extract information from a heat source, it must be a heat engine, extracting high temperature energy and excreting low temperature energy. Heat engines do this feat by incorporating carefully constructed linkages and machinery to prevent the work from vanishing in the turbulent, diffusive, entropic working fluid. If the machinery has to be made out of the same material as the working fluid, then it is like saying “a high information dense state can process heat energy to extract information”.

Well, doesn’t this process produce more information than at the beginning? Wouldn’t this allow for an infinitely great amount of information inside the system, the “Borg” of cellular automata?

No, because entropy is a fluid property, and it wants to diffuse away. The more information that gets concentrated, the greater the gradients, and the more work expended keeping those gradients. Therefore for a fixed energy flow, there is a maximum entropy gradient, a maximum entropy density, where all the effort is expended just staying alive. For heat engines like your car, this is given by the maximum temperature the engine can run at before it melts, and Sadi Carnot used the formula (Ti-Tf)/Ti to describe this “Carnot efficiency”. For cells, we’ll call this maximum achievable entropy gradient “the life efficiency”, and I think it is fair to argue that the more efficient a heat engine, the more information it must contain. (Have you looked at the number of sensor wires under the hood of a late model car and compared it to, say, a VW bug engine?)

But one thing it cannot do, cannot allow, is “a low information state can process heat energy to spontaneously become a high information state that processes heat energy”. This is the proverbial “perpetual motion machine” now operating on entropy, it is the “perpetual information machine”. For just as Clausius and Kelvin showed that heat engines that produced too much work (too efficient) could be linked together to produce a perpetual motion machine, so also fluid “life” machines that produce too much information can be linked together to produce perpetual motion. This proves that no such machine is possible or else biology would long ago become a perpetual motion machine that never eats.

So why do these erroneous views keep propagating, having spontaneous information arising from energy gradients?

Because they fudge the bookkeeping. Entropy is notoriously hard to measure, and so for example, they might underestimate the information in the cellular machinery and think that a temperature gradient has more than enough entropy to create the cellular machinery. Or as Granville Sewell argues, they have an open system that allows entropy to arrive and disappear without proper accounting, so that information accumulates inside the cell, which they then misattribute to temperature gradients. But if any of these papers were even within hailing distance of being correct, then perpetual motion would be commonplace by now, and you and I would spend our days wondering what to do with all our free time.

Other reader’s thoughts?

See also: Granville Sewell’s important contribution to physics: Entropy-X

Follow UD News at Twitter!

7 Replies to “Rob Sheldon: Increasing the heat energy leads to decreasing the information

  1. 1
    Upright BiPed says:

    Other reader’s thoughts?

    facepalm

  2. 2
    Gordon Davisson says:

    The relationship (if any) between energy and information depends a great deal on what definition of information one chooses; there are many definitions available, and which one you should pick depends on what you’re actually trying to do. (Compare this with picking a definition of “size”: depending on what you’re trying to figure out, you might use length, height, depth, circumference, volume, surface area, mass, etc… or maybe several of them at once.)

    Unfortunately, the definition Dr. Sheldon chose has pretty serious problems for almost any application.

    Well, since S is the entropy, and using Shannon’s definition, -S = Information, then dQ/T = -d(Info)

    First, a minor quibble: correct me if I’m wrong, but I don’t think Shannon ever used this definition. I’m not an expert on the history here, but Norbert Weiner was the first I know of to identify information with a decrease in Shannon entropy, but not thermodynamic entropy. AIUI Leon Brillouin was the first to identify it with negative thermodynamic entropy.

    But there are real problems here as well. For one thing, entropy (both Shannon and Boltzmann/Gibbs) is always positive (or at least nonnegative), and hence by this definition information is always negative (or at least never positive).

    Now, normally this is dealt with by only looking at changes in entropy, and identifying information with decreases in entropy. But that still winds up being pretty seriously problematic. For instance, the entropy of a typical human is far far higher than that of any bacterium (mostly because of size: entropy is, other things being equal, proportional to size, and humans are far far bigger than bacteria); this would mean that turning a bacterium into a human would be a massive decrease of information. Does that make any sense to anyone at all?

    (Side note: I used “size” above without clarifying which measure I meant. In this case, the number of atoms or molecules would probably be the most directly relevant measure.)

    But it’s even worse than that, because according to this definition, simply cooling something off increases its information by a huge amount. Consider cooling one cc (about a thimblefull) of water off by one degree Centigrade (=1.8 degrees Fahrenheit) from, say, 27° C to 26° C (absolute temperatures of 300.15 K and 299.15 K respectively). The amount of heat removed is (almost by definition) about 1 calorie, so ΔQ = -1 cal, and ΔInformation = -ΔQ/T ~= – (-1 cal) / 300 K = +3.33e-3 cal/K. To convert that from thermodynamic units to information units (bits), we need to divide by Boltzmann’s constant times the natural log of 2; that’s k_B * ln(2) = 3.298e-24 cal/K * 0.6931 = 2.286e-24 cal/K. Dividing that into the entropy change we got above gives an information increase of …wait for it… 1.46e21 bits.

    That’s over a thousand billion billion bits of information just because a little water cooled off slightly.

    I’m going to go ahead and claim that this definition has almost nothing to do with what most people mean by “information”.

    BTW, I’ve discussed this here before; see this earlier comment for a treatment of the relation between information and Shannon entropy.

    Also, setting aside the question of “information” for the moment:

    So why do these erroneous views keep propagating, having spontaneous information arising from energy gradients?

    Because they fudge the bookkeeping. Entropy is notoriously hard to measure, and so for example, they might underestimate the information in the cellular machinery and think that a temperature gradient has more than enough entropy to create the cellular machinery. Or as Granville Sewell argues, they have an open system that allows entropy to arrive and disappear without proper accounting, so that information accumulates inside the cell, which they then misattribute to temperature gradients. But if any of these papers were even within hailing distance of being correct, then perpetual motion would be commonplace by now, and you and I would spend our days wondering what to do with all our free time.

    I’d agree that that analysis in Daniel Styer’s “Entropy and Evolution” is significantly wrong, but Emory Bunn’s “Evolution and the second law of thermodynamics” has, as far as I know, only one minor error: he assumed the entropy associated with thermal radiation was E/T (where T is the temperature of the emitting body). It’s actually more than that in general (and 4E/3T for the special case of blackbody radiation). I did a version of the entropy flux calculation (quoted here) that takes this into account and got a lower bound of 3.3e14 W/K (compare with Styer’s 4.2e14 W/K). Can Dr. Sheldon point out any other errors in Styer’s analysis, or any at all in my analysis?

    BTW, if you plug that value into -S = Information (with appropriate conversions), you get 3.4e37 bits per second of information increase.

    Finally, News (Denyse, I presume?) mentioned Sewell’s X-Entropy. As I’ve pointed out before, Sewell’s X-Entropy calculation is only valid in the specific case of diffusion through a solid (and even then, only in the absence of gravity). In many other situations, it falls apart completely (I give two examples here; I can give more).

    The X-Entropy argument has been refuted. It’s done. Give it up.

  3. 3
    tarmaras says:

    Excellent choice of topic Rob Sheldon.

    I found this section from A.Dalela’s Moral Materialism, in which his semantic/informational theory of matter is applied to the problems in statistical mechanics and thermodynamics, to be relevant to the discussion. Main point: he proposes that energy is information, and it exists on a continuum, from abstract (concepts) to contingent (objects). I highlighted the points I found most salient.

    Statistical Mechanics and Information

    We earlier saw how statistical mechanics posits the existence of a priori real particles, and the distribution of energy in the ensemble is the distribution of total energy into the momentum of each particle. Classically, such a system must always be in a deterministic state, even if we don’t know that state. Since a classical system is always linear and reversible, such a deterministic theory could not explain why a thermal system is non-linear and irreversible. To address this problem, statistical mechanics posits that the system is simultaneously in multiple different states. This is of course predictively correct, but it leads to the problem of understanding how a classically definite system could at once be in many distinct states.

    The solution to this problem is possible if we can discard the idea that there is a fixed set of particles whose state is uncertain. Rather, we can say that the total number of particles in an ensemble is equal to the number of particles whose state is certain. The particles are thus not a priori real. Rather particles are created when the system becomes more and more certain in its state. How does a system become certain? We can postulate that a system becomes certain when information is added to that system. The uncertainty in a system is not randomness in the system. It is rather the total amount of information that could be added to a system, before the system runs out of its information storing capacity. A system into which more information cannot be added has a net zero uncertainty. Such a system can be likened to a classically definite object. Since the addition of information and the reduction of uncertainty create ever more particles, a classically definite system must have the maximum number of particles that can be packed in a given region.

    To adopt this viewpoint, we will have to discard the idea that nature comprises of a priori real particles. The particles would rather be created only when some information is added to the ensemble to convert the energy in the ensemble into a definite kind of material distribution. Therefore, the particles are not a priori real. They are rather byproducts of adding information into an ensemble.

    The point of inflexion in statistical mechanics is that a classically definite state implies a definite energy in a particle but a definite energy in the ensemble does not imply classically definite states. If we suppose that the number of particles remains the same even if the system state is uncertain, then the system must exist in multiple such states simultaneously. This problem, however, does not arise if the total number of particles equals the definiteness in state. As information is added, the state becomes more definite and the total number of particles (which must exist in definite states) increases.

    Classical physics studied individual particles which have both definite state and hence definite energy. A canonical example of such a particle is a billiard ball which has both a definite energy and a definite state. The successes of classical physics led us to believe that all matter exists as particles in definite state. Essentially, the universe was modeled as a collection of very small billiard balls. But this idea is inconsistent with thermal phenomena and we cannot claim that there is in fact a classically definite reality in the ensemble. We would still like to retain classical intuitions in the case of billiard balls. So, how do we reconcile the ideas that particles in an ensemble are not in a definite state but the particle as a billiard ball is in a definite state? This is possible if we acknowledge that particles are created by adding information into an ensemble. The billiard ball is a particle created by adding information. However, in an ensemble much of the information is missing. Therefore, the ensemble must only have much fewer particles in definite state. This means that the ensemble has a much larger potential to create new particles by adding information as compared to the billiard ball. These definite particles would be created only when the ensemble has been encoded with its maximum possible information capacity.

    The informational viewpoint explains why an ordered system becomes disordered when it performs ‘work.’ The ‘work’ in this case is the act of passing information from the source to the destination. In this interaction, the transfer of information decreases the order in the source and increases the order in the destination. Since order and disorder represent the number of particles, this viewpoint implies that some particles disappear in the source system and appear in the destination system. However, the exchange of particles does not involve classical motion because in thermal phenomena there may be physical barriers that prevent the motion of particles in a classical sense. The transfer of energy however represents a transfer of information and since particles are created when information is added, the transfer of information denotes a transfer of particles. This notion of material transfer requires a new physical theory.
    In this theory, the primary entity being transferred is information, which is currently described as energy. A system with some information exists as a concept with some level of definiteness; for instance, it can encode the idea of a car. When information is extracted from this system, the system becomes less definite and will now encode more abstract information. For instance, the ensemble would now encode the idea of a vehicle, rather than that of a car. The reduction in definiteness in the system state means that there must be fewer particles in a definite state. These particles encode more abstract information, which cannot be exchanged with the environment because the environment may already carry that kind of information. For instance, we cannot transfer the idea of a vehicle to an object which is already a specific BMW car. The reason is that the idea of a car already includes the idea of a vehicle. We can transfer abstract information between systems only when these systems live on different branches of the semantic tree and the idea being transferred is not already part of the receiving object. An idea cannot be transferred if it already exists in the receiving system.

    Therefore, as more energy is extracted from a system, it becomes harder to extract any further energy. This is totally unintuitive from the standpoint of classical physics where a moving object X can collide with a static object Y and transfer all its energy to Y. After the collision, X would be static and Y would be moving. In this interaction, all the energy in X has been transferred into Y. Thermal systems seem to be incapable of doing so. The thermal system retains some energy and can never transfer all its energy. To explain why some energy is not transferrable we must say that that energy does not exist as motion but as information. When information is concretized, it appears as physical things and their motion. However, when the information is removed, it exists as abstract ideas and cannot be transferred as would be the case if it were motion. To correctly describe matter, we should now describe motion as information exchange rather than information exchange as motion. The specific change in the case of thermal systems is that all of motion can be transferred but the information cannot all be transferred.

    Unlike classical physics, where the work is performed by a transfer of energy, now the work is performed by the transfer of information. Therefore, the capacity for work in a system is not its total energy, but the total amount of information that can actually be transferred. While all of motion in a system can be transferred, all of information cannot be. The information system therefore does not behave like a classical system because of this basic difference. The transfer of information depends on the object with which the information is being exchanged. The more informationally similar the two objects, the less information would they exchange. Conversely, two objects that are informationally dissimilar would exchange significantly more information. Eventually, the information exchange stops when the most abstract type of information that a system can transfer is already present in the receiving system.

    A disordered system also has energy but it does not have transferable information and thus its capacity to do work is reduced. The fact that work cannot be done by the energy unless information can be exchanged implies that work is actually done by information and not by energy. This is a dramatic revision of the idea of work in classical physics where work is done by the transfer of energy and the order of the system transferring this energy plays no role in the amount of possible work. The problems of statistical mechanics indicate that measuring information as energy is like measuring a book on a weighing scale. Lots of books may weigh the same, but they would not be equally useful to us in terms of information. Essentially, when we reduce information to energy, we lose the type differences between the semantic particles. Some type differences represent information that can be transferred, while other types denote information which is similar to the environment and therefore cannot be exchanged. This also means that if we were to bring a system very conceptually different from the source of the latent energy, the system would be able to exchange the energy with it.
    The problem therefore isn’t that the energy is not inherently extractable. The problem is that we are not using the correct procedure to extract the energy. Two conceptually different systems can exchange information so long as they are conceptually different. The information exchange ends when the receiving system already contains all the information that the sender has to provide. There may still be a lot of information in the sender, but it would not be exchanged because the receiver already has everything the sender can provide. To extract this information, a new kind of receiver has to be brought into the picture which can extract the information. This fact is clearly observed in the use of a thermometer which can measure that the system has heat, even though it cannot perform work.

    The inability in the sender to transfer information to a receiver—when the receiver already has the information—cannot be understood in a physical theory. For instance, one computer can keep sending the same type of information to the other computer and this transfer of (physical) information can never be prevented. The receiver can mark all this redundant information as ‘spam’ which could mean that the received information is discarded. But there is no way for the receiver to prevent the sender from sending the spam. In fact, the spammer can find more ways to obviate the physical rules that the receiver uses to detect spam. For instance, the spammer could send the same information from a different address and the receiver would not know. Only in a semantic theory can the presence of some information in the receiver prevent the sender from transmitting that information. The inability to send information is a semantic effect, but that effect is visible within matter.

    When Claude Shannon formulated communication theory, he used probabilities to describe the expectation of some symbols. The idea was that the sender should not send what is already expected at the receiver. Sending what the receiver already expects (and can therefore predict) is a veritable waste of the communication channel. Ideally, therefore, the sender is only supposed to send what the receiver cannot expect. But there is no way in Shannon’s theory to prevent the sending of information which the receiver expects. A sender may send the same information over and over again and the information would be transmitted, received, and discarded. In a semantic communication system, however, it is possible to prevent the transmission of redundant information. There is an inherent sense in which the expectation of the receiver is correlated with the prediction of the sender and the sender will not transmit what the receiver already knows. Only new ideas would be transmitted.

    The perceived non-linearity of the thermal system is a byproduct of the correlation between senders and receivers. These systems are by no means non-linear. However, they are correlated, which allows unique information to be transmitted but the non-unique information to be held back. Ideally, a complete physical theory of nature would have to predict this sender-receiver correlation. If quantum theory were a complete theory of nature, it would predict this kind of correlation, but it does not. This is because quantum theory still treats the senders and receivers physically. The information being sent out is still treated as a quantum of energy (called photon) but this photon does not represent a semantic type. A quantum system—according to quantum theory—thus can keep transmitting information to another quantum system, producing the same kind of prediction that was earlier true in classical physics: i.e. a system can completely transfer all its energy to another system. Current quantum theory therefore violates the basic inability in a system to transmit information beyond a certain point, and theories of ‘quantum thermodynamics’ have to be separately formulated to explain the observed non-linearity. This I consider as a flaw in current quantum theory, which can be rectified only when the theory treats the sent and received information as semantic symbols.

    The hierarchical space-time theory described previously explains and predicts the observed lack of transmission because in this theory the sender and receiver systems are correlated objects produced by splitting an abstract object into contingent objects. The transmitter in this case sends particles (or anti-particles) and the receiver receives these particles (or anti-particles). However, these acts of sending and receiving are not arbitrary. That is, the sender does not send the information and let the receiver figure out if the information is actually useful. Rather, the sender sends only useful information. If the information is not useful, it cannot be sent.

    The emission and absorption of quantum particles is therefore not arbitrary as in current quantum theory (in quantum theory, the absorption and emission events cannot be predicted). Rather, quantum systems are correlated in a way that thermodynamics requires although current quantum theory does not yet predict or explain. This is further profoundly related to the nature of light as current quantum theory describes it. Light, in the current theory, is emitted as photons which ‘travel’ at the speed of light to eventually impinge upon some quantum object which absorbs this light. In principle, therefore, the sender and receiver of this photon have not a priori agreed to exchange information. Rather, the sender is just emitting information and the receiver just happens to be on the path of that information. This is a classical picture of communication as it involves the idea that light ‘moves’ in space-time like other material objects and it can collide with an arbitrary object. In a quantum theory compatible with thermodynamics, the photon would not be emitted unless the photon carries a unique type of information which is currently absent in the receiver. In other words, the quantum communication will not exchange redundant information.

    Accordingly, the light we receive from the sun is not just the outcome of a nuclear reaction which happens independently of our receiving it. Rather, the nuclear reaction occurs because a specific object has to receive that information. In a sense, the sun is not just “pushing” out light. Rather, the receiver is also “pulling” in light. Unless the sender and receiver are correlated, and the communication carries some unique information missing in the receiver, the information cannot be sent. The hierarchical space-time view predicts this but current quantum theory does not. To the extent that quantum theory is incompatible with thermodynamics—the former is linear and the latter is non-linear—the hierarchical space-time view can be seen as an advance over current quantum theory. But this advance also involves discarding many ideas that current quantum theory carries. Specifically, quantum objects have to be viewed as symbols of information not just lumps of energy, mass, or charge.

    Current statistical mechanics is a byproduct of the incorrect classical assumption that the world is individuated into a priori real particles. We suppose that each particle must be in a definite state, although we cannot know which state the particles are in. The idea, therefore, that there indeed exist real individual particles is a theoretical and metaphysical assumption. The assumption is consistent with classical physics, but it is inconsistent with observation. In the informational picture, the particles do not exist until information is added. Someone might object: we are able to observe the atoms in the ensemble by doing diffraction experiments. How can we suppose that the thermal system does not have all these particles? The answer to this problem is that the diffraction experiment adds information to the system. By probing the system using atomic information, the information is added into the system and the outcome is a byproduct of the information that was added in the experiment. This is further related to the problem of measurement interaction in quantum theory as I will shortly discuss. However, the key point here is that to observe microscopic parts of an ensemble, information must be added to divide the ensemble into microscopic parts that can be observed. The observed outcome of such an experiment is a byproduct of the act of adding information. The classical idea therefore that there are indeed real particles is incorrect.

    So long as particles can be added to an ensemble, there is room to add information in the ensemble. These additions will incrementally fix the uncertainty in the state by packing more information. A collection of material particles therefore is not always in an uncertain state . Rather, this uncertainty is a byproduct of a shortfall in information in a system. In classical physics, each particle is already in a definite state and further information cannot be added and this corresponds to the state of maximal information in an ensemble.

  4. 4
    bornagain77 says:

    While I don’t know enough about the equations of thermodynamics to say one way or the other whether Dr. Sheldon’s claim, i.e. ‘Increasing the heat energy leads to decreasing the information’, is right from the mathematical, i.e. theoretical, perspective, I do know that his claim is right from the empirical perspective.

    Simply put, just pouring raw energy into a system (as with the sun pouring energy onto the earth or as with boiling water) actually increases the disorder of a prebiotic system. i.e. less biological information.

    Nick Lane Takes on the Origin of Life and DNA – Jonathan McLatchie – July 2010
    Excerpt: numerous problems abound for the hydrothermal vent hypothesis for the origin of life,,,, For example, as Stanley Miller has pointed out, the polymers are “too unstable to exist in a hot prebiotic environment.” Miller has also noted that the RNA bases are destroyed very quickly in water when the water boils. Intense heating also has the tendency to degrade amino acids such as serine and threonine. A more damning problem lies in the fact that the homochirality of the amino acids is destroyed by heating.
    Of course, accounting for the required building blocks is an interesting problem, but from the vantage of ID proponents, it is only one of many problems facing materialistic accounts of the origin of life. After all, it is the sequential arrangement of the chemical constituents — whether that happens to be amino acids in proteins, or nucleotides in DNA or RNA — to form complex specified information (a process which requires the production of specified irregularity), which compellingly points toward the activity of rational deliberation (Intelligence).
    http://www.evolutionnews.org/2.....36101.html

    Refutation Of Hyperthermophile Origin Of Life scenario
    Excerpt: While life, if appropriately designed, can survive under extreme physical and chemical conditions, it cannot originate under those conditions. High temperatures are especially catastrophic for evolutionary models. The higher the temperature climbs, the shorter the half-life for all the crucial building block molecules,
    http://www.reasons.org/LateHea.....iginofLife

    The origin of life–did it occur at high temperatures?
    Excerpt: Prebiotic chemistry points to a low-temperature origin because most biochemicals decompose rather rapidly at temperatures of 100 degrees C (e.g., half-lives are 73 min for ribose, 21 days for cytosine, and 204 days for adenine).
    http://www.ncbi.nlm.nih.gov/pubmed/11539558

    “Accordingly, Abelson(1966), Hull(1960), Sillen(1965), and many others have criticized the hypothesis that the primitive ocean, unlike the contemporary ocean, was a “thick soup” containing all of the micromolecules required for the next stage of molecular evolution. The concept of a primitive “thick soup” or “primordial broth” is one of the most persistent ideas at the same time that is most strongly contraindicated by thermodynamic reasoning and by lack of experimental support.”
    – Sidney Fox, Klaus Dose on page 37 in Molecular Evolution and the Origin of Life.

    “Despite bioenergetic and thermodynamic failings the 80-year-old concept of primordial soup remains central to mainstream thinking on the origin of life, But soup has no capacity for producing the energy vital for life.”
    William Martin – an evolutionary biologist – New Research Rejects 80-Year Theory of ‘Primordial Soup’ as the Origin of Life – Feb. 2010

    Chemist explores the membranous origins of the first living cell:
    Excerpt: Conditions in geothermal springs and similar extreme environments just do not favor membrane formation, which is inhibited or disrupted by acidity, dissolved salts, high temperatures, and calcium, iron, and magnesium ions. Furthermore, mineral surfaces in these clay-lined pools tend to remove phosphates and organic chemicals from the solution. “We have to face up to the biophysical facts of life,” Deamer said. “Hot, acidic hydrothermal systems are not conducive to self-assembly processes.”
    http://currents.ucsc.edu/05-06/04-03/deamer.asp

    Besides the implausibility of the ‘hot’ origin of life scenarios, some Darwinists have instead opted for promoting a ‘cold’ origin of life scenario with ice. Yet the cold origin of life scenario is also found to be implausible for several reasons.

    The Peculiar Properties of Ice – August 2012
    Excerpt: As we reported here, there are some fundamental problems that still need to be addressed for an ice-packed origin of life to be feasible. A cold origin-of-life scenario may seem to solve some difficulties with molecular stability and nucleotide concentration, but the tradeoffs, including a slower reaction rate, make this scenario untenable.
    http://www.evolutionnews.org/2.....62861.html

    Nick Lane himself notes a ‘significant conceptual flaw’ in some origin of life research regarding ‘equilibrium’.

    Origin-of-Life Theorists Fail to Explain Chemical Signatures in the Cell – Casey Luskin – February 15, 2012
    Excerpt: (Nick) Lane also notes that the study has a significant conceptual flaw.
    “To suggest that the ionic composition of primordial cells should reflect the composition of the oceans is to suggest that cells are in equilibrium with their medium, which is close to saying that they are not alive,” Lane says. “Cells require dynamic disequilibrium — that is what being alive is all about.”,,,
    Our uniform experience affirms that specified information-whether inscribed hieroglyphics, written in a book, encoded in a radio signal, or produced in a simulation experiment-always arises from an intelligent source, from a mind and not a strictly material process.
    (Stephen Meyer – Signature in the Cell, p. 347)
    Per Evolution News and Views

    In this regards Nick Lane is right. Biological life is dramatically characterized by its thermodynamic dis-equilbrium with the environment not by its equilibrium.

    Professor Harold Morowitz has shown that the Origin of Life ‘problem’ escalates dramatically over the oft quoted 1 in 10^40,000 figure when working from a thermodynamic perspective:

    “The probability for the chance of formation of the smallest, simplest form of living organism known is 1 in 10^340,000,000. This number is 10 to the 340 millionth power! The size of this figure is truly staggering since there is only supposed to be approximately 10^80 (10 to the 80th power) electrons in the whole universe!”
    (Professor Harold Morowitz, Energy Flow In Biology pg. 99, Biophysicist of George Mason University)

    Dr. Morowitz did another probability calculation working from the thermodynamic perspective with a already existing cell and came up with this number:

    DID LIFE START BY CHANCE?
    Excerpt: Molecular biophysicist, Horold Morowitz (Yale University), calculated the odds of life beginning under natural conditions (spontaneous generation). He calculated, if one were to take the simplest living cell and break every chemical bond within it, the odds that the cell would reassemble under ideal natural conditions (the best possible chemical environment) would be one chance in 10^100,000,000,000. You will have probably have trouble imagining a number so large, so Hugh Ross provides us with the following example. If all the matter in the Universe was converted into building blocks of life, and if assembly of these building blocks were attempted once a microsecond for the entire age of the universe. Then instead of the odds being 1 in 10^100,000,000,000, they would be 1 in 10^99,999,999,916 (also of note: 1 with 100 billion zeros following would fill approx. 20,000 encyclopedias)
    http://members.tripod.com/~Black_J/chance.html

    Punctured cell will never reassemble – Jonathan Wells – 2:40 mark of video
    http://www.youtube.com/watch?v=WKoiivfe_mo

    Also of related interest is the information content that is derived in a ‘simple’ cell when working from a thermodynamic perspective:

    “Is there a real connection between entropy in physics and the entropy of information? ….The equations of information theory and the second law are the same, suggesting that the idea of entropy is something fundamental…”
    Siegfried, Dallas Morning News, 5/14/90, [Quotes Robert W. Lucky, Ex. Director of Research, AT&T, Bell Laboratories & John A. Wheeler, of Princeton & Univ. of TX, Austin]

    Biophysics – Information theory. Relation between information and entropy: – Setlow-Pollard, Ed. Addison Wesley
    Excerpt: Linschitz gave the figure 9.3 x 10^12 cal/deg or 9.3 x 10^12 x 4.2 joules/deg for the entropy of a bacterial cell. Using the relation H = S/(k In 2), we find that the information content is 4 x 10^12 bits. Morowitz’ deduction from the work of Bayne-Jones and Rhees gives the lower value of 5.6 x 10^11 bits, which is still in the neighborhood of 10^12 bits. Thus two quite different approaches give rather concordant figures.
    https://docs.google.com/document/d/18hO1bteXTPOqQtd2H12PI5wFFoTjwg8uBAU5N0nEQIE/edit

    ‘The information content of a simple cell has been estimated as around 10^12 bits, comparable to about a hundred million pages of the Encyclopedia Britannica.”
    Carl Sagan, “Life” in Encyclopedia Britannica: Macropaedia (1974 ed.), pp. 893-894

    HISTORY OF EVOLUTIONARY THEORY – WISTAR DESTROYS EVOLUTION
    Excerpt: A number of mathematicians, familiar with the biological problems, spoke at that 1966 Wistar Institute,, For example, Murray Eden showed that it would be impossible for even a single ordered pair of genes to be produced by DNA mutations in the bacteria, E. coli,—with 5 billion years in which to produce it! His estimate was based on 5 trillion tons of the bacteria covering the planet to a depth of nearly an inch during that 5 billion years. He then explained that,, E. coli contain(s) over a trillion (10^12) bits of data. That is the number 10 followed by 12 zeros. *Eden then showed the mathematical impossibility of protein forming by chance.
    http://www.pathlights.com/ce_e.....hist12.htm

    Thus, for someone to suggest that there is no problem between thermodynamics and the increase of non-trivial biological information is simply to be out of touch with the empirical realities of the situation.

    “To get a range on the enormous challenges involved in bridging the gaping chasm between non-life and life, consider the following: “The difference between a mixture of simple chemicals and a bacterium, is much more profound than the gulf between a bacterium and an elephant.”
    (Dr. Robert Shapiro, Professor Emeritus of Chemistry, NYU) – The Theist holds the Intellectual High-Ground – March 2011

  5. 5
    Seversky says:

    So what is the definition of “information” in these cases? It’s being bandied about as if it’s some sort of widely-accepted unitary concept but it sounds nothing like what I would recognize as information.

  6. 6
    bornagain77 says:

    Seversky, I believe that Dr. Sheldon and Gordon Davisson are talking about Shannon information whereas I’m looking for an increase in non-trivial functional biological information.

    Three subsets of sequence complexity and their relevance to biopolymeric information – Abel, Trevors
    Excerpt: Three qualitative kinds of sequence complexity exist: random (RSC), ordered (OSC), and functional (FSC).,,, Shannon information theory measures the relative degrees of RSC and OSC. Shannon information theory cannot measure FSC. FSC is invariably associated with all forms of complex biofunction, including biochemical pathways, cycles, positive and negative feedback regulation, and homeostatic metabolism. The algorithmic programming of FSC, not merely its aperiodicity, accounts for biological organization. No empirical evidence exists of either RSC of OSC ever having produced a single instance of sophisticated biological organization. Organization invariably manifests FSC rather than successive random events (RSC) or low-informational self-ordering phenomena (OSC).,,,
    http://www.tbiomed.com/content/2/1/29

    Measuring the functional sequence complexity of proteins – Kirk K Durston, David KY Chiu, David L Abel and Jack T Trevors – 2007
    Excerpt: We have extended Shannon uncertainty by incorporating the data variable with a functionality variable. The resulting measured unit, which we call Functional bit (Fit), is calculated from the sequence data jointly with the defined functionality variable. To demonstrate the relevance to functional bioinformatics, a method to measure functional sequence complexity was developed and applied to 35 protein families.,,,
    http://www.tbiomed.com/content/4/1/47

    Mutations, epigenetics and the question of information
    Excerpt: By definition, a mutation in a gene results in a new allele. There is no question that mutation (defined as any change in the DNA sequence) can increase variety in a population. However, it is not obvious that this necessarily means there is an increase in genomic information.,, If one attempts to apply Shannon’s theory of information, then this can be viewed as an increase. However, Shannon’s theory was not developed to address biological information. It is entirely unsuitable for this since an increase of information by Shannon’s definition can easily be lethal.
    http://creation.com/mutations-.....nformation

    The Law of Physicodynamic Incompleteness – David L. Abel – 2011
    Excerpt: “If decision-node programming selections are made randomly or by law rather than with purposeful intent, no non-trivial (sophisticated) function will spontaneously arise.”
    If only one exception to this null hypothesis were published, the hypothesis would be falsified. Falsification would require an experiment devoid of behind-the-scenes steering. Any artificial selection hidden in the experimental design would disqualify the experimental falsification. After ten years of continual republication of the null hypothesis with appeals for falsification, no falsification has been provided.
    The time has come to extend this null hypothesis into a formal scientific prediction:
    “No non trivial algorithmic/computational utility will ever arise from chance and/or necessity alone.”
    https://www.academia.edu/11759341/Physicodynamic_Incompleteness_-_Scirus_Sci-Topic_Page

  7. 7
    EDTA says:

    The same amount of heat energy at higher temperature has more information than the same heat at lower temperature.

    Huh? The same amount of heat energy at a higher temperature??? Did I miss something there?

Leave a Reply