Back to Basics of ID biosemiotics Complex Specified Information Computing, AI, Cybernetics and Mechatronics Darwinist rhetorical tactics Entropy Functionally Specified Complex Information & Organization ID Foundations Informatics thermodynamics and information UD Guest Posts

UD Guest Post: Dr Eugen S on the second law of thermodynamics (plus . . . ) vs. “evolution”

Spread the love

Our Physicist and Computer Scientist from Russia — and each element of that balance is very relevant — is back, with more.  MOAR, in fact. This time, he tackles the “terror-fitted depths” of thermodynamics and biosemiotics. (NB: Those needing a backgrounder may find an old UD post here and a more recent one here, helpful.)

More rich food for thought for the science-hungry masses, red hot off the press:


>>On the Second Law of Thermodynamics in the context of the origin of life


Rudolf Clausius (1822-1888)

This note was motivated by my discussions with Russian interlocutors. One of UD readers here has asked me to produce a summary of those discussions, which I am happy to do now. I hope it will be interesting to all UD readers.

Unfortunately, sometimes one can see erroneous interpretations of thermodynamics results, especially, those concerning the famous second law. In discussions with my Russian readers I encountered somewhat forthright statements about the alleged sufficiency of thermodynamics itself in proving the existence of God and even claims that the concept of thermodynamic fluctuations of macrostate is all wrong. One anonymous reader has put it like this: thermodynamics has been polluted with fluctuations.

To show that such claims are incorrect, I briefly quote the various open sources on thermodynamics and elaborate on them a bit.

At the same time, I will also try to show that having in one’s hands only argumentation from thermodynamics per se, is insufficient to demonstrate the incorrectness of a materialist view on the origin of life (which view is necessarily reductionist). Even though I will focus my attention only on thermodynamics, I believe this is true for any exclusively physicalistic view on life as motion of the particles of matter, to put it in the words of Howard Pattee. I think that in order to succeed in showing the inadequacy of reductionist views regarding the origin of life, one should take a system level view. Only at a system level, can one fully appreciate the irreducible complexity of life, as I attempt to argue below.

While in the first part of my note I expose some frequent errors in interpreting thermodynamics mainly by theists, its second part is aimed at dismantling some materialist objections. I do not claim to be an expert in thermodynamics, so any constructive criticism of my views here is welcome.

I apologize in advance that some of my links are to Russian sources.

[–> actually, this is an asset here at UD; the ID debate in the Russian language zone of cyberspace demonstrates beyond responsible doubt, that the issue is not just an over-spilling of American politics post some 1980’s USSC decision. NCSE, et al, kindly, take due note; KF]

They are there mainly because they exist in the original Russian version of this note.

Classical thermodynamics and fluctuations of state

Let us start with a quote from here (translation mine):

In contrast to thermodynamics, statistical mechanics deals with a specific class of processes — fluctuations whereby a system changes its state from a more probable to a less probable, which consequently reduces its entropy. The presence of fluctuations demonstrates that the second law which states that in an isolated thermodynamic system entropy does not reduce, holds only statistically, i.e. on average over large enough intervals of time.

So? Of course, the second law holds. Eventually all entropic differentials in an isolated thermodynamic system will be nullified over time and its entropy will reach its maximum, in full accordance with the second law. However, there are two things I wanted to comment on.

1. An extrapolation of the classical thermodynamics results on to the entire universe is unwarranted.

We cannot really extend the logic of classical thermodynamics to the entire universe (see here and here). Rudolf Clausius came to the conclusion about the heat death of the universe based on specific assumptions of his theory also assuming that the universe is an isolated thermodynamic system. However, the universe is not a thermodynamic system at all since the assumptions such as the existence of thermodynamic equilibrium, the additivity of energy, ergodicity, do not hold for it as a whole (see here for details). Contemporary physics regards the heat death of the universe as only a hypothesis that is valid only based on the classical theory due to Clausius. But we know today that thermodynamic systems may depart very far from equilibrium, so far that applying the classical theory to such processes is not valid.

[A bit of context may help, here the Hertzsprung-Russel view of H-rich gas balls:]

hr_sketch[This is generally seen as a context in which based on the balance of gravity and radiation pressure stars form, unfold across a life cycle and die, some with a supernova bang, others with a white dwarf whimper. So, across time the observed cosmos moves to ever more degradation of energy, plausibly — but not unquestionably — ending in a heat death scenario:]

The Big Bang timeline -- a world with a beginning
The Big Bang timeline — a world with a beginning

[Issues over fluctuations and cycles etc as ES raises below then come to bear.]

2. Even though the second law is universal and fundamental, local short-term fluctuations of state do occur since the second law is formulated statistically for large enough numbers of molecules and long enough periods of time.

As one textbook stated, the second law does not only admit fluctuations but in fact predicts them.

Small local fluctuations of macrostate are consequently a reality, which does not go contrary to the universality of the second law. For example, we can observe fluctuations of temperature, density or entropy. This is a manifestation of the law of large numbers. Locally, there can and will be departures from the mathematical expectations but on the whole the law of large numbers levels off all such local departures from the theoretical mean. As a result, the mean for a sample will tend to its theoretical value as you make it more and more representative. The same happens with entropic fluctuations. Even in isolated thermodynamic systems entropy at its max value for the whole system may fluctuate locally. When we integrate over the entire volume, the influence of those fluctuations is found to be negligible while the entropy on the whole behaves in accordance with the second law.

Here lies one of the frequent misunderstandings between us and materialists, at least in the Russian segment of the internet. They are flagging up frequent incorrect interpretations of the second law, and rightly so. Unfortunately, it must be stated that based on thermodynamics alone you cannot say that their reductionist claim is wrong. E.g. we cannot say that evolution as such is contrary to the second law.

So, can there be fluctuations? Yes! But materialists go on further and consider life such a fluctuation. It is persistent due to replication but its persistence will eventually deteriorate. Yes, materialists say, there will be a time when all life will eventually cease to exist (i.e. when there is no Sun around anymore). That is all in fine accord with the second law. And yet their conclusion about life is fallacious. However, the right argumentation that we can prove them wrong with is not found at the level of thermodynamics.

Life viewed at a system level

The bad news is that in order to get our heads around the origin of life at least conceptually, physics is not enough. If I pierce a computer network cable with voltage measuring equipment, I will, of course, detect voltage jumps. What I will not be able to do without bringing on the table all that the computer network is doing, is understand the meaning of these voltage jumps. Without the system level knowledge, the computer will remain a black box for me:


Materialists are wrong not in saying state fluctuations occur. They (at least those of them who subscribe to the reductionist picture of life as exclusively chemistry) are wrong in considering life’s origin a spontaneous fluctuation. They get inside the cell and say: look, it is all chemistry! Of course, who said it isn’t chemistry?! Living cells are material objects! They, as any other material body, consist of particles of matter subject to natural regularities.

But the point is that life is not only physics or chemistry! Life is about functional organization (not mere order) whereby its parts are integrated into a whole and act in such a way that the whole persists long enough without disintegration by maintaining its autonomy, metabolizing, replicating and responding to stimuli. In order to see that materialists commit a fallacy (or knowingly play word games) it is necessary to walk up a level of scientific abstraction and examine what kind of ‘fluctuation’ life is.

[Cf. a brief discussion of fluctuations, here. This shows a basic result (bear in mind the Avogadro Number is 6.02 *10^23 molecular scale particles), which we may clip per fair use:]

It turns out that this fluctuation is indeed special even to the point of putting away as inadequate the language of statistical mechanics! Here is why. In order to act against the tendency of entropy to increase, life must be organized so that it replicates with minimal information losses. This means, according to Shannon’s theorem for a noisy channel, that, provided the channel capacity is not exceeded during transfer, minimal losses are achieved when information is encoded using a discrete digital code: the genetic code meets that requirement!



Yockey's analysis of protein synthesis as a code-based communication process
Yockey’s analysis of protein synthesis as a code-based communication process

[Protein Synthesis:]

Protein Synthesis (HT: Wiki Media)
Protein Synthesis (HT: Wiki Media)

Furthermore, using code means this ‘fluctuation’ should not only write its own description into memory but it also must be able to also encode memory itself.

In order words, to replicate, a non-homogenous system (in contrast to homogenous systems such as crystals) should be able to read/write its own symbolic description from/to memory. This means that a living cell should have a system for information translation. What a spontaneous fluctuation! Clever indeed…

Fluctuations do not build adapters!

As [longtime UD commenter and now occasional contributor] Mung has aptly put it once (alluding to Francis Crick’s famous phrase ‘life is a frozen accident’), accidents do not build adapters! Mung was referring to the adaptor hypothesis that was formulated by Crick in 1955 and so brilliantly validated afterwards. During genetic code translation, life uses adaptors responsible for maintaining unambiguous correspondence between the nucleotide instructions written on messenger RNAs and their amino acid meanings expressed in synthesized protein macromolecules.

Imagine loads of nano-meter plugs being inserted in their respective sockets at a high rate (the rate is variable with a max at about 20 amino acid residues attached to a polypeptide per second) such that the ‘wrong’ amino acid which does not correspond to the RNA codon being processed will not be inserted into the protein molecule.

Step by step protein synthesis in action, in the ribosome, based on the sequence of codes in the mRNA control tape (Courtesy, Wikipedia and LadyofHats)
Step by step protein synthesis in action, in the ribosome, based on the sequence of codes in the mRNA control tape (Courtesy, Wikipedia and LadyofHats)

This means only one thing: engineering foresight must have been at play here apart from mere regularities of nature. Indeed, it was necessary to make sure that the formal correspondence, also known as the genetic code, between a token (mRNA codon) and its referent amino acid, should be maintained. This correspondence carries absolutely no bias towards or against any physical aspect of its implementation! The only possibility for this to have happened is by intelligence. No other way! Non-living matter is void of any foresight or decision making capabilities and it, consequently, could not be accountable for the organization of life!

In the whole of the observed universe things similar to life are doable only by human intelligence. Examples of sign processing systems are limited to: mathematics, art, languages, ciphers, motorway code and the other human-made information processing systems, on the one hand, and … biological codes, on the other. Nothing else…

Consequently, we can hypothesize that biological codes also have intelligent origin.

Even today the entire human technology collectively cannot match the grand design and implementation of life. So far, we were able to reverse-engineer some parts of this true piece of art and wonderfully organized artifact.

To create life, from the point of view of systems theory and information theory, it was necessary to organize the complex {code+protocol+translator}, – the whole of it and at once, – simply because code without its complementary translator is useless rubbish and, likewise, a translator without code a meaningless pile of junk. Furthermore, a protocol is a set of rules i.e. a non-physical thing, a logical correspondence of material entities that have no physical or chemical attraction or bias of any sort to one another. How could that arise by purely physicalistic means?! Total nonsense!

No physical fluctuation of macrostate – of entropy, density, pressure, temperature – can achieve it. The first ever living organism already was not reducible to its constituents. A gradual path for its own assembly starting at a spontaneous fluctuation of state is non-existent if this world. It is a myth. Nature only permits the creation of information processing systems being itself indifferent to information processing; indifferent in the same sense as a spherical mass is at equilibrium on a horizontal plane without friction.

If we take a look at how life is organized, we shall see that its organization is purposefully directed to counteract entropic increases. Death is the unavoidable end of this wrestling as far as an individual living organism is concerned. Before dying though an organism passes on life to the next generation which is organized in the same way. «The fight against the second law» is realized as replication. Although every new organism and even the replication mechanism itself are subject to degradation over time (the latter at a much slower rate than the former), the heart of life, so to speak, is in replication!

[Cf. the von Neumann kinematic self replicator (c. 1948), vNSR:]

jvn_self_replicator[Also, Mignea on cell based self replication (2012):]

Nothing of the sort can be observed in inanimate nature as a complex. Various individual elements of this complex are observed, such is crystal growth/replication for instance. However, for a crystal to grow information translation and codes are not necessary. Crystals grow mechanically, as matrix bulk-copying of layers upon layers of lattice, in line with the minimum total potential energy principle (see my previous OP). In contrast, the genetic code cannot be explained solely by this principle since the essence of translation is in the processing of tokens that only evoke physical effects without determining what these effects should be! Tokens impose boundary conditions on the system dynamics.

As a matter of fact, biosemiotics as a discipline that describes sign processing in biological systems, views a token, not a gene, as a unit of life. According to biosemiotics (cf. here and here), life is physics coupled with specific symbolic boundary conditions, which includes organized behaviour for working with symbolic memory.

This is where our materialist interlocutors are in error or decidedly playing word games.

Some of them, who are educated enough to appreciate the predicament for their worldview, find nothing better than questioning the objectivity of information as a scientifically detectable phenomenon. That is understandable: if you think of this, they have no other option. Playing the fool is a lot more comfortable than having to acknowledge the obvious i.e. that life has intelligent origin!

So, non-living matter is missing the important ingredient that is key for the organization of life: it is absolutely void of any information translation. That must be the emphasis in our discussions with naturalists. Unfortunately, from the point of view of mere thermodynamics, life is indistinguishable from non-life: in both there are physical interactions of particles; in both non-living and living systems energy is spent and entropy tends to maximum. After all, both life and non-life use the same chemical elements from the same periodical system.>>


As promised, rich food for thought.

Let us now discuss what our Russian colleagues have to say. END

16 Replies to “UD Guest Post: Dr Eugen S on the second law of thermodynamics (plus . . . ) vs. “evolution”

  1. 1
    kairosfocus says:

    More — nay, MOAR — from Dr Eugen S of Russia. This time he tackles thermodynamics and information systems. Serious food for thought from the Russian language ID debate.

  2. 2

    Congratulations again, Evgeny

  3. 3
  4. 4
    EugeneS says:

    KF, UB,

    Thanks very much.

    What I also wanted to discuss originally is Zermelo-Poincare recurrence but I removed references to it from this posted version.

    I have a good friend who graduated from the same uni as I did. He is an Orthodox Christian priest, like myself, and holds a doctorate in theology. We often discuss physics with him from a theological perspective. His opinion is invaluable for me in many respects, in particular as that of an expert in theology.

    Recently, we discussed the recurrence paradox with him. And I would really like to hear people’s opinions on it.

    I also post on some Russian blogs where the bulk of the audience are theists with no scientific background. One reader said something like this: Zermelo-Poincare recurrence is rubbish, a drop of ink dissolved in water will never assemble to a drop again by itself.

    Poincaré’s theorem states that, given a delta of state, systems with some specific properties (ergodicity, measure preservation ?) will eventually pass through a state which is within the delta from the initial state. The time required is called the Poincaré cycle. The original theorem was formulated in celestial mechanics. Zermelo adapted it to statistical mechanics. This is a paradox because we never observe recurrent behaviour in real large systems. Entropy stays at max values once it gets there.

    In one source, I saw an interpretation of this paradox. According to it, the Poincaré cycle for any practical system is so large that we will never be able to observe it, so there is no real paradox. And yes, recurrence does exist and is observed for small systems.

    The question is, does is suggest that cyclicity is somehow engrained in nature? In this case, it poses a theological problem since eternal cyclicity is an atomistic concept originating in pagan philosophy. E.g. Boltzman viewed the entire university as a gigantic fluctuation. Add cyclicity to it and you get ancient mythology 🙂 Do people see where I am coming from?

  5. 5
    kairosfocus says:


    Walker and Davies:

    In physics, particularly in statistical mechanics, we base many of our calculations on the assumption of metric transitivity, which asserts that a system’s trajectory will eventually [–> given “enough time and search resources”] explore the entirety of its state space – thus everything that is phys-ically possible will eventually happen. It should then be trivially true that one could choose an arbitrary “final state” (e.g., a living organism) and “explain” it by evolving the system backwards in time choosing an appropriate state at some ’start’ time t_0 (fine-tuning the initial state). In the case of a chaotic system the initial state must be specified to arbitrarily high precision. But this account amounts to no more than saying that the world is as it is because it was as it was, and our current narrative therefore scarcely constitutes an explanation in the true scientific sense.

    We are left in a bit of a conundrum with respect to the problem of specifying the initial conditions necessary to explain our world. A key point is that if we require specialness in our initial state (such that we observe the current state of the world and not any other state) metric transitivity cannot hold true, as it blurs any dependency on initial conditions – that is, it makes little sense for us to single out any particular state as special by calling it the ’initial’ state. If we instead relax the assumption of metric transitivity (which seems more realistic for many real world physical systems – including life), then our phase space will consist of isolated pocket regions and it is not necessarily possible to get to any other physically possible state (see e.g. Fig. 1 for a cellular automata example).

    [–> or, there may not be “enough” time and/or resources for the relevant exploration, i.e. we see the 500 – 1,000 bit complexity threshold at work vs 10^57 – 10^80 atoms with fast rxn rates at about 10^-13 to 10^-15 s leading to inability to explore more than a vanishingly small fraction on the gamut of Sol system or observed cosmos . . . the only actually, credibly observed cosmos]

    Thus the initial state must be tuned to be in the region of phase space in which we find ourselves [–> notice, fine tuning], and there are regions of the configuration space our physical universe would be excluded from accessing, even if those states may be equally consistent and permissible under the microscopic laws of physics (starting from a different initial state). Thus according to the standard picture, we require special initial conditions to explain the complexity of the world, but also have a sense that we should not be on a particularly special trajectory to get here (or anywhere else) as it would be a sign of fine–tuning of the initial conditions. [ –> notice, the “loading”] Stated most simply, a potential problem with the way we currently formulate physics is that you can’t necessarily get everywhere from anywhere (see Walker [31] for discussion). [“The “Hard Problem” of Life,” June 23, 2016, a discussion by Sara Imari Walker and Paul C.W. Davies at Arxiv.]

    I add, plausibly cosmological run-down is also implicit in our circumstances. So, the cycle becomes like the mathematical point that a circle with radius of curvature infinity is indistinguishable from a straight line. A mathematical consequence to collection- of- particle dynamics.

    We do not live in a Stoic world of cycles.

    Which, being of in principle finite duration, would impose an infinite succession of finite stage steps. We cannot traverse that and there is no infinite past. Looking ahead, this space-time domain may be potentially infinite but will not traverse an actual infinity in steps of cycles.

    We are forced to seek a finitely remote past origin, and we face a root of reality as a being of different order, a necessary being capable of causing a world.


  6. 6
    EugeneS says:


    Greatly appreciated!

  7. 7
    kairosfocus says:

    ES, I augmented the OP a bit so those needing a spot of help can get a leg up. KF

  8. 8
    johnnyb says:

    Just to point out, you can read Mignea’s discussion of self-replication in this book. It and the other papers in the book are well worth it.

  9. 9
    bornagain77 says:

    as to this comment on ‘fluctuations’:

    Materialists are wrong not in saying state fluctuations occur. They (at least those of them who subscribe to the reductionist picture of life as exclusively chemistry) are wrong in considering life’s origin a spontaneous fluctuation. They get inside the cell and say: look, it is all chemistry! Of course, who said it isn’t chemistry?! Living cells are material objects! They, as any other material body, consist of particles of matter subject to natural regularities.
    But the point is that life is not only physics or chemistry! Life is about functional organization (not mere order) whereby its parts are integrated into a whole and act in such a way that the whole persists long enough without disintegration by maintaining its autonomy, metabolizing, replicating and responding to stimuli. In order to see that materialists commit a fallacy (or knowingly play word games) it is necessary to walk up a level of scientific abstraction and examine what kind of ‘fluctuation’ life is.,,,
    It turns out that this fluctuation is indeed special even to the point of putting away as inadequate the language of statistical mechanics! Here is why. In order to act against the tendency of entropy to increase, life must be organized so that it replicates with minimal information losses. This means, according to Shannon’s theorem for a noisy channel, that, provided the channel capacity is not exceeded during transfer, minimal losses are achieved when information is encoded using a discrete digital code: the genetic code meets that requirement!

    Although the recognition that a ‘discrete digital code’ is necessary to explain why life ‘replicates with minimal information losses’ and to thus explain why life is able to ‘act against the tendency of entropy to increase’, in my honest opinion, the recognition that a ‘discrete digital code’ is necessary to explain life’s ability to resist entropy does not go nearly far enough in explaining life’s unique relationship with entropy.

    I hold that in order to fully explain life’s relationship with entropy it is necessary the recognize that information is it’s own distinct physical entity that is separate from matter and energy, and to recognize that information is not merely to be defined as a particular arrangement of matter and energy.

    In clarifying this important point, first it is important to note that “The equations of information theory and the second law are the same,,”

    “Is there a real connection between entropy in physics and the entropy of information? ….The equations of information theory and the second law are the same, suggesting that the idea of entropy is something fundamental…”
    Siegfried, Dallas Morning News, 5/14/90, [Quotes Robert W. Lucky, Ex. Director of Research, AT&T, Bell Laboratories & John A. Wheeler, of Princeton & Univ. of TX, Austin]

    Second it is important, using those equations, to note just how far out of thermodynamic equilibrium life actually is from an information perspective,

    Biophysics – Information theory. Relation between information and entropy: – Setlow-Pollard, Ed. Addison Wesley
    Excerpt: Linschitz gave the figure 9.3 x 10^12 cal/deg or 9.3 x 10^12 x 4.2 joules/deg for the entropy of a bacterial cell. Using the relation H = S/(k In 2), we find that the information content is 4 x 10^12 bits. Morowitz’ deduction from the work of Bayne-Jones and Rhees gives the lower value of 5.6 x 10^11 bits, which is still in the neighborhood of 10^12 bits. Thus two quite different approaches give rather concordant figures.

    “a one-celled bacterium, e. coli, is estimated to contain the equivalent of 100 million pages of Encyclopedia Britannica. Expressed in information in science jargon, this would be the same as 10^12 bits of information. In comparison, the total writings from classical Greek Civilization is only 10^9 bits, and the largest libraries in the world – The British Museum, Oxford Bodleian Library, New York Public Library, Harvard Widenier Library, and the Moscow Lenin Library – have about 10 million volumes or 10^12 bits.”
    – R. C. Wysong

    ‘The information content of a simple cell has been estimated as around 10^12 bits, comparable to about a hundred million pages of the Encyclopedia Britannica.”
    – Carl Sagan, “Life” in Encyclopedia Britannica: Macropaedia (1974 ed.), pp. 893-894

    Andy McIntosh, professor of thermodynamics and combustion theory at the University of Leeds, holds that it is non-material information that is constraining the cell to be so far out of thermodynamic equilibrium. More importantly, Dr. McIntosh holds that regarding information as independent of energy and matter ‘resolves the thermodynamic issues and invokes the correct paradigm for understanding the vital area of thermodynamic/organisational interactions’.

    Information and Thermodynamics in Living Systems – Andy C. McIntosh – 2013
    Excerpt: ,,, information is in fact non-material and that the coded information systems (such as, but not restricted to the coding of DNA in all living systems) is not defined at all by the biochemistry or physics of the molecules used to store the data. Rather than matter and energy defining the information sitting on the polymers of life, this approach posits that the reverse is in fact the case. Information has its definition outside the matter and energy on which it sits, and furthermore constrains it to operate in a highly non-equilibrium thermodynamic environment. This proposal resolves the thermodynamic issues and invokes the correct paradigm for understanding the vital area of thermodynamic/organisational interactions, which despite the efforts from alternative paradigms has not given a satisfactory explanation of the way information in systems operates.,,,

    As to Dr. McIntosh’s contention that it must be information that is independent of matter and energy which is constraining life to be so far out of equilibrium, information has now been experimentally shown to have a ‘thermodynamic content’

    Maxwell’s demon demonstration (knowledge of a particle’s position) turns information into energy – November 2010
    Excerpt: Scientists in Japan are the first to have succeeded in converting information into free energy in an experiment that verifies the “Maxwell demon” thought experiment devised in 1867.,,, In Maxwell’s thought experiment the demon creates a temperature difference simply from information about the gas molecule temperatures and without transferring any energy directly to them.,,, Until now, demonstrating the conversion of information to energy has been elusive, but University of Tokyo physicist Masaki Sano and colleagues have succeeded in demonstrating it in a nano-scale experiment. In a paper published in Nature Physics they describe how they coaxed a Brownian particle to travel upwards on a “spiral-staircase-like” potential energy created by an electric field solely on the basis of information on its location. As the particle traveled up the staircase it gained energy from moving to an area of higher potential, and the team was able to measure precisely how much energy had been converted from information.

    Demonic device converts information to energy – 2010
    Excerpt: “This is a beautiful experimental demonstration that information has a thermodynamic content,” says Christopher Jarzynski, a statistical chemist at the University of Maryland in College Park. In 1997, Jarzynski formulated an equation to define the amount of energy that could theoretically be converted from a unit of information2; the work by Sano and his team has now confirmed this equation. “This tells us something new about how the laws of thermodynamics work on the microscopic scale,” says Jarzynski.

    Information: From Maxwell’s demon to Landauer’s eraser – Lutz and Ciliberto – Oct. 25, 2015 – Physics Today
    Excerpt: The above examples of gedanken-turned-real experiments provide a firm empirical foundation for the physics of information and tangible evidence of the intimate connection between information and energy. They have been followed by additional experiments and simulations along similar lines.12 (See, for example, Physics Today, August 2014, page 60.) Collectively, that body of experimental work further demonstrates the equivalence of information and thermodynamic entropies at thermal equilibrium.,,,
    (2008) Sagawa and Ueda’s (theoretical) result extends the second law to explicitly incorporate information; it shows that information, entropy, and energy should be treated on equal footings.
    J. Parrondo, J. Horowitz, and T. Sagawa. Thermodynamics of information.
    Nature Physics, 11:131-139, 2015.

    In further clarifying the physical reality of information that is separate from matter and energy, it is important to note that in quantum mechanics it is quantum information that is primarily conserved. i.e. In quantum mechanics it is not matter and energy that are primarily conserved as matter and energy are primarily conserved in classical mechanics:

    Quantum no-hiding theorem experimentally confirmed for first time – 2011
    Excerpt: In the classical world, information can be copied and deleted at will. In the quantum world, however, the conservation of quantum information means that information cannot be created nor destroyed. This concept stems from two fundamental theorems of quantum mechanics: the no-cloning theorem and the no-deleting theorem. A third and related theorem, called the no-hiding theorem, addresses information loss in the quantum world. According to the no-hiding theorem, if information is missing from one system (which may happen when the system interacts with the environment), then the information is simply residing somewhere else in the Universe; in other words, the missing information cannot be hidden in the correlations between a system and its environment.

    Quantum no-deleting theorem
    Excerpt: A stronger version of the no-cloning theorem and the no-deleting theorem provide permanence to quantum information. To create a copy one must import the information from some part of the universe and to delete a state one needs to export it to another part of the universe where it will continue to exist.

    In fact, in quantum mechanics, quantum entanglement is a ‘physical resource’ that can be used as a ‘quantum information channel’

    Quantum Entanglement and Information
    Quantum entanglement is a physical resource, like energy, associated with the peculiar nonclassical correlations that are possible between separated quantum systems. Entanglement can be measured, transformed, and purified. A pair of quantum systems in an entangled state can be used as a quantum information channel to perform computational and cryptographic tasks that are impossible for classical systems. The general study of the information-processing capabilities of quantum systems is the subject of quantum information theory.

  10. 10
    bornagain77 says:

    Quantum teleportation is fairly good for clearly demonstrating the fact that quantum information is its own distinct physical entity that is separate from matter and energy:

    Quantum Teleportation Enters the Real World – September 19, 2016
    Excerpt: Two separate teams of scientists have taken quantum teleportation from the lab into the real world.
    Researchers working in Calgary, Canada and Hefei, China, used existing fiber optics networks to transmit small units of information across cities via quantum entanglement — Einstein’s “spooky action at a distance.”,,,
    This isn’t teleportation in the “Star Trek” sense — the photons aren’t disappearing from one place and appearing in another. Instead, it’s the information that’s being teleported through quantum entanglement.,,,
    ,,, it is only the information that gets teleported from one place to another.

    Moreover, classical ‘digital’ information is found to be a subset of ‘non-local’, (i.e. beyond space and time), quantum entanglement/information by the following method:

    Quantum knowledge cools computers: New understanding of entropy – June 2011
    Excerpt: No heat, even a cooling effect;
    In the case of perfect classical knowledge of a computer memory (zero entropy), deletion of the data requires in theory no energy at all. The researchers prove that “more than complete knowledge” from quantum entanglement with the memory (negative entropy) leads to deletion of the data being accompanied by removal of heat from the computer and its release as usable energy. This is the physical meaning of negative entropy. Renner emphasizes, however, “This doesn’t mean that we can develop a perpetual motion machine.” The data can only be deleted once, so there is no possibility to continue to generate energy. The process also destroys the entanglement, and it would take an input of energy to reset the system to its starting state. The equations are consistent with what’s known as the second law of thermodynamics: the idea that the entropy of the universe can never decrease. Vedral says “We’re working on the edge of the second law. If you go any further, you will break it.”

    The researchers of the preceding paper rightly claim that this paper demonstrates the fact that ‘information is physical’. i.e. the fact that information is its own distinct physical entity that is separate from matter and energy:

    Scientists show how to erase information without using energy – January 2011
    Excerpt: Until now, scientists have thought that the process of erasing information requires energy. But a new study shows that, theoretically, information can be erased without using any energy at all.,,, “Landauer said that information is physical because it takes energy to erase it. We are saying that the reason it (information) is physical has a broader context than that.”, Vaccaro explained.

    The claim that ‘information is physical’ in the preceding paper has now been experimentally verified:

    New Scientist astounds: Information is physical – May 13, 2016
    Excerpt: Recently came the most startling demonstration yet: a tiny machine powered purely by information, which chilled metal through the power of its knowledge. This seemingly magical device could put us on the road to new, more efficient nanoscale machines, a better understanding of the workings of life, and a more complete picture of perhaps our most fundamental theory of the physical world.

    Matter, energy… knowledge: – May 11, 2016
    Running a brain-twisting thought experiment for real shows that information is a physical thing – so can we now harness the most elusive entity in the cosmos?

    It is also important to note just how far off base materialists have been in their materialistic ‘fluctuations are primary’ assumption about life. In the following video, the materialistic claim, by Carl Zimmer, that life is merely ‘barely constrained randomness’ is addressed and refuted:

    Molecular Biology – 19th Century Materialism meets 21st Century Quantum Mechanics – video

    This following paper came to my attention after I had made the preceding video and further exposes the fact that the materialistic assumption of ‘barely constrained randomness’ is wrong:

    Finding a lack of ‘random’ collisions in a crowded cell is a ‘counterintuitive surprise’ for researchers:
    Proteins put up with the roar of the crowd – June 23, 2016
    Excerpt: It gets mighty crowded around your DNA, but don’t worry: According to Rice University researchers, your proteins are nimble enough to find what they need.
    Rice theoretical scientists studying the mechanisms of protein-DNA interactions in live cells showed that crowding in cells doesn’t hamper protein binding as much as they thought it did.,,,
    If DNA can be likened to a library, it surely is a busy one. Molecules roam everywhere, floating in the cytoplasm and sticking to the tightly wound double helix. “People know that almost 90 percent of DNA is covered with proteins, such as polymerases, nucleosomes that compact two meters into one micron, and other protein molecules,” Kolomeisky said.,,,
    That makes it seem that proteins sliding along the strand would have a tough time binding, and it’s possible they sometimes get blocked. But the Rice team’s theory and simulations indicated that crowding agents usually move just as rapidly, sprinting out of the way.
    “If they move at the same speed, the molecules don’t bother each other,” Kolomeisky said. “Even if they’re covering a region, the blockers move away quickly so your protein can bind.”
    In previous research, the team determined that stationary obstacles sometimes help quicken a protein’s search for its target by limiting options. This time, the researchers sought to define how crowding both along DNA and in the cytoplasm influenced the process.
    “We may think everything’s fixed and frozen in cells, but it’s not,” Kolomeisky said. “Everything is moving.”,,,
    Floating proteins appear to find their targets quickly as well. “This was a surprise,” he said. “It’s counterintuitive, because one would think collisions between a protein and other molecules on DNA would slow it down. But the system is so dynamic (and so well designed?), it doesn’t appear to be an issue.”

    Jim Al-Khalili states, in regards to quantum mechanics, ‘Biologists have got off lightly in my view’:

    Jim Al-Khalili, at the 2:30 minute mark of the following video states,
    “,,and Physicists and Chemists have had a long time to try and get use to it (Quantum Mechanics). Biologists, on the other hand have got off lightly in my view. They are very happy with their balls and sticks models of molecules. The balls are the atoms. The sticks are the bonds between the atoms. And when they can’t build them physically in the lab nowadays they have very powerful computers that will simulate a huge molecule.,, It doesn’t really require much in the way of quantum mechanics in the way to explain it.”
    At the 6:52 minute mark of the video, Jim Al-Khalili goes on to state:
    “To paraphrase, (Erwin Schrödinger in his book “What Is Life”), he says at the molecular level living organisms have a certain order. A structure to them that’s very different from the random thermodynamic jostling of atoms and molecules in inanimate matter of the same complexity. In fact, living matter seems to behave in its order and its structure just like inanimate cooled down to near absolute zero. Where quantum effects play a very important role. There is something special about the structure, about the order, inside a living cell. So Schrodinger speculated that maybe quantum mechanics plays a role in life”.
    Jim Al-Khalili – Quantum biology – video

  11. 11
    bornagain77 says:

    As mentioned in the “19th Century Materialism meets 21st Century Quantum Mechanics” video, Quantum information is now found in a wide variety of biological molecules. Here are a few papers verifying that claim:

    “What happens is this classical information (of DNA) is embedded, sandwiched, into the quantum information (of DNA). And most likely this classical information is never accessed because it is inside all the quantum information. You can only access the quantum information or the electron clouds and the protons. So mathematically you can describe that as a quantum/classical state.”
    Elisabeth Rieper – Classical and Quantum Information in DNA – video (Longitudinal Quantum Information resides along the entire length of DNA discussed at the 19:30 minute mark; at 24:00 minute mark Dr Rieper remarks that practically the whole DNA molecule can be viewed as quantum information with classical information embedded within it)

    Classical and Quantum Information Channels in Protein Chain – Dj. Koruga, A. Tomi?, Z. Ratkaj, L. Matija – 2006
    Abstract: Investigation of the properties of peptide plane in protein chain from both classical and quantum approach is presented. We calculated interatomic force constants for peptide plane and hydrogen bonds between peptide planes in protein chain. On the basis of force constants, displacements of each atom in peptide plane, and time of action we found that the value of the peptide plane action is close to the Planck constant. This indicates that peptide plane from the energy viewpoint possesses synergetic classical/quantum properties. Consideration of peptide planes in protein chain from information viewpoint also shows that protein chain possesses classical and quantum properties. So, it appears that protein chain behaves as a triple dual system: (1) structural – amino acids and peptide planes, (2) energy – classical and quantum state, and (3) information – classical and quantum coding. Based on experimental facts of protein chain, we proposed from the structure-energy-information viewpoint its synergetic code system.

    Quantum coherent-like state observed in a biological protein for the first time – October 13, 2015
    Excerpt: If you take certain atoms and make them almost as cold as they possibly can be, the atoms will fuse into a collective low-energy quantum state called a Bose-Einstein condensate. In 1968 physicist Herbert Fröhlich predicted that a similar process at a much higher temperature could concentrate all of the vibrational energy in a biological protein into its lowest-frequency vibrational mode. Now scientists in Sweden and Germany have the first experimental evidence of such so-called Fröhlich condensation (in proteins).,,,
    The real-world support for Fröhlich’s theory (for proteins) took so long to obtain because of the technical challenges of the experiment, Katona said.

    Quantum criticality in a wide range of important biomolecules
    Excerpt: “Most of the molecules taking part actively in biochemical processes are tuned exactly to the transition point and are critical conductors,” they say.
    That’s a discovery that is as important as it is unexpected. “These findings suggest an entirely new and universal mechanism of conductance in biology very different from the one used in electrical circuits.”
    The permutations of possible energy levels of biomolecules is huge so the possibility of finding even one that is in the quantum critical state by accident is mind-bogglingly small and, to all intents and purposes, impossible.,, of the order of 10^-50 of possible small biomolecules and even less for proteins,”,,,
    “what exactly is the advantage that criticality confers?”

    It is also important to reiterate the fact that quantum information is a ‘non-local’, beyond space and time, entity that simply refuses be reduced to any conceivable within space-time, matter and energy, explanation:

    Looking beyond space and time to cope with quantum theory – 29 October 2012
    Excerpt: “Our result gives weight to the idea that quantum correlations somehow arise from outside spacetime, in the sense that no story in space and time can describe them,”

    Physicists find extreme violation of local realism in quantum hypergraph states – Lisa Zyga – March 4, 2016
    Excerpt: Many quantum technologies rely on quantum states that violate local realism, which means that they either violate locality (such as when entangled particles influence each other from far away) or realism (the assumption that quantum states have well-defined properties, independent of measurement), or possibly both. Violation of local realism is one of the many counterintuitive, yet experimentally supported, characteristics of the quantum world.
    Determining whether or not multiparticle quantum states violate local realism can be challenging. Now in a new paper, physicists have shown that a large family of multiparticle quantum states called hypergraph states violates local realism in many ways. The results suggest that these states may serve as useful resources for quantum technologies, such as quantum computers and detecting gravitational waves.,,,
    The physicists also showed that the greater the number of particles in a quantum hypergraph state, the more strongly it violates local realism, with the strength increasing exponentially with the number of particles. In addition, even if a quantum hypergraph state loses one of its particles, it continues to violate local realism. This robustness to particle loss is in stark contrast to other types of quantum states, which no longer violate local realism if they lose a particle. This property is particularly appealing for applications, since it might allow for more noise in experiments.

    Quantum correlations do not imply instant causation – August 12, 2016
    Excerpt: A research team led by a Heriot-Watt scientist has shown that the universe is even weirder than had previously been thought.
    In 2015 the universe was officially proven to be weird. After many decades of research, a series of experiments showed that distant, entangled objects can seemingly interact with each other through what Albert Einstein famously dismissed as “Spooky action at a distance”.
    A new experiment by an international team led by Heriot-Watt’s Dr Alessandro Fedrizzi has now found that the universe is even weirder than that: entangled objects do not cause each other to behave the way they do.

    Experimental test of nonlocal causality – August 10, 2016
    Previous work on causal explanations beyond local hidden-variable models focused on testing Leggett’s crypto-nonlocality (7, 42, 43), a class of models with a very specific choice of hidden variable that is unrelated to Bell’s local causality (44). In contrast, we make no assumptions on the form of the hidden variable and test all models ,,,
    Our results demonstrate that a causal influence from one measurement outcome to the other, which may be subluminal, superluminal, or even instantaneous, cannot explain the observed correlations.,,,

    Most importantly, Besides providing direct empirical falsification of neo-Darwinian claims that say information is emergent from a material basis, the implication of finding ‘non-local’, beyond space and time, and ‘conserved’ quantum information in molecular biology on such a massive scale, in every DNA and protein molecule, is fairly, and pleasantly, obvious.
    That pleasant implication, or course, being the fact that we now have fairly strong physical evidence heavily suggesting that we do indeed have an eternal soul that lives beyond the death of our material bodies.

    “Let’s say the heart stops beating. The blood stops flowing. The microtubules lose their quantum state. But the quantum information, which is in the microtubules, isn’t destroyed. It can’t be destroyed. It just distributes and dissipates to the universe at large. If a patient is resuscitated, revived, this quantum information can go back into the microtubules and the patient says, “I had a near death experience. I saw a white light. I saw a tunnel. I saw my dead relatives.,,” Now if they’re not revived and the patient dies, then it’s possible that this quantum information can exist outside the body. Perhaps indefinitely as a soul.”
    – Stuart Hameroff – Quantum Entangled Consciousness – Life After Death – video (5:00 minute mark)


    John 1:1-4
    In the beginning was the Word, and the Word was with God, and the Word was God. He was in the beginning with God. All things were made through Him, and without Him nothing was made that was made. In Him was life, and the life was the light of men.

    Mark 8:37
    “Is anything worth more than your soul?”

  12. 12
    EugeneS says:


    Sure. It looks a more thorough now 🙂

  13. 13
    EugeneS says:


    Thanks for pointing. I have got this book, well worth having another look.

  14. 14
    EugeneS says:

    I have spoken to a friend of mine, who is a specialist in statistical mechanics and is lecturing at Moscow Physics University. I have asked him how he reconciles the recurrence paradox with the second law.

    His point is very clear and simple (and, consequently, very likely to be true). Recurrence, he said, is predicted by the underlying mathematical model. Where we can keep as close to its assumptions as possible, it will be observed (given enough time etc). E.g. where we can make sure that the system is isolated, which is hard to achieve in practice. As Eugene Wigner pointed out in his Nobel lecture, the laws of physics are conditional mathematical models. We will observe them in practice only if we make sure their assumptions hold.

  15. 15
    OLV says:

    Interrelationship Between Fractal Ornament and Multilevel Selection Theory

    Interdisciplinarity is one of the features of modern science, defined as blurring the boundaries of disciplines and overcoming their limitations or excessive specialization by borrowing methods from one discipline into another, integrating different theoretical assumptions, and using the same concepts and terms. Often, theoretical knowledge of one discipline and technological advances of another are combined within an interdisciplinary science, and new branches or disciplines may also emerge. Biosemiotics, a field that arose at the crossroads of biology, semiotics, linguistics, and philosophy, enables scientists to borrow theoretical assumptions from semiotics and extend them to different biological theories. The latter applies especially to extended synthesis, wherein culture is viewed as one of the factors influencing evolution. In the present research, the semiotic system of Ukrainian folk ornament is analyzed through the theory of fractals, key features of which are recursion and self-similarity. As a result, an assumption is made about the fractal structure of culture and social life on a conceptual level. What follows is a discussion of how this assumption can contribute to the multilevel selection theory, one of the foundations of extended synthesis, which employs the concept of self-similarity at all levels of the biological hierarchy.

  16. 16
    PeterA says:

    OLV @15:

    What is that? 🙂

    Anyway, glad you brought back this interesting thread.

Leave a Reply