Conservation of Information Information Intelligent Design Science

Bill Dembski on the primacy of information for science

Spread the love

Recently, Bill Dembski, founder of Uncommon Descent, got together with Fred Skiff, chair of the physics department at the University of Iowa, to discuss themes from his book, Being as Communion:

Jon Garvey on William Dembski’s Being as Communion

Goodreads quotes from Being as Communion

The conversation examines why information is the most basic object of study in science and how Conservation of Information naturally leads to the conclusion that intelligence is the ultimate source of information.

Bill Dembski, “Conversation about the Primacy of Information for Science” at Bill Dembski.com

2 Replies to “Bill Dembski on the primacy of information for science

  1. 1
    bornagain77 says:

    For me personally, I think that Dr. Dembski, in a video entitled “The Primacy of Information for Science” could have taken the principle of “Conservation of Information” much further, from merely a mathematical proof, into physics.

    For instance, in the video Dr. Dembski defines information as such:

    “Information is the reduction of possibilities within a reference class”
    William Dembski – The Primacy of Information for Science: – 29:45 minute mark – video
    https://youtu.be/PQQXjxdvuGs?t=1751

    And as to relating information, i.e. “the reduction of possibilities within a reference class”, to physics we find that the in quantum physics that quantum wave function, prior to collapse, is mathematically defined as being in a ‘infinite dimensional’ state,

    The Unreasonable Effectiveness of Mathematics in the Natural Sciences – Eugene Wigner – 1960
    Excerpt: We now have, in physics, two theories of great power and interest: the theory of quantum phenomena and the theory of relativity.,,, The two theories operate with different mathematical concepts: the four dimensional Riemann space and the infinite dimensional Hilbert space,
    http://www.dartmouth.edu/~matc.....igner.html

    Wave function
    Excerpt “wave functions form an abstract vector space”,,, This vector space is infinite-dimensional, because there is no finite set of functions which can be added together in various combinations to create every possible function.
    http://en.wikipedia.org/wiki/W.....ctor_space

    Moreover, we find that it takes an infinite amount of information to describe the quantum wave function prior to its collapse,

    Explaining Information Transfer in Quantum Teleportation: Armond Duwell †‡ University of Pittsburgh
    Excerpt: In contrast to a classical bit, the description of a (quantum) qubit requires an infinite amount of information. The amount of information is infinite because two real numbers are required in the expansion of the state vector of a two state quantum system (Jozsa 1997, 1)
    http://www.cas.umt.edu/phil/fa.....lPSA2K.pdf

    Quantum Computing – Stanford Encyclopedia
    Excerpt: Theoretically, a single qubit can store an infinite amount of information, yet when measured (and thus collapsing the Quantum Wave state) it yields only the classical result (0 or 1),,,
    http://plato.stanford.edu/entr.....tcomp/#2.1

    In what should be needless to say, the collapse of a wave function from a infinite dimensional-infinite information state to a single bit state within the four-dimensional space-time of this universe is the most dramatic example of a “reduction of possibilities within a reference class” that is possible.

    In further relating information, i.e. “the reduction of possibilities within a reference class”, to physics we also find that, as far as entropy itself is concerned, the universe itself is a severe example of “the reduction of possibilities within a reference class.”

    Entropy is also, by a wide margin, the most finely tuned of the initial conditions of the Big Bang. Finely tuned to an almost incomprehensible degree of precision, 1 part in 10 to the 10 to the 123rd power. As Roger Penrose himself stated that, “This now tells us how precise the Creator’s aim must have been: namely to an accuracy of one part in 10^10^123.”

    “This now tells us how precise the Creator’s aim must have been: namely to an accuracy of one part in 10^10^123.”
    Roger Penrose – How special was the big bang? – (from the Emperor’s New Mind, Penrose, pp 339-345 – 1989)

    To give us a clue just how much of a “reduction of possibilities within a reference class” 1 in 10^10^123 actually = is, in the following video Dr. Bruce Gordon states, “you would need a hundred million, trillion, trillion, trillion, universes our size, with a zero on every proton and neutron in all of those universes just to write out this number. That is how fine tuned the initial entropy of our universe is.”

    “An explosion you think of as kind of a messy event. And this is the point about entropy. The explosion in which our universe began was not a messy event. And if you talk about how messy it could have been, this is what the Penrose calculation is all about essentially. It looks at the observed statistical entropy in our universe. The entropy per baryon. And he calculates that out and he arrives at a certain figure. And then he calculates using the Bekenstein-Hawking formula for Black-Hole entropy what the,,, (what sort of entropy could have been associated with,,, the singularity that would have constituted the beginning of the universe). So you’ve got the numerator, the observed entropy, and the denominator, how big it (the entropy) could have been. And that fraction turns out to be,, 1 over 10 to the 10 to the 123rd power. Let me just emphasize how big that denominator is so you can gain a real appreciation for how small that probability is. So there are 10^80th baryons in the universe. Protons and neutrons. No suppose we put a zero on every one of those. OK, how many zeros is that? That is 10^80th zeros. This number has 10^123rd zeros. OK, so you would need a hundred million, trillion, trillion, trillion, universes our size, with zero on every proton and neutron in all of those universes just to write out this number. That is how fine tuned the initial entropy of our universe is. And if there were a pre-Big Bang state and you had some bounces, then that fine tuning (for entropy) gets even finer as you go backwards if you can even imagine such a thing. ”
    Dr Bruce Gordon – Contemporary Physics and God Part 2 – video – 1:50 minute mark – video
    https://youtu.be/ff_sNyGNSko?t=110

    As should be needless to say, entropy itself is certainly one heck of a “reduction of possibilities within a reference class” and thus entropy itself, on top of quantum physics, also gives us compelling evidence, via William Dembski’s definition of information, that the universe must be ‘information theoretic’ in its foundational basis.

    In further solidifying the connection between information, entropy, and quantum mechanics, Tom Seigfried states that “The equations of information theory and the second law are the same, suggesting that the idea of entropy is something fundamental…”

    “Is there a real connection between entropy in physics and the entropy of information? ….The equations of information theory and the second law are the same, suggesting that the idea of entropy is something fundamental…”
    – Tom Siegfried, Dallas Morning News, 5/14/90, [Quotes Robert W. Lucky, Ex. Director of Research, AT&T, Bell Laboratories & John A. Wheeler, of Princeton & Univ. of TX, Austin]

    And now it has been demonstrated that ‘information has a thermodynamic content’.

    In the following 2010 experimental realization of Maxwell’s demon thought experiment, it was demonstrated that knowledge of a particle’s location and/or position converts information into energy.

    Maxwell’s demon demonstration turns information into energy – November 2010
    Excerpt: Scientists in Japan are the first to have succeeded in converting information into free energy in an experiment that verifies the “Maxwell demon” thought experiment devised in 1867.,,, In Maxwell’s thought experiment the demon creates a temperature difference simply from information about the gas molecule temperatures and without transferring any energy directly to them.,,, Until now, demonstrating the conversion of information to energy has been elusive, but University of Tokyo physicist Masaki Sano and colleagues have succeeded in demonstrating it in a nano-scale experiment. In a paper published in Nature Physics they describe how they coaxed a Brownian particle to travel upwards on a “spiral-staircase-like” potential energy created by an electric field solely on the basis of information on its location. As the particle traveled up the staircase it gained energy from moving to an area of higher potential, and the team was able to measure precisely how much energy had been converted from information.
    http://www.physorg.com/news/20.....nergy.html

    And as the following 2010 article stated about the preceding experiment, “This is a beautiful experimental demonstration that information has a thermodynamic content,”

    Demonic device converts information to energy – 2010
    Excerpt: “This is a beautiful experimental demonstration that information has a thermodynamic content,” says Christopher Jarzynski, a statistical chemist at the University of Maryland in College Park. In 1997, Jarzynski formulated an equation to define the amount of energy that could theoretically be converted from a unit of information2; the work by Sano and his team has now confirmed this equation. “This tells us something new about how the laws of thermodynamics work on the microscopic scale,” says Jarzynski.
    http://www.scientificamerican......rts-inform

    Moreover, this following 2011 paper goes even further and states that the entropy of a system is always dependent on how much knowledge of the system an ‘observer’ may have of the system. Specifically, “an an object does not have a certain amount of entropy per se, instead an object’s entropy is always dependent on the observer. Applied to the example of deleting data, this means that if two individuals delete data in a memory and one has more knowledge of this data, she perceives the memory to have lower entropy and can then delete the memory using less energy.,,,”

    Quantum knowledge cools computers: New understanding of entropy – June 1, 2011
    Excerpt: Recent research by a team of physicists,,, describe,,, how the deletion of data, under certain conditions, can create a cooling effect instead of generating heat. The cooling effect appears when the strange quantum phenomenon of entanglement is invoked.,,,
    The new study revisits Landauer’s principle for cases when the values of the bits to be deleted may be known. When the memory content is known, it should be possible to delete the bits in such a manner that it is theoretically possible to re-create them. It has previously been shown that such reversible deletion would generate no heat. In the new paper, the researchers go a step further. They show that when the bits to be deleted are quantum-mechanically entangled with the state of an observer, then the observer could even withdraw heat from the system while deleting the bits. Entanglement links the observer’s state to that of the computer in such a way that they know more about the memory than is possible in classical physics.,,,
    In measuring entropy, one should bear in mind that an object does not have a certain amount of entropy per se, instead an object’s entropy is always dependent on the observer. Applied to the example of deleting data, this means that if two individuals delete data in a memory and one has more knowledge of this data, she perceives the memory to have lower entropy and can then delete the memory using less energy.,,,
    No heat, even a cooling effect;
    In the case of perfect classical knowledge of a computer memory (zero entropy), deletion of the data requires in theory no energy at all. The researchers prove that “more than complete knowledge” from quantum entanglement with the memory (negative entropy) leads to deletion of the data being accompanied by removal of heat from the computer and its release as usable energy. This is the physical meaning of negative entropy.
    Renner emphasizes, however, “This doesn’t mean that we can develop a perpetual motion machine.” The data can only be deleted once, so there is no possibility to continue to generate energy. The process also destroys the entanglement, and it would take an input of energy to reset the system to its starting state. The equations are consistent with what’s known as the second law of thermodynamics: the idea that the entropy of the universe can never decrease. Vedral says “We’re working on the edge of the second law. If you go any further, you will break it.”
    http://www.sciencedaily.com/re.....134300.htm

  2. 2
    bornagain77 says:

    And as the following 2017 article states: James Clerk Maxwell (said), “The idea of dissipation of energy depends on the extent of our knowledge.”,,,
    quantum information theory,,, describes the spread of information through quantum systems.,,,
    Fifteen years ago, “we thought of entropy as a property of a thermodynamic system,” he said. “Now in (quantum) information theory, we wouldn’t say entropy is a property of a system, but a property of an observer who describes a system.”,,,

    The Quantum Thermodynamics Revolution – May 2017
    Excerpt: the 19th-century physicist James Clerk Maxwell put it, “The idea of dissipation of energy depends on the extent of our knowledge.”
    In recent years, a revolutionary understanding of thermodynamics has emerged that explains this subjectivity using quantum information theory — “a toddler among physical theories,” as del Rio and co-authors put it, that describes the spread of information through quantum systems. Just as thermodynamics initially grew out of trying to improve steam engines, today’s thermodynamicists are mulling over the workings of quantum machines. Shrinking technology — a single-ion engine and three-atom fridge were both experimentally realized for the first time within the past year — is forcing them to extend thermodynamics to the quantum realm, where notions like temperature and work lose their usual meanings, and the classical laws don’t necessarily apply.
    They’ve found new, quantum versions of the laws that scale up to the originals. Rewriting the theory from the bottom up has led experts to recast its basic concepts in terms of its subjective nature, and to unravel the deep and often surprising relationship between energy and information — the abstract 1s and 0s by which physical states are distinguished and knowledge is measured.,,,
    Renato Renner, a professor at ETH Zurich in Switzerland, described this as a radical shift in perspective. Fifteen years ago, “we thought of entropy as a property of a thermodynamic system,” he said. “Now in (quantum) information theory, we wouldn’t say entropy is a property of a system, but a property of an observer who describes a system.”,,,
    https://www.quantamagazine.org/quantum-thermodynamics-revolution/

    What is so interesting about finding “entropy is a property of a system, but a property of an observer who describes a system.”, is that the information content that is found to be in a simple one cell bacterium, when working from the thermodynamic perspective, is found to be around 10 to the 12 bits,,,

    Biophysics – Information theory. Relation between information and entropy: – Setlow-Pollard, Ed. Addison Wesley
    Excerpt: Linschitz gave the figure 9.3 x 10^12 cal/deg or 9.3 x 10^12 x 4.2 joules/deg for the entropy of a bacterial cell. Using the relation H = S/(k In 2), we find that the information content is 4 x 10^12 bits. Morowitz’ deduction from the work of Bayne-Jones and Rhees gives the lower value of 5.6 x 10^11 bits, which is still in the neighborhood of 10^12 bits. Thus two quite different approaches give rather concordant figures.
    http://www.astroscu.unam.mx/~a.....ecular.htm

    ,,, Which is the equivalent of about 100 million pages of Encyclopedia Britannica. ‘In comparison,,, the largest libraries in the world,, have about 10 million volumes or 10^12 bits.”

    “a one-celled bacterium, e. coli, is estimated to contain the equivalent of 100 million pages of Encyclopedia Britannica. Expressed in information in science jargon, this would be the same as 10^12 bits of information. In comparison, the total writings from classical Greek Civilization is only 10^9 bits, and the largest libraries in the world – The British Museum, Oxford Bodleian Library, New York Public Library, Harvard Widenier Library, and the Moscow Lenin Library – have about 10 million volumes or 10^12 bits.”
    – R. C. Wysong – The Creation-evolution Controversy

    And thus since Bacterial cells are about 10 times smaller than most plant and animal cells.

    Size Comparisons of Bacteria, Amoeba, Animal & Plant Cells
    Excerpt: Bacterial cells are very small – about 10 times smaller than most plant and animal cells.
    https://education.seattlepi.com/size-comparisons-bacteria-amoeba-animal-plant-cells-4966.html

    And since there are conservatively estimated to be around 30 trillion cells within the average human body,

    Revised Estimates for the Number of Human and Bacteria Cells in the Body – 2016
    Abstract: Reported values in the literature on the number of cells in the body differ by orders of magnitude and are very seldom supported by any measurements or calculations. Here, we integrate the most up-to-date information on the number of human and bacterial cells in the body. We estimate the total number of bacteria in the 70 kg “reference man” to be 3.8·10^13. For human cells, we identify the dominant role of the hematopoietic lineage to the total count (?90%) and revise past estimates to 3.0·10^13 human cells. Our analysis also updates the widely-cited 10:1 ratio, showing that the number of bacteria in the body is actually of the same order as the number of human cells, and their total mass is about 0.2 kg.
    https://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.1002533

    Then that gives us a rough ballpark estimate of around 300 trillion times 100 million pages of Encyclopedia Britannica. Or about 300 trillion times the information content contained within the books of all the largest libraries in the world.

    Needless to say, that is a massive amount of positional information.

    As the following article states, the information to build a human infant, atom by atom, would take up the equivalent of enough thumb drives to fill the Titanic, multiplied by 2,000.

    In a TED Talk, (the Question You May Not Ask,,, Where did the information come from?) – November 29, 2017
    Excerpt: Sabatini is charming.,,, he deploys some memorable images. He points out that the information to build a human infant, atom by atom, would take up the equivalent of enough thumb drives to fill the Titanic, multiplied by 2,000. Later he wheels out the entire genome, in printed form, of a human being,,,,:
    [F]or the first time in history, this is the genome of a specific human, printed page-by-page, letter-by-letter: 262,000 pages of information, 450 kilograms.,,,
    https://evolutionnews.org/2017/11/in-a-ted-talk-heres-the-question-you-may-not-ask/

    On top of that we can add the fairly recent findings that demonstrate that quantum information is ubiquitous within biology. As Dr Reiper remarks in the following video, “practically the whole DNA molecule can be viewed as quantum information with classical information embedded within it.”

    “What happens is this classical information (of DNA) is embedded, sandwiched, into the quantum information (of DNA). And most likely this classical information is never accessed because it is inside all the quantum information. You can only access the quantum information or the electron clouds and the protons. So mathematically you can describe that as a quantum/classical state.”
    Elisabeth Rieper – Classical and Quantum Information in DNA – video (Longitudinal Quantum Information resides along the entire length of DNA discussed at the 19:30 minute mark; at 24:00 minute mark Dr Rieper remarks that practically the whole DNA molecule can be viewed as quantum information with classical information embedded within it)
    https://youtu.be/2nqHOnVTxJE?t=1176

    The interesting thing about quantum information is that it is experimentally shown to “non-local” as well as “conserved”.

    As the following paper entitled “Looking beyond space and time to cope with quantum theory” stated, ““Our result gives weight to the idea that quantum correlations somehow arise from outside spacetime, in the sense that no story in space and time can describe them,”

    Looking beyond space and time to cope with quantum theory – 29 October 2012
    Excerpt: “Our result gives weight to the idea that quantum correlations somehow arise from outside spacetime, in the sense that no story in space and time can describe them,”
    http://www.quantumlah.org/high.....uences.php

    And as the following article states, “In the classical world, information can be copied and deleted at will. In the quantum world, however, the conservation of quantum information means that information cannot be created nor destroyed.”

    Quantum no-hiding theorem experimentally confirmed for first time – 2011
    Excerpt: In the classical world, information can be copied and deleted at will. In the quantum world, however, the conservation of quantum information means that information cannot be created nor destroyed. This concept stems from two fundamental theorems of quantum mechanics: the no-cloning theorem and the no-deleting theorem. A third and related theorem, called the no-hiding theorem, addresses information loss in the quantum world. According to the no-hiding theorem, if information is missing from one system (which may happen when the system interacts with the environment), then the information is simply residing somewhere else in the Universe; in other words, the missing information cannot be hidden in the correlations between a system and its environment.
    http://www.physorg.com/news/20.....tally.html

    The implication of finding ‘non-local’, beyond space and time, and ‘conserved’, quantum information in molecular biology on such a massive scale, in every important biomolecule in our bodies, is fairly, and pleasantly, obvious.
    That pleasant implication, of course, being the fact that we now have very strong empirical evidence suggesting that we do indeed have an eternal soul that is capable of living beyond the death of our material bodies. As Stuart Hameroff states in the following article, the quantum information,,, isn’t destroyed. It can’t be destroyed.,,, it’s possible that this quantum information can exist outside the body. Perhaps indefinitely as a soul.”

    Leading Scientists Say Consciousness Cannot Die It Goes Back To The Universe – Oct. 19, 2017 – Spiritual
    Excerpt: “Let’s say the heart stops beating. The blood stops flowing. The microtubules lose their quantum state. But the quantum information, which is in the microtubules, isn’t destroyed. It can’t be destroyed. It just distributes and dissipates to the universe at large. If a patient is resuscitated, revived, this quantum information can go back into the microtubules and the patient says, “I had a near death experience. I saw a white light. I saw a tunnel. I saw my dead relatives.,,” Now if they’re not revived and the patient dies, then it’s possible that this quantum information can exist outside the body. Perhaps indefinitely as a soul.”
    – Stuart Hameroff – Quantum Entangled Consciousness – Life After Death – video (5:00 minute mark)
    https://www.disclose.tv/leading-scientists-say-consciousness-cannot-die-it-goes-back-to-the-universe-315604

    Verse:

    Mark 8:37
    Is anything worth more than your soul?

    Thus in conclusion, the principle of ‘Conservation of Information” plays far deeper into physics, and even into quantum biology, than, IMHO, Dr. Dembski has thus far elucidated mathematically.

    Myself, I think it is well worth exploring this issue of ‘Conservation of Information” in full as to how it relates to quantum physics, and quantum biology in particular, and not just, basically, exploring it mathematically (no matter how mathematically rigorous that exploration may be by Dembski and company)..

    After all, is not science equally dependent of both mathematical models and the experimental confirmation of those mathematical models?

    Verses

    1 Thessalonians 5:21
    but test all things. Hold fast to what is good.

    John 1:1-4
    In the beginning was the Word, and the Word was with God, and the Word was God. He was with God in the beginning. Through him all things were made; without him nothing was made that has been made. In him was life, and that life was the light of all mankind.

Leave a Reply