Uncommon Descent Serving The Intelligent Design Community

What is information anyway? Some proposed answers

arroba Email

Evolution News and Views

From Casey Luskin’s talk at Evolution News & Views:

Information is not always easy to define, but it often involves a measure of degree of randomness. The fundamental intuition behind information is a reduction in possibilities. The more possibilities you rule out, the more information you’ve conveyed.

Nature can produce “information” under certain definitions. Intelligent agents also produce information (certain types, at least). As Henry Quastler observed, “The creation of new information is habitually associated with conscious activity.”

To put it another way: The reduction in uncertainty could occur by an intelligent agent, or through a physical occurrence.

Types of information and example follow.


To summarize, Information can be understood and defined in different ways. Some are useful for detecting design and/or measuring bioinformation. Some are not. Semantic information is useful for detecting design, but it’s not the only way to detect design. It’s a special case of design, not the general case. Semantic information as a class falls within complex and specified information, which is a more general mode of design detection. The genetic code is a form of syntactic, semantic, and complex specified information. More.

See also: Life as “self-perpetuating information strings”? At least Adami is on the right track in focusing on understanding information, not chemistry, as the key driver.

Follow UD News at Twitter!

One reason I'm reluctant to jump into these conversations is because my schedule does not permit me to respond in real time. It helps me appreciate the time and effort Bornagain and others are able to put into their informative comments. I just wanted to make one distinction regarding Wiener's work. He did not assert that information could be generated by a random process (such as a random number generator). Those are only good at making gibberish. Instead, he observed that many information sources could be statistically modeled by a random process, which had a rich set of tools available to help analysis. If you look at text and compare it with a zipped file of the same text, you see that you can often predict what the next letter will be with the cleartext file, but predicting the next number in the zip file is hopeless...it seems like gibberish even though we know we can unzip it into something meaningful. Wiener observed that streams rich in information statistically resembled random data, and could be mathematically treated as such. I don't think he ever claimed a random process could actually generate information. I hope everyone understands this distinction. GBDixon
...informational concepts have a long time ago been included in the set of accepted biological categories. Coding and self-description, however, are not just additions to biochemical and and biophysical causality. Rather, the semiotic character of life has staged the physical and chemical processes of the world in ways, which are simply unbelievable from the point of view of mere causality. Causality, then, should be studied in the light of semiosis, not vice versa. Due to the uninformed concept of information in present biology, however, the power of nature's semiosis is still largely hidden. A total reframing of biological theory is needed. - Biosemiotics: Information, Codes and Signs in Living Systems
BA, LTI systems are particularly tractable mathematically allowing superposition and in effect an assumed stability of responsiveness to inputs. Likely, you were quietly using the premise and it was not made a fuss of. Real systems of consequence, strictly, are seldom linear and never actually time invariant. And servosystems -- control of position, speed, etc -- are amenable to essentially the same analysis as process control systems. KF kairosfocus
groovamos, as a tech at the chemical factory I worked at, we never used 'linear time-invariant (LTI) systems', I have never even heard of such things, although I was just a tech. We were reliant on proportional–integral–derivative controllers (PID controllers) for just about everything that was required to be brought back into balance after an unpredictable input. i.e. to "modify an unstable system to be stable". From what I can gather, many of the types of processes we were controlling in the factory were much more akin to Weiner's work on the automatic aiming and firing of anti-aircraft guns than they were to what could be termed a 'linear time invariant system'. All that aside, due to your concerns, and seeing how many people were involved, I have scrubbed the claim that Weiner was "THE" founder from my notes, and have instead inserted his work during WWII and at MIT as qualifiers for his authority to speak on the subject of information. bornagain
While Nyquist is a graphical technique, it only provides a limited amount of intuition for why a system is stable or unstable, or how to modify an unstable system to be stable. LTI systems are highly basic to engineering and are foundational to systems theory almost to the exclusion of everything else in undergraduate curricula, especially in electrical communications. Where this is not true in communications, e.g. multiplicators, Fourier transform properties can be used such as time or frequency shifting. Pole-zero plots in the complex frequency plane can indicate the possibility of instability with poles in the right half plane being near or on the vertical axis. This happens for instance when a sound system with microphones begins to feed back, and analysis would show the distance from the origin of a conjugate pair of poles in the plane indicates the sinusoidal frequency of the instability in such a system. Moving the mic around, playing with gains, placing objects or faces close to the mic changes the position of the complex poles on the pole-zero plot in a didactic analysis. The Nyquist plot provides more information as to the risk of instability and was a huge innovation over pole-zero plots, and is one reason I say Weiner should not be given sole credit for the founding of any field, giant as he was. BTW there is no set way to measure stability of systems, but for ubiquitous LTI systems, such as audio, and analog design with op-amps, the Nyquist criterion is by far the best approach for people with only undergraduate training in control theory like myself. Using Bode plots for this can be better in cases but it is all still based on Nyquist. To say the Nyquist criterion is limited is like saying anything in mathematics is limited and it is a pretty dang moot point. groovamos
as to modifying "an unstable system to be stable", the following tidbit from biology is interesting: Proteins have now been shown to have a ‘Cruise Control’ mechanism, which works to ‘self-correct’ the integrity of the protein structure from any random mutations imposed on them.
Proteins with cruise control provide new perspective: "A mathematical analysis of the experiments showed that the proteins themselves acted to correct any imbalance imposed on them through artificial mutations and restored the chain to working order." http://www.princeton.edu/main/news/archive/S22/60/95O56/
'Cruise Control' permeating the whole of the protein structure? This is an absolutely fascinating discovery. The equations of calculus involved in achieving even a simple process control loop, such as a dynamic cruise control loop, are very complex.
PID controller A proportional–integral–derivative controller (PID controller) is a generic control loop feedback mechanism (controller) widely used in industrial control systems. A PID controller attempts to correct the error between a measured process variable and a desired setpoint by calculating and then outputting a corrective action that can adjust the process accordingly and rapidly, to keep the error minimal. http://en.wikipedia.org/wiki/PID_controller
It is in realizing the staggering level of engineering that must be dealt with to achieve ‘cruise control’ for each individual protein, along the entirety of the protein structure, that it becomes apparent information must reside along the entire protein structure and that even Axe’s 1 in 10^77 estimate for rarity of finding a functional protein within sequence space is far, far too generous.
Quantum criticality in a wide range of important biomolecules Excerpt: “Most of the molecules taking part actively in biochemical processes are tuned exactly to the transition point and are critical conductors,” they say. That’s a discovery that is as important as it is unexpected. “These findings suggest an entirely new and universal mechanism of conductance in biology very different from the one used in electrical circuits.” The permutations of possible energy levels of biomolecules is huge so the possibility of finding even one that is in the quantum critical state by accident is mind-bogglingly small and, to all intents and purposes, impossible.,, of the order of 10^-50 of possible small biomolecules and even less for proteins,”,,, “what exactly is the advantage that criticality confers?” https://medium.com/the-physics-arxiv-blog/the-origin-of-life-and-the-hidden-role-of-quantum-criticality-ca4707924552 (A Reply To PZ Myers) Estimating the Probability of Functional Biological Proteins? Kirk Durston , Ph.D. Biophysics – 2012 Excerpt (Page 4): The Probabilities Get Worse This measure of functional information (for the RecA protein) is good as a first pass estimate, but the situation is actually far worse for an evolutionary search. In the method described above and as noted in our paper, each site in an amino acid protein sequence is assumed to be independent of all other sites in the sequence. In reality, we know that this is not the case. There are numerous sites in the sequence that are mutually interdependent with other sites somewhere else in the sequence. A more recent paper shows how these interdependencies can be located within multiple sequence alignments.[6] These interdependencies greatly reduce the number of possible functional protein sequences by many orders of magnitude which, in turn, reduce the probabilities by many orders of magnitude as well. In other words, the numbers we obtained for RecA above are exceedingly generous; the actual situation is far worse for an evolutionary search. http://powertochange.com/wp-content/uploads/2012/11/Devious-Distortions-Durston-or-Myers_.pdf Physicists Discover Quantum Law of Protein Folding – February 22, 2011 Quantum mechanics finally explains why protein folding depends on temperature in such a strange way. Excerpt: First, a little background on protein folding. Proteins are long chains of amino acids that become biologically active only when they fold into specific, highly complex shapes. The puzzle is how proteins do this so quickly when they have so many possible configurations to choose from. To put this in perspective, a relatively small protein of only 100 amino acids can take some 10^100 different configurations. If it tried these shapes at the rate of 100 billion a second, it would take longer than the age of the universe to find the correct one. Just how these molecules do the job in nanoseconds, nobody knows.,,, Their astonishing result is that this quantum transition model fits the folding curves of 15 different proteins and even explains the difference in folding and unfolding rates of the same proteins. That's a significant breakthrough. Luo and Lo's equations amount to the first universal laws of protein folding. That’s the equivalent in biology to something like the thermodynamic laws in physics. http://www.technologyreview.com/view/423087/physicists-discover-quantum-law-of-protein/ Classical and Quantum Information Channels in Protein Chain – Dj. Koruga, A. Tomi?, Z. Ratkaj, L. Matija – 2006 Abstract: Investigation of the properties of peptide plane in protein chain from both classical and quantum approach is presented. We calculated interatomic force constants for peptide plane and hydrogen bonds between peptide planes in protein chain. On the basis of force constants, displacements of each atom in peptide plane, and time of action we found that the value of the peptide plane action is close to the Planck constant. This indicates that peptide plane from the energy viewpoint possesses synergetic classical/quantum properties. Consideration of peptide planes in protein chain from information viewpoint also shows that protein chain possesses classical and quantum properties. So, it appears that protein chain behaves as a triple dual system: (1) structural – amino acids and peptide planes, (2) energy – classical and quantum state, and (3) information – classical and quantum coding. Based on experimental facts of protein chain, we proposed from the structure-energy-information viewpoint its synergetic code system. http://www.scientific.net/MSF.518.491
In fact since quantum entanglement/information falsified reductive materialism and/or local realism in the first place (Alain Aspect; Anton Zeilinger) then finding quantum entanglement/information to be ‘protein specific’ is absolutely shattering to any rational hope that materialists had, in whatever slim probabilities they had for finding a functional protein sequence by neo-Darwinian processes, since a ‘transcendent’, ‘non-local’, cause must be supplied which is specific to each unique protein structure. Reductive materialism, which is the basis of neo-Darwinian thought, is simply at a complete loss to supply such a ‘non-local’ transcendent cause for quantum information-entanglement.
John 1:1-4 In the beginning was the Word, and the Word was with God, and the Word was God. The same was in the beginning with God. All things were made by Him, and without Him was not anything made that was made. In Him was life, and that life was the Light of men.
The Nyquist criterion is widely used in electronics and control system engineering, as well as other fields, for designing and analyzing systems with feedback. While Nyquist is one of the most general stability tests, it is still restricted to linear, time-invariant (LTI) systems. Non-linear systems must use more complex stability criteria, such as Lyapunov or the circle criterion. While Nyquist is a graphical technique, it only provides a limited amount of intuition for why a system is stable or unstable, or how to modify an unstable system to be stable. Techniques like Bode plots, while less general, are sometimes a more useful design tool. https://en.wikipedia.org/wiki/Nyquist_stability_criterion During World War II, his work on the automatic aiming and firing of anti-aircraft guns caused Wiener to investigate information theory independently of Claude Shannon and to invent the Wiener filter. (To him is due the now standard practice of modeling an information source as a random process—in other words, as a variety of noise.) His anti-aircraft work eventually led him to formulate cybernetics.[10] After the war, his fame helped MIT to recruit a research team in cognitive science, composed of researchers in neuropsychology and the mathematics and biophysics of the nervous system, including Warren Sturgis McCulloch and Walter Pitts. These men later made pioneering contributions to computer science and artificial intelligence. https://en.wikipedia.org/wiki/Norbert_Wiener#During_and_after_World_War_II bornagain
Norbert Wiener created the modern field of control and communication systems, utilizing concepts like negative feedback. His seminal 1948 book Cybernetics both defined and named the new field. In the book, Wiener helped define the new quantitative concept of information coming out of the work of John von Neumann, Alan Turing, and Claude Shannon on computers and communications. As Leo Szilard had done twenty years earlier, Wiener emphasized the information gained in choices between two equiprobable alternatives, which produce "bits" (binary digits) of information. "One and all, time series [of experimental data] and the apparatus to deal with them, whether in the computing laboratory or in the telephone circuit, have to deal with the recording, preservation, transmission, and use of information. What is this information, and how is it measured? One of the simplest, most unitary forms of information is the recording of a choice between two equally probable simple alternatives, one or the other of which is bound to happen — a choice, for example, between heads and tails in the tossing of a coin. We shall call a single choice of this sort a decision. If then we ask for the amount of information in the perfectly precise measurement of a quantity known to lie between A and B, which may with uniform a priori probability lie anywhere in this range... We may conceive this in the following way: we know a priori that a variable lies between 0 and 1, and a posteriori that it lies on the interval (a, b) inside (0, 1). Then the amount of information we have from our a posteriori knowledge is -log2 (measure of (a, b) / measure of (0, 1)) The quantity we here define as amount of information is the negative of the quantity usually defined as entropy in similar situations. The definition here given is not the one given by R. A. Fisher for statistical problems, although it is a statistical definition; and can be used to replace Fisher's definition in the technique of statistics. (Cybernetics, 2nd edition, pp.61-62) etc.. etc.. "The mechanical brain does not secrete thought "as the liver does bile," as the earlier materialists claimed, nor does it put it out in the form of energy, as the muscle puts out its activity. Information is information, not matter or energy. No materialism which does not admit this can survive at the present day. " http://www.informationphilosopher.com/solutions/scientists/wiener/
BA77, unattributed quote: Norbert Wiener created the modern field of modern field of control and communication systems, utilizing concepts like negative feedback. His seminal 1948 book Cybernetics both defined and named the new field. Classical LTI (linear time-invariant) control systems were made possible by engineer Harry Nyquist, who established the famous Nyquist criterion - absolutely the fundamental criterion for stable control systems. Stated as follows: The open loop gain of an LTI system must decrease at (nearly) -6db per octave at the 0db crossover frequency (gain = 1) and the phase angle (lag) controlled typically within -35~-45 degrees at the 0db crossover: https://en.wikipedia.org/wiki/Nyquist_stability_criterion http://faculty.mercer.edu/jenkins_he/documents/GainMarginandPhaseMargin.pdf I believe that Nyquist based his work on earlier work by Weiner on system functions and Laplace transformation. It is a misunderstanding to give Weiner credit for the founding of the field of classical control theory when such a fundamental property of stability was laid out by Nyquist who also originated the fundamental tool of the Nyquist plot : https://en.wikipedia.org/wiki/Harry_Nyquist Norbert Weiner contributed to communications theory (e.g. the famous Weiner filter which simulates cross- correlation for radar), his contributions are not as well known as Nyquist, Shannon and Hartley to non-specialists. However two important theorems of Fourier analysis bear his name and have important implications for electrical communications and and many other fields. (Paley-Weiner theorem and Weiner-Kinchine theorem) The first of these is the crux of the proof of the impossibility of so-called brick wall, or ideal filters. It is misleading to give credit to Weiner as THE founder of either of these fields of engineering although it may seem fashionable. Cybernetics is not a term used in engineering education today. groovamos
GBDixon, Thanks for your comments. Mung
OT: Lesia, reply to request:
Quantum Consciousness – Time Flies Backwards? – Stuart Hameroff MD Excerpt: Dean Radin and Dick Bierman have performed a number of experiments of emotional response in human subjects. The subjects view a computer screen on which appear (at randomly varying intervals) a series of images, some of which are emotionally neutral, and some of which are highly emotional (violent, sexual….). In Radin and Bierman’s early studies, skin conductance of a finger was used to measure physiological response They found that subjects responded strongly to emotional images compared to neutral images, and that the emotional response occurred between a fraction of a second to several seconds BEFORE the image appeared! Recently Professor Bierman (University of Amsterdam) repeated these experiments with subjects in an fMRI brain imager and found emotional responses in brain activity up to 4 seconds before the stimuli. Moreover he looked at raw data from other laboratories and found similar emotional responses before stimuli appeared. http://www.quantumconsciousness.org/views/TimeFlies.html Morality: physical reality of objective morality: https://docs.google.com/document/d/13shZ_ui7a5MwCLP6CZd3B29gs7LH9XEP1Vb1hNeZ9u8/edit and https://docs.google.com/document/d/1BhgtPC364n2iAwrO4jhyfW1SOZT6qwoEFrA2J7PNnHk/edit
as to:
"I have no idea how you apply these information concepts to the theoretical (and experimental) physics Bornagain discusses."
a few notes:
Exposing the impotence of the Neo-Darwinian theory - Intelligent Design By Dick Peterson - Jan. 2, 2015 Excerpt: Among the most important developments in evolutionary biology is the undeniable role information plays in the intricate mechanisms of living cells. Mathematician, philosopher, and theologian William Dembski argues that matter is not real in and of itself, but things become real insofar as they exchange information with each other. In contrast to a materialist view that everything is built out of matter, Dembski proposes that information is the fundamental stuff of the world and the outworking of an intelligence. “As a Christian, I hold that things are real because they relate to God,” he said. Don’t expect the greater scientific community to embrace Dembski’s belief anytime soon, if ever. But a growing body of scientific research could soon expose the impotence of the Neo-Darwinian theory of evolution. http://www.worldmag.com/2015/01/exposing_the_impotence_of_the_neo_darwinian_theory?fb_action_ids=997300716951950&fb_action_types=og.likes&fb_ref=.VKdtVJMtktE.like “The thesis of my book ‘Being as Communion’ is that the fundamental stuff of the world is information. That things are real because they exchange information one with another.” Conversations with William Dembski–The Thesis of Being as Communion – video https://www.youtube.com/watch?v=cYAsaU9IvnI "What I call the theological worldview is the idea that the world and everything in it has meaning and reason, and in particular a good and indubitable meaning. It follows immediately that our worldly existence, since it has in itself at most a very dubious meaning, can only be means to the end of another existence. The idea that everything in the world has a meaning [reason] is an exact analogue of the principle that everything has a cause, on which rests all of science." Kurt Gödel - Hao Wang’s biography "Reflections on Kurt Gödel", MIT Press, 1987
Seversky, you make an interesting distinction: information can be subjective and changing. Imagine Shannon's list of messages, one of which is 'launch rocket A'. It initially may be the most important message in the list but can actually be remove from the list entirely once sent, since it will never make sense to send it again, unless, as you said, you are not sure the message arrived the first time. The symbols themselves do not seem to be the information since the actual information can be translated from one set of symbols to another. We represent DNA sequences with letters, the cell with chemicals. In this sense, information is abstract as Mapau asserted (I should not have used the word 'abstract' in my original post. Probably 'esoteric' would have been better...conveying the idea of something difficult to apply practically). An intended receiver or interpreting machine is always part of information transfer, and he or it usually decides what is information. We have written texts in unknown languages. To us the information content is low until we figure it out or find someone who can read them, then they become very valuable. But to the person who could read the text, the information was always there. We often simplify the problem by considering thrown die or other trivial problems, but the cell is so complex that the actual information content probably changes with each generation, and to us it even changes according to how we look at it (does 'junk DNA', much of which serves some function, have the same number of bits of information per codon as protein coding DNA? Probably not). For DNA, probably the best we can do is estimate a reasonable lower bound on information content in the cell. I have no idea how you apply these information concepts to the theoretical physics Bornagain discusses. How do you discuss information in the event horizon of a black hole when there will never be anyone to read or anything to interpret it? Why is the information content of something changeable according to who receives it, while the physicists seem to be able to reduce information to a constant? I personally think we are conflating different things and calling them both information, but I'm not qualified to comment on this. So, we do often talk past each other because there seem to be many different definitions of information. I do feel the simpler definitions are the ones we should be considering in biology, and that we are not discussing the right thing until we include the role of the receiver. Discussing the fancier definitions seem to lead us in circles. GBDixon
As well, as if that was not 'spooky enough', information, not material, is found to be foundational to physical reality:
"it from bit” Every “it”— every particle, every field of force, even the space-time continuum itself derives its function, its meaning, its very existence entirely—even if in some contexts indirectly—from the apparatus-elicited answers to yes-or-no questions, binary choices, bits. “It from bit” symbolizes the idea that every item of the physical world has a bottom—a very deep bottom, in most instances, an immaterial source and explanation, that which we call reality arises in the last analysis from the posing of yes-no questions and the registering of equipment—evoked responses, in short all matter and all things physical are information-theoretic in origin and this is a participatory universe." – Princeton University physicist John Wheeler (1911–2008) (Wheeler, John A. (1990), “Information, physics, quantum: The search for links”, in W. Zurek, Complexity, Entropy, and the Physics of Information (Redwood City, California: Addison-Wesley)) Why the Quantum? It from Bit? A Participatory Universe? Excerpt: In conclusion, it may very well be said that information is the irreducible kernel from which everything else flows. Thence the question why nature appears quantized is simply a consequence of the fact that information itself is quantized by necessity. It might even be fair to observe that the concept that information is fundamental is very old knowledge of humanity, witness for example the beginning of gospel according to John: "In the beginning was the Word." Anton Zeilinger - a leading expert in quantum teleportation: http://www.metanexus.net/archive/ultimate_reality/zeilinger.pdf Quantum physics just got less complicated - Dec. 19, 2014 Excerpt: Patrick Coles, Jedrzej Kaniewski, and Stephanie Wehner,,, found that 'wave-particle duality' is simply the quantum 'uncertainty principle' in disguise, reducing two mysteries to one.,,, "The connection between uncertainty and wave-particle duality comes out very naturally when you consider them as questions about what information you can gain about a system. Our result highlights the power of thinking about physics from the perspective of information,",,, http://phys.org/news/2014-12-quantum-physics-complicated.html John Lennox at Rice University: Christianity Gave Us Science - Sept. 28, 2015 53:00 minute mark - mass-energy is derivative from information (i.e. It from bit) and life is based on information. https://youtu.be/PSq4KLjMSlI?t=3182
It is hard to imagine a more convincing proof that we are made 'in the image of God', than finding that both the universe and life itself are 'information theoretic' in their basis, and that we, of all the creatures on earth, uniquely possess an ability to understand and create information. I guess a more convincing evidence could be that God Himself became a man, defeated death on a cross, and then rose from the dead to prove that He was God. But who has ever heard of such overwhelming evidence as that?
Turin Shroud Quantum Hologram Reveals The Words 'The Lamb' on a Solid Oval Object Under The Beard - video http://www.godtube.com/watch/?v=J21MECNU Solid Oval Object Under The Beard http://shroud3d.com/findings/solid-oval-object-under-the-beard
Verses and Music:
Genesis 1:26 And God said, Let us make man in our image, after our likeness: and let them have dominion over the fish of the sea, and over the fowl of the air, and over the cattle, and over all the earth, and over every creeping thing that creepeth upon the earth. John 1:1-4 In the beginning was the Word, and the Word was with God, and the Word was God. The same was in the beginning with God. All things were made by Him, and without Him was not anything made that was made. In Him was life, and that life was the Light of men. Casting Crowns - The Word Is Alive https://www.youtube.com/watch?v=X9itgOBAxSc
In further note to posts 2 and 3: Besides information being independent of matter and energy, and being shown to be physically real instead of being merely emergent from a material basis as is held in Darwinian/materialistic thought, the one thing that most drastically demarcates humans from other animals is our ability to use information.
Origin of the Mind: Marc Hauser - Scientific American - April 2009 Excerpt: "Researchers have found some of the building blocks of human cognition in other species. But these building blocks make up only the cement footprint of the skyscraper that is the human mind",,, http://www.wjh.harvard.edu?/~mnkylab/publications/rec?ent/mindSciAm.pdf Evolution of the Genus Homo – Annual Review of Earth and Planetary Sciences – Ian Tattersall, Jeffrey H. Schwartz, May 2009 Excerpt: “Unusual though Homo sapiens may be morphologically, it is undoubtedly our remarkable cognitive qualities that most strikingly demarcate us from all other extant species. They are certainly what give us our strong subjective sense of being qualitatively different. And they are all ultimately traceable to our symbolic capacity. Human beings alone, it seems, mentally dissect the world into a multitude of discrete symbols, and combine and recombine those symbols in their minds to produce hypotheses of alternative possibilities. When exactly Homo sapiens acquired this unusual ability is the subject of debate.” http://www.annualreviews.org/doi/abs/10.1146/annurev.earth.031208.100202 Leading Evolutionary Scientists Admit We Have No Evolutionary Explanation of Human Language - December 19, 2014 Excerpt: Understanding the evolution of language requires evidence regarding origins and processes that led to change. In the last 40 years, there has been an explosion of research on this problem as well as a sense that considerable progress has been made. We argue instead that the richness of ideas is accompanied by a poverty of evidence, with essentially no explanation of how and why our linguistic computations and representations evolved.,,, (Marc Hauser, Charles Yang, Robert Berwick, Ian Tattersall, Michael J. Ryan, Jeffrey Watumull, Noam Chomsky and Richard C. Lewontin, "The mystery of language evolution," Frontiers in Psychology, Vol 5:401 (May 7, 2014).) Luskin adds: It's difficult to imagine much stronger words from a more prestigious collection of experts. http://www.evolutionnews.org/2014/12/leading_evoluti092141.html
Michael Egnor states, in regards to mental powers, We are more different from apes than apes are from viruses.,,,
The Fundamental Difference Between Humans and Nonhuman Animals - Michael Egnor - November 5, 2015 Excerpt: Human beings have mental powers that include the material mental powers of animals but in addition entail a profoundly different kind of thinking. Human beings think abstractly, and nonhuman animals do not. Human beings have the power to contemplate universals, which are concepts that have no material instantiation. Human beings think about mathematics, literature, art, language, justice, mercy, and an endless library of abstract concepts. Human beings are rational animals. Human rationality is not merely a highly evolved kind of animal perception. Human rationality is qualitatively different -- ontologically different -- from animal perception. Human rationality is different because it is immaterial. Contemplation of universals cannot have material instantiation, because universals themselves are not material and cannot be instantiated in matter.,,, It is a radical difference -- an immeasurable qualitative difference, not a quantitative difference. We are more different from apes than apes are from viruses.,,, http://www.evolutionnews.org/2015/11/the_fundamental_2100661.html
In fact, it was because of such a drastic difference in the mental powers of animals as compared to the mental powers of man that Alfred Wallace, co-discoverer of natural selection, believed that man had a soul:
"Nothing in evolution can account for the soul of man. The difference between man and the other animals is unbridgeable. Mathematics is alone sufficient to prove in man the possession of a faculty unexistent in other creatures. Then you have music and the artistic faculty. No, the soul was a separate creation." Alfred Russel Wallace - An interview by Harold Begbie printed on page four of The Daily Chronicle (London) issues of 3 November and 4 November 1910.
More interesting still, the three Rs, reading, writing, and arithmetic, i.e. the unique ability to process information inherent to man, are the very first things to be taught to children when they enter elementary school. And yet it is this information processing, i.e. reading, writing, and arithmetic that is found to be foundational to life:
Information Enigma (Where did the information come from?) - video https://www.youtube.com/watch?v=aA-FcnLsF1g Signature in the Cell by Stephen Meyer - video clip https://www.youtube.com/watch?v=TVkdQhNdzHU Complex grammar of the genomic language – November 9, 2015 Excerpt: The ‘grammar’ of the human genetic code is more complex than that of even the most intricately constructed spoken languages in the world. The findings explain why the human genome is so difficult to decipher –,,, ,,, in their recent study in Nature, the Taipale team examines the binding preferences of pairs of transcription factors, and systematically maps the compound DNA words they bind to. Their analysis reveals that the grammar of the genetic code is much more complex than that of even the most complex human languages. Instead of simply joining two words together by deleting a space, the individual words that are joined together in compound DNA words are altered, leading to a large number of completely new words. http://www.sciencedaily.com/releases/2015/11/151109140252.htm John Lennox - Semiotic Information - video https://www.youtube.com/watch?v=F6rd4HEdffw
Is information the code or what the code conveys? Is information the alphabet, vocabulary, syntax, etc of a language or what is said or written using that language? If information is a decrease in uncertainty then a message from me to a friend announcing "Arriving New York 10:30 pm" is information because the friend did not know that before. His uncertainty has been reduced by that amount. If I was unsure whether or not the message had been received and sent it again, is it still information, given that there is no further reduction in my friends uncertainty? In other words, we have two messages exactly the same, except in one case there is information and in the second there is not. So what is information? Do the various definitions of information have anything in common? Are they, in fact, referring to the same thing or are they referring to different things and calling them all information is just confusing the issue? Seversky
If we accept the Shannon definition of information as a code agreed upon between sender and receiver (a very sensible definition, IMO), then information is necessarily an abstract entity. And if so, information is not physical, i.e., it can neither emerge from physics nor even exist in a physical universe. The fact that we, humans, can create information and can recognize it in nature is a sign that we are more than just physical beings. Therefore, any evidence of information in nature (such as genetic sequences) is strong evidence, if not proof, that one or more conscious and intelligent beings had something to do with it. It is also evidence that consciousness requires more than just a physical brain. Mapou
Poor Dr. Shannon gets short shrift by Luskin and nearly anyone else who discusses the nature of information. Shannon's definition of information is not what the so-called experts say it is. It is much simpler. Shannon's formal definition of information was a list of valid messages agreed upon by a sender and a receiver (simple, eh?). The problem he solved was how to encode these messages in the most efficient manner (this resulted in the famous entropy equation) and how to send the messages in such a way that when they are corrupted by noise, they can still be discerned from the much larger set of received, possibly corrupted messages. We can extend his practical and simple information definition to many areas (this agrees pretty much with Mung's definition): Information is any message that is useful to someone or something By 'message' we usually mean something using symbols syntactically arranged, such as alphabet letters, bits or DNA sequences. By 'useful' we mean it means something, has value, or does something useful when interpreted (by a computer or molecular machine in a cell, for example). Shannon certainly knew what information was even if those who profess expertise in his theory do not. Shannon never viewed information as some abstract thing that was difficult to apply practically: information was messages with meaning. The whole 'Shannon information' thing came later and is a result of only examining a subset of the problem, and even then it is misinterpreted. Those here who discuss the nature of information make it way too complicated. We know there is a set of all DNA sequences that do useful things. We know there is a much much larger set of DNA sequences that are gibberish and have no use or value. The first set is information. The second set isn't. Simple. (the review pane for this comment is showing my name as 'anonymous'. I am GBDixon) GBDixon
In fact an entire human can, theoretically, be reduced to quantum information and teleported to another location in the universe:
Quantum Teleportation Of A Human? – video https://www.youtube.com/watch?v=yfePpMTbFYY Will Human Teleportation Ever Be Possible? As experiments in relocating particles advance, will we be able to say, "Beam me up, Scotty" one day soon? By Corey S. Powell - Monday, June 16, 2014 Excerpt: Note a fascinating common thread through all these possibilities. Whether you regard yourself as a pile of atoms, a DNA sequence, a series of sensory inputs or an elaborate computer file, in all of these interpretations you are nothing but a stack of data. According to the principle of unitarity, quantum information is never lost. Put them together, and those two statements lead to a staggering corollary: At the most fundamental level, the laws of physics say you are immortal. http://discovermagazine.com/2014/julyaug/20-the-ups-and-downs-of-teleportation
Moreover, this quantum entanglement/information, by which both classical information and energy are reduced to quantum information, is now found in molecular biology on a massive scale. In every DNA and protein molecule.
Classical and Quantum Information in DNA – Elisabeth Rieper – video (Longitudinal Quantum Information along the entire length of DNA discussed at the 19:30 minute mark; at 24:00 minute mark Dr Rieper remarks that practically the whole DNA molecule can be viewed as quantum information with classical information embedded within it) https://youtu.be/2nqHOnVTxJE?t=1176 Classical and Quantum Information Channels in Protein Chain - Dj. Koruga, A. Tomi?, Z. Ratkaj, L. Matija - 2006 Abstract: Investigation of the properties of peptide plane in protein chain from both classical and quantum approach is presented. We calculated interatomic force constants for peptide plane and hydrogen bonds between peptide planes in protein chain. On the basis of force constants, displacements of each atom in peptide plane, and time of action we found that the value of the peptide plane action is close to the Planck constant. This indicates that peptide plane from the energy viewpoint possesses synergetic classical/quantum properties. Consideration of peptide planes in protein chain from information viewpoint also shows that protein chain possesses classical and quantum properties. So, it appears that protein chain behaves as a triple dual system: (1) structural - amino acids and peptide planes, (2) energy - classical and quantum state, and (3) information - classical and quantum coding. Based on experimental facts of protein chain, we proposed from the structure-energy-information viewpoint its synergetic code system. http://www.scientific.net/MSF.518.491 Quantum coherent-like state observed in a biological protein for the first time - October 13, 2015 Excerpt: If you take certain atoms and make them almost as cold as they possibly can be, the atoms will fuse into a collective low-energy quantum state called a Bose-Einstein condensate. In 1968 physicist Herbert Fröhlich predicted that a similar process at a much higher temperature could concentrate all of the vibrational energy in a biological protein into its lowest-frequency vibrational mode. Now scientists in Sweden and Germany have the first experimental evidence of such so-called Fröhlich condensation (in proteins).,,, The real-world support for Fröhlich's theory (for proteins) took so long to obtain because of the technical challenges of the experiment, Katona said. per physorg
And, unlike classical physics where it is matter and energy that are conserved, in quantum mechanics it is quantum information that is conserved.
Quantum no-hiding theorem experimentally confirmed for first time Excerpt: In the classical world, information can be copied and deleted at will. In the quantum world, however, the conservation of quantum information means that information cannot be created nor destroyed. This concept stems from two fundamental theorems of quantum mechanics: the no-cloning theorem and the no-deleting theorem. A third and related theorem, called the no-hiding theorem, addresses information loss in the quantum world. According to the no-hiding theorem, if information is missing from one system (which may happen when the system interacts with the environment), then the information is simply residing somewhere else in the Universe; in other words, the missing information cannot be hidden in the correlations between a system and its environment. http://www.physorg.com/news/2011-03-quantum-no-hiding-theorem-experimentally.html Quantum no-deleting theorem Excerpt: A stronger version of the no-cloning theorem and the no-deleting theorem provide permanence to quantum information. To create a copy one must import the information from some part of the universe and to delete a state one needs to export it to another part of the universe where it will continue to exist. per wikipedia
Besides providing direct empirical falsification of neo-Darwinian claims as to the generation of information from a material basis, (i.e. information is merely 'emergent' in Darwinian thinking), the implication of finding 'non-local', beyond space and time, and ‘conserved’ quantum information in molecular biology on a massive scale is fairly, and pleasantly, obvious:
Does Quantum Biology Support A Quantum Soul? – Stuart Hameroff - video https://www.youtube.com/watch?v=iIyEjh6ef_8 Quantum Entangled Consciousness - Life After Death - Stuart Hameroff - video https://www.youtube.com/watch?v=jjpEc98o_Oo The Case for the Soul (Near-Death Experiences) - video (Quantum Entangled consciousness and conservation of quantum information discussed at 9:00 minute mark) https://www.youtube.com/watch?v=rlBO0Y9GJhk
Verse and Music:
John 1:1-4 In the beginning was the Word, and the Word was with God, and the Word was God. He was in the beginning with God. All things were made through Him, and without Him nothing was made that was made. In Him was life, and the life was the light of men. Moriah Peters - You Carry Me - music https://www.youtube.com/watch?v=x2H-zQjgurQ
What is information anyway? Although precisely defining what type of information you are dealing with is certainly very important, I hold that showing information to be physically real is more important as far as physical science itself is concerned.
"Information is information, not matter or energy. No materialism which does not admit this can survive at the present day." Norbert Weiner - MIT Mathematician -(Cybernetics, 2nd edition, p.132) Norbert Wiener created the modern field of control and communication systems, utilizing concepts like negative feedback. His seminal 1948 book Cybernetics both defined and named the new field.
Classical information was shown to have a 'thermodynamic content' in the following experiment.
Maxwell's demon demonstration (knowledge of a particle's position) turns information into energy - November 2010 Excerpt: Scientists in Japan are the first to have succeeded in converting information into free energy in an experiment that verifies the "Maxwell demon" thought experiment devised in 1867.,,, In Maxwell’s thought experiment the demon creates a temperature difference simply from information about the gas molecule temperatures and without transferring any energy directly to them.,,, Until now, demonstrating the conversion of information to energy has been elusive, but University of Tokyo physicist Masaki Sano and colleagues have succeeded in demonstrating it in a nano-scale experiment. In a paper published in Nature Physics they describe how they coaxed a Brownian particle to travel upwards on a "spiral-staircase-like" potential energy created by an electric field solely on the basis of information on its location. As the particle traveled up the staircase it gained energy from moving to an area of higher potential, and the team was able to measure precisely how much energy had been converted from information. http://www.physorg.com/news/2010-11-maxwell-demon-energy.html Demonic device converts information to energy - 2010 Excerpt: "This is a beautiful experimental demonstration that information has a thermodynamic content," says Christopher Jarzynski, a statistical chemist at the University of Maryland in College Park. In 1997, Jarzynski formulated an equation to define the amount of energy that could theoretically be converted from a unit of information2; the work by Sano and his team has now confirmed this equation. "This tells us something new about how the laws of thermodynamics work on the microscopic scale," says Jarzynski. http://www.scientificamerican.com/article.cfm?id=demonic-device-converts-inform
Dr. Andy C. McIntosh, who is the Professor of Thermodynamics Combustion Theory at the University of Leeds (the highest teaching/research rank in U.K. university hierarchy), has written a peer-reviewed paper in which he holds that it is 'non-material information' which is constraining the local thermodynamics of a cell to be in such a extremely high non-equilibrium state:
Information and entropy – top-down or bottom-up development in living systems? Excerpt: This paper highlights the distinctive and non-material nature of information and its relationship with matter, energy and natural forces. It is proposed in conclusion that it is the non-material information (transcendent to the matter and energy) that is actually itself constraining the local thermodynamics to be in ordered disequilibrium and with specified raised free energy levels necessary for the molecular and cellular machinery to operate. A.C. McINTOSH - Dr Andy C. McIntosh is the Professor of Thermodynamics Combustion Theory at the University of Leeds. (the highest teaching/research rank in U.K. university hierarchy) http://journals.witpress.com/paperinfo.asp?pid=420
Moreover, Dr. McIntosh holds that regarding information as independent of energy and matter 'resolves the thermodynamic issues and invokes the correct paradigm for understanding the vital area of thermodynamic/organisational interactions'.
Information and Thermodynamics in Living Systems - Andy C. McIntosh - 2013 Excerpt: ,,, information is in fact non-material and that the coded information systems (such as, but not restricted to the coding of DNA in all living systems) is not defined at all by the biochemistry or physics of the molecules used to store the data. Rather than matter and energy defining the information sitting on the polymers of life, this approach posits that the reverse is in fact the case. Information has its definition outside the matter and energy on which it sits, and furthermore constrains it to operate in a highly non-equilibrium thermodynamic environment. This proposal resolves the thermodynamic issues and invokes the correct paradigm for understanding the vital area of thermodynamic/organisational interactions, which despite the efforts from alternative paradigms has not given a satisfactory explanation of the way information in systems operates.,,, http://www.worldscientific.com/doi/abs/10.1142/9789814508728_0008
Here is a recent video by Dr. Giem, that gets the main points of Dr. McIntosh’s paper over very well for the layperson:
Biological Information – Information and Thermodynamics in Living Systems 11-22-2014 by Paul Giem (A. McIntosh) – video https://www.youtube.com/watch?v=IR_r6mFdwQM
Moreover, this classical information which is shown to possess a 'thermodynamic content', and is thus shown to be physically real, is also shown to be a subset of Quantum Information. First it is important to learn that Quantum Entanglement and Quantum Information are two sides of the same coin:
Quantum Entanglement and Information Quantum entanglement is a physical resource, like energy, associated with the peculiar nonclassical correlations that are possible between separated quantum systems. Entanglement can be measured, transformed, and purified. A pair of quantum systems in an entangled state can be used as a quantum information channel to perform computational and cryptographic tasks that are impossible for classical systems. The general study of the information-processing capabilities of quantum systems is the subject of quantum information theory. http://plato.stanford.edu/entries/qt-entangle/
And classical information, by the use of quantum entanglement, can be reduced to quantum information
Quantum knowledge cools computers: New understanding of entropy – June 2011 Excerpt: No heat, even a cooling effect; In the case of perfect classical knowledge of a computer memory (zero entropy), deletion of the data requires in theory no energy at all. The researchers prove that “more than complete knowledge” from quantum entanglement with the memory (negative entropy) leads to deletion of the data being accompanied by removal of heat from the computer and its release as usable energy. This is the physical meaning of negative entropy. Renner emphasizes, however, “This doesn’t mean that we can develop a perpetual motion machine.” The data can only be deleted once, so there is no possibility to continue to generate energy. The process also destroys the entanglement, and it would take an input of energy to reset the system to its starting state. The equations are consistent with what’s known as the second law of thermodynamics: the idea that the entropy of the universe can never decrease. Vedral says “We’re working on the edge of the second law. If you go any further, you will break it.” http://www.sciencedaily.com/releases/2011/06/110601134300.htm
It should be noted that, (besides the preceding study showing classical information is a subset of quantum information), the preceding study also falsified the late Rolf Landauer’s decades old contention that the information encoded in a computer is merely 'physical', (merely ‘emergent’ from a material basis), since he believed it always required energy to erase classical information from a computer. Moreover, as if that were not enough to prove the physical reality of information that is independent of matter and energy, physicists have now reduced material, via quantum teleportation, to quantum information. (of note: energy is completely reduced to quantum information, whereas matter is semi-completely reduced, with the caveat being that matter can be reduced to energy via e=mc2).
How Teleportation Will Work - Excerpt: In 1993, the idea of teleportation moved out of the realm of science fiction and into the world of theoretical possibility. It was then that physicist Charles Bennett and a team of researchers at IBM confirmed that quantum teleportation was possible, but only if the original object being teleported was destroyed. — As predicted, the original photon no longer existed once the replica was made. http://science.howstuffworks.com/science-vs-myth/everyday-myths/teleportation1.htm Quantum Teleportation – IBM Research Page Excerpt: “it would destroy the original (photon) in the process,,” http://researcher.ibm.com/view_project.php?id=2862
Some observations... 1. Information is always information about something. 2. Meaningless information is an oxymoron. Mung

Leave a Reply