Uncommon Descent Serving The Intelligent Design Community

The First Gene: An information theory look at the origin of life

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email
The First Gene: The Birth of Programming, Messaging and Formal Control

Here, edited by David Abel, The First Gene: The Birth of Programming, Messaging and Formal Control :

“The First Gene: The Birth of Programming, Messaging and Formal Control” is a peer-reviewed anthology of papers that focuses, for the first time, entirely on the following difficult scientific questions: *How did physics and chemistry write the first genetic instructions? *How could a prebiotic (pre-life, inanimate) environment consisting of nothing but chance and necessity have programmed logic gates, decision nodes, configurable-switch settings, and prescriptive information using a symbolic system of codons (three nucleotides per unit/block of code)? The codon table is formal, not physical. It has also been shown to be conceptually ideal. *How did primordial nature know how to write in redundancy codes that maximally protect information? *How did mere physics encode and decode linear digital instructions that are not determined by physical interactions? All known life is networked and cybernetic. “Cybernetics” is the study of various means of steering, organizing and controlling objects and events toward producing utility. The constraints of initial conditions and the physical laws themselves are blind and indifferent to functional success. Only controls, not constraints, steer events toward the goal of usefulness (e.g., becoming alive or staying alive). Life-origin science cannot advance until first answering these questions: *1-How does nonphysical programming arise out of physicality to then establish control over that physicality? *2-How did inanimate nature give rise to a formally-directed, linear, digital, symbol-based and cybernetic-rich life? *3-What are the necessary and sufficient conditions for turning physics and chemistry into formal controls, regulation, organization, engineering, and computational feats? “The First Gene” directly addresses these questions.

As we write, it is #2 in biophysics, and the trolls haven’t even got there yet.

Here’s Casey Luskin’s review:

Materialists Beware: The First Gene Defends a Strictly Scientific, Non-Materialist Conception of Biological Origins:

The First Gene investigates a number of different types of information that we find in nature, including prescriptive information, semantic information, and Shannon information. Prescriptive information is what directs our choices, and it is a form of semantic information — which is a type of functional information. In contrast, Shannon information, according to Abel, shouldn’t even be called “information” because it’s really a measure of a reduction in certainty, and by itself cannot do anything to “prescribe or generate formal function.” (p. 11) Making arguments similar to those embodied in Dembski’s law of conservation of information, Abel argues that “Shannon uncertainty cannot progress to becoming [Functional Information] without smuggling in positive information from an external source.” (p. 12) The highest form of information, however, is prescriptive information:

Comments
Joseph, you are exactly right, which is why I don't have a problem with calling it "Shannon information." I think what Abel is getting at, however, is that the general public, including many materialists who are attempting to attack ID, use the concept of Shannon "information" precisely in that ordinary usage sense. This demonstrates that they don't understand what Shannon information is, but the term is still a problematic rhetorical hatchet when used wrongly. My guess is what Abel is getting at is that we'd be better off continuing to use the word "information" the way most people think of it, and, therefore, come up with some other term for Shannon "information." Based on what we know about information today, I'd say I have to agree -- Shannon probably should not have referred to it as "information" as that just confuses the issue and makes it seem that his theory covers more than it does. Or we can continue to call it "Shannon information" and then at every turn just explain to people (like Weaver had to do) that we really aren't talking about information in the sense that anybody typically uses the word. Oh, well. I don't suppose there is any possibility of changing the term now, so we'll just have to keep using the term and educate people that it doesn't mean what they might think at first.Eric Anderson
November 18, 2011
November
11
Nov
18
18
2011
08:34 AM
8
08
34
AM
PDT
further note:
Three subsets of sequence complexity and their relevance to biopolymeric information - Abel, Trevors Excerpt: Shannon information theory measures the relative degrees of RSC and OSC. Shannon information theory cannot measure FSC. FSC is invariably associated with all forms of complex biofunction, including biochemical pathways, cycles, positive and negative feedback regulation, and homeostatic metabolism. The algorithmic programming of FSC, not merely its aperiodicity, accounts for biological organization. No empirical evidence exists of either RSC of OSC ever having produced a single instance of sophisticated biological organization. Organization invariably manifests FSC rather than successive random events (RSC) or low-informational self-ordering phenomena (OSC).,,, Testable hypotheses about FSC What testable empirical hypotheses can we make about FSC that might allow us to identify when FSC exists? In any of the following null hypotheses [137], demonstrating a single exception would allow falsification. We invite assistance in the falsification of any of the following null hypotheses: Null hypothesis #1 Stochastic ensembles of physical units cannot program algorithmic/cybernetic function. Null hypothesis #2 Dynamically-ordered sequences of individual physical units (physicality patterned by natural law causation) cannot program algorithmic/cybernetic function. Null hypothesis #3 Statistically weighted means (e.g., increased availability of certain units in the polymerization environment) giving rise to patterned (compressible) sequences of units cannot program algorithmic/cybernetic function. Null hypothesis #4 Computationally successful configurable switches cannot be set by chance, necessity, or any combination of the two, even over large periods of time. We repeat that a single incident of nontrivial algorithmic programming success achieved without selection for fitness at the decision-node programming level would falsify any of these null hypotheses. This renders each of these hypotheses scientifically testable. We offer the prediction that none of these four hypotheses will be falsified. http://www.tbiomed.com/content/2/1/29
One part of Shannon information theory that is useful for Intelligent Design though is this part. Claude Shannon's work on 'communication of information' actually fully supports Intelligent Design, since the first 'optimal' DNA code has to be at least as complex as the present 'optimal' DNA code we find in life, as is illustrated in the following video and quotes:
Shannon Information - Channel Capacity - Perry Marshall - video http://www.metacafe.com/watch/5457552/ “Because of Shannon channel capacity that previous (first) codon alphabet had to be at least as complex as the current codon alphabet (DNA code), otherwise transferring the information from the simpler alphabet into the current alphabet would have been mathematically impossible” Donald E. Johnson – Bioinformatics: The Information in Life Deciphering Design in the Genetic Code Excerpt: When researchers calculated the error-minimization capacity of one million randomly generated genetic codes, they discovered that the error-minimization values formed a distribution where the naturally occurring genetic code's capacity occurred outside the distribution. Researchers estimate the existence of 10 possible genetic codes possessing the same type and degree of redundancy as the universal genetic code. All of these codes fall within the error-minimization distribution. This finding means that of the 10 possible genetic codes, few, if any, have an error-minimization capacity that approaches the code found universally in nature. http://www.reasons.org/biology/biochemical-design/fyi-id-dna-deciphering-design-genetic-code
Perhaps its time for Richard Dawkins to call on his extra-terrestrial designers? video and music:
Richard Dawkins admits to Intelligent Design - video http://www.youtube.com/watch?v=BoncJBrrdQ8 AWOLNATION - "SAIL" (Official Video) http://www.youtube.com/watch?v=PPtSKimbjOU
bornagain77
November 18, 2011
November
11
Nov
18
18
2011
07:45 AM
7
07
45
AM
PDT
Notes:
Programming of Life - Information - Shannon, Functional & Prescriptive - video http://www.youtube.com/user/Programmingoflife#p/c/AFDF33F11E2FB840/1/h3s1BXfZ-3w The Capabilities of Chaos and Complexity: David L. Abel - Null Hypothesis For Information Generation - 2009 To focus the scientific community’s attention on its own tendencies toward overzealous metaphysical imagination bordering on “wish-fulfillment,” we propose the following readily falsifiable null hypothesis, and invite rigorous experimental attempts to falsify it: "Physicodynamics cannot spontaneously traverse The Cybernetic Cut: physicodynamics alone cannot organize itself into formally functional systems requiring algorithmic optimization, computational halting, and circuit integration." A single exception of non trivial, unaided spontaneous optimization of formal function by truly natural process would falsify this null hypothesis. http://www.mdpi.com/1422-0067/10/1/247/pdf Can We Falsify Any Of The Following Null Hypothesis (For Information Generation) 1) Mathematical Logic 2) Algorithmic Optimization 3) Cybernetic Programming 4) Computational Halting 5) Integrated Circuits 6) Organization (e.g. homeostatic optimization far from equilibrium) 7) Material Symbol Systems (e.g. genetics) 8) Any Goal Oriented bona fide system 9) Language 10) Formal function of any kind 11) Utilitarian work http://mdpi.com/1422-0067/10/1/247/ag The Law of Physicodynamic Insufficiency - Dr David L. Abel - November 2010 Excerpt: “If decision-node programming selections are made randomly or by law rather than with purposeful intent, no non-trivial (sophisticated) function will spontaneously arise.”,,, After ten years of continual republication of the null hypothesis with appeals for falsification, no falsification has been provided. The time has come to extend this null hypothesis into a formal scientific prediction: “No non trivial algorithmic/computational utility will ever arise from chance and/or necessity alone.” http://www-qa.scitopics.com/The_Law_of_Physicodynamic_Insufficiency.html The Law of Physicodynamic Incompleteness - David L. Abel - August 2011 Summary: “The Law of Physicodynamic Incompleteness” states that inanimate physicodynamics is completely inadequate to generate, or even explain, the mathematical nature of physical interactions (the purely formal laws of physics and chemistry). The Law further states that physicodynamic factors cannot cause formal processes and procedures leading to sophisticated function. Chance and necessity alone cannot steer, program or optimize algorithmic/computational success to provide desired non-trivial utility. http://www.scitopics.com/The_Law_of_Physicodynamic_Incompleteness.html
bornagain77
November 18, 2011
November
11
Nov
18
18
2011
07:31 AM
7
07
31
AM
PDT
In contrast, Shannon information, according to Abel, shouldn’t even be called “information” because it’s really a measure of a reduction in certainty, and by itself cannot do anything to “prescribe or generate formal function.” :
The word information in this theory is used in a special mathematical sense that must not be confused with its ordinary usage. In particular, information must not be confused with meaning.- Warren Weaver, one of Shannon's collaborators
Joseph
November 18, 2011
November
11
Nov
18
18
2011
07:18 AM
7
07
18
AM
PDT
1 7 8 9

Leave a Reply