Uncommon Descent Serving The Intelligent Design Community

Lee Spetner on evolution and information

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

From Lee Spetner, author of The Evolution Revolution at True Origin:

Many years ago I published a paper that pointed out that the evolutionary process can be characterized as a transmission of information from the environment to the genome (Spetner 1964). Somewhat later I wrote that there is no known random mutation that adds information to the genome (Spetner 1997). This statement in one form or another has found its way into the debate on evolution versus creation. Evolutionists have distorted the statement to attack it, as in Claim CB102, where Isaak has written his target for attack as, ‘Mutations are random noise; they do not add information. Evolution cannot cause an increase in information.’ Perhaps something like this statement was indeed made in an argument by someone, but Isaak has distorted its meaning. For his ‘refutation’ he writes the following 4 points (the references are his citations): More.

See also: Lee Spetner answers his critics

Comments
When it comes to nucleotides, they're not too far away from equiprobability. One can adjust for this lack of equiprobability, if one chooses. But, to use Hartley's formula gets you close enough to what you need to know--unless you're not interested in 'knowing.' Newton's formula for gravitation is not entirely correct. But, it's a great approximation, and saves one from the tedium of using Einstein's equations to solve for the same thing. They don't use General Relativity when sending probes out into space, but simple Newtonian gravity. It's close enough. Same here. Again, unless you don't want to concede a thing.PaV
August 12, 2017
August
08
Aug
12
12
2017
07:45 AM
7
07
45
AM
PDT
PaV @ 157 - letters in the English language are one example.Bob O'H
August 12, 2017
August
08
Aug
12
12
2017
02:56 AM
2
02
56
AM
PDT
PaV: How, where, and what ways, is ‘equiprobability’ lost? When events are not equally probable.Mung
August 11, 2017
August
08
Aug
11
11
2017
11:30 PM
11
11
30
PM
PDT
ET:
Why not? We can use Shannon’s methodology to calculate the number of bits it contains.
Shannon specifically cites his reliance on Hartley's measure of information. Shannon's measure is useful, no doubt. But it is principally concerned with electronic communication, and is not built upon the notions of human communication, per se; whereas, Hartley's starting point is always 'human': in basically using the alphabet as the source of 'symbols,' and the telegraph operator as the 'selector.'PaV
August 11, 2017
August
08
Aug
11
11
2017
07:44 PM
7
07
44
PM
PDT
Bob O'H:
As long as you have equiprobability. Once you lose that, I’m not sure if it’s useful.
How, where, and what ways, is 'equiprobability' lost?PaV
August 11, 2017
August
08
Aug
11
11
2017
07:39 PM
7
07
39
PM
PDT
Mung:
Hartley information Shannon information Dembski information Spetner information Gitt information ET information Will it ever end!
You forgot Mung misinformation...ET
August 11, 2017
August
08
Aug
11
11
2017
08:08 AM
8
08
08
AM
PDT
Mung:
It just keeps getting better and better. iirc, Gitt defined seven different kinds of information. Shannon’s was only one of them and most certainly does not apply to all of them.
Evidence please- try making your case. Anything that can be broken down into bits means that Shannon's methodology applies.ET
August 11, 2017
August
08
Aug
11
11
2017
07:45 AM
7
07
45
AM
PDT
PaV:
Shannon’s measure is geared directly to electronic communication, and is defective in the sense that it doesn’t adapt readily to what we, as humans, know to be information—as in, e.g., this sentence I just wrote.
Why not? We can use Shannon's methodology to calculate the number of bits it contains.ET
August 11, 2017
August
08
Aug
11
11
2017
07:43 AM
7
07
43
AM
PDT
Hartley information Shannon information Dembski information Spetner information Gitt information ET information Will it ever end! :)Mung
August 11, 2017
August
08
Aug
11
11
2017
06:58 AM
6
06
58
AM
PDT
PaV @ 150 -
The beauty of Hartley’s derivation of his “quantitative measure of information” is that it is simple, and widely applicable.
As long as you have equiprobability. Once you lose that, I'm not sure if it's useful.Bob O'H
August 11, 2017
August
08
Aug
11
11
2017
03:30 AM
3
03
30
AM
PDT
ET: Using Shannon’s methodology we can measure the number of bits in any message, ie Gitt information. It just keeps getting better and better. iirc, Gitt defined seven different kinds of information. Shannon's was only one of them and most certainly does not apply to all of them.Mung
August 11, 2017
August
08
Aug
11
11
2017
03:01 AM
3
03
01
AM
PDT
ET: The beauty of Hartley's derivation of his "quantitative measure of information" is that it is simple, and widely applicable. Shannon's measure is geared directly to electronic communication, and is defective in the sense that it doesn't adapt readily to what we, as humans, know to be information---as in, e.g., this sentence I just wrote. Being that Hartley's derivation is prior in time to that of Shannon's lends it a certain authority. Further, that it is a definition given a meaning before ever there was a Creationist debate, or before 'information' was ever considered an attack on neo-Darwinism, makes it a good, and sufficient definition of information. Here at UD, we would be wise to find an effective way of using it. It is a more basic, and intuitive measure of information.PaV
August 11, 2017
August
08
Aug
11
11
2017
01:14 AM
1
01
14
AM
PDT
The point being, of course, is that the genetic information we can measure is material: DNA, RNAs, proteins. Materialists think they have the monopoly on material things. Being able to measure the amount of functional, material, genetic information required for the most basic living organism body slams that notion. And a new slogan is born: Intelligent Design- Using material information to defeat materialism one bit at a timeET
August 10, 2017
August
08
Aug
10
10
2017
09:55 PM
9
09
55
PM
PDT
ET, Yes, that's a valid point.Dionisio
August 10, 2017
August
08
Aug
10
10
2017
08:56 PM
8
08
56
PM
PDT
Agreed, but we measure what we canET
August 10, 2017
August
08
Aug
10
10
2017
08:44 PM
8
08
44
PM
PDT
ET: There is more complex functionally specified information beyond the genome. Genes are just part of the whole enchilada.Dionisio
August 10, 2017
August
08
Aug
10
10
2017
08:28 PM
8
08
28
PM
PDT
Dionsio:
Isn’t the complex functionally specified information seen in the biological systems beyond the scope of the Shannon information concept?
No. We can calculate the number of bits in any given gene using Shannon's methodology. We then compare that to the number of possible sequences to achieve the same end. From there we check the similarities of those sequences. Then we recalculate the probabilities given the possible number of sequences of that length vs the number that give you protein x for function y.ET
August 10, 2017
August
08
Aug
10
10
2017
06:41 PM
6
06
41
PM
PDT
Mung, Using Shannon's methodology we can measure the number of bits in any message, ie Gitt information. Meyer's point about information carrying capacity is that if we don't know the meaning we can still quantify what's there. Meaning even if we are receiving (seemingly) random characters we can still quantify it. Then at least we know the maximum length of any possible Gitt information it contains. That said if we do know the meaning the quantification process is already started. You know the maximum possible number of bits of Gitt information present.
Does information theory give us a quantitative measure of information or does it give us a quantitative way to measure information carrying capacity?
Again, the two are not mutually exclusive. We look at a sequence of DNA and we can calculate the information carrying capacity. From there we see if said DNA codes for a protein. If so then we know we are working with Gitt information but we can still use Shannon to quantify it. From there you can narrow it down via data compression and redundancies in sequences. So what kind of information does DNA contain, Shannon information or specified information [colloquial use]? Mere complexity or specified complexity? The answer is- both. Say you have an important Gitt message to send but you had a limited number of bits. You would use Shannon's methodology to see how long your message was so you could tell if it met the transmission length. The machines don't care about the meaning just the length. The sender and receiver care about the meaning.ET
August 10, 2017
August
08
Aug
10
10
2017
06:32 PM
6
06
32
PM
PDT
> Information theory provided a way to measure information carrying capacity (Meyer). I do wish you would make up your mind ET. Does information theory give us a quantitative measure of information or does it give us a quantitative way to measure information carrying capacity? The two are not the same.Mung
August 10, 2017
August
08
Aug
10
10
2017
03:53 PM
3
03
53
PM
PDT
> Do you know what the word “and” means, Mung? Like the word 'information' it probably has more than one meaning.Mung
August 10, 2017
August
08
Aug
10
10
2017
03:50 PM
3
03
50
PM
PDT
Not everything observed in scientific research is quantifiable/measurable by the currently available methods. Newer measuring approaches must be realized. Or humbly accept the fact that we won't be able to rationally measure all the processes we observe. Certain things may remain unmeasurable/unquantifiable by our intellectual capacity.Dionisio
August 10, 2017
August
08
Aug
10
10
2017
02:40 PM
2
02
40
PM
PDT
Isn't the complex functionally specified information seen in the biological systems beyond the scope of the Shannon information concept? Aren't those different categories of informational complexity associated with procedural functionality? Isn't it like mixing classical and quantum mechanics? Or maybe it's even worse than that? Perhaps one can measure certain aspects of biological information, as GP has done so well in his recent OPs about the functional information jumps in proteins. But that's not the whole enchilada in biology. How do we measure the complex functionally specified informational complexity associated with morphogenesis, gastrulation, neurulation? Are you guys joking? Perhaps it's tempting to measure things when doing reductionist bottom-up reverse engineering approach to research. But that's just a very limited descriptive measurement which leaves complex functionality out of the big picture.Dionisio
August 10, 2017
August
08
Aug
10
10
2017
02:15 PM
2
02
15
PM
PDT
Do you know what the word "and" means, Mung? Information theory provided a way to measure information carrying capacity (Meyer). Functionality provides us with meaning, ie information in the normal sense. So what kind of information does DNA contain, Shannon information or specified information [colloquial use]? Mere complexity or specified complexity? The answer is- both. Do you understand what he means by that? So you have IT on one hand and then everything else Spetner says about information on the other. And then combine the two.ET
August 10, 2017
August
08
Aug
10
10
2017
01:55 PM
1
01
55
PM
PDT
> And in the Preface he introduced Information Theory as a quantification process. We already know that the information theory definition is one definition of information. So we haven't gained any information by reading Spetner.Mung
August 10, 2017
August
08
Aug
10
10
2017
01:33 PM
1
01
33
PM
PDT
"Not By Chance" chapter 2 Information and Life, covers it, Mung. And in the Preface he introduced Information Theory as a quantification process. "the Evolution Revolution", chapter 1 Evolution and Information covers it also.ET
August 10, 2017
August
08
Aug
10
10
2017
12:02 PM
12
12
02
PM
PDT
Even in his books Spetner does not define the sense in which he is using the term. So it's no surprise that he doesn't do so in his article either.Mung
August 10, 2017
August
08
Aug
10
10
2017
10:12 AM
10
10
12
AM
PDT
Mung- Yes, Bob O'H was right in that Spetner did not define "information" in his linked article. However Spetner did define it in his books. He even mentioned Shannon's link to genetic information. Also Meyer's quote proves that you were wrong about me. But don't worry I don't expect you to apologize for that. I know that is not your style.ET
August 10, 2017
August
08
Aug
10
10
2017
08:24 AM
8
08
24
AM
PDT
Yes, that quote from Meyer is great. It highlights the fact that if you're going to talk about 'information' and DNA you need to be clear about which definition of information you are using. So Bob O'H was right all along.Mung
August 10, 2017
August
08
Aug
10
10
2017
07:46 AM
7
07
46
AM
PDT
So what kind of information does DNA contain, Shannon information or specified information [colloquial use]? Mere complexity or specified complexity? The answer is- both. First, DNA certainly does have a quantifiable amount of information carrying capacity as measured by Shannon's theory. ... ...Thus molecular biologists beginning with Francis Crick have equated biological information not only with improbability (or complexity), but also with "specificity", where "specificity" or "specified" has meant "necessary to function." Thus in addition to a quantifiable amount of Shannon information (or complexity), DNA also contains information in the sense of Webster's second definition: it contains "alternative sequences or arrangements of something that produce a specific effect."...
That's from Meyer, Stephen C.; "Signature in the Cell", pages 108-109 And that is what Spetner was talking about- Shannon gave us a methodology to measure it and function is an instance of information in its colloquial use.ET
August 9, 2017
August
08
Aug
9
09
2017
03:41 PM
3
03
41
PM
PDT
ET, I see your point too. Thanks. BTW, where are the other 8 types? :)Dionisio
August 9, 2017
August
08
Aug
9
09
2017
11:30 AM
11
11
30
AM
PDT
1 2 3 6

Leave a Reply