Uncommon Descent Serving The Intelligent Design Community

Lee Spetner on evolution and information

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

From Lee Spetner, author of The Evolution Revolution at True Origin:

Many years ago I published a paper that pointed out that the evolutionary process can be characterized as a transmission of information from the environment to the genome (Spetner 1964). Somewhat later I wrote that there is no known random mutation that adds information to the genome (Spetner 1997). This statement in one form or another has found its way into the debate on evolution versus creation. Evolutionists have distorted the statement to attack it, as in Claim CB102, where Isaak has written his target for attack as, ‘Mutations are random noise; they do not add information. Evolution cannot cause an increase in information.’ Perhaps something like this statement was indeed made in an argument by someone, but Isaak has distorted its meaning. For his ‘refutation’ he writes the following 4 points (the references are his citations): More.

See also: Lee Spetner answers his critics

Comments
"There are 10 type of people in this world- those who understand binary and those who don't" - anonymous "nobody" from the ARN forumET
August 9, 2017
August
08
Aug
9
09
2017
08:55 AM
8
08
55
AM
PDT
mike1962 made a valid point: In most cases symbols represent certain information if they are set according to the same protocol used by both the producer, sender, transmitter and the receiver of the symbols. In Mung's example @128, the symbols 1 and 0 represent tiny electric impulses of predetermined amplitudes. Those impulses are used to activate and operate the microprocessor logics which is designed based on a binary code, not on the actual physical value of the electric impulses. However, the amplitude and frequency of the electric impulses must be according to the circuit parameters that correspond to the activation and/or resetting of the gates that conform the logic tables that map the outputs resulting from the different possible input cases handled by the designed circuits. In embryonic development, the fascinating morphogenesis seems like a process where molecular signaling profiles --aka morphogen gradients-- are formed following a complex choreography that guarantees the correct interpretation by the receiving cells, thus determining their individual fate. The localization/delocalization of the morphogen sources, their secretion rate, the type of morphogen, their different modes of transportation, their degradation rate, all that affects the resulting organs or tissues. Complex functionally specified informational complexity on steroids. When those of us who have spent years working on software development for complex engineering design projects, watch in awe the marvelous cellular and molecular choreographies orchestrated within the biological systems, we are humbled at the realization that the work we used to be so proud of suddenly looks like LEGO toys for toddlers.Dionisio
August 9, 2017
August
08
Aug
9
09
2017
08:51 AM
8
08
51
AM
PDT
Without information symbols are merely marks or noises/ sounds. When symbols are created they are endowed with the information agreed upon by the designer and the people/ person who will be using it. Mung's 0 and 1 only represent a making of a choice if that was already agreed upon. Otherwise they just represent the numbers zero and one. And carrying information about what choice was made is definitely information in the normal sense of the word.ET
August 9, 2017
August
08
Aug
9
09
2017
08:00 AM
8
08
00
AM
PDT
Just think of all the information these two lowly symbols contain! 0 1 If you really stop to think about it, these are not so much carriers of information, but rather represent the making of a choice. In representing which choice was made, in that way, they could be said to carry information (about which choice was made). But that's certainly not information in the normal sense of the word.Mung
August 9, 2017
August
08
Aug
9
09
2017
07:51 AM
7
07
51
AM
PDT
They require a sufficient receiving context for actual information transfer.
They aren't symbols without it. And seeing that symbols did not invent nor define themselves clearly their information is not self-contained. As for DNA, well DNA is nothing without the genetic code and everything else to transcribe, edit, process and translate it. As for sending someone a sandwich, that too requires information. You need to know if the person receiving the sandwich has any allergies or specific diet. Don't want to send a BLT to an Islamic Cleric or else the meaning may be misconstrued.ET
August 8, 2017
August
08
Aug
8
08
2017
01:08 PM
1
01
08
PM
PDT
ET, We can quibble about the terms, and in a certain sense I would say that symbols "contain" or "carry" information, but there is a distinction I'm trying to make. And the distinction is: context. If I have an intention for you to have a sandwich, I can communicate that intention in two ways. I can send you a recipe of how you can build a sandwich, or I can send you a sandwich. In the first case, you won't be able to use my recipe if you don't already have a pre-loaded context in which to interpret the symbols. We have to share the same language and experience sufficiently for you to properly interpret the symbols in order to produce my intention. In the second case, no contextual interpretation is necessary. You simply possess the sandwich I sent you. Symbols certain can be arranged by the sender with intention for a certain outcome, but the symbols are not self-containers of information. They require a sufficient receiving context for actual information transfer. So maybe "self-container" vs "container" is my quibble. Symbols "contain" information within a context but do not "self-contain" information. I would put DNA in the first category. I would put an entire cell in the second category.mike1962
August 8, 2017
August
08
Aug
8
08
2017
12:27 PM
12
12
27
PM
PDT
Hi mike1962- Each symbol contains the information of what it represents. And yes, sometimes symbols can be combined such that the information contained in that combination is greater than that of each symbol. How about this: Symbols trigger a response because they carry/ contain information. :cool:ET
August 8, 2017
August
08
Aug
8
08
2017
12:14 PM
12
12
14
PM
PDT
ET, What information does this contain? : OBDITMOTNTDBGUTFBTBTFEODTSASEOTDPHTNAGUASTTDBIYDBTLITJATBMHSIT? Clearly these are symbols, but what information is this string of symbols carrying? What about this sentence: Harry is on fire. What information is that string of symbols carrying?mike1962
August 8, 2017
August
08
Aug
8
08
2017
11:47 AM
11
11
47
AM
PDT
Symbols contain information. That is what makes them symbols. If a Chinese person sends a message of Kanji characters to me my curiosity would be triggered and I would try to find out what, if any, information they contained.ET
August 8, 2017
August
08
Aug
8
08
2017
08:54 AM
8
08
54
AM
PDT
ET,
So the mRNA (codons) triggers a polypeptide response from the ribosome? Or does the mRNA carry a message to the ribosome to produce this polypeptide?
I would say the former.
Messengers carry messages vs messages trigger responses. The first does seem redundant while the second is a little vague- what (type of) response? Is it the same for everyone? What if the triggered response isn’t the one expected?
If a Chinese person sends a message of Kanji characters to me, nothing will be triggered, because I don't understand Kanji. Nothing was "carried." Only an attempt at a triggering in me was attempted (and failed.) The success of the message depends on my prior knowledge of Kanji. Without that prior knowledge, no meaning was "carried" to me.
Who expected some kid dressed up liked a zombie to answer “I like turtles”, to a question about his costume? But hey it got him an appearance on Tosh.0.
I don't understand the reference.
The more important question would be, under your analysis, what triggered that response given it had nothing to do with what the sender intended [sic]?
Symbols can cause unintended triggering, or no triggering at all, that is, an effect unintended by the sender, because they can be misunderstood, or not understood at all, by the receiver. The reason is because symbols "carry" nothing. They are dependent on the receiver's prior knowledge for there to be a proper utilization that is intended by the sender. Something that actually "carries" information (or anything else) is no longer merely a symbol, but is a "container." If I send someone a picture of a sandwich, and they have no idea what a sandwich is, no information, beyond the mere picture itself, is transferred. However, if I send someone a lunch box with a sandwich in it, the receiver will receive the sandwich (and might even eat it, if she is smart enough to figure out that it's food), regardless of the receiver's prior understanding or experience of what a sandwich is. The lunch box is not merely a symbol, it is a container. Codons seem to me to be symbols and not containers.mike1962
August 8, 2017
August
08
Aug
8
08
2017
08:31 AM
8
08
31
AM
PDT
Yes D, symbols represent something. That is because they carry information. ;)ET
August 8, 2017
August
08
Aug
8
08
2017
07:37 AM
7
07
37
AM
PDT
ET, mike1962: Carry, trigger,...? Interesting... What about represents? Or all of the above in different proportions? Or something else? :) Traffic lights carry, trigger, represent information? Stop signs? Morphogen gradients? mRNA ? How are the proteins SATB1 and SATB2 in GP's excellent recent thread formed? What are the gene sequences associated with those proteins? How does it go from the gene to the mRNA in those two cases? How does it go from the mRNA to the actual proteins? Perhaps GP could comment on this after he comes back from summer vacation. UB could give us a hand with this too.Dionisio
August 8, 2017
August
08
Aug
8
08
2017
03:12 AM
3
03
12
AM
PDT
So the mRNA (codons) triggers a polypeptide response from the ribosome? Or does the mRNA carry a message to the ribosome to produce this polypeptide? Messengers carry messages vs messages trigger responses. The first does seem redundant while the second is a little vague- what (type of) response? Is it the same for everyone? What if the triggered response isn't the one expected? Who expected some kid dressed up liked a zombie to answer "I like turtles", to a question about his costume? But hey it got him an appearance on Tosh.0. The more important question would be, under your analysis, what triggered that response given it had nothing to do with what the sender intended?ET
August 7, 2017
August
08
Aug
7
07
2017
08:46 PM
8
08
46
PM
PDT
I would say symbols "carry" nothing. Rather, they trigger a response in the receiver because the receiver knows how to map the symbol to specific meaning intended by the sender. Maybe it's a quibble, but I think "trigger" is a better term.mike1962
August 7, 2017
August
08
Aug
7
07
2017
08:22 PM
8
08
22
PM
PDT
Mung:
Symbols can carry information...
Umm, a symbol that doesn't carry any information is not a symbol. If it's a symbol it carries information.
But symbols are not information ...
And yet they are not symbols without being endowed with it. Information, communication and symbols- It's the trinity of information theory. You can't have one without the other. Well maybe Mung can...ET
August 7, 2017
August
08
Aug
7
07
2017
07:06 PM
7
07
06
PM
PDT
In the case of the genetic code, ie genetic information, the symbols carry the information. And we can measure it, thanks to IT.ET
August 7, 2017
August
08
Aug
7
07
2017
02:21 PM
2
02
21
PM
PDT
Trucks can carry symbols and symbols can be part of a drum set. Trucks can carry drum sets and you can beat on a truck like a drum set. You may even find a part that can be used as a symbol. Images of trucks can also be used as symbols. And if that image is on a thin sheet of metal it can be a symbol symbol. "Information Theory" provided us with a way to quantify information. And that is why Spetner said what he did in the preface. We can now quantify it- genetic information- (thanks to IT) which allows us to see if random mutations add new information and if the proposed process is even capable. Or maybe that was Spetner's inferenceET
August 7, 2017
August
08
Aug
7
07
2017
02:19 PM
2
02
19
PM
PDT
Symbols can carry information, and trucks can carry televisions. But symbols are not information and trucks are not televisions. Most people manage to grasp the difference between the medium and the message. "Information Theory" is interested only in the medium.Mung
August 7, 2017
August
08
Aug
7
07
2017
01:25 PM
1
01
25
PM
PDT
Bob O'H:
PaV – three comments on Hartley: 1. he assumes the symbols are equiprobable (an assumption Shannon relaxed).
That's because they are equiprobable. He's talking about produced electrical currents of different voltages. They are equiprobable: the telegrapher can use any of the three ( or more) he chooses.
2. you can’t get away from entropy, because both entropy and Hartley’s approach are based on a multinomial likelihood, so his approach also has an entropic interpretation (equivalently, entropy has a multinomial interpretation).
Just because I'm using a combinatorial approach doesn't make it "entropic." Shannon's argument, IIRC, has nothing at all to do with entropy. It's simply that the formulas he uses to describe his notion of information resembles the Boltzman equation for statistical mechanics. Further, entropy has to do with the 'degrees of freedom' of matter. Generally each atom has three. You can't associate 26 degrees of freedom with atoms, as you can with the English language.
3. Hartley’s approach to information is pretty much the same as Shannon’s (for obvious historical reasons). So I don’t see how it helps in this discussion.
Hartley develops his idea of an information measure in a very different way than does Shannon. Shannon's formula involves 'probabilities.' Hartley's does not. It involves the number of symbols/characters that are being used, and the number of selections that are made. They're entirely different. And, Hartley's definition was first--that is, the most basic and intuitive. That is, the easiest to understand. Think here of Occham's Razor. Shannone developed a 'theory of communication.' He uses the symbol 'H' for the amount of information from a "source." This is the very symbol Hartley uses for total information 'received' from the 'sender' = source. If you want to use Shannon's definition and theory, then it applies to the 'fidelity' of the transcription process---which is a process by which the information received from DNA is faithfully translated into RNA. Wikipedia: "genetic information" = nucleic acid sequence. Hartley's definition of information directly applies. Why do you make this so difficult? Just found this:
In summary, although it is true that Shannon’s theory is not interested in individual amounts of information, this does not mean that those quantities cannot be defined. In fact, in the first paragraph of his paper, Shannon (1948) explicitly says that his own proposal aims at extending the work of Ralph Hartley (1928), where a logarithmic function is introduced as a measure of uncertainty in the case of equiprobability; the Hartley function can be viewed as measuring an individual entropy. What is Shannon Information.
BTW, in the second part of Hartley's paper, which I simply skimmed through, he deals with 'varying,' and not 'fixed,' currents of electricity. And in this case, an averaging is required. At that point, a statistical average is used (entropy). This corresponds to population genetics throughout deep time, not the transcription and translation that takes place in living cells, where these processes are extremely 'hifi,' and which rely on the constancy of 'fixed,' individual codons--not on statistical averages. Hence, Hartley's original idea of information--which directly applies not only to DNA, but to computer codes themselves--is not replaced, but extended by Shannon; and, which, said extension is not needed in analyzing the information content of DNA.PaV
August 7, 2017
August
08
Aug
7
07
2017
10:34 AM
10
10
34
AM
PDT
PaV - three comments on Hartley: 1. he assumes the symbols are equiprobable (an assumption Shannon relaxed). 2. you can't get away from entropy, because both entropy and Hartley's approach are based on a multinomial likelihood, so his approach also has an entropic interpretation (equivalently, entropy has a multinomial interpretation). 3. Hartley's approach to information is pretty much the same as Shannon's (for obvious historical reasons). So I don't see how it helps in this discussion.Bob O'H
August 7, 2017
August
08
Aug
7
07
2017
06:28 AM
6
06
28
AM
PDT
PaV, Yes, I understand. For me that was all put to rest with "Signature in the Cell" in which Meyer defined exactly what he was talking about by 'information'- see my comment in 75 aboveET
August 7, 2017
August
08
Aug
7
07
2017
06:25 AM
6
06
25
AM
PDT
Bob O'H:
are you saying that Spetner doesn’t use the concept of information in the article cite in the OP that he uses in Not By Chance?
No. I was just correcting Mung as you were talking about the article and not his books. Had you read his books you wouldn't be asking what Spetner meant when he talked about 'information'.ET
August 7, 2017
August
08
Aug
7
07
2017
06:23 AM
6
06
23
AM
PDT
ET: That's always been the argument here at UD for over ten years: information is not defined properly; or, that specified complexity is not defined correctly; or, fill in the blank. My point in posting Hartley's "quantitative measure of information" is two-fold: to put it in digital form; and to to illustrate that the concept is straightforward. Hartley actually addresses the issue of personal communication, which is what information is all about. And his defintion doesn't lend itself to 'entropy' arguments, as does Shannon's. Hartley's is not a statistical mechanics argument. The problem here is this: whenever you speak with somone of the liberal persuasion, you'll find out that to win an argument they're losing, they will at some point ask you: "What do you mean by ................? You can simply fill in the blank. Remember, everything depends on "what the definition of is is." Some people don't want to know the Truth.PaV
August 7, 2017
August
08
Aug
7
07
2017
04:49 AM
4
04
49
AM
PDT
ET - are you saying that Spetner doesn't use the concept of information in the article cite in the OP that he uses in Not By Chance?Bob O'H
August 7, 2017
August
08
Aug
7
07
2017
03:21 AM
3
03
21
AM
PDT
And Bob O'H wasn't talking about Not By Chance- he was talking about the article linked to in the OP that never defined 'information'. And Mung doesn't even know what my claims are as he doesn't seem to be able to comprehend what he reads. My claims are supported by Spetner himself. Write to him, MungET
August 6, 2017
August
08
Aug
6
06
2017
11:49 PM
11
11
49
PM
PDT
LoL! The preface sets the context and in Not By Chance he makes it clear that it is the actual sequence of symbols of DNA that are the information:
The information in a cell plays a role much like that played by information in a factory. The production file in a factory contains a set of instructions that tell what each worker has to do at each stage. The production file is information carried by printed symbols; the developmental instructions in the cell are information carried by molecular symbols.
The DNA is the starting point of the genetic codeET
August 6, 2017
August
08
Aug
6
06
2017
11:46 PM
11
11
46
PM
PDT
Let's take a look at Spetner's book, The Evolution Revolution.
The information of life is presently thought to reside to a large extent in the molecules of deoxyribonudleic acid (DNA) that are the constituents of the genome (the set of genes and their controls) in every living cell, and is therefore called genetic information.
Doesn't define information. Tells us where it's "thought to reside." Clearly, it's not the actual sequence of symbols of DNA that are the information.Mung
August 6, 2017
August
08
Aug
6
06
2017
06:20 PM
6
06
20
PM
PDT
ET:
Bob O’H is just saying that Spetner never clearly defines the word ‘information’.
Bob O'H is correct. Spetner, in Not By Chance, never clearly defines the word ‘information’. ET:
My point is that Spetner is using Shannon’s methodology applied to genetic information...
As pointed out above, Spetner, in his book Not By Chance only mentions Information Theory or Claude Shannon in the preface, after which they exit, stage left, never to be heard from again. You're the one who referred Bob O'H to Spetner's books (@8) as if he would find answers there. He may indeed find answers in them, but they will be such as to contradict your claims.Mung
August 6, 2017
August
08
Aug
6
06
2017
06:07 PM
6
06
07
PM
PDT
Pav, Bob O’H is just saying that Spetner never clearly defines the word ‘information’. My point is that Spetner is using Shannon’s methodology applied to genetic information, ie the information that builds organisms, which includes, but is not limited to, Crick’s definition:
Information means here the precise determination of sequence, either of bases in the nucleic acid or on amino acid residues in the protein.
Each nucleotide would have 2 bits of information carrying capacity.ET
August 6, 2017
August
08
Aug
6
06
2017
02:51 PM
2
02
51
PM
PDT
ET: You still have yet to say what I am wrong about ... Zzzzz...Mung
August 6, 2017
August
08
Aug
6
06
2017
01:07 PM
1
01
07
PM
PDT
1 2 3 4 6

Leave a Reply