Uncommon Descent Serving The Intelligent Design Community

Darwinism and academic culture: Mathematician Jeffrey Shallit weighs in

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

You can tell that Darwinism is failing when it attracts completely ridiculous attacks like this one, on Signature in the Cell (Harper One, 2009). The gist is that Nagel thought Meyer’s book a prize.* But Shallit says,

Meyer claims, over and over again, that information can only come from a mind — and that claim is an absolutely essential part of his argument. Nagel, the brilliant philosopher, should see why that is false. Consider making a weather forecast. Meteorologists gather information about the environment to do so: wind speed, direction, temperature, cloud cover, etc. It is only on the basis of this information that they can make predictions. What mind does this information come from?

What mind indeed? If we experience either snow or dull, freezing rain here tomorrow, why should I be surprised? This is the season officially known as winter.

So, maybe Nagel, the brilliant philosopher, knows more than Shallit, the University of Waterloo prof.

I thought Thomas Nagel’s discussion of animal mind, in “What is it like to be a bat?” was the best of its type, in elucidating the difficulties of a materialist explanation of mind. I would commend it to all.

For example, the information that explains how the butterfly emerges from the mess of the pupa, after the caterpillar has done its bit by constantly eating leaves, is vastly more complex than the information that explains why rain falls or snow blankets. We seek an explanation for metamorphosis, not for why rain or snow falls.

Here is an example:

So how is the trick done inside the “magic box” of the pupa? As one biologist told me, “The entire caterpillar dissolves, and is reconstructed as a butterfly.” The stored energy from the caterpillar’s voracious eating habits creates that? … ridiculous. Let’s hear more explanations, and subject them to tests, based on the life of the universe.

*Signature in the Cell (Harper One, 2009) was literally a prize at Uncommon Descent recently. I hope for more copies soon, for more contests and more prizes.

Comments
Mrs O'Leary, The stored energy from the caterpillar’s voracious eating habits creates that? … ridiculous. Are you truly incredulous about the energy required? What other source of energy is there? Or are you amazed that one set of instructions can build two different body plans when starting from different inputs?Nakashima
January 2, 2010
January
01
Jan
2
02
2010
01:52 AM
1
01
52
AM
PDT
Mark Frank: The article could hardly be clearer. The only use of “information” that all the biology departments in the universe mean is Shannon information. All the many other uses are contentious.
Yes, that's what the article suggests.
Both Godfrey-Smith (2000a) and Griffiths (2001) have argued that there is one highly restricted use of a fairly rich semantic language within genetics that is justified.
So they *argue* the point.
This very narrow understanding of the informational properties of genes
At that, it's a narrow understanding.
It does not vindicate the idea that genes code for whole-organism phenotypes, let alone provide a basis for the wholesale use of informational or semantic language in biology.
Even then, it has limited applicability. Comparing to your statement,
jerry: We mean {by information} the same thing that all the biology departments in the universe mean
Certainly that resource supports Mark Frank's view and contradicts jerry's. Many biologists probably do use the term information informally, but most of the time they are referring to data.Zachriel
January 1, 2010
January
01
Jan
1
01
2010
05:37 AM
5
05
37
AM
PDT
#82 Jerry You were arguing that this sense is in common use by biology departments. Godfrey writes: Both Godfrey-Smith (2000a) and Griffiths (2001) have argued that there is one highly restricted use of a fairly rich semantic language within genetics that is justified Nowhere does he imply that this sense is in common use and this sentence would seem to imply the reverse.Mark Frank
December 31, 2009
December
12
Dec
31
31
2009
11:25 PM
11
11
25
PM
PDT
Let me repeat what I wrote on the other thread. This is in response to Mark Frank's absurd claim which was: "The article could hardly be clearer. The only use of “information” that all the biology departments in the universe mean is Shannon information. All the many other uses are contentious." I then said: "I do not believe what I just read. I suggest all interested go to the Stanford article and go to the section on the genetic code. It says 'Both Godfrey-Smith (2000a) and Griffiths (2001) have argued that there is one highly restricted use of a fairly rich semantic language within genetics that is justified. This is the idea that genes “code for” the amino acid sequence of protein molecules, in virtue of the peculiar features of the “transcription and translation” mechanisms found within cells. Genes specify amino acid sequence via a templating process that involves a regular mapping rule between two quite different kinds of molecules (nucleic acid bases and amino acids). This mapping rule is combinatorial, and apparently arbitrary (in a sense that is hard to make precise. This very narrow understanding of the informational properties of genes is basically in accordance with the influential early proposal of Francis Crick (1958). The argument is that these low-level mechanistic features make gene expression into a causal process that has significant analogies to paradigmatic symbolic phenomena. Some have argued that this analogy becomes questionable once we move from the genetics of simple prokaryotic organisms (bacteria), to those in eukaryotic cells. This has been a theme of Sarkar’s work (1996). Mainstream biology tends to regard the complications that arise in the case of eukaryotes as mere details that do not compromise the basic picture we have of how gene expression works. An example is the editing and “splicing” of mRNA transcripts. The initial stage in gene expression is the use of DNA in a template process to construct an intermediate molecule, mRNA or “messenger RNA,” that is then used as a template in the manufacture of a protein. The protein is made by stringing a number of amino acid molecules together. In organisms other than bacteria, the mRNA is often extensively modified (“edited”) prior to its use. This process makes eukaryotic DNA a much less straightforward predictor of the protein’s amino acid sequence than it is in bacteria, but it can be argued that this does not much affect the crucial features of gene expression mechanisms that motivate the introduction of a symbolic or semantic mode of description. So the argument in Godfrey-Smith (2000a) and Griffiths (2001) is that there is one kind of informational or semantic property that genes and only genes have: coding for the amino acid sequences of protein molecules. But this relation “reaches” only as far as the amino acid sequence. It does not vindicate the idea that genes code for whole-organism phenotypes, let alone provide a basis for the wholesale use of informational or semantic language in biology. Genes can have a reliable causal role in the production of a whole-organism phenotype, of course. But if this causal relation is to be described in informational terms, then it is a matter of ordinary Shannon information, which applies to environmental factors as well.' Then ask yourself how Mark Frank could write his comment with a straight face. But it is what we expect around here from anti ID people. Did he really think we would not read on?" This whole discussion is getting weird. Denying the obvious, claiming things that are not true under obvious examination. It is nothing new with the anti ID crowd but it does waste time trying to deal with their crap. The link for the Stanford article in in #67.jerry
December 31, 2009
December
12
Dec
31
31
2009
12:30 PM
12
12
30
PM
PDT
#78 Jerry Re me slithering away. I was put into moderation and none of my comments are being published - possibly this one will get through!Mark Frank
December 31, 2009
December
12
Dec
31
31
2009
11:43 AM
11
11
43
AM
PDT
#79 - Are you disputing that the sequences contained within DNA contain encoded instructions for the development and functioning of living things? If not, then can you see the difference between that and a sequence made from the same elements, but which is random and describes nothing? I have no reason to doubt that you can. If DNA contains such descriptive and functional information, then who really cares in what other sense biologists may or may not use the word "information?" It's downright Orwellian to try to reshape reality by managing the words we use to describe it. Perhaps the problem can be solved by eliminating the offending word, information. A caterpillar's DNA contains the encoded instructions for all of its components at the cellular and anatomical level, as well as the complex behaviors that enable to to live. What word do you prefer to describe what is contained in its DNA? If information is synonymous with meaningless bits of random data, then I suppose information is poor word. Would would be a better one?ScottAndrews
December 31, 2009
December
12
Dec
31
31
2009
11:29 AM
11
11
29
AM
PDT
jerry at 78, “There is no “common one” other than Shannon information that is used in biology. Mark Frank pointed that out to you just a few days ago here and here.” And Mark Frank was wrong as he slithered away after it was pointed out to him. So I suggest you too read the article on biology and information. Another reason why no one should take you seriously as you failed to point out the continuation of the comments. By all means let's review the rest of the comments. I note that a simple way for you to have refuted my statement would have been to provide references of your own, but 'tis the season to be generous so I'll look for you. Mark Frank wrote here that: Both Godfrey-Smith (2000a) and Griffiths (2001) have argued that there is one highly restricted use of a fairly rich semantic language within genetics that is justified. This is clearly not something that is accepted by all the biology departments in the Universe. It is a contentious definition which they are arguing for. If this is the sense of information that you mean – then fine – just confirm that you mean information in the sense defined in Godfrey-Smith (2000a) and Griffiths (2001). But don’t pretend it is obvious or universally accepted within biology. Here you left a mildly amusing Casablanca reference, but certainly didn't address Mr. Frank's argument. Mark Frank requested clarification again here. Here you failed again to respond, merely repeating "The one the biology departments across the universe are using." after Mr. Frank used your own sources to show that no such single definition is being used. Mark Frank realizes the futility of his quest here and stops asking. At no point further in that thread do you attempt to address his argument. So, not only is your claim to have proven Mark Frank wrong not supported by the available evidence, your characterization of him having "slithered" away is also not in accordance with the reality of the situation. I would personally be very interested to read your statement of what you believe to be the consensus definition of "information" among biologists. My primary goal in delurking here is to understand CSI well enough to be able to implement software that measures it. Understanding what ID proponents mean by "information" would help me achieve that goal.Mustela Nivalis
December 31, 2009
December
12
Dec
31
31
2009
10:56 AM
10
10
56
AM
PDT
"There is no “common one” other than Shannon information that is used in biology. Mark Frank pointed that out to you just a few days ago here and here." And Mark Frank was wrong as he slithered away after it was pointed out to him. So I suggest you too read the article on biology and information. Another reason why no one should take you seriously as you failed to point out the continuation of the comments. How many times does it has to be pointed out to all the know nothings here that the term "information" has many uses. And the one most used in biology is not Shannon information but apparently Shannon information techniques can be applied to it to estimate its complexity.jerry
December 31, 2009
December
12
Dec
31
31
2009
10:05 AM
10
10
05
AM
PDT
Joseph, can you give me a link to an essay of Meyer's where he gives the same definition as he uses in his book?Heinrich
December 31, 2009
December
12
Dec
31
31
2009
08:04 AM
8
08
04
AM
PDT
Heinrich:
Joseph – I’m specifically asking about how ID (and in particular Meyer) quantifies information.
Then I suggest that you get busy reading his books and essays.
Until this is clear, I think it’s difficult to be sure whether Shallit was talking sense or not.
To you.Joseph
December 31, 2009
December
12
Dec
31
31
2009
06:59 AM
6
06
59
AM
PDT
Joseph - I'm specifically asking about how ID (and in particular Meyer) quantifies information. Until this is clear, I think it's difficult to be sure whether Shallit was talking sense or not.Heinrich
December 31, 2009
December
12
Dec
31
31
2009
01:53 AM
1
01
53
AM
PDT
Mustela Nivalis (the weasel), If biology uses Shannon information then it is no wonder biologists can't tell us very much beyond basic operation. As I and many others have said- Shannon didn't care about function, nor meaning. You cannot put meaning into mathematical form so Shannon dod not try to. What he did was to formulate a way to calculate the information carrying capacity.Joseph
December 30, 2009
December
12
Dec
30
30
2009
03:18 PM
3
03
18
PM
PDT
Heinrich, How does one quantify anything in the theory of evolution? The point being is that you really need to focus on your position because it is the utter failure of your position to find any supporting data that has allowed ID to remain "in play".Joseph
December 30, 2009
December
12
Dec
30
30
2009
01:50 PM
1
01
50
PM
PDT
Something else for Zachriel to ignore: All Shannon did was demonstrate the information carrying capacity- IOW Shannon's theory cannot differentiate between functional or message bearing sequences from random and useless noise.Joseph
December 30, 2009
December
12
Dec
30
30
2009
01:48 PM
1
01
48
PM
PDT
For Zachriel to ignore-
"The word information in this theory is used in a special mathematical sense that must not be confused with its ordinary usage. In particular information must not be confused with meaning."--Warren Weaver (worked with Shannon)
Joseph
December 30, 2009
December
12
Dec
30
30
2009
01:45 PM
1
01
45
PM
PDT
Jerry at 66, Information has all sorts of definitions and a common one is the one that is used in biology. You are a big fan of Godfrey-Smith and even talks about this in explicit detail. There is no "common one" other than Shannon information that is used in biology. Mark Frank pointed that out to you just a few days ago here and here.Mustela Nivalis
December 30, 2009
December
12
Dec
30
30
2009
10:36 AM
10
10
36
AM
PDT
Thanks for the reference Jerry, not sure I will have time to read it all but if you look at my first post, I'm not totally on board with classical information theory applied to biology anyway, because the proponents are trying to prove intelligence behind a message. Too slippery of an approach in my view, because in the Shannon sense, a human at one end and a human at the other is the definition of information. One end may be a measuring/sensing machine designed by a human. You can read a book, coming from an author by definition, or transmit it over a communication channel with no errors so someone else can read it provided enough attention has been paid to the statistical requirements for error-free (actually the probability of error-free) transmission in the inevitable presence of noise, thanks to Shannon. But this has nothing to do with what a sentence means, or how that meaning may depend on context, analogs that may be absolutely relevant to biology. When it gets down to it, I'm much more enthusiastic about arguments from complexity, possibly with relevant probabilistic calculations, a la Behe in Edge.groovamos
December 30, 2009
December
12
Dec
30
30
2009
10:10 AM
10
10
10
AM
PDT
jerry: Would you say that the information in a computer program or an English paragraph is information in the Shannon sense? From what little I understand, it isn’t.
He says as he transmits data (as Shannon Information) across the Internet. Shannon Information is a mathematical model. The correct question is whether an English paragraph can be analyzed as Shannon Information. The answer is clearly yes.Zachriel
December 30, 2009
December
12
Dec
30
30
2009
09:30 AM
9
09
30
AM
PDT
"Of course they are in the Shannon sense, because information is anything meaningful that can be represented as sequences of symbols." I suggest you read the article in the Stanford Encyclopedia of Philosophy on Biological information. http://plato.stanford.edu/entries/information-biological/ It is and isn't Shannon information. Read my comment at #13. In one sense it is Shannon but in its crucial use, it isn't. I am not an expert but will take Godfrey-Smith's assessment who is supposedly and expert on this. The article is long but covers the various uses people have made of information within biology.jerry
December 30, 2009
December
12
Dec
30
30
2009
09:14 AM
9
09
14
AM
PDT
R0b, I suggest you read my comment at #13. Information has all sorts of definitions and a common one is the one that is used in biology. You are a big fan of Godfrey-Smith and even talks about this in explicit detail.jerry
December 30, 2009
December
12
Dec
30
30
2009
09:09 AM
9
09
09
AM
PDT
Jerry says: "Would you say that the information in a computer program or an English paragraph is information in the Shannon sense? From what little I understand, it isn’t." Of course they are in the Shannon sense, because information is anything meaningful that can be represented as sequences of symbols. What Shannon did was to completely describe mathematically the situation whereby a message from one source is corrupted by an uncorrelated source of interference. This means that the two sources are uncorrelated so that each looks like random noise to the other. A straightforward illustration is when deep space communications are subject to interference from noise sources such as atmospheric discharges, solar induced magnetic disturbances, etc, causing errors in the information transmission. But suppose we want to do some weird science and analyze the induced errors in this scenario. We could do an experiment whereby the error rate and the error occurrence in time were studied. We would see such a study as meaningful because we are now studying the noise sources; and the equipment, designed by minds, provides the information. We could command the spacecraft to send a repetitive symbol indefinitely thus reducing the information content from the craft to zero, but the received information content would not be zero, as we would as a result of the noise sources' interference be receiving random symbol sequences. We could surmise useful information about the noise sources from this. Thus the key is usefulness in our mind. Weird science as there are usually better ways to study noise.groovamos
December 30, 2009
December
12
Dec
30
30
2009
08:18 AM
8
08
18
AM
PDT
Weather data are gathered as a time sequences of values, and the various alternative sequences produce specific effects. Why then do weather data not count as information according to this definition? Various alternative sequences of weather do produce specific effects. Humidity increases the chance of rain, etc, etc. Those are physical properties and events, not information. (Otherwise we would have to redefine "information" as "Stuff that exists.") The weather never, ever produces an abstract, encoded description of itself. When it rains, the ground gets wet. The rain does not enter a record of itself into a meterological log. One is cause and effect. The cause can often be determined from the effect. The other is an abstract definition of the cause and effect, which is never directly produced by the weather. I think some of the comments overcomplicate this. Maybe I'm wrong. But to boil it down, natural, unintelligent processes are not known for using abstract languages to create descriptions of reality that are separate from that reality. Why would they? And even if such a thing could result from millions of years of unintelligent evolution, how does a language more complicated than that used by any living thing appear at the very beginning, before anything had evolved? Surely you can understand why some might easily dismiss the idea.ScottAndrews
December 30, 2009
December
12
Dec
30
30
2009
08:09 AM
8
08
09
AM
PDT
jerry, as much as I enjoy your effusive flattery, please don't lay it on too thick in public. My point was about the Webster definition quoted by sagebrush gardener, which says nothing about machinery, "pointing" (how is that different from cause and effect?), intermediary mechanisms, number of subparts, or particular processes. If certain time sequences produce predictable effects, why do they not count as information according to the Webster definition?R0b
December 30, 2009
December
12
Dec
30
30
2009
08:03 AM
8
08
03
AM
PDT
"as a time sequences" --> "as time sequences"R0b
December 30, 2009
December
12
Dec
30
30
2009
07:15 AM
7
07
15
AM
PDT
"Weather data are gathered as a time sequences of values, and the various alternative sequences produce specific effects. Why then do weather data not count as information according to this definition?" For someone who admits that he does not know what this is all about, you keep on showing up with comments that do not make sense. The data in DNA points to a specific independent entity. And it does so through a machinery set up to provide this relationship just as computer programming and language do. The data in weather does not point to anything specific. That is not to say that the pattern of weather elements do cause some other phenomena but they are not such that they cause the same specific thing each time. There is no intermediary mechanism of parts that lead from weather pattern A to weather pattern B as there in the DNA transcription/translation process through a mechanism consisting of about a thousand sub parts. You should go back to Amazon and put up another review which admits you do not understand the process or that you willfully distort it. You already have admitted you are ignorant of this process and yet you pontificate on Amazon and here and who knows where else. You say it is not malicious but one has to doubt that based on this persistent obstructive behavior. The sad thing is that your ignorance gathers helpful comments from readers. So it is so easy to see how a self proclaimed know nothing can distort a system.jerry
December 30, 2009
December
12
Dec
30
30
2009
07:14 AM
7
07
14
AM
PDT
sagebrush gardener @ 7:
The second meaning — the one intended by Meyer — is “the attribute inherent in and communicated by one of two or more alternative sequences or arrangements of something (as nucleotides in DNA or binary digits in a computer program) that produce specific effects”.
Weather data are gathered as a time sequences of values, and the various alternative sequences produce specific effects. Why then do weather data not count as information according to this definition?R0b
December 30, 2009
December
12
Dec
30
30
2009
06:31 AM
6
06
31
AM
PDT
(sorry, forgot about this)
All that said on page 86 of “Signature….” Meyer point’s to Webster’s dictionary- “the attribute inherent in and communicated by alternative sequences or arrangements of something that produce specific effects”.
How does one quantify this?Heinrich
December 30, 2009
December
12
Dec
30
30
2009
05:24 AM
5
05
24
AM
PDT
I read some of shallit's comments...are there any darwinists who are not arrogant a....... anyway this paper is rather interesting...
The fundamental contention inherent in our three subsets of sequence complexity proposed in this paper is this: without volitional agency assigning meaning to each configurable-switch-position symbol, algorithmic function and language will not occur. The same would be true in assigning meaning to each combinatorial syntax segment (programming module or word). Source and destination on either end of the channel must agree to these assigned meanings in a shared operational context. Chance and necessity cannot establish such a cybernetic coding/decoding scheme Self-ordering phenomena are observed daily in accord with chaos theory. But under no known circumstances can self-ordering phenomena like hurricanes, sand piles, crystallization, or fractals produce algorithmic organization. Algorithmic "self-organization" has never been observed [70] despite numerous publications that have misused the term [21,151-162]. Bone fide organization always arises from choice contingency, not chance contingency or necessity.
link tsmith
December 29, 2009
December
12
Dec
29
29
2009
09:44 AM
9
09
44
AM
PDT
Collin: I take issue though with how you use the word specificity. Does specificity mean improved function?
To calculate specificity, we define our function at a certain level, and determine the number of sequences that could provide this level of function. The fewer such sequences, the more specific the sequence. We can visual this as a fitness landscape, often a sharp local peak and a broad foothill region. As we can show that protein evolution is adept at hillclimbing, we can see that the protein becomes better matched to the function. Specificity increases, often by many orders of magnitude.
Collin: Does specificity mean functional specified complex information?
Assuming a constant sequence length, if the specificity increases, so does the specified complexity. X = –log2 [ BIGNUM · S(T)·P(T|H)] (S(T) is Dembski's specificity. Dembski defines it in terms of a Semiotic Agent or more recently by K-complexity. The shorter the description, the more specific. H is the chance hypothesis, which is rather vaguely defined, but is usually taken as being a uniform random distribution.)Zachriel
December 29, 2009
December
12
Dec
29
29
2009
07:20 AM
7
07
20
AM
PDT
Zachriel, Good answer. I'll have to review those studies. I take issue though with how you use the word specificity. Does specificity mean improved function? Is that how Meyer and Dembski are using the term? Does specificity mean functional specified complex information?Collin
December 28, 2009
December
12
Dec
28
28
2009
08:18 PM
8
08
18
PM
PDT
1 2 3

Leave a Reply