Uncommon Descent Serving The Intelligent Design Community

Darwinism and academic culture: Mathematician Jeffrey Shallit weighs in

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

You can tell that Darwinism is failing when it attracts completely ridiculous attacks like this one, on Signature in the Cell (Harper One, 2009). The gist is that Nagel thought Meyer’s book a prize.* But Shallit says,

Meyer claims, over and over again, that information can only come from a mind — and that claim is an absolutely essential part of his argument. Nagel, the brilliant philosopher, should see why that is false. Consider making a weather forecast. Meteorologists gather information about the environment to do so: wind speed, direction, temperature, cloud cover, etc. It is only on the basis of this information that they can make predictions. What mind does this information come from?

What mind indeed? If we experience either snow or dull, freezing rain here tomorrow, why should I be surprised? This is the season officially known as winter.

So, maybe Nagel, the brilliant philosopher, knows more than Shallit, the University of Waterloo prof.

I thought Thomas Nagel’s discussion of animal mind, in “What is it like to be a bat?” was the best of its type, in elucidating the difficulties of a materialist explanation of mind. I would commend it to all.

For example, the information that explains how the butterfly emerges from the mess of the pupa, after the caterpillar has done its bit by constantly eating leaves, is vastly more complex than the information that explains why rain falls or snow blankets. We seek an explanation for metamorphosis, not for why rain or snow falls.

Here is an example:

So how is the trick done inside the “magic box” of the pupa? As one biologist told me, “The entire caterpillar dissolves, and is reconstructed as a butterfly.” The stored energy from the caterpillar’s voracious eating habits creates that? … ridiculous. Let’s hear more explanations, and subject them to tests, based on the life of the universe.

*Signature in the Cell (Harper One, 2009) was literally a prize at Uncommon Descent recently. I hope for more copies soon, for more contests and more prizes.

Comments
Zachriel: So Shannon did “deal with information.” Joseph: Only if one chooses to conflate mere complexity with information.
Shannon's paper makes repeated references to information, so it is clear that he "dealt with information" —unless you are claiming that Shannon didn't understand information.
Joseph: IOW Meyer uses “information” as information technology uses the term.
The global information infrastructure is based on Shannon's Theory of Information.
Joseph: IOW with Shannon meaning and content are not even considered.
Yes, that is correct. But that doesn't salvage your unsupportable statement that Shannon didn't deal with informaiton.Zachriel
December 28, 2009
December
12
Dec
28
28
2009
05:58 PM
5
05
58
PM
PDT
The classical theory of information can be compared to the statement that one kilogram of gold has the same value as one kilogram of sand- Karl Steinbach info scientist
The point being is if you do not deal with content, value, meaning, substance, then you are not dealing with information. And yes information can be transmitted as bits but that doesn't mean that any configuration of transmitted bits is information, ie having content and meaning.Joseph
December 28, 2009
December
12
Dec
28
28
2009
05:26 PM
5
05
26
PM
PDT
“Two messages, one of which is heavily loaded with meaning and the other which is pure nonsense, can be exactly equivalent…as regards information” Warren Weaver (info scientist) Zachriel:
That directly contradicts your position that “As for Shannon he never did deal with information, just mere complexity.”
It supports what I said. IOW with Shannon meaning and content are not even considered. However information is all about meaning and content.
So Shannon did “deal with information.”
Only if one chooses to conflate mere complexity with information. Again Meyer goes over this in chapter 4 of "Signature in the Cell". Werner Gitt provides an excellent review of Shannon's work and relevance in his book "In the Beginning was Information". Shannon was concerned with one aspect- statistics.
It’s just the theoretical basis of the digital revolution.
Not the part of his work which equates/ conflates random characters with information. However I get the impression that all you don't understand what Shannon was trying to do.Joseph
December 28, 2009
December
12
Dec
28
28
2009
05:22 PM
5
05
22
PM
PDT
Joseph: What part of the following don’t you understand: “Two messages, one of which is heavily loaded with meaning and the other which is pure nonsense, can be exactly equivalent…as regards information” Warren Weaver (info scientist)
That directly contradicts your position that "As for Shannon he never did deal with information, just mere complexity."
In particular, information must not be confused with meaning. In fact, two messages, one of which is heavily loaded with meaning and the other which is pure nonsense, can be exactly equivalent, from the present viewpoint, as regards information.
So Shannon did "deal with information."
Joseph: That is why many scientists didn’t find his “theory” very useful.
It's just the theoretical basis of the digital revolution.Zachriel
December 28, 2009
December
12
Dec
28
28
2009
04:14 PM
4
04
14
PM
PDT
Zachriel, You are confused and you think your confusion is meaningful discourse. What part of the following don't you understand: “Two messages, one of which is heavily loaded with meaning and the other which is pure nonsense, can be exactly equivalent…as regards information” Warren Weaver (info scientist) “The reason for the ‘uselessness’ of Shannon’s theory in the different sciences is frankly that no science can limit itself to its syntactic level”- Ernst von Weizsacker The point being is that Shannon never cared about content. This is well known and not disputed. That is why many scientists didn't find his "theory" very useful.
The statement provided sufficient context to indicate we’re talking about replicators subject to natural variation and selection.
No one is talking about replicators subject to natural variation- whatever that means- and selection- whatever that means. Living organisms are far more than replicators, design is natural and selection can be artificial. So again what have random variations and natural selection been shown to do? That's right I nailed it above- they provide a wobbling stability and nothing more.Joseph
December 28, 2009
December
12
Dec
28
28
2009
03:43 PM
3
03
43
PM
PDT
Collin: What is this evidence you speak of?
Whether looking at phylogenetics or the optimization of proteins or tracing the spread of disease, in vitro or in vivo or in silico, natural or directed, there is a great deal of literature on how incremental changes lead to improved function. Rowe et al., Analysis of a complete DNA–protein affinity landscape, Journal of the Royal Society 2009. Lenski et al., Genome evolution and adaptation in a long-term experiment with Escherichia coli, Nature 2009. Bloom & Arnold, In the light of directed evolution: Pathways of adaptive protein evolution, PNAS 2009. Hayashi et al., Experimental Rugged Fitness Landscape in Protein Sequence Space, PLoS 2006. Weinreich et al., Darwinian Evolution Can Follow Only Very Few Mutational Paths to Fitter Proteins, Science 2006.
Collin: For a good discussion of what “specification” means to ID-ers see this:
I am quite familiar with the paper.Zachriel
December 28, 2009
December
12
Dec
28
28
2009
03:40 PM
3
03
40
PM
PDT
Zachriel, What is this evidence you speak of? Evolutionary algorithims? What is that? Is that evidence? Or is it supposition? For a good discussion of what "specification" means to ID-ers see this: http://www.designinference.com/documents/2005.06.Specification.pdfCollin
December 28, 2009
December
12
Dec
28
28
2009
01:16 PM
1
01
16
PM
PDT
Zachriel: As evolutionary processes can optimize functional biochemical systems, increasing their “specificity,” that means CSI can increase due to the simple mechanisms of natural variation and selection. Collin: This is question begging. The assertion is that evolutionary processes cannot increase their specificity.
It's not question begging. It's pointing out that the assertion is contradicted by evidence that evolutionary processes can optimize functions, which means they are increasing the specification of the solution. Typically, this is represented by a peak on a fitness landscape, which evolutionary algorithms are quite adept at climbing.Zachriel
December 28, 2009
December
12
Dec
28
28
2009
10:37 AM
10
10
37
AM
PDT
Zachriel said "As evolutionary processes can optimize functional biochemical systems, increasing their “specificity,” that means CSI can increase due to the simple mechanisms of natural variation and selection" This is question begging. The assertion is that evolutionary processes cannot increase their specificity.Collin
December 28, 2009
December
12
Dec
28
28
2009
09:39 AM
9
09
39
AM
PDT
Joseph: As for Shannon he never did deal with information, just mere complexity. Zachriel: That’s funny. Joseph: It happens to be true.
Maybe you're right. Did the Father of Information Theory even mention "information" in his seminal paper, A Mathematical Theory of Communication?
Zachriel: As evolutionary processes can optimize functional biochemical systems, increasing their “specificity,” that means CSI can increase due to the simple mechanisms of natural variation and selection. Joseph: Ya see I keep telling you that evolutionary processes are meaningless as the debate is about BLIND and UNDIRECTED processes.
The statement provided sufficient context to indicate we're talking about replicators subject to natural variation and selection. It has been shown that such processes are quite adept at hill-climbing, i.e. optimization through increased specificity.Zachriel
December 28, 2009
December
12
Dec
28
28
2009
09:26 AM
9
09
26
AM
PDT
Joseph: As for Shannon he never did deal with information, just mere complexity. Zachriel:
That’s funny.
It happens to be true. Meyer discusses this in "Signature.." "Two messages, one of which is heavily loaded with meaning and the other which is pure nonsense, can be exactly equivalent...as regards information" Warren Weaver (info scientist) "The reason for the 'uselessness' of Shannon's theory in the different sciences is frankly that no science can limit itself to its syntactic level"- Ernst von Weizsacker Joseph: Behe cashes it out in terms of minimal function of biochemical systems. Zachriel:
As evolutionary processes can optimize functional biochemical systems, increasing their “specificity,” that means CSI can increase due to the simple mechanisms of natural variation and selection.
That is funny. Ya see I keep telling you that evolutionary processes are meaningless as the debate is about BLIND and UNDIRECTED processes. Also there isn't any scientific data which demonstrates CSI can originate via blind and undirected processes.Joseph
December 28, 2009
December
12
Dec
28
28
2009
08:57 AM
8
08
57
AM
PDT
Mark Frank, Wm Dembski has had plenty of opportunity to "correct" Meyer. I know the two have worked together on ID. That Dembski hasn't "corrected" Meyer tells me that Meyer is not contradicting Dembski. Also I have provided a direct quote from Meyer saying he is using information as Dembski is.Joseph
December 28, 2009
December
12
Dec
28
28
2009
08:48 AM
8
08
48
AM
PDT
jerry @ 29:
“Would someone be kind enough to quote Meyer where he explains what he means by “information”?” DNA and the transcription/translation process.
You seem to be interpreting the question as "What information is Meyer talking about?" rather than "How does Meyer define the term 'information'?". This would explain our miscommunication in a previous thread.R0b
December 28, 2009
December
12
Dec
28
28
2009
08:28 AM
8
08
28
AM
PDT
#39 R0b - thanks #40 Joseph just because R0b sez that it contradicts Dembski does not mean it actually does No, but assuming R0b's quotes are accurate, then these quotes contradict Dembski.Mark Frank
December 28, 2009
December
12
Dec
28
28
2009
08:20 AM
8
08
20
AM
PDT
R0b:
Yes and no. Meyer is inconsistent, and he usually gets Dembski’s definition wrong.
Really? Then it is strange that Dembski hasn't corrected him seeing that they have lectured at the same conferences. Also just because R0b sez that it contradicts Dembski does not mean it actually does.Joseph
December 28, 2009
December
12
Dec
28
28
2009
07:12 AM
7
07
12
AM
PDT
Mark Frank:
Is Meyer’s definition of information different from Dembski’s.
Yes and no. Meyer is inconsistent, and he usually gets Dembski's definition wrong. As you know, the "complexity" part of Dembski's "specified complexity" refers to improbability, but Meyer usually associates it with Kolmogorov complexity. For instance, in chapter 4 he says: Complex sequences exhibit an irregular, nonrepeating arrangement that defies expression by a general law or computer algorithm (an algorithm is a set of instructions for accomplishing a specific task or mathematical operation). The opposite of a complex sequence is a highly ordered sequence like ABCABCABCABC, in which the characters or constituents repeat over and over due to some underlying rule, algorithm, or general law. He sometimes correctly defines "complexity" as improbability, and often he defines it as both improbability and irregularity. In fact, he even equates the two concepts, saying: Information scientists typically equate "complexity" with "improbability," whereas they regard repetitive or redundant sequences as highly probable. You'll note that this contradicts Dembski, who defines "specified complexity" as being descriptively simple and improbable. Dembski's examples are often repetitive sequences or simple patterns, such as a series of coin flips that consists of all heads, or Nicholas Caputo drawing "Democrat" for every election, or the monolith in 2001: A Space Odyssey, or a simple narrowband signal from space.R0b
December 28, 2009
December
12
Dec
28
28
2009
07:05 AM
7
07
05
AM
PDT
Joseph: As for Shannon he never did deal with information, just mere complexity.
That's funny.
Joseph: Behe cashes it out in terms of minimal function of biochemical systems.
As evolutionary processes can optimize functional biochemical systems, increasing their "specificity," that means CSI can increase due to the simple mechanisms of natural variation and selection.Zachriel
December 28, 2009
December
12
Dec
28
28
2009
06:22 AM
6
06
22
AM
PDT
Biological specification always refers to function. An organism is a functional system comprising many functional subsystems. In virtue of their function, these systems embody patterns that are objectively given and can be identified independently of the systems that embody them. Hence these systems are specified in the same sense required by the complexity-specification criterion (see sections 1.3 and 2.5). The specification of organisms can be crashed out in any number of ways. Arno Wouters cashes it out globally in terms of the viability of whole organisms. Michael Behe cashes it out in terms of minimal function of biochemical systems.- Wm. Dembski page 148 of NFL
In the preceding and proceeding paragraphs William Dembski makes it clear that biological specification is CSI- complex specified information. In the paper "The origin of biological information and the higher taxonomic categories", Stephen C. Meyer wrote:
Dembski (2002) has used the term “complex specified information” (CSI) as a synonym for “specified complexity” to help distinguish functional biological information from mere Shannon information--that is, specified complexity from mere complexity. This review will use this term as well.
All that said on page 86 of "Signature...." Meyer point's to Webster's dictionary- "the attribute inherent in and communicated by alternative sequences or arrangements of something that produce specific effects". As for Shannon he never did deal with information, just mere complexity. IOW Meyer uses "information" as information technology uses the term.Joseph
December 28, 2009
December
12
Dec
28
28
2009
06:13 AM
6
06
13
AM
PDT
Re #35 Is Meyer's definition of information different from Dembski's. I think I understand Dembski's fairly well having read most of his papers.Mark Frank
December 28, 2009
December
12
Dec
28
28
2009
02:01 AM
2
02
01
AM
PDT
Bruce - the mathematical definition should be fine for me: I'm pretty numerate. I don't mind doing work, but is it possible to get these definitions without buying someone's books?Heinrich
December 28, 2009
December
12
Dec
28
28
2009
01:48 AM
1
01
48
AM
PDT
To Heinrich, et al: Meyer spends an entire chapter (Chapter 4) defining what he means by the type of information found in the genomes of living organisms. The definition is specific, rigorous, and mathematical. It follows Dembski, who spent several chapters of several books (The Design Inference, No Free Lunch, and others) explaining and defining complex specified information (CSI). You can't really give an adequate one or two sentence definition. If you are really serious about understanding ID, you need to do a little work!Bruce David
December 27, 2009
December
12
Dec
27
27
2009
08:18 PM
8
08
18
PM
PDT
Indeed, Collin. I'm hoping I don't have to shell out a pile of money just to find and read one or two sentences.Heinrich
December 27, 2009
December
12
Dec
27
27
2009
12:42 PM
12
12
42
PM
PDT
Heinrich, I'm sure you will be able to trash Meyer's definition once you've read it.Collin
December 27, 2009
December
12
Dec
27
27
2009
09:30 AM
9
09
30
AM
PDT
Would you be kind enough to read post number 7 and take note of the relevant quote already provided.
The quote isn't from Meyer, so I can't be sure that it is the definition he meant. The definition also isn't enough, because there is nothing in the definition that shows how to quantify it.
Meyers [sic] draws a clear distinction between what he calls ‘complex specified information’ and ‘Shannon information’, which is merely complex.
OK, so Meyer doesn't use Shannon information. That cuts it down to about 493 uses. :-)Heinrich
December 27, 2009
December
12
Dec
27
27
2009
01:29 AM
1
01
29
AM
PDT
I lent my copy of "Signature in the Cell", so I can't doublecheck my memory, but Meyers is downright voluble about what he considers information. Have the critics here read his book??? Meyers draws a clear distinction between what he calls 'complex specified information' and 'Shannon information', which is merely complex. 'Shannon' is the term I'm unable to doublecheck. Anyway, he clearly draws the distinction between the two types of information. I'm assuming the critics here are intelligent, therefore I have to conclude that they haven't read the book, which is not too smart when debating something asserted in it.Anaxagoras_Rules
December 26, 2009
December
12
Dec
26
26
2009
05:19 PM
5
05
19
PM
PDT
---Heinrich: "Would someone be kind enough to quote Meyer where he explains what he means by “information”? Would you be kind enough to read post number 7 and take note of the relevant quote already provided. That way you can move on to your next objection, which will consist of a series of comments indicating your disastisfaction with the definition, followed by the earth shaking observation that the term has more than one meaning and that not everyone applies it the same way or in the same context or from the same vantage point.StephenB
December 26, 2009
December
12
Dec
26
26
2009
01:39 PM
1
01
39
PM
PDT
"Would someone be kind enough to quote Meyer where he explains what he means by “information”?" DNA and the transcription/translation process.jerry
December 26, 2009
December
12
Dec
26
26
2009
01:34 PM
1
01
34
PM
PDT
Until you can show how the type of information Meyer refers to arises in nature, then maybe you should refrain from criticizing Meyer or ID or defending Shallit’s specious non relevant comment.
As I wrote, I don't have Meyer's book. Until someone explains what exactly Meyer means by "information", it's difficult to progress further. I've tried to explain how Shallit's criticism makes sense, i.e. that it is talking about physical processes that affect the information that is measured, but I don't know how that aligns with how Meyer uses "information" in his book. Would someone be kind enough to quote Meyer where he explains what he means by "information"?Heinrich
December 26, 2009
December
12
Dec
26
26
2009
12:59 PM
12
12
59
PM
PDT
From the below post: http://sonofneocles.blogspot.com/2009/12/tom-nagel-apostate.html The weather forecasting analogy is not very good. The forecaster, an intelligent agent. He gathers data, from which he generates a prediction. The data is generated by the interaction of natural phenomena with various intelligently designed instruments, that convert the interactions into information. This then is the data used (be it magnetic patches on a hard drive, or squiggly lines on a barometer). This first-level data, together with the methodology used to generate the prediction (statistical modeling, etc..) generates the prediction a second level of data or information. But, to be clear here; the data fed into the model is the result of the interaction of intelligent agents with the natural phenomena, via tools they have created for such purposes (and their brains of course). Only after all that has gone on does the same person, or different people feed this data or information into a machine or brain that has 'installed' within it a higher level predictive method, and/or tool, which then generates some further wider-scope information that can in turn be used by other intelligent agents, because they can read and understand it. The initial phenomena, while it did not come from a mind, nevertheless can be called 'information' only in an extended metaphorical sense. It is not really information at all (but some physical events like air movement), until experienced and interpreted as such by brains and intelligently designed measuring devices which are related to each other in just the right way. But, aside from that fallacy, notice the invective: It's sad to see such an eminent philosopher (Nagel) make a fool of himself with this recommendation. and in this comment a bit down the page on said blog.. Nagel has become a disgrace. He was a philosopher who made some significant contributions, but in areas far afield of this one. A small irony: the other book he chooses to recommend is by a colleague and friend with whom he co-teaches. High standards of integrity here! Wow, ad-hominum with a vengeance. He's recommending books by friends. The horror! Academic quasi-nepotism in action.shaun
December 26, 2009
December
12
Dec
26
26
2009
12:28 PM
12
12
28
PM
PDT
"Science was never intended to make logic it’s master, and it never could." Strike that. Science was never intended to surpass logic, and it never could. also "It can never be determined outright, if every instance of gravity isn’t being controlled by god, or if no god was never involved." I meant "ever involved." Anyways, one point I ran across from a Dembski article one time is that randomness can't be defined. it was a while ago and I think that's what he was saying. Either way this is what I got out of it. A lottery is predetermined to only give apparent randomness but the chain leads back to order from a persons mind. The paint on the balls and the timing of the lady pulling the lever isn't a confirmation of randomness. You might as well disprove destiny. A random number generated computer program relies first on the inventor's arbitrary concept of order installed into it. But darwinist attack ID saying it must define information, which is represented here as order. Darwinist must at the same time define chaos, but don't. Both are on equal ground, until you get down to debating rocks to rockstars which is cut and dry, there's no path. On the highest level though, of defining order and chaos themselves, from which comes information, it's equal ground. The basis of neodarwinism, of which origins is included except during embarrassing setbacks, is that pure chaos can exist, when it looks like it can't. Unless you want to say laws can start up from the motion and bouncing off of each other. The bouncing itself is a law.lamarck
December 26, 2009
December
12
Dec
26
26
2009
12:16 PM
12
12
16
PM
PDT
1 2 3

Leave a Reply