Uncommon Descent Serving The Intelligent Design Community

Darwinism and academic culture: Mathematician Jeffrey Shallit weighs in

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
arroba Email

You can tell that Darwinism is failing when it attracts completely ridiculous attacks like this one, on Signature in the Cell (Harper One, 2009). The gist is that Nagel thought Meyer’s book a prize.* But Shallit says,

Meyer claims, over and over again, that information can only come from a mind — and that claim is an absolutely essential part of his argument. Nagel, the brilliant philosopher, should see why that is false. Consider making a weather forecast. Meteorologists gather information about the environment to do so: wind speed, direction, temperature, cloud cover, etc. It is only on the basis of this information that they can make predictions. What mind does this information come from?

What mind indeed? If we experience either snow or dull, freezing rain here tomorrow, why should I be surprised? This is the season officially known as winter.

So, maybe Nagel, the brilliant philosopher, knows more than Shallit, the University of Waterloo prof.

I thought Thomas Nagel’s discussion of animal mind, in “What is it like to be a bat?” was the best of its type, in elucidating the difficulties of a materialist explanation of mind. I would commend it to all.

For example, the information that explains how the butterfly emerges from the mess of the pupa, after the caterpillar has done its bit by constantly eating leaves, is vastly more complex than the information that explains why rain falls or snow blankets. We seek an explanation for metamorphosis, not for why rain or snow falls.

Here is an example:

So how is the trick done inside the “magic box” of the pupa? As one biologist told me, “The entire caterpillar dissolves, and is reconstructed as a butterfly.” The stored energy from the caterpillar’s voracious eating habits creates that? … ridiculous. Let’s hear more explanations, and subject them to tests, based on the life of the universe.

*Signature in the Cell (Harper One, 2009) was literally a prize at Uncommon Descent recently. I hope for more copies soon, for more contests and more prizes.

Comments
Mrs O'Leary, The stored energy from the caterpillar’s voracious eating habits creates that? … ridiculous. Are you truly incredulous about the energy required? What other source of energy is there? Or are you amazed that one set of instructions can build two different body plans when starting from different inputs? Nakashima
Mark Frank: The article could hardly be clearer. The only use of “information” that all the biology departments in the universe mean is Shannon information. All the many other uses are contentious.
Yes, that's what the article suggests.
Both Godfrey-Smith (2000a) and Griffiths (2001) have argued that there is one highly restricted use of a fairly rich semantic language within genetics that is justified.
So they *argue* the point.
This very narrow understanding of the informational properties of genes
At that, it's a narrow understanding.
It does not vindicate the idea that genes code for whole-organism phenotypes, let alone provide a basis for the wholesale use of informational or semantic language in biology.
Even then, it has limited applicability. Comparing to your statement,
jerry: We mean {by information} the same thing that all the biology departments in the universe mean
Certainly that resource supports Mark Frank's view and contradicts jerry's. Many biologists probably do use the term information informally, but most of the time they are referring to data. Zachriel
#82 Jerry You were arguing that this sense is in common use by biology departments. Godfrey writes: Both Godfrey-Smith (2000a) and Griffiths (2001) have argued that there is one highly restricted use of a fairly rich semantic language within genetics that is justified Nowhere does he imply that this sense is in common use and this sentence would seem to imply the reverse. Mark Frank
Let me repeat what I wrote on the other thread. This is in response to Mark Frank's absurd claim which was: "The article could hardly be clearer. The only use of “information” that all the biology departments in the universe mean is Shannon information. All the many other uses are contentious." I then said: "I do not believe what I just read. I suggest all interested go to the Stanford article and go to the section on the genetic code. It says 'Both Godfrey-Smith (2000a) and Griffiths (2001) have argued that there is one highly restricted use of a fairly rich semantic language within genetics that is justified. This is the idea that genes “code for” the amino acid sequence of protein molecules, in virtue of the peculiar features of the “transcription and translation” mechanisms found within cells. Genes specify amino acid sequence via a templating process that involves a regular mapping rule between two quite different kinds of molecules (nucleic acid bases and amino acids). This mapping rule is combinatorial, and apparently arbitrary (in a sense that is hard to make precise. This very narrow understanding of the informational properties of genes is basically in accordance with the influential early proposal of Francis Crick (1958). The argument is that these low-level mechanistic features make gene expression into a causal process that has significant analogies to paradigmatic symbolic phenomena. Some have argued that this analogy becomes questionable once we move from the genetics of simple prokaryotic organisms (bacteria), to those in eukaryotic cells. This has been a theme of Sarkar’s work (1996). Mainstream biology tends to regard the complications that arise in the case of eukaryotes as mere details that do not compromise the basic picture we have of how gene expression works. An example is the editing and “splicing” of mRNA transcripts. The initial stage in gene expression is the use of DNA in a template process to construct an intermediate molecule, mRNA or “messenger RNA,” that is then used as a template in the manufacture of a protein. The protein is made by stringing a number of amino acid molecules together. In organisms other than bacteria, the mRNA is often extensively modified (“edited”) prior to its use. This process makes eukaryotic DNA a much less straightforward predictor of the protein’s amino acid sequence than it is in bacteria, but it can be argued that this does not much affect the crucial features of gene expression mechanisms that motivate the introduction of a symbolic or semantic mode of description. So the argument in Godfrey-Smith (2000a) and Griffiths (2001) is that there is one kind of informational or semantic property that genes and only genes have: coding for the amino acid sequences of protein molecules. But this relation “reaches” only as far as the amino acid sequence. It does not vindicate the idea that genes code for whole-organism phenotypes, let alone provide a basis for the wholesale use of informational or semantic language in biology. Genes can have a reliable causal role in the production of a whole-organism phenotype, of course. But if this causal relation is to be described in informational terms, then it is a matter of ordinary Shannon information, which applies to environmental factors as well.' Then ask yourself how Mark Frank could write his comment with a straight face. But it is what we expect around here from anti ID people. Did he really think we would not read on?" This whole discussion is getting weird. Denying the obvious, claiming things that are not true under obvious examination. It is nothing new with the anti ID crowd but it does waste time trying to deal with their crap. The link for the Stanford article in in #67. jerry
#78 Jerry Re me slithering away. I was put into moderation and none of my comments are being published - possibly this one will get through! Mark Frank
#79 - Are you disputing that the sequences contained within DNA contain encoded instructions for the development and functioning of living things? If not, then can you see the difference between that and a sequence made from the same elements, but which is random and describes nothing? I have no reason to doubt that you can. If DNA contains such descriptive and functional information, then who really cares in what other sense biologists may or may not use the word "information?" It's downright Orwellian to try to reshape reality by managing the words we use to describe it. Perhaps the problem can be solved by eliminating the offending word, information. A caterpillar's DNA contains the encoded instructions for all of its components at the cellular and anatomical level, as well as the complex behaviors that enable to to live. What word do you prefer to describe what is contained in its DNA? If information is synonymous with meaningless bits of random data, then I suppose information is poor word. Would would be a better one? ScottAndrews
jerry at 78, “There is no “common one” other than Shannon information that is used in biology. Mark Frank pointed that out to you just a few days ago here and here.” And Mark Frank was wrong as he slithered away after it was pointed out to him. So I suggest you too read the article on biology and information. Another reason why no one should take you seriously as you failed to point out the continuation of the comments. By all means let's review the rest of the comments. I note that a simple way for you to have refuted my statement would have been to provide references of your own, but 'tis the season to be generous so I'll look for you. Mark Frank wrote here that: Both Godfrey-Smith (2000a) and Griffiths (2001) have argued that there is one highly restricted use of a fairly rich semantic language within genetics that is justified. This is clearly not something that is accepted by all the biology departments in the Universe. It is a contentious definition which they are arguing for. If this is the sense of information that you mean – then fine – just confirm that you mean information in the sense defined in Godfrey-Smith (2000a) and Griffiths (2001). But don’t pretend it is obvious or universally accepted within biology. Here you left a mildly amusing Casablanca reference, but certainly didn't address Mr. Frank's argument. Mark Frank requested clarification again here. Here you failed again to respond, merely repeating "The one the biology departments across the universe are using." after Mr. Frank used your own sources to show that no such single definition is being used. Mark Frank realizes the futility of his quest here and stops asking. At no point further in that thread do you attempt to address his argument. So, not only is your claim to have proven Mark Frank wrong not supported by the available evidence, your characterization of him having "slithered" away is also not in accordance with the reality of the situation. I would personally be very interested to read your statement of what you believe to be the consensus definition of "information" among biologists. My primary goal in delurking here is to understand CSI well enough to be able to implement software that measures it. Understanding what ID proponents mean by "information" would help me achieve that goal. Mustela Nivalis
"There is no “common one” other than Shannon information that is used in biology. Mark Frank pointed that out to you just a few days ago here and here." And Mark Frank was wrong as he slithered away after it was pointed out to him. So I suggest you too read the article on biology and information. Another reason why no one should take you seriously as you failed to point out the continuation of the comments. How many times does it has to be pointed out to all the know nothings here that the term "information" has many uses. And the one most used in biology is not Shannon information but apparently Shannon information techniques can be applied to it to estimate its complexity. jerry
Joseph, can you give me a link to an essay of Meyer's where he gives the same definition as he uses in his book? Heinrich
Heinrich:
Joseph – I’m specifically asking about how ID (and in particular Meyer) quantifies information.
Then I suggest that you get busy reading his books and essays.
Until this is clear, I think it’s difficult to be sure whether Shallit was talking sense or not.
To you. Joseph
Joseph - I'm specifically asking about how ID (and in particular Meyer) quantifies information. Until this is clear, I think it's difficult to be sure whether Shallit was talking sense or not. Heinrich
Mustela Nivalis (the weasel), If biology uses Shannon information then it is no wonder biologists can't tell us very much beyond basic operation. As I and many others have said- Shannon didn't care about function, nor meaning. You cannot put meaning into mathematical form so Shannon dod not try to. What he did was to formulate a way to calculate the information carrying capacity. Joseph
Heinrich, How does one quantify anything in the theory of evolution? The point being is that you really need to focus on your position because it is the utter failure of your position to find any supporting data that has allowed ID to remain "in play". Joseph
Something else for Zachriel to ignore: All Shannon did was demonstrate the information carrying capacity- IOW Shannon's theory cannot differentiate between functional or message bearing sequences from random and useless noise. Joseph
For Zachriel to ignore-
"The word information in this theory is used in a special mathematical sense that must not be confused with its ordinary usage. In particular information must not be confused with meaning."--Warren Weaver (worked with Shannon)
Joseph
Jerry at 66, Information has all sorts of definitions and a common one is the one that is used in biology. You are a big fan of Godfrey-Smith and even talks about this in explicit detail. There is no "common one" other than Shannon information that is used in biology. Mark Frank pointed that out to you just a few days ago here and here. Mustela Nivalis
Thanks for the reference Jerry, not sure I will have time to read it all but if you look at my first post, I'm not totally on board with classical information theory applied to biology anyway, because the proponents are trying to prove intelligence behind a message. Too slippery of an approach in my view, because in the Shannon sense, a human at one end and a human at the other is the definition of information. One end may be a measuring/sensing machine designed by a human. You can read a book, coming from an author by definition, or transmit it over a communication channel with no errors so someone else can read it provided enough attention has been paid to the statistical requirements for error-free (actually the probability of error-free) transmission in the inevitable presence of noise, thanks to Shannon. But this has nothing to do with what a sentence means, or how that meaning may depend on context, analogs that may be absolutely relevant to biology. When it gets down to it, I'm much more enthusiastic about arguments from complexity, possibly with relevant probabilistic calculations, a la Behe in Edge. groovamos
jerry: Would you say that the information in a computer program or an English paragraph is information in the Shannon sense? From what little I understand, it isn’t.
He says as he transmits data (as Shannon Information) across the Internet. Shannon Information is a mathematical model. The correct question is whether an English paragraph can be analyzed as Shannon Information. The answer is clearly yes. Zachriel
"Of course they are in the Shannon sense, because information is anything meaningful that can be represented as sequences of symbols." I suggest you read the article in the Stanford Encyclopedia of Philosophy on Biological information. http://plato.stanford.edu/entries/information-biological/ It is and isn't Shannon information. Read my comment at #13. In one sense it is Shannon but in its crucial use, it isn't. I am not an expert but will take Godfrey-Smith's assessment who is supposedly and expert on this. The article is long but covers the various uses people have made of information within biology. jerry
R0b, I suggest you read my comment at #13. Information has all sorts of definitions and a common one is the one that is used in biology. You are a big fan of Godfrey-Smith and even talks about this in explicit detail. jerry
Jerry says: "Would you say that the information in a computer program or an English paragraph is information in the Shannon sense? From what little I understand, it isn’t." Of course they are in the Shannon sense, because information is anything meaningful that can be represented as sequences of symbols. What Shannon did was to completely describe mathematically the situation whereby a message from one source is corrupted by an uncorrelated source of interference. This means that the two sources are uncorrelated so that each looks like random noise to the other. A straightforward illustration is when deep space communications are subject to interference from noise sources such as atmospheric discharges, solar induced magnetic disturbances, etc, causing errors in the information transmission. But suppose we want to do some weird science and analyze the induced errors in this scenario. We could do an experiment whereby the error rate and the error occurrence in time were studied. We would see such a study as meaningful because we are now studying the noise sources; and the equipment, designed by minds, provides the information. We could command the spacecraft to send a repetitive symbol indefinitely thus reducing the information content from the craft to zero, but the received information content would not be zero, as we would as a result of the noise sources' interference be receiving random symbol sequences. We could surmise useful information about the noise sources from this. Thus the key is usefulness in our mind. Weird science as there are usually better ways to study noise. groovamos
Weather data are gathered as a time sequences of values, and the various alternative sequences produce specific effects. Why then do weather data not count as information according to this definition? Various alternative sequences of weather do produce specific effects. Humidity increases the chance of rain, etc, etc. Those are physical properties and events, not information. (Otherwise we would have to redefine "information" as "Stuff that exists.") The weather never, ever produces an abstract, encoded description of itself. When it rains, the ground gets wet. The rain does not enter a record of itself into a meterological log. One is cause and effect. The cause can often be determined from the effect. The other is an abstract definition of the cause and effect, which is never directly produced by the weather. I think some of the comments overcomplicate this. Maybe I'm wrong. But to boil it down, natural, unintelligent processes are not known for using abstract languages to create descriptions of reality that are separate from that reality. Why would they? And even if such a thing could result from millions of years of unintelligent evolution, how does a language more complicated than that used by any living thing appear at the very beginning, before anything had evolved? Surely you can understand why some might easily dismiss the idea. ScottAndrews
jerry, as much as I enjoy your effusive flattery, please don't lay it on too thick in public. My point was about the Webster definition quoted by sagebrush gardener, which says nothing about machinery, "pointing" (how is that different from cause and effect?), intermediary mechanisms, number of subparts, or particular processes. If certain time sequences produce predictable effects, why do they not count as information according to the Webster definition? R0b
"as a time sequences" --> "as time sequences" R0b
"Weather data are gathered as a time sequences of values, and the various alternative sequences produce specific effects. Why then do weather data not count as information according to this definition?" For someone who admits that he does not know what this is all about, you keep on showing up with comments that do not make sense. The data in DNA points to a specific independent entity. And it does so through a machinery set up to provide this relationship just as computer programming and language do. The data in weather does not point to anything specific. That is not to say that the pattern of weather elements do cause some other phenomena but they are not such that they cause the same specific thing each time. There is no intermediary mechanism of parts that lead from weather pattern A to weather pattern B as there in the DNA transcription/translation process through a mechanism consisting of about a thousand sub parts. You should go back to Amazon and put up another review which admits you do not understand the process or that you willfully distort it. You already have admitted you are ignorant of this process and yet you pontificate on Amazon and here and who knows where else. You say it is not malicious but one has to doubt that based on this persistent obstructive behavior. The sad thing is that your ignorance gathers helpful comments from readers. So it is so easy to see how a self proclaimed know nothing can distort a system. jerry
sagebrush gardener @ 7:
The second meaning — the one intended by Meyer — is “the attribute inherent in and communicated by one of two or more alternative sequences or arrangements of something (as nucleotides in DNA or binary digits in a computer program) that produce specific effects”.
Weather data are gathered as a time sequences of values, and the various alternative sequences produce specific effects. Why then do weather data not count as information according to this definition? R0b
(sorry, forgot about this)
All that said on page 86 of “Signature….” Meyer point’s to Webster’s dictionary- “the attribute inherent in and communicated by alternative sequences or arrangements of something that produce specific effects”.
How does one quantify this? Heinrich
I read some of shallit's comments...are there any darwinists who are not arrogant a....... anyway this paper is rather interesting...
The fundamental contention inherent in our three subsets of sequence complexity proposed in this paper is this: without volitional agency assigning meaning to each configurable-switch-position symbol, algorithmic function and language will not occur. The same would be true in assigning meaning to each combinatorial syntax segment (programming module or word). Source and destination on either end of the channel must agree to these assigned meanings in a shared operational context. Chance and necessity cannot establish such a cybernetic coding/decoding scheme Self-ordering phenomena are observed daily in accord with chaos theory. But under no known circumstances can self-ordering phenomena like hurricanes, sand piles, crystallization, or fractals produce algorithmic organization. Algorithmic "self-organization" has never been observed [70] despite numerous publications that have misused the term [21,151-162]. Bone fide organization always arises from choice contingency, not chance contingency or necessity.
link tsmith
Collin: I take issue though with how you use the word specificity. Does specificity mean improved function?
To calculate specificity, we define our function at a certain level, and determine the number of sequences that could provide this level of function. The fewer such sequences, the more specific the sequence. We can visual this as a fitness landscape, often a sharp local peak and a broad foothill region. As we can show that protein evolution is adept at hillclimbing, we can see that the protein becomes better matched to the function. Specificity increases, often by many orders of magnitude.
Collin: Does specificity mean functional specified complex information?
Assuming a constant sequence length, if the specificity increases, so does the specified complexity. X = –log2 [ BIGNUM · S(T)·P(T|H)] (S(T) is Dembski's specificity. Dembski defines it in terms of a Semiotic Agent or more recently by K-complexity. The shorter the description, the more specific. H is the chance hypothesis, which is rather vaguely defined, but is usually taken as being a uniform random distribution.) Zachriel
Zachriel, Good answer. I'll have to review those studies. I take issue though with how you use the word specificity. Does specificity mean improved function? Is that how Meyer and Dembski are using the term? Does specificity mean functional specified complex information? Collin
Zachriel: So Shannon did “deal with information.” Joseph: Only if one chooses to conflate mere complexity with information.
Shannon's paper makes repeated references to information, so it is clear that he "dealt with information" —unless you are claiming that Shannon didn't understand information.
Joseph: IOW Meyer uses “information” as information technology uses the term.
The global information infrastructure is based on Shannon's Theory of Information.
Joseph: IOW with Shannon meaning and content are not even considered.
Yes, that is correct. But that doesn't salvage your unsupportable statement that Shannon didn't deal with informaiton. Zachriel
The classical theory of information can be compared to the statement that one kilogram of gold has the same value as one kilogram of sand- Karl Steinbach info scientist
The point being is if you do not deal with content, value, meaning, substance, then you are not dealing with information. And yes information can be transmitted as bits but that doesn't mean that any configuration of transmitted bits is information, ie having content and meaning. Joseph
“Two messages, one of which is heavily loaded with meaning and the other which is pure nonsense, can be exactly equivalent…as regards information” Warren Weaver (info scientist) Zachriel:
That directly contradicts your position that “As for Shannon he never did deal with information, just mere complexity.”
It supports what I said. IOW with Shannon meaning and content are not even considered. However information is all about meaning and content.
So Shannon did “deal with information.”
Only if one chooses to conflate mere complexity with information. Again Meyer goes over this in chapter 4 of "Signature in the Cell". Werner Gitt provides an excellent review of Shannon's work and relevance in his book "In the Beginning was Information". Shannon was concerned with one aspect- statistics.
It’s just the theoretical basis of the digital revolution.
Not the part of his work which equates/ conflates random characters with information. However I get the impression that all you don't understand what Shannon was trying to do. Joseph
Joseph: What part of the following don’t you understand: “Two messages, one of which is heavily loaded with meaning and the other which is pure nonsense, can be exactly equivalent…as regards information” Warren Weaver (info scientist)
That directly contradicts your position that "As for Shannon he never did deal with information, just mere complexity."
In particular, information must not be confused with meaning. In fact, two messages, one of which is heavily loaded with meaning and the other which is pure nonsense, can be exactly equivalent, from the present viewpoint, as regards information.
So Shannon did "deal with information."
Joseph: That is why many scientists didn’t find his “theory” very useful.
It's just the theoretical basis of the digital revolution. Zachriel
Zachriel, You are confused and you think your confusion is meaningful discourse. What part of the following don't you understand: “Two messages, one of which is heavily loaded with meaning and the other which is pure nonsense, can be exactly equivalent…as regards information” Warren Weaver (info scientist) “The reason for the ‘uselessness’ of Shannon’s theory in the different sciences is frankly that no science can limit itself to its syntactic level”- Ernst von Weizsacker The point being is that Shannon never cared about content. This is well known and not disputed. That is why many scientists didn't find his "theory" very useful.
The statement provided sufficient context to indicate we’re talking about replicators subject to natural variation and selection.
No one is talking about replicators subject to natural variation- whatever that means- and selection- whatever that means. Living organisms are far more than replicators, design is natural and selection can be artificial. So again what have random variations and natural selection been shown to do? That's right I nailed it above- they provide a wobbling stability and nothing more. Joseph
Collin: What is this evidence you speak of?
Whether looking at phylogenetics or the optimization of proteins or tracing the spread of disease, in vitro or in vivo or in silico, natural or directed, there is a great deal of literature on how incremental changes lead to improved function. Rowe et al., Analysis of a complete DNA–protein affinity landscape, Journal of the Royal Society 2009. Lenski et al., Genome evolution and adaptation in a long-term experiment with Escherichia coli, Nature 2009. Bloom & Arnold, In the light of directed evolution: Pathways of adaptive protein evolution, PNAS 2009. Hayashi et al., Experimental Rugged Fitness Landscape in Protein Sequence Space, PLoS 2006. Weinreich et al., Darwinian Evolution Can Follow Only Very Few Mutational Paths to Fitter Proteins, Science 2006.
Collin: For a good discussion of what “specification” means to ID-ers see this:
I am quite familiar with the paper. Zachriel
Zachriel, What is this evidence you speak of? Evolutionary algorithims? What is that? Is that evidence? Or is it supposition? For a good discussion of what "specification" means to ID-ers see this: http://www.designinference.com/documents/2005.06.Specification.pdf Collin
Zachriel: As evolutionary processes can optimize functional biochemical systems, increasing their “specificity,” that means CSI can increase due to the simple mechanisms of natural variation and selection. Collin: This is question begging. The assertion is that evolutionary processes cannot increase their specificity.
It's not question begging. It's pointing out that the assertion is contradicted by evidence that evolutionary processes can optimize functions, which means they are increasing the specification of the solution. Typically, this is represented by a peak on a fitness landscape, which evolutionary algorithms are quite adept at climbing. Zachriel
Zachriel said "As evolutionary processes can optimize functional biochemical systems, increasing their “specificity,” that means CSI can increase due to the simple mechanisms of natural variation and selection" This is question begging. The assertion is that evolutionary processes cannot increase their specificity. Collin
Joseph: As for Shannon he never did deal with information, just mere complexity. Zachriel: That’s funny. Joseph: It happens to be true.
Maybe you're right. Did the Father of Information Theory even mention "information" in his seminal paper, A Mathematical Theory of Communication?
Zachriel: As evolutionary processes can optimize functional biochemical systems, increasing their “specificity,” that means CSI can increase due to the simple mechanisms of natural variation and selection. Joseph: Ya see I keep telling you that evolutionary processes are meaningless as the debate is about BLIND and UNDIRECTED processes.
The statement provided sufficient context to indicate we're talking about replicators subject to natural variation and selection. It has been shown that such processes are quite adept at hill-climbing, i.e. optimization through increased specificity. Zachriel
Joseph: As for Shannon he never did deal with information, just mere complexity. Zachriel:
That’s funny.
It happens to be true. Meyer discusses this in "Signature.." "Two messages, one of which is heavily loaded with meaning and the other which is pure nonsense, can be exactly equivalent...as regards information" Warren Weaver (info scientist) "The reason for the 'uselessness' of Shannon's theory in the different sciences is frankly that no science can limit itself to its syntactic level"- Ernst von Weizsacker Joseph: Behe cashes it out in terms of minimal function of biochemical systems. Zachriel:
As evolutionary processes can optimize functional biochemical systems, increasing their “specificity,” that means CSI can increase due to the simple mechanisms of natural variation and selection.
That is funny. Ya see I keep telling you that evolutionary processes are meaningless as the debate is about BLIND and UNDIRECTED processes. Also there isn't any scientific data which demonstrates CSI can originate via blind and undirected processes. Joseph
Mark Frank, Wm Dembski has had plenty of opportunity to "correct" Meyer. I know the two have worked together on ID. That Dembski hasn't "corrected" Meyer tells me that Meyer is not contradicting Dembski. Also I have provided a direct quote from Meyer saying he is using information as Dembski is. Joseph
jerry @ 29:
“Would someone be kind enough to quote Meyer where he explains what he means by “information”?” DNA and the transcription/translation process.
You seem to be interpreting the question as "What information is Meyer talking about?" rather than "How does Meyer define the term 'information'?". This would explain our miscommunication in a previous thread. R0b
#39 R0b - thanks #40 Joseph just because R0b sez that it contradicts Dembski does not mean it actually does No, but assuming R0b's quotes are accurate, then these quotes contradict Dembski. Mark Frank
R0b:
Yes and no. Meyer is inconsistent, and he usually gets Dembski’s definition wrong.
Really? Then it is strange that Dembski hasn't corrected him seeing that they have lectured at the same conferences. Also just because R0b sez that it contradicts Dembski does not mean it actually does. Joseph
Mark Frank:
Is Meyer’s definition of information different from Dembski’s.
Yes and no. Meyer is inconsistent, and he usually gets Dembski's definition wrong. As you know, the "complexity" part of Dembski's "specified complexity" refers to improbability, but Meyer usually associates it with Kolmogorov complexity. For instance, in chapter 4 he says: Complex sequences exhibit an irregular, nonrepeating arrangement that defies expression by a general law or computer algorithm (an algorithm is a set of instructions for accomplishing a specific task or mathematical operation). The opposite of a complex sequence is a highly ordered sequence like ABCABCABCABC, in which the characters or constituents repeat over and over due to some underlying rule, algorithm, or general law. He sometimes correctly defines "complexity" as improbability, and often he defines it as both improbability and irregularity. In fact, he even equates the two concepts, saying: Information scientists typically equate "complexity" with "improbability," whereas they regard repetitive or redundant sequences as highly probable. You'll note that this contradicts Dembski, who defines "specified complexity" as being descriptively simple and improbable. Dembski's examples are often repetitive sequences or simple patterns, such as a series of coin flips that consists of all heads, or Nicholas Caputo drawing "Democrat" for every election, or the monolith in 2001: A Space Odyssey, or a simple narrowband signal from space. R0b
Joseph: As for Shannon he never did deal with information, just mere complexity.
That's funny.
Joseph: Behe cashes it out in terms of minimal function of biochemical systems.
As evolutionary processes can optimize functional biochemical systems, increasing their "specificity," that means CSI can increase due to the simple mechanisms of natural variation and selection. Zachriel
Biological specification always refers to function. An organism is a functional system comprising many functional subsystems. In virtue of their function, these systems embody patterns that are objectively given and can be identified independently of the systems that embody them. Hence these systems are specified in the same sense required by the complexity-specification criterion (see sections 1.3 and 2.5). The specification of organisms can be crashed out in any number of ways. Arno Wouters cashes it out globally in terms of the viability of whole organisms. Michael Behe cashes it out in terms of minimal function of biochemical systems.- Wm. Dembski page 148 of NFL
In the preceding and proceeding paragraphs William Dembski makes it clear that biological specification is CSI- complex specified information. In the paper "The origin of biological information and the higher taxonomic categories", Stephen C. Meyer wrote:
Dembski (2002) has used the term “complex specified information” (CSI) as a synonym for “specified complexity” to help distinguish functional biological information from mere Shannon information--that is, specified complexity from mere complexity. This review will use this term as well.
All that said on page 86 of "Signature...." Meyer point's to Webster's dictionary- "the attribute inherent in and communicated by alternative sequences or arrangements of something that produce specific effects". As for Shannon he never did deal with information, just mere complexity. IOW Meyer uses "information" as information technology uses the term. Joseph
Re #35 Is Meyer's definition of information different from Dembski's. I think I understand Dembski's fairly well having read most of his papers. Mark Frank
Bruce - the mathematical definition should be fine for me: I'm pretty numerate. I don't mind doing work, but is it possible to get these definitions without buying someone's books? Heinrich
To Heinrich, et al: Meyer spends an entire chapter (Chapter 4) defining what he means by the type of information found in the genomes of living organisms. The definition is specific, rigorous, and mathematical. It follows Dembski, who spent several chapters of several books (The Design Inference, No Free Lunch, and others) explaining and defining complex specified information (CSI). You can't really give an adequate one or two sentence definition. If you are really serious about understanding ID, you need to do a little work! Bruce David
Indeed, Collin. I'm hoping I don't have to shell out a pile of money just to find and read one or two sentences. Heinrich
Heinrich, I'm sure you will be able to trash Meyer's definition once you've read it. Collin
Would you be kind enough to read post number 7 and take note of the relevant quote already provided.
The quote isn't from Meyer, so I can't be sure that it is the definition he meant. The definition also isn't enough, because there is nothing in the definition that shows how to quantify it.
Meyers [sic] draws a clear distinction between what he calls ‘complex specified information’ and ‘Shannon information’, which is merely complex.
OK, so Meyer doesn't use Shannon information. That cuts it down to about 493 uses. :-) Heinrich
I lent my copy of "Signature in the Cell", so I can't doublecheck my memory, but Meyers is downright voluble about what he considers information. Have the critics here read his book??? Meyers draws a clear distinction between what he calls 'complex specified information' and 'Shannon information', which is merely complex. 'Shannon' is the term I'm unable to doublecheck. Anyway, he clearly draws the distinction between the two types of information. I'm assuming the critics here are intelligent, therefore I have to conclude that they haven't read the book, which is not too smart when debating something asserted in it. Anaxagoras_Rules
---Heinrich: "Would someone be kind enough to quote Meyer where he explains what he means by “information”? Would you be kind enough to read post number 7 and take note of the relevant quote already provided. That way you can move on to your next objection, which will consist of a series of comments indicating your disastisfaction with the definition, followed by the earth shaking observation that the term has more than one meaning and that not everyone applies it the same way or in the same context or from the same vantage point. StephenB
"Would someone be kind enough to quote Meyer where he explains what he means by “information”?" DNA and the transcription/translation process. jerry
Until you can show how the type of information Meyer refers to arises in nature, then maybe you should refrain from criticizing Meyer or ID or defending Shallit’s specious non relevant comment.
As I wrote, I don't have Meyer's book. Until someone explains what exactly Meyer means by "information", it's difficult to progress further. I've tried to explain how Shallit's criticism makes sense, i.e. that it is talking about physical processes that affect the information that is measured, but I don't know how that aligns with how Meyer uses "information" in his book. Would someone be kind enough to quote Meyer where he explains what he means by "information"? Heinrich
From the below post: http://sonofneocles.blogspot.com/2009/12/tom-nagel-apostate.html The weather forecasting analogy is not very good. The forecaster, an intelligent agent. He gathers data, from which he generates a prediction. The data is generated by the interaction of natural phenomena with various intelligently designed instruments, that convert the interactions into information. This then is the data used (be it magnetic patches on a hard drive, or squiggly lines on a barometer). This first-level data, together with the methodology used to generate the prediction (statistical modeling, etc..) generates the prediction a second level of data or information. But, to be clear here; the data fed into the model is the result of the interaction of intelligent agents with the natural phenomena, via tools they have created for such purposes (and their brains of course). Only after all that has gone on does the same person, or different people feed this data or information into a machine or brain that has 'installed' within it a higher level predictive method, and/or tool, which then generates some further wider-scope information that can in turn be used by other intelligent agents, because they can read and understand it. The initial phenomena, while it did not come from a mind, nevertheless can be called 'information' only in an extended metaphorical sense. It is not really information at all (but some physical events like air movement), until experienced and interpreted as such by brains and intelligently designed measuring devices which are related to each other in just the right way. But, aside from that fallacy, notice the invective: It's sad to see such an eminent philosopher (Nagel) make a fool of himself with this recommendation. and in this comment a bit down the page on said blog.. Nagel has become a disgrace. He was a philosopher who made some significant contributions, but in areas far afield of this one. A small irony: the other book he chooses to recommend is by a colleague and friend with whom he co-teaches. High standards of integrity here! Wow, ad-hominum with a vengeance. He's recommending books by friends. The horror! Academic quasi-nepotism in action. shaun
"Science was never intended to make logic it’s master, and it never could." Strike that. Science was never intended to surpass logic, and it never could. also "It can never be determined outright, if every instance of gravity isn’t being controlled by god, or if no god was never involved." I meant "ever involved." Anyways, one point I ran across from a Dembski article one time is that randomness can't be defined. it was a while ago and I think that's what he was saying. Either way this is what I got out of it. A lottery is predetermined to only give apparent randomness but the chain leads back to order from a persons mind. The paint on the balls and the timing of the lady pulling the lever isn't a confirmation of randomness. You might as well disprove destiny. A random number generated computer program relies first on the inventor's arbitrary concept of order installed into it. But darwinist attack ID saying it must define information, which is represented here as order. Darwinist must at the same time define chaos, but don't. Both are on equal ground, until you get down to debating rocks to rockstars which is cut and dry, there's no path. On the highest level though, of defining order and chaos themselves, from which comes information, it's equal ground. The basis of neodarwinism, of which origins is included except during embarrassing setbacks, is that pure chaos can exist, when it looks like it can't. Unless you want to say laws can start up from the motion and bouncing off of each other. The bouncing itself is a law. lamarck
There's so much fake concern over the definition of information. It's the via (noun) by which anything arises that can't be explained by what we believe to be "natural" processes. The word is the totality of one side of the debate, ID's side. No use reading the rest of this post because that's it. The laws themselves which make lightning occur, and gravity etc are all likely from a sentient source, but their mechanistic via is seen to be easily reproduced by nature on it's own. This is an opinion. The grey area of information vs. nature will be as large as not enough is known but it will likely always be an opinion. But to pretend one or the other doesn't exist is for some religions. It can never be determined outright, if every instance of gravity isn't being controlled by god, or if no god was never involved. But it can be seen that the probability of god creating each gravity moment newly isn't worth talking about - so that gets put on the "natural" shelf. And someday hopefully god's "handiwork left behind on automatic" can be studied by ID in a meaningful way. When you deny information exists, you have to then deny everything exists, except for maybe a few subjective ideas, because it's on par with observing a wall. We don't have a handle on the mathematical absolute knowledge of anything, yet something exists. Give me a break. Science was never intended to make logic it's master, and it never could. It's always subservient to logic "what makes sense", because tools were never built into science to overcome logic. It's always going to be about opinions of data. Some scientists want you to believe your opinion of their data doesn't count. Why not? You're on equal ground with anyone else when looking at raw results. lamarck
"Phew – reality has asserted itself, and we disagree." It's your obligation to show why you disagree with my point. I am not aware of any instances where ID is at odds with reality. Individual people make assertions but that does not mean that their assertions are essential to ID. Like anything in science it is modified by additional data. As far as information that Meyer is concerned with: when one data sources specifies the content in another data source that is an unusual type of information. I continually give the same answer to everyone here. Ask the biology programs around the globe what they mean by biological information. They use it all the time. And then listen to a Berkeley course on the history of information to try understand the various ways the term information can be used. Until you can show how the type of information Meyer refers to arises in nature, then maybe you should refrain from criticizing Meyer or ID or defending Shallit's specious non relevant comment. Shallit is referring to a different type of information and trying to conflate the two. That you cannot see this is interesting. Meyer doesn't have to use the term FSCI. Anyone who criticizes the term should deal with the content of the term rather than the term itself. jerry
Why should you be worried. ID is based on truth, an accurate reading of the physical universe. It makes logical conclusions based on that reading. What I said was obvious. What Shallit said was specious and is also obvious.
Phew - reality has asserted itself, and we disagree. :-) To me, ID isn't an accurate reading of the physical universe, and I'd also disagree that what Shallit said was specious. And I'm not sure it's obvious, either. this was why I was trying to unpack what was meant by information, and where it was being created or just transmitted.
What Meyer is saying is that certain types of information only seem to be able to be created by a mind. We have had the same discussion here a hundred times or more. We call this information, FSCI or functionally specified complex information.
Does he use the term FCSI? And how does he define "information" in FCSI? I assume you're able to just quote the passage where he explains what he means by "information". Heinrich
"Hm. I’m worried. Seriously worried. We agree." Why should you be worried. ID is based on truth, an accurate reading of the physical universe. It makes logical conclusions based on that reading. What I said was obvious. What Shallit said was specious and is also obvious. "In particular, is Meyer saying that mind-independent information cannot be created? i.e. a physical process cannot lead to an increase in the information that would be measured?" I doubt that he believes this because there are lots of types of information. In your sentence you are using the term in a general fashion and could mean any of the many types of information. But as we have said only a mind designates it as information whether the information exists or not in nature. That is neither here nor there and has nothing to do with the argument made in his book because he is not talking about information in general. It may be an interesting question to debate in a philosophy class but it has nothing to do with origin of life or evolution. What Meyer is saying is that certain types of information only seem to be able to be created by a mind. We have had the same discussion here a hundred times or more. We call this information, FSCI or functionally specified complex information. (Some will make the fatuous argument that no one else except us uses this term and therefore it is invalid but that has nothing to do with the relevance of the argument) The information in DNA when looked at in a certain way is this type information and is essentially similar to the information contained in language and computer programming. This type of information is found as a result of intelligence all the time but is not found in nature at all except in certain aspects of DNA. So the question is can nature using the forces of nature only, produce this type of information. The answer so far is it cannot. To say it has produced it in life is begging the question because that is the entity under discussion. So all the processes of nature from the beginning of time and all the scientists in laboratories all over the globe have not been able to produce this type of information in even the littlest bit. I expect that in some time in the future a great fanfare will be made and some group of scientists will show how a small amount of this type of information was created by using some unusual combination of forces. But even such an event will probably be only child's play compared to what will be necessary to produce true FSCI. The "C" stands for complex and represents thousands of bits of information and even that is only a token of what is present in the simplest cell. But we have a legion of anti ID people here who deny the obvious. Which is why when it comes to science it is ID that is scientifically based and the anti ID which is religiously or sophistry based. jerry
jerry @13 -
The term information has unknown numbers of meanings. The argument that it is only produced by the mind is in a sense true and not true. If there were no humans or entities in the universe of similar oh higher levels of intelligence, many combinations of natural entities, not life, would not be considered information because no mind could make such a designation. But the combinations of natural entities would exist non the less.
Hm. I'm worried. Seriously worried. We agree. As I still haven't won a copy of SotC, can someone explain what concept Meyer uses? In particular, is Meyer saying that mind-independent information cannot be created? i.e. a physical process cannot lead to an increase in the information that would be measured? Heinrich
BTW, wind speed, etc., does not exist in a natural state, but are concepts of minds and as such can only be ascertained by mental activity. Shallits's pasted blurb is just some very grasping and not well thought out gobblydegook. Anaxagoras_Rules
Quote from wikipedia: Messenger ribonucleic acid (mRNA) is a molecule of RNA encoding a chemical "blueprint" for a protein product. mRNA is transcribed from a DNA template, and carries coding information to the sites of protein synthesis: the ribosomes ... In mRNA as in DNA, genetic information is encoded in the sequence of nucleotides arranged into codons consisting of three bases each. Each codon encodes for a specific amino acid, except the stop codons that terminate protein synthesis. This process requires two other types of RNA: transfer RNA (tRNA) mediates recognition of the codon and provides the corresponding amino acid, while ribosomal RNA (rRNA) is the central component of the ribosome's protein manufacturing machinery. Collin
UNLESS the cell can be viewed as a hardware decoder designed by mind. Why not? tribune7
groovamos, I don't see how it is a circular conundrum. The tRNA and mRNA can't be designed to be a "decoder" of sorts? Collin
:I’m losing patience with the use of information theory “concepts” or non-concepts as the case may be, to push ID in the marketplace.: The term "information" has many meanings and from what I understand the use of it in the Shannon sense doe not have anything to do with the most common use of the term in biology.. Your are a MSEE and apparently know something about Shannon information. Would you say that the information in a computer program or an English paragraph is information in the Shannon sense? From what little I understand, it isn't. jerry
Jeffrey says in his example that weather information does not originate in mind(s). I disagree in part; but he is on to something regarding the nature of/ or inherent existence or nonexistence of information in the weather. Does weather have informational content? No. Do units of measure? Yes. Do the designs of transducers and sensors? Yes again. The last two arise in minds and in the Shannon sense are part of the coding problem, and are indispensible for collecting weather information. And so weather information does originate in minds, because the coding originates in minds. This in my view is a conundrum for Jeffrey's argument, and is in fact laughingly related to the insistence of scientists that there is a such thing as a "genetic code", a term used to wow each other and the public in general. By relying on the "wow" factor to keep the funding coming, life scientists in this case are unintentionally implying a mind behind the "genetic code", because codes originate in minds. Including computer coding, because minds produce the hardware, assemblers and compilers that translate codes into function. Alternatively if there is a such thing as a genetic code then by the above reasoning there would have to be a mind in the cell for there to be a such thing as meaningful information, since information requires a mind to code and a mind to decode, UNLESS the cell can be viewed as a hardware decoder designed by mind. This may be a circular conundrum for ID and although I support ID, I'm losing patience with the use of information theory "concepts" or non-concepts as the case may be, to push ID in the marketplace. I think there is confusion sown by the use of little understood concepts. Engineer here, with an MSEE which is also how Shannon was credentialed before he obtained his PhD in math. I never see in the literature that Shannon was an engineer first and a mathematician second. groovamos
Consider making a weather forecast. Meteorologists gather information about the environment to do so: wind speed, direction, temperature, cloud cover, etc. It is only on the basis of this information that they can make predictions. What mind does this information come from? The information is derived from standards and definitions that were created by the minds of men, well, like Anders Celsius. Without these standards -- intelligently designed, if you will -- the information would not exist. tribune7
The term information has unknown numbers of meanings. The argument that it is only produced by the mind is in a sense true and not true. If there were no humans or entities in the universe of similar oh higher levels of intelligence, many combinations of natural entities, not life, would not be considered information because no mind could make such a designation. But the combinations of natural entities would exist non the less. The moon is lifeless, so we presume, but there contains all sorts of combinations of natural entities that once a mind considers it, is considered information using many of the common uses of information. These combinations of entities, craters, rocks, dust, crevices etc, are definitely information to the geologist and other scientists. Similarly, temperatures, wind speed, etc are pieces of information to the mind. Were they information prior to the mind observing them. Some might argue yes, some no. But the issue is really pointless in the debate over evolution or the origin of life or the origin of the universe. The issue becomes are there certain entities that can only come into existence through an intelligence and do these entities possess a certain type of information that is different than other types of information. If I take a walk in the woods, and pick up a rock, this rock contains all sorts of information. For example, the very place it was laying and its orientation and its comparison to its surrounding might tell me how long the rock has been there and its potential origin. If someone did a chemical analysis of the rock and suppose there was a technique to describe each molecule in the rock and its position relative to every other molecule, then I would have a very complete description of the rock. All this is information. Suppose someone did further analysis of the rock and realized that it pointed to an opening in a cave 50 feet away. Now that could be a coincidence or it could mean that some intelligence placed the rock there. If it just happened to be a coincidence is that information. If some intelligence placed the rock there, then most would call that information. Again remembering that there are many, many uses of the term "information." Also if one did a chemical analysis of the rock and then realized that many of the combinations of the various molecules led to the production of an exterior gas which led to the formation of other types of rock or other entities that made it easier for rocks to form, we would be amazed. We would call these sub combinations information too but in a different sense than the previous types of information used about this rock. If these processes were actually quite simple but still efficacious in producing new rocks, we would not call the process intelligence based but an interesting phenomena of nature. It would be in all the chemical and geology books and be studied. However, if the entities produced by these sub combination produced highly intricate systems that inter worked with each other to produce factories for producing rocks, we would step back and say how is this possible. And if these little rock systems producing factories found a way to perpetuate themselves, we would call it life and be amazed. We would look for the chemical properties of the molecules that would force these rock systems to start and continue to research them until we concluded that there does not seem to be any properties of chemicals that inevitably lead to such systems and the only source for such systems in our experience is intelligence. Of course the properties described does not exist, but it shows the different types of information that can be associated with a rock. It also shows that to conflate the fact that some of this information could exist with and without an intelligence to assess it with the type of information that only exists as a result of intelligence it a nonsense argument and one easily separated. There are many types of information in DNA. Just the description of the individual nucleotides is information. Their groupings relative to each other are information. If the nucleotide combinations were found to reliably spell out someone's name that would be another piece of information. So if we were able to reliably show that Craig Ventner is spelled out in the some set of nucleotides, that is information of a different nature using the same data set. If we found that the combinations of nucleotides coded for something extraneous to the DNA through a chemical process, that would be entirely different information using the same data set. If we found out that the nucleotides had different patterns in them, that would be information. If we found that the nucleotides cut off at certain points that would be information. The different types of information that might arise from nucleotides may almost be endless. But if certain subtypes of the various information generated by an examination of the nucleotides were identified as only arising from intelligent intervention, that does not make the other types of information that could arise through natural means, somehow contradict the intelligent origin of the other information. Shallit's argument is specious and the sad thing he knows it but yet proffers it. But we see that same phenomena here played out every day and the know nothing raise their fatuous objections. jerry
Bruce, I am pretty sure that Matteo was using sarcasm to ridicule the idea that such a complex process came into being by pure chance. But this does raise a point that I hope may be helpful to mention here. If I may humbly offer a suggestion, I have always found communication to be most effective when clarity is chosen over cleverness. Cleverness may bring some small amusement to those who are already in agreement, but more often than not (as I have observed many times in this forum) it confuses one's supporters while having little effect on the opposition except prompting them to respond in the same way. Like salt on food, an occasional light sprinkling of wit may season our arguments, but it does not make a good main dish. sagebrush gardener
Semiotics, anyone? I haven't read Signature of the Cell (I'm waiting to win one of Denyse's competitions. :-)), but I think one has had to be careful separating out the signified (e.g. the actual temperature), and the signified (e.g. the number that is written down). As I understand it, Shallit's point is that the actual temperature (whether recorded or not) is the information, in which case it exists independently of any intelligence. I can't comment on exactly what Meyer means by information, but it's clear that Dr. Dembski's information is something that exists independently of a human mind: it's a property of the cell. So the actual sequence of DNA is equivalent to the actual weather: it's something which is there. Heinrich
To Mateo: You said, "Those insects which proceeded to grow into something useful after dissolving into a slurry went on to reproduce, while those that didn’t, didn’t. Those insects which dissolved after building a cocoon did even better! Don’t you see?!? It was Natural Selection which is the very antithesis of randomness!!!" You've missed a very crucial point--until you can show a possible evolutionary path by which an an insect which does not metamorphose reaches a state where it dissolves into a slurry and then "gr[ows] into something useful" by a series of incremental steps each of which enhances its fitness, you have nothing at all. And even then, you have to deal with the problem of how a slurry that is not contained (say within a cocoon) could possibly grow into anything at all. Bruce David
When I first read this post, I thought, "Is Shallit serious? Of course the information comes from the human beings who decided what to measure, who set up the instrumentation to measure those quantities, and then who make the predictions based on those results." However, as I think a little more deeply, I believe that there is something else going on here. If I put myself in Shallit's shoes, I think that he equates all the complexity in the atmosphere with the complexity in the DNA. Since being a materialist (I assume) he sees DNA as having been produced by natural causes and the weather having been so produced as well, he sees no difference between the two in terms of information. In making this equivalence, however, he misses the distinction that differentiates CSI from complexity, and that is functional specificity. The complexity in the atmosphere that is the weather is neither specified nor is it functional. Therefore, although it is complex, it does not qualify as information by Meyer's definition (following Dembski, of course). Complexity can be and is produced by natural causes all the time throughout the Universe. It is complex, functionally specified information that requires a designing intelligence to create. Bruce David
Well, pointing out Shallit's equivocation is quite right, but we can also, I think, simply take his argument and ask him what he realy knows at all. Okay, we have this "information" in the wind, clouds, etc. But what is this arguing at all? He has still said absolutely nothing because he is simply making a non-empirical assertion that we "know" that this "information" didn't come from a mind because we "know" that it didn't come from a mind. In actuality he has not changed the argument at all. He is trying to clarify Meyer's argument by using the exact same argument and saying, "but we know that weather is from purely natural and materialistic processes", but that is only an assumption. My question: How often does a process have to recur on its own until we pronounce it "natural" and "without the cause of a mind"? How many revolutions does a spinning top have to make before we say, "it 'obviously' started itself because it just keeps going"? How scientific is this? Just because we don't see an intervening finger to keep the top spinning doesn't mean that it didn't have one "In the beginning . . ." Brent
Shallit is confusing two distinct meanings of the word information. The first meaning, which Shallit is using, is "knowledge obtained from investigation", i.e. a description of an object, event, etc. The second meaning -- the one intended by Meyer -- is "the attribute inherent in and communicated by one of two or more alternative sequences or arrangements of something (as nucleotides in DNA or binary digits in a computer program) that produce specific effects". To illustrate, that is like the difference between a picture of what I see on my computer screen vs. the millions of lines of computer programming that produced it. Or a description of the orchid sitting on my desk vs. the complex genetic code that determined its features. [ Source of definitions: http://www.merriam-webster.com/dictionary/information ] sagebrush gardener
Shallit:
Meteorologists gather information about the environment to do so: wind speed, direction, temperature, cloud cover, etc. It is only on the basis of this information that they can make predictions. What mind does this information come from?
The meteorologist's? SCheesman
Who could ever think that wind speed, temperatures, humidity, rainfall, and the times when they occurred are information? They are not. They are physical properties and events. They are neither information nor data. A drop of rain falling is not information. When one records, using some language, that a drop of rain fell, or other meteorological recordings, that is information. It requires language. Has the weather, without any intelligence, ever recorded or communicated an abstract, symbolic description of itself? Shallit's comment is willful cluelessness. ScottAndrews
Well, with a name like Shallit that rhymes with...(no need here for elaboration)...it is all quite understandable. JPCollado
Well isn't it obvious how caterpillars become butterflies? It's very simple: Those insects which proceeded to grow into something useful after dissolving into a slurry went on to reproduce, while those that didn't, didn't. Those insects which dissolved after building a cocoon did even better! Don't you see?!? It was Natural Selection which is the very antithesis of randomness!!! Yet another cdesign proponentist "puzzle" instantaneously refuted! You intelligent design creationists have zero understanding of Science! </sarc> Matteo
Well, Shallit managed to trash a workshop he has never taken (and wouldn't need to, if he has tenure at your expense). One taught by an expert in non-tenured survival. If that is what you want, vote for it. O'Leary
Thanks, Denise. I think the problem with Shallit's weather analogy is not the degree of complexity in information, it's a full-blown category error. He's conflating data with information. Wind speed, temperature, etc are data, measurements. Wind has to have a speed. Cloud cover has to have an area. This is necessity. Data and information are commonly used as synonyms, but Meyer clearly does not intend this meaning with his technical use of the word information. Shallit either knows this and uses the common meaning of information as synonymous with data as a deceptive equivocation or he's just one dumb mathematician. landru

Leave a Reply