Uncommon Descent Serving The Intelligent Design Community

Since you asked

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

I’m generally happy to answer questions from anyone, if I think they’re interesting enough. Recently the following seven questions were brought to my attention. I thought they merited a response, so here goes. The answers given below are my own; readers are free to disagree if they wish.

1. Does a spider web, a bee hive, a mole burrow, a bird nest, a termite mound, or a beaver dam have “biological function”, and do they have “information”?

All of the above structures combine the characteristics of high probabilistic complexity (i.e. it is difficult for natural processes lacking foresight to generate them) and low descriptive complexity (i.e. they are easy to describe in a few words). Hence they all contain complex specified information (CSI). Insofar as they are useful to the creatures that make them, they could also be said to have a function. However, I wouldn’t say that these structures have a “biological function.” Biological function, properly speaking, belongs to organs or systems inside an organism’s body, which enable the organism to perform some useful task.

2. Does a tool that is made and used by a bird, a chimpanzee, other non-human primates, any other organism that isn’t human, or a human, have “information”, and does it have “biological function”.

Complex specified information, yes. Biological function, no.

3. Does the organism understand and/or generate information when building a nest, web, hive, dam, etc.?

The organism certainly generates complex specified information when building these structures. Does it understand this information? No. It cannot explain and justify its actions. It cannot say why it built these structures this way and not that way, so I’d say it lacks understanding.

4. Does the organism understand and/or generate information when making and using a tool?

Same as for question 3.

5. Apply the same questions to an organism, such as a bird, a non-human primate, or a human, but substitute tools that are not made by the organism. For instance, natural objects that the organism doesn’t modify, but does select and use as a tool.

Owing to their specificity and suitability for a particular job, these natural objects contain a certain amount of complex specified information (in most cases, a small amount). However, no new information is generated here.

6. If there’s information in any of the things I mentioned above (web, hive, dam, nest, tool, etc.) is it “functional complex specified information”?

No. None of the structures in questions 1 to 5 exhibit functional complex specified information, because they are not patterns embodied in structures that enable the structures to perform some function or useful task. Functional complex specified information can on the other hand be ascribed to systems in an organism’s body that are biologically useful.

And one more question:

7. When a cephalopod changes its shape, texture, or colors, does it understand and/or generate information (is it functional complex specified information?), and does that change of shape, texture, or colors have biological function?

I’d say this is a genuine case of functional complex specified information. The patterns are inside the organism, and they enable it to perform a biologically useful task.

Recommended reading:
here, here and here.

Comments
Regarding some of the disagreements upthread... I think it’s really important to keep in mind that the representations and protocols that make the transmission of information possible are themselves discrete realities, apart from the information being transferred.Upright BiPed
July 6, 2011
July
07
Jul
6
06
2011
12:09 PM
12
12
09
PM
PDT
Interesting comments above…particularly the disagreements Here’s my $.02: 1) A state of matter does not contain information (it is nothing more than a state of matter). 2) Information can exist about that state of matter, but such information requires a mechanism in order to bring it into existence (because it doesn’t exist in the state of matter itself). 3) If such a mechanism brings that information into existence, that information will exist only by a representative arrangement of matter or energy. 4) Matter or energy (which has been so arranged) can be used to transfer information. 5) That transfer requires protocols to establish the relationship between the physical representation and the state of matter it represents. - - - - - - - - - - In other words, a rationale distinction is made between a) matter, b) information, and c) matter that has been arranged to be a carrier of information. There is also the implicit acknowledgement that information requires a mechanism in order to exist, that information physically exists only by means of representations, that a receiver of information must access protocols in order to be in-form-ed by receiving information, and that the representations and protocols which make the transfer of information possible are not themselves the product of physical law. I believe this view remains consistent in regards to the three broad realms of information: the informational interaction between of all (lower to higher) living things, as well as the transient biological information being generated within living organisms, and also in the transmission of genetic information.Upright BiPed
July 6, 2011
July
07
Jul
6
06
2011
12:08 PM
12
12
08
PM
PDT
--Mung: "From this I’d have to say it’s not likely that he [Meyer] believes Shannon information is a kind of information." P. 108: "So, what kind of information does DNA contain, Shannon information or specified information. Mere complexity or specified complexity? The answer is---both."StephenB
July 6, 2011
July
07
Jul
6
06
2011
12:02 PM
12
12
02
PM
PDT
I think I see Ilion & Mung’s point (correct me if I’m wrong).
Ilion's obviously further along the road of thought on this than I am. I've only recently begun to really think it through. I liked your post though. I'm not ready (yet) to say that "aboutness" only exists in minds. But maybe I'll get there. Is it possible for God to become informed about something? My initial thought is no. I went through Dembski's 2005 paper again and I didn't find the phrase "complex specified information."Mung
July 6, 2011
July
07
Jul
6
06
2011
11:50 AM
11
11
50
AM
PDT
Stephen Meyer:
Clearly, we all know that intelligent agents can create specified information and that information comes from minds... its source invariably comes to a mind...
Stephen Meyer:
In our illustration, both Smith and Jones have an equally improbable sequence of ten characters. The chance of getting either sequence at random is the same... Both sequences, therefore, have information-carrying capacity, or Shannon information...Thus, Smith's number contains specified information or functional information, whereas Jones's does not; Smith's number has information content, whereas Jones's number has only information-carrying capacity (or Shannon information).
It's pretty clear that by Shannon information Meyer means information-carrying capacity and not information content. From this I'd have to say it's not likely that he believes Shannon information is a kind of information.Mung
July 6, 2011
July
07
Jul
6
06
2011
11:32 AM
11
11
32
AM
PDT
I think I see Ilion & Mung's point (correct me if I'm wrong). Information references or describes the thing; it isn't an aspect of the thing itself. Since the material world only produces "things", and it doesn't produces references to or descriptions of things, only non-material minds can produce information, beause information is always "about" the thing. It might be a materialistic habit to think of "information" in terms of "transferred" or being "contained" in an object waiting to be uncovered by a mind. It might even be erroneous to consider information something that a mind contains or transfers to another mind (more habit from the material world). But it also seems to me that "information" as used by IDists (such as FSCI) could as easily be renamed FSCM (functionaly specified complex mechanism). I'm not saying I agree with it, but I think I see the point being made, I do find it very interesting.William J. Murray
July 6, 2011
July
07
Jul
6
06
2011
11:26 AM
11
11
26
AM
PDT
StephenB:
Indeed, you haven’t even provided your own definition of the word “information.” You [and Mung] have only made claims about what it is not.
Sigh. My post @1 in this thread made no claim as to what information is or is not, but only that those items did not contain or generate information. My post @3 directly addresses what information is. Not what it is not. And then again my post @6. See also my links @8. You seem confused. A claim that something does not contain information is not a claim about what information is not. So where have I made claims about what information is not? And even if I had, who cares? Of all the claims I made about what information is not, which ones do you disagree with? And since when is elimination of what a thing or concept is not an occasion for opprobrium?Mung
July 6, 2011
July
07
Jul
6
06
2011
10:10 AM
10
10
10
AM
PDT
---Mung: "Says the person who accused me of claiming that Stephen Meyer is silly." I will happily acknowledge that you only said that Stephen Meyer embraces a silly idea and did not say that he is a silly person. [“The idea that there are two kinds of information is silly."]StephenB
July 6, 2011
July
07
Jul
6
06
2011
10:08 AM
10
10
08
AM
PDT
---Mung: "But yes, Meyer in those texts is confused (and confusing). I’d love to sit with him some day and have a chat about it." Who knows, it could happen some day. In any case, we will just have to agree to disagree, I guess. I think Meyer got it right, you think he got it wrong. By the way, how do you define "information."StephenB
July 6, 2011
July
07
Jul
6
06
2011
09:57 AM
9
09
57
AM
PDT
StephenB:
You have a funny way of being imprecise at the very time that precision is called for.
Says the person who accused me of claiming that Stephen Meyer is silly. Say it isn't so.Mung
July 6, 2011
July
07
Jul
6
06
2011
09:55 AM
9
09
55
AM
PDT
Information and entropy are the two sides of the same physical reality (max order=max information, chaos=zero information).
Others will say that a completely random (maximally disordered) sequence contains the most Shannon information. Here's a pattern from Dembski's paper: 0100011011000001010011100101110111 0000000100100011010001010110011110 00100110101011110011011110111100 How much information does it contain? IMO, that's the wrong question to ask. Alternative questions: How much information is required to specify the pattern. How much information would be needed by a search to find that pattern (or one similar to it).Mung
July 6, 2011
July
07
Jul
6
06
2011
09:46 AM
9
09
46
AM
PDT
---"Ilion … also, why did that silly person need to put quote-marks around the word ‘contain’ when he (apparently, instinctively) asserted that DNA “contains” information? Because, to have used the word without the quotemarks would have been to say a blatantly false thing, an absurd thing." You have a funny way of being imprecise at the very time that precision is called for. [a] The reason I put the word "contain" in quotes was to dramatize the fact that Mung had insisted, with your approval, that the DNA does NOT contain information. That should be a simple enough concept to grasp. [b] Stephen Meyer does not put the word in quotes, as is obvious from these two paragraphs I cited: “…The bases in DNA and RNA and the sequence of amino acids in proteins do not contain mere Shannon information. Rather these molecules store information that is also functionally specified. As such, they manifest one of the kinds of patterns–a functionally specific pattern–that routinely leads statisticians to reject chance hypotheses, at least in the case of improbable events.” “Thus, in addition to a quanifiable amount of Shannon information (or complexity), DNA also contains information in the sense of Webster’s second definition: it contains alternative sequences or arrangements of something that ‘performs a specific effect.’ Although DNA does not convey information that is received, understood, or used by a conscious mind, it does have information that is received and used by the cell’s machinery to build the structures critical to the maintenance of life. DNA displays a property–functional specificity–that transcends the merely mathematical formalism of Shannon’s theory.” ----"DNA sequences may represent information, but they are not, themselves, information." So you say, but Stephen Meyer obviously disagrees with you. You may choose to declare yourself the more credible authority, but I doubt very much if that declaration will carry much weight. Indeed, you haven't even provided your own definition of the word "information." You [and Mung] have only made claims about what it is not.StephenB
July 6, 2011
July
07
Jul
6
06
2011
09:45 AM
9
09
45
AM
PDT
StephenB:
I am sure that Stephen Meyer will be happy to submit to your superior judgment and concede that he is, indeed, “silly.”
Well since I didn't say Meyer is silly there's no reason for him to concede such a thing. But yes, Meyer in those texts is confused (and confusing). I'd love to sit with him some day and have a chat about it. I don't just parrot SitC, I read it and engage the argument and develop my own ideas. "“…The bases in DNA and RNA and the sequence of amino acids in proteins do not contain mere Shannon information." They don't contain Shannon information, period. "Rather these molecules store information that is also functionally specified." He has it backwards. It is not the information that is functionally specified. It is information which brings about functional specificity. "As such, they manifest one of the kinds of patterns–a functionally specific pattern-" Better. It's the pattern which is specified, not information. According to Dembski specification is the pattern that signifies intelligence. Why is that? "Thus, in addition to a quanifiable amount of Shannon information (or complexity), DNA also contains information in the sense of Webster’s second definition: it contains alternative sequences or arrangements of something that ‘performs a specific effect.’ " No. DNA does not contain information. DNA contains sequences of bases which are specified. It is the sequence of bases which are the effect. "DNA displays a property–functional specificity–" The property that it displays is not information content. It's functional specificity. How does functional specificity come about?Mung
July 6, 2011
July
07
Jul
6
06
2011
09:21 AM
9
09
21
AM
PDT
Eugene S @ 54, I don't have any more time right now to go into it myself ... but, please re-examine what you have said. Can you spot the absurdity of it? If you cannot, I will try to make time later to show it you.Ilion
July 6, 2011
July
07
Jul
6
06
2011
04:58 AM
4
04
58
AM
PDT
... also, why did that silly person need to put quote-marks around the word 'contain' when he (apparently, instinctively) asserted that DNA "contains" information? Because, to have used the word without the quotemarks would have been to say a blatantly false thing, an absurd thing. DNA sequences may represent information, but they are not, themselves, information.Ilion
July 6, 2011
July
07
Jul
6
06
2011
04:53 AM
4
04
53
AM
PDT
"... your instincts are correct. A DNA molecule and a computer program both “contain” information." Instincts, huh? I prefer to get my beliefs from reason.Ilion
July 6, 2011
July
07
Jul
6
06
2011
04:50 AM
4
04
50
AM
PDT
Ilion, Information exists ‘within’ — and only ‘within’ — minds. There is no information, whatsoever, “out there” in the physical/material world. I disagree. Information and entropy are the two sides of the same physical reality (max order=max information, chaos=zero information). These two are concepts that represent objective reality outside of us and consequently do not depend on our knowledge about them.Eugene S
July 6, 2011
July
07
Jul
6
06
2011
04:23 AM
4
04
23
AM
PDT
Well I don't see a good place to put this link so I'll ask you a question vj; do you think this is why ID isn't taught in High schools? http://www.henrymakow.com/education.htmllamarck
July 6, 2011
July
07
Jul
6
06
2011
12:02 AM
12
12
02
AM
PDT
PS: SB, Hence:
1 –> 10^120 ~ 2^398 2 –> Following Hartley, we can define Information on a probability metric: I = – log(p) . . . eqn n2 3 –> So, we can re-present the Chi-metric: [where, from Dembski, Specification 2005, ? = – log2[10^120 ·?S(T)·P(T|H)] . . . eqn n1] Chi = – log2(2^398 * D2 * p) . . . eqn n3 Chi = Ip – (398 + K2) . . . eqn n4 4 –> That is, the Dembski CSI Chi-metric is a measure of Information for samples from a target zone T on the presumption of a chance-dominated process, beyond a threshold of at least 398 bits, covering 10^120 possibilities. 5 –> Where also, K2 is a further increment to the threshold that naturally peaks at about 100 further bits . . . . 6 –> So, the idea of the Dembski metric in the end — debates about peculiarities in derivation notwithstanding — is that if the Hartley-Shannon- derived information measure for items from a hot or target zone in a field of possibilities is beyond 398 – 500 or so bits, it is so deeply isolated that a chance dominated process is maximally unlikely to find it, but of course intelligent agents routinely produce information beyond such a threshold . . . . As in (using Chi_500 for VJT’s CSI_lite [UPDATE, July 3: and S for a dummy variable that is 1/0 accordingly as the information in I is empirically or otherwise shown to be specific, i.e. from a narrow target zone T, strongly UNREPRESENTATIVE of the bulk of the distribution of possible configurations, W]): Chi_500 = Ip*S – 500, bits beyond the [solar system resources] threshold . . . eqn n5 Chi_1000 = Ip*S – 1000, bits beyond the observable cosmos, 125 byte/ 143 ASCII character threshold . . . eqn n6 Chi_1024 = Ip*S – 1024, bits beyond a 2^10, 128 byte/147 ASCII character version of the threshold in n6, with a config space of 1.80*10^308 possibilities, not 1.07*10^301 . . . eqn n6a . . . . Using Durston’s Fits from his Table 1, in the Dembski style metric of bits beyond the threshold, and simply setting the threshold at 500 bits: RecA: 242 AA, 832 fits, Chi: 332 bits beyond SecY: 342 AA, 688 fits, Chi: 188 bits beyond Corona S2: 445 AA, 1285 fits, Chi: 785 bits beyond . . . results n7 The two metrics are clearly consistent . . . (Think about the cumulative fits metric for the proteins for a cell . . . ) In short one may use the Durston metric as a good measure of the target zone’s actual encoded information content, which Table 1 also conveniently reduces to bits per symbol so we can see how the redundancy affects the information used across the domains of life to achieve a given protein’s function; not just the raw capacity in storage unit bits [= no. of AA's * 4.32 bits/AA on 20 possibilities, as the chain is not particularly constrained.]
--> Onlookers, observe how, since April, MG and ilk -- clearly the source of the above agenda of intended to be loaded questions -- have consistently ducked and misrepresented the above.kairosfocus
July 6, 2011
July
07
Jul
6
06
2011
12:02 AM
12
12
02
AM
PDT
F/N: yerondai, tekili mishi, inguni yae kekiri, mezte. Is this a meaningful, functionally specific, complex string? Why or why not? [ANS: Glyphs that are used in meaningful messages are strung together, but these "words" themselves come from no coherent accessible vocabulary, nor is there a structure of rules to govern their use in meaningful, difference making ways. However, they could be used to drive a code in a program, yea, even they could serve as the password for a system, or they could be used as a string that serves as an index on a defined config space that has an associated equally intelligently defined objective function with peaks and valleys and slopes. But the string standing by itself has only the function of holding certain values of glyphs, it gains functionality form being integrated into a complex communication and information processing system and environment that knows the difference between message and noise, and knows instruction from nonsense. Such systems are invariably complex, irreducibly complex, and in our observation of their origin, are designed. So, pardon my following that idiot of intelligent design, Newton in Opticks Query 31, and inferring on induction that the empirically warranted best explanation for such is design, and insisting that I will allow no censorship by controlling materialist metaphysical Lewontinian a prioris. If you want to overturn that inductive conclusion, hostage-takers and ilk, YOU ARE GOING TO HAVE TO PROVIDE CREDIBLE OBSERVED CASES OF BLIND CHANCE AND NECESSITY DESIGNING AND DEVELOPING SUCH SYSTEMS.]kairosfocus
July 5, 2011
July
07
Jul
5
05
2011
11:53 PM
11
11
53
PM
PDT
Here is another one from Meyer to chew on: "Thus, in addition to a quanifiable amount of Shannon information (or complexity), DNA also contains information in the sense of Webster's second definition: it contains alternative sequences or arrangements of something that 'performs a specific effect.' Although DNA does not convey information that is received, understood, or used by a conscious mind, it does have information that is received and used by the cell's machinery to build the structures critical to the maintenance of life. DNA displays a property--functional specificity--that transcends the merely mathematical formalism of Shannon's theory."StephenB
July 5, 2011
July
07
Jul
5
05
2011
11:44 PM
11
11
44
PM
PDT
---Mung: "The idea that there are two kinds of information is silly." I am sure that Stephen Meyer will be happy to submit to your superior judgment and concede that he is, indeed, "silly." Since you have given the matter so much more thought than he has, you will certainly have no difficulty explaining why he is misguided in his belief that the information in a DNA molecule actually performs a function. Also, I am sure that you will be happy to explain this sentence: "...The bases in DNA and RNA and the sequence of amino acids in proteins do not contain mere Shannon information. Rather these molecules store information that is also functionally specified. As such, they manifest one of the kinds of patterns--a functionally specific pattern--that routinely leads statisticians to reject chance hypotheses, at least in the case of improbable events."StephenB
July 5, 2011
July
07
Jul
5
05
2011
11:12 PM
11
11
12
PM
PDT
Grr, yourself, Mung. Did you read those three posts? "... I think this is the way Dembski and others have been moving as well." And, when you get there/here, you'll find me already waiting. I've been here for many years.Ilion
July 5, 2011
July
07
Jul
5
05
2011
09:49 PM
9
09
49
PM
PDT
It seems to me that one of the central claims of ID theory is that certain structures have information content. That is to say, they "contain" information. I believe I am moving away from that view towards one which speaks of what is required to specify such a structure. I think this is the way Dembski and others have been moving as well.Mung
July 5, 2011
July
07
Jul
5
05
2011
09:30 PM
9
09
30
PM
PDT
How about saying that information is any "message" that can be encoded by means of a language? For human and animal languages that seems to me to be self-evident. Biological information, that is encoded in DNA, is expressed as life of one kind or another. Life is the "message" in the genetic language. It's easy to get too philosophical, methinks, when information is the subject. We might also consider what is required for language to create information. The laws of identity and non-contradiction make all language possible. Free will, in some sense, is required since any arrangement of symbols cannot be explained by physical laws. And purpose/causality is also necessary. Modus tollens. If I did not intend to be communicating, I would not be communicating. But I am communicating. Therefore, I do intend to communicate. None of these things (Reason, "local" language rules and symbols, free will, and purpose) can be explained given the explanatory resources of naturalism (the laws of physics). This is fascinating to me. As it turns out, The Word, The Logos, defeats the nonsense of naturalism decisively and completely. And every time a naturalist/materialist utters a single word they put the lie to their world view. This is not that difficult.tgpeeler
July 5, 2011
July
07
Jul
5
05
2011
09:16 PM
9
09
16
PM
PDT
The idea that there are two kinds of information is silly.Mung
July 5, 2011
July
07
Jul
5
05
2011
09:14 PM
9
09
14
PM
PDT
That should read, "Our minds can apprehend it, of course, but they are not [always] responsible for it."StephenB
July 5, 2011
July
07
Jul
5
05
2011
09:13 PM
9
09
13
PM
PDT
mike 1962, your instincts are correct. A DNA molecule and a computer program both "contain" information. Further, the latter contains two kinds of information, Shannon Information and Specified information. According to Stephen Meyer, Webster defines "information" in at least two important ways: [a] "The communication or reception of knowledge or intelligence." [b] "The attribute inherent in and communicated by alternative sequences and arrangements of something that produces specific effects." Type [b] information exists independently of our ability to know anyting about it. Our minds can apprehend it, of course, but our minds are not responsible for it. As Meyer puts it, "The DNA contains "alternative sequences of nucleotide bases and can produce a specific effect."StephenB
July 5, 2011
July
07
Jul
5
05
2011
09:08 PM
9
09
08
PM
PDT
grr... Ilion.Mung
July 5, 2011
July
07
Jul
5
05
2011
08:39 PM
8
08
39
PM
PDT
The entomology of the word “information” comes from the idea to “shape” a mind.
Mung
July 5, 2011
July
07
Jul
5
05
2011
08:37 PM
8
08
37
PM
PDT
1 3 4 5 6 7

Leave a Reply