Uncommon Descent Serving The Intelligent Design Community

On “Specified Complexity,” Orgel and Dembski

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
arroba Email

Bill Dembski often uses the term “specified complexity” to denote a characteristic of patterns that are best explained by the act of an intelligent designer. He defines the term as follows:

What is specified complexity? An object, event, or structure exhibits specified complexity if it is both complex (i.e., one of many live possibilities) and specified (i.e., displays an independently given pattern). A long sequence of randomly strewn Scrabble pieces is complex without being specified. A short sequence spelling the word “the” is specified without being complex. A sequence corresponding to a Shakespearean sonnet is both complex and specified.

William A. Dembski, No Free Lunch: Why Specified Complexity Cannot Be Purchased without Intelligence (Lanham: Rowman & Littlefield, 2002), xiii.

 

Dembski does not claim to have originated the concept of specified complexity:

The term specified complexity is about thirty years old. To my knowledge origin-of-life researcher Leslie Orgel was the first to use it. In his 1973 book The Origins of Life he wrote: “Living organisms are distinguished by their specified complexity. Crystals such as granite fail to qualify as living because they lack complexity; mixtures of random polymers fail to qualify because they lack specificity” (189). More recently, Paul Davies (1999, 112) identified specified complexity as the key to resolving the problem of life’s origin: “Living organisms are mysterious not for their complexity per se, but for their tightly specified complexity.”

The Logical Underpinnings of Intelligent Design

Is there a relationship between Leslie Orgel’s use of the term and Dembski’s. Yes, Dembski explains the relationship as follows:

Neither Orgel nor Davies, however, provided a precise analytic account of specified complexity. I provide such an account in The Design Inference (1998b) and its sequel No Free Lunch (2002). In this section I want briefly to outline my work on specified complexity. Orgel and Davies used specified complexity loosely. I’ve formalized it as a statistical criterion for identifying the effects of intelligence.

Id.

In summary, Orgel and Davies used the concept of specified complexity loosely. Dembski takes the concept they used loosely and formalizes it. One must be willfully obtuse, however, to fail to see the connection between the way Dembski uses the term and the way Orgel uses the term.

Dembski:

A long sequence of randomly strewn Scrabble pieces is complex without being specified.
A short sequence spelling the word “the” is specified without being complex.
A sequence corresponding to a Shakespearean sonnet is both complex and specified.

Orgel:

Mixtures of random polymers are complex without being specified.
Crystals such as granite are specified without being complex.
Living organisms are both complex and specified.

Yes, Orgel used the term more loosely than Dembski, but they are talking about the same concept. That is why Dembski repeatedly connects the term with Orgel and Davies in No Free Lunch.

When intelligent agents act, they leave behind a characteristic trademark or signature-what I define as specified complexity. [FN13] The complexity-specification criterion detects design by identifying this trademark of designed objects.
No Free Lunch, 6
[FN13]: The term “specified complexity” goes back at least to 1973, when Leslie Orgel used it in connection with origins-of-life research: “Living organisms are distinguished by their specified complexity. Crystals such as granite fail to qualify as living because they lack complexity; mixtures of random polymers fail to qualify because they lack specificity.” See Orgel, The Origins of Life (New York: Wiley, 1973 ), 189. The challenge of specified complexity to nonteleological accounts of life’s origin continues to loom large. Thus according to Paul Davies, “Living organisms are mysterious not for their complexity per se, but for their tightly specified complexity.” See Paul Davies, The Fifth Miracle (New York: Simon & Schuster, 1999), 112.

And

The central problem of biology is therefore not simply the origin of information but the origin of complex specified information. Paul Davies emphasized this point in his recent book The Fifth Miracle where he summarizes the current state of origin-of-life research: “Living organisms are mysterious not for their complexity per se, but for their tightly specified complexity.” The problem of specified complexity has dogged origin-of-life research now for decades. Leslie Orgel recognized the problem in the early 1970s: “Living organisms are distinguished by their specified complexity. Crystals such as granite fail to qualify as living because they lack complexity; mixtures of random polymers fail to qualify because they lack specificity.” [FN33]
No Free Lunch, 149
[FN33]: Leslie Orgel, The Origins of Life (New York: Wiley, 1973), 189.

And

In The Fifth Miracle Davies goes so far as to suggest that any laws capable of explaining the origin of life must be radically different from any scientific laws known to date.3 The problem, as he sees it, with currently known scientific laws, like the laws of chemistry and physics, is that they cannot explain the key feature of life that needs to be explained.   That feature is specified complexity. As Davies puts it: “Living organisms are mysterious not for their complexity per se, but for their tightly specified complexity.” [FN 5]
No Free Lunch, 180
[FN5] Davies, Fifth Miracle, 112. Consider also the following claim by Leslie Orgel: “Living organisms are distinguished by their specified complexity. Crystals such as granite fail to qualify as living because they lack complexity; mixtures of random polymers fail to qualify because they lack specificity.” In Leslie Orgel, The Origins of Life (New York: John Wiley, 1973), 189.

And

The term “specified complexity” has been in use for about thirty years. The first reference to it with which I am familiar is from Leslie Orgel’s 1973 book The Origins of Life, where specified complexity is treated as a feature of biological systems distinct from inorganic systems. [FN35]
No Free Lunch, 328-29.
[FN 35] Leslie Orgel, The Origins of Life (New York: Wiley, 1973 ), 189.

UPDATE (HT to Mung):

Orgel on Specified Complexity

Crystals are usually taken as the prototypes of simple well specified structures…Lumps of granite or random mixtures of polymers are examples of structures which are complex but not specified.

p. 189

Wait for it …

These vague idea can be made more precise by introducing the idea of information. Roughly speaking, the information content of a structure is the minimum number of instructions needed to specify the structure. One can see intuitively that many instructions are needed to specify a complex structure. On the other hand a simple repeating structure can be specified in rather few instructions. Complex but random structures, by definition, need hardly be specified at all.

– p. 190

A final nail:

Paley was right to emphasize the need for special explanations of the existence of objects with high information content, for they cannot be formed in nonevolutionary, inorganic processes.

– p. 196

Comments
kairosfocus said: "Did n=you notice that to date AFAIK none of the more stringent objectors of recent weeks has actually admitted that FSCO/I exists as a real characteristic of anything?" Why should anyone admit that? FSCO/I is a label you made up for some imaginary thing that you can't verify, especially in life forms. You're the only one who uses the term "FSCO/I". Not even your fellow IDers use it. "That, when confronted with something as simple and direct as an Abu 6500 C3 fishing reel to date they have studiously avoided it apart from one objector who had the bright idea to say well it has gears in it and we know of only one case of gears in the world of life. Actually, just one case of seriously properly meshed gears should be in itself a wake up call." Wake up call for what? Fishing reels are already known to be designed, and just because something in a life form can be described by some humans as 'gears' doesn't mean that they are gears that are deliberately designed, created, machined, and assembled along with other parts by a supernatural god. A spine from a fish hook cactus looks similar in shape to a man made metal fish hook but that doesn't mean that fish hook cactuses are designed by man made metal fish hook designers or by a supernatural god. "But the wider manifestations of FSCO/I are all around us — think, wiring diagram style node arc linkages and organisation that depends on specific configuration to achieve function — literally (think PC screen and the wider PC not to mention the data strings, programs, keyboards, track pads etc etc) staring us in the face." PC screens, PCs, PC data strings, PC programs, keyboards, track pads, etc., etc., are already known to be designed, and your reference to "wiring diagram style node arc linkages and organisation that depends on specific configuration to achieve function" is apparently in regard to computers and/or computer networks, which are also already known to be designed. Everything is a machine to you, isn't it? And 'the designer' is a machinist, mechanic, foundry worker, electrician, computer assembler/programmer, plumber, and construction worker, right? How does 'the designer' find the time to save souls? Pachyaena
Joe blurted: "Tamara, We love your bald assertions, false accusation s and innuendos. Do you really think they help you? And don’t you get sick of doing that and then getting proven wrong? Or are you so pathological that it doesn’t bother you?" Dang, it's going to cost a lot of money to replace all of the shattered irony meters throughout the multiverse. Pachyaena
Tamara, We love your bald assertions, false accusation s and innuendos. Do you really think they help you? And don't you get sick of doing that and then getting proven wrong? Or are you so pathological that it doesn't bother you? In "Signature in the Cell" and all other ID writings, it is clear that we use information in the standard and accepted way: b : the attribute inherent in and communicated by one of two or more alternative sequences or arrangements of something (as nucleotides in DNA or binary digits in a computer program) that produce specific effects Notice how that also captures Crick's definition of biological information which we use when discussing biology. Joe
Silver Asiatic: What’s your definition? We're more than happy to discuss any clearly defined term, such as Shannon Information. As for specified complexity, the problem is the changeable nature of its calculation. Here is Dembski's formula: chi = – log2 [ 10 ^ –120 * phi_S(T) * P(T|H) ] If we take P(T|H) to refer to a standard probability distribution, then it seemingly leads to false positives. If not, then P(T|H) may be intractable, and if it also lacks independence from phi_S, it can result in circularity. Zachriel
Z 110
Which is why you have to use a consistent definition of information.
What's your definition? Silver Asiatic
We do have a consistent definition of information.
Don't do yourself down Joe, tell it like it is. You have many consistent definitions of information, lots of subtle and not so subtle variations to make "information" a word with which to win any argument. Tamara Knight
We do have a consistent definition of information. Joe
Silver Asiatic: Again, there doesn’t appear to be any evidence that clouds generate sequential data which is received, translated and operationalized by rain, but I’m open to that possibility. Which is why you have to use a consistent definition of information. In one place you define it as requiring a sender and receiver; in another has in relation to a function. On top of that, it's used differently by different IDers, even on the same thread. Meanwhile, we still haven't been able to get a clear formula for specified complexity. Zachriel
I don 't know, Silver Asiatic- just plan an outdoor event- washing and polishing the car is a good one- and the clouds and rain seem to coordinate to disrupt your plans. ;) Joe
Joe: Basically Shannon only told us about information carrying capacity. Tamara Knight: But I’ve never claimed anything else And yet we have you saying: Tamara Knight: Shannon is all about communication not meaning. Can you please explain how it is that you think your second claim isn't any different from what Joe said?
I’m glad we’ve cleared that up now and agree on at least one key issue. Let’s hope with your help Mung can see the light too.
Just what is it, precisely, that you think I need to "see the light" on? Tamara Knight:
And how would that being true affect the conclusion of the previous discussions? Shannon’s long gone, and his definition of the term can’t be undone. Would renaming it “Mung’s nemesis” make it more or less like CSI?
You're the one who thrives on making up terms. You want to call it "Mung's Nemesis" no one here can stop you. You want to call it "Dembski Information " no one here can stop you. Mung
R0bb. I guess you missed my response @78 to your question.
These statements have no references attached.
So? Mung
Mung @ 24 shares an interesting quote from the Wikipedia entry on "Complexity":
In physical systems, complexity is a measure of the probability of the state vector of the system.
There is also a similar sentence in the entry on "Physical System":
The complexity of a physical system is equal to the probability of it being in a particular state vector.
These statements have no references attached. The former was written by an anonymous contributor in 2004, and the latter was merged in from a now-nonexistent article so it can't be tracked. I've never heard of a field called "physical systems". Has anyone seen the term "complexity" used this way in thermodynamics or statistical mechanics? Any examples would be appreciated. Edit: Added spaces in "Mung @ 24" so it wouldn't be treated as an email address. R0bb
It’s not a sender and receiver of information as you stated above. If it is a functional relationship, then there is a functional relationship between clouds and rain.
Again, there doesn't appear to be any evidence that clouds generate sequential data which is received, translated and operationalized by rain, but I'm open to that possibility. Silver Asiatic
Silver Asiatic: There is a relationship between the generation of information and the function. It's not a sender and receiver of information as you stated above. If it is a functional relationship, then there is a functional relationship between clouds and rain. Silver Asiatic: Did the quoted text in post 95 help? We responded in 98. Zachriel
Joe,
What if no one knew what “intuition” meant? Would we still use it [intuition]? :) How would we know? How could we convey that to others if there wasn’t any meaning in our communications?
LOL. I wonder why we don't have numerous posts questioning what is meant by intuition? [No, I don't really wonder - I know why. :-)] Silver Asiatic
Zachriel
nothing to do with transmitting information between a sender and receiver. Notably, the measure is in reference to a particular function
There is a relationship between the generation of information and the function.
We don’t need a definition of information. There was never a need for “A Mathematical Theory of Communication”. We can just use intuition.
Did the quoted text in post 95 help? Silver Asiatic
Joe
Clouds provide the data that we use to turn into information.
Exactly. We can extract data from any observable thing and then create information out of it. That is different from something that generates and communicates information to a receiver as in biological functions. Silver Asiatic
What if no one knew what "intuition" meant? Would we still use it [intuition]? :) How would we know? How could we convey that to others if there wasn't any meaning in our communications? Joe
Clouds provide the data that we use to turn into information. Joe
Zachriel: If we see clouds and infer rain, does that means the clouds contain information about rain? Silver Asiatic: There is sender, code, medium, receiver, translation and operation – among other things in an informational relationship. So you're saying clouds don't give us information about rain? Silver Asiatic: Durston did some nice peer-reviewed work on the nature of information. You might be interested in this: http://www.tbiomed.com/content/4/1/47 They *define* a measure of information which has nothing to do with transmitting information between a sender and receiver. Notably, the measure is in reference to a particular function, and more important, to a degree of function. Change the degree of function, and the measure changes. Nor can they reasonably calculate how many sequences have the function, so they have a problem on that side of the equation too. Silver Asiatic: I don’t know why it’s a problem. Definition disorder? We don't need a definition of information. There was never a need for "A Mathematical Theory of Communication". We can just use intuition. Zachriel
Joe
And most people, including children, know what the word means. Why are you having such difficulty? Perhaps you should try a dictionary…
I don't know why it's a problem. Definition disorder? There's a Wiki site about information which includes the paragraph I quoted from one SCIENTIST who argues that information requires a relationship. Still looking for the evolutionary origins of information networks. Random symbols that have no meaning for sender are sent for no reason to nobody. They are received randomly by receivers who don't know or care about what the signal means - since it means nothing. But accidentally, the signal and receiver randomly arrive at the same meaning of the randomly generated signal. Add billions of years - and we have the internet. Silver Asiatic
Joe @ 94
Only to people with knowledge of such things. When I see tracks in the snow those tracks contain information. And today that information led me to where my dog had wandered to.
Right. You extracted information from raw data. But you could also infer that there was some intelligence and that the tracks may have been communicating something. There are specified patterns (the image matching dog prints, the direction of the steps, the even spacing). But there's very limited complexity and no apparent coding or communication network or function observable. The paw-prints communicate enough information on their own that some kind of probability study could be done to determine if they were caused randomly or by natural forces. Silver Asiatic
If we see clouds and infer rain, does that means the clouds contain information about rain?
Information is communicative. There is sender, code, medium, receiver, translation and operation - among other things in an informational relationship. There's no evidence that clouds communicate something to rain and that rain receives, processes and acts on that information, but I'm open to the possibility that it does work that way. As mentioned above, there's a large area of scientific research and analysis on information. This area of study is not limited to ID research. Durston did some nice peer-reviewed work on the nature of information. You might be interested in this: http://www.tbiomed.com/content/4/1/47 From the text:
We have extended Shannon uncertainty by incorporating the data variable with a functionality variable.
This goes beyond Shannon's ideas.
Abel and Trevors have delineated three qualitative aspects of linear digital sequence complexity [2,3], Random Sequence Complexity (RSC), Ordered Sequence Complexity (OSC) and Functional Sequence Complexity (FSC). RSC corresponds to stochastic ensembles with minimal physicochemical bias and little or no tendency toward functional free-energy binding. OSC is usually patterned either by the natural regularities described by physical laws or by statistically weighted means. For example, a physico-chemical self-ordering tendency creates redundant patterns such as highly-patterned polysaccharides and the polyadenosines adsorbed onto montmorillonite [4]. Repeating motifs, with or without biofunction, result in observed OSC in nucleic acid sequences. The redundancy in OSC can, in principle, be compressed by an algorithm shorter than the sequence itself. As Abel and Trevors have pointed out, neither RSC nor OSC, or any combination of the two, is sufficient to describe the functional complexity observed in living organisms, for neither includes the additional dimension of functionality, which is essential for life [5]. FSC includes the dimension of functionality [2,3]. Szostak [6] argued that neither Shannon's original measure of uncertainty [7] nor the measure of algorithmic complexity [8] are sufficient. Shannon's classical information theory does not consider the meaning, or function, of a message. Algorithmic complexity fails to account for the observation that 'different molecular structures may be functionally equivalent'. For this reason, Szostak suggested that a new measure of information–functional information–is required [6]. Chiu, Wong, and Cheung also discussed the insufficiency of Shannon uncertainty [9,10] when applied to measuring outcomes of variables. The differences between RSC, OSC and FSC in living organisms are necessary and useful in describing biosequences of living organisms.
Note: Abel and Trevors delineate three kinds of information sequencing - Random, Ordered and Functional. Note: Shannon uncertainty is insufficient in describing the functional information in biology. Szostak proposed different measures. The bolded text just affirms what Joe has been arguing elsewhere in this thread. Discussion of Shannon uncertainty (it's not information) as if that's the only or most significant measure we can use is irrelevant and incorrect. Silver Asiatic
You keep using that word, information.
And most people, including children, know what the word means. Why are you having such difficulty? Perhaps you should try a dictionary...
If we see clouds and infer rain, does that means the clouds contain information about rain?
Only to people with knowledge of such things. When I see tracks in the snow those tracks contain information. And today that information led me to where my dog had wandered to. Joe
Tamara, I use the words in the standard and well accepted ways. I have no idea what definitions you are using in order to spew the accusations that you spew.
You are clearly far more interested in playing word games than in establishing the veracity of any underlying scientific hypotheses.
Nice projection. Look you ate it on natural selection and fitness and you want to try to blame me. The posts are there for all to read so go ahead and cry.
You constantly claim science is being held back by the refusal of the mainsteam to embrace ID.
Mainstream hasn't given us any insight as to what makes an organism what it is. Mainstream would benefit from "The Privileged Planet" when it comes to trying to find other intelligent beings.
Sure, KNOWING what a designer did and WHEN and HOW it did it would be an immense boost to science, but you claim ID has not the slightest interest in determinig that.
You are confused, as usual. ID isn't about that but ID doesn't prevent anyone from trying to find answers to those. However that is like asking that newly found Amazon tribe to tell us how computers are made. But anyway we are still waiting for the demonstration of communicating without meaning. Any time you are ready... Joe
ME: Together we can probably manage to correct Mung’s misunderstanding about the equivalence of CSI and Shannon information. Joe: Shannon information is a misnomer.
And how would that being true affect the conclusion of the previous discussions? Shannon's long gone, and his definition of the term can't be undone. Would renaming it "Mung's nemesis" make it more or less like CSI?
But I still want to hear about this communication without meaning. If you can demonstrate that I will be very impressed.
Whoa! Clever boy! If I had seen that coming I would have posted something suggesting that “meaning” is relevant only in the sense that the receiver receives exactly what the sender sent, nothing more. Why didn't I do that? Oh, wait a minute.... So yes Joe, it is pefectly possible to define meaning and communication in such a way that you can trivially but truthfully post "You cannot have communication without meaning." as whatever the blog equivalent of a sound bite is. It is equally possible to define them so as to be able to say "You can have communication without meaning.", Which is why scientific progress depends on agreed definitions. You are clearly far more interested in playing word games than in establishing the veracity of any underlying scientific hypotheses. You constantly claim science is being held back by the refusal of the mainsteam to embrace ID. But what I regard as a coffee-break challenge, you must regard as your daily grind. You must have spent thousands of hours here playing word games, but can you link me to a single post where you discuss how ID acceptance would affect the course of scientific discovery? Say, for example, Genetic Engineering to improve crops. Sure, KNOWING what a designer did and WHEN and HOW it did it would be an immense boost to science, but you claim ID has not the slightest interest in determinig that. What would be your career advice if I were a newly graduated eager young biologist and ID advocate looking for discoveries to feed the world? "Luv, you've got to just pray to the designer, because trust me, it's all too complex for a pretty little girl like you to ever understand" perhaps? Tamara Knight
Silver Asiatic: We know that informational relationships can be build by design. You keep using that word, information. If we see clouds and infer rain, does that means the clouds contain information about rain? Zachriel
You cannot have communication without meaning. Joe
Tamara Knight:
Together we can probably manage to correct Mung’s misunderstanding about the equivalence of CSI and Shannon information.
Shannon information is a misnomer. But I still want to hear about this communication without meaning. If you can demonstrate that I will be very impressed. Joe
Mung, remember this? https://uncommondesc.wpengine.com/intelligent-design/new-user-feature-at-ud/#comment-532110 When you posted "Why is it only for new users?" in response to a post titled "New User Feature at UD", I thought it was a rather feeble attempt at a joke. But extracting the meaning from communication seems to be an area where you have problems. I now see that you must have genuinely parsed that title as "new-user feature" rather than the obvious "new user-feature". Please explain how you think Joe's (undisputed) statement "Shannon only told us about information carrying capacity." is even at odds with mine claiming “Shannon is all about communication not meaning”, let alone proving a lie. As Joe correctly explained, Shannon's work defined the information carrying capacity of a communication channel. If the concept of "meaning" occurs anywhere in his work, it is only in the sense that the receiver gets exactly what the sender intended, nothing more. Regardless of whether or not anything else you are saying has merit, you can't load your extended definition of information on Shannon's concepts and expect his metric to still be relevant Joe gets it. Perhaps he can explain it to you using concepts you are more familiar with. Tamara Knight
Joe: Basically Shannon only told us about information carrying capacity. Tamara Knight: But I’ve never claimed anything else That's a lie. You said: "Shannon is all about communication not meaning" Mung
Z
That’s all well and good, but the claim is that quantified specified complexity is a signature of design.
We know that informational relationships can be build by design. I don't think we know of any other source for the origin of such. Silver Asiatic
Zachriel @ 82 That was an interesting bio - thanks. Silver Asiatic
Silver Asiatic: It’s an observation of a relationship. That's all well and good, but the claim is that quantified specified complexity is a signature of design. Zachriel
Zachriel
We provided Dembski’s definition.
This OP offered definitions by Dembski. Wikipedia offers an extensive explanation of what information is:
As representation and complexity The cognitive scientist and applied mathematician Ronaldo Vigo argues that information is a concept that involves at least two related entities in order to make quantitative sense. These are, any dimensionally defined category of objects S, and any of its subsets R. R, in essence, is a representation of S, or, in other words, conveys representational (and hence, conceptual) information about S. Vigo then defines the amount of information that R conveys about S as the rate of change in the complexity of S whenever the objects in R are removed from S. Under "Vigo information", pattern, invariance, complexity, representation, and information—five fundamental constructs of universal science—are unified under a novel mathematical framework.[4][5] Among other things, the framework aims to overcome the limitations of Shannon-Weaver information when attempting to characterize and measure subjective information.
The above is interesting: "two related entities". It's an observation of a relationship. If you'd like to explain the evolutionary origin of informational relationships, I would find that interesting. Science recognizes that information exists and has certain characteristics. This subject matter is not reserved to ID researchers alone. Silver Asiatic
Silver Asiatic: I don’t think there needs to be a simple answer to the problem of information to recognize it and to draw reasonable inferences from what we observe. Which is why Claude Shannon is considered the father of information theory. Shannon laid one of the fundamental building blocks of the Information Age. http://www.corp.att.com/attlabs/reputation/timeline/16shannon.html Zachriel
Zachriel
Nor is there likely a simple answer to the problem of information.
I don't think there needs to be a simple answer to the problem of information to recognize it and to draw reasonable inferences from what we observe. Silver Asiatic
Mung
I’m thinking Orgel would call it a simple well-specified structure, like a crystal.
Would we say that unlike a crystal, what we know about coins is that they don't line up with all heads facing like that, but we know that crystals display patterns? In the same way, we know that trees grow vertically. But if we find a log standing vertically there's something different in that case. Silver Asiatic
Joe: Basically Shannon only told us about information carrying capacity.
But I've never claimed anything else, unless you want to back out of that position by claiming CSI can be carried without using up Shannon bandwidth. However it is good to have you on board. Together we can probably manage to correct Mung's misunderstanding about the equivalence of CSI and Shannon information. Tamara Knight
R0bb:
If we have 500 coins lined up in a repeating pattern of all heads, is that an instance of specified complexity, according to Orgel’s usage of the term?
I'm thinking Orgel would call it a simple well-specified structure, like a crystal. Are they two-headed coins? Mung
Mung- It is all the rage, ie communication without meaning. And I am pretty sure that is how Washington DC politics are carried out. :cool: And if you read Zachriel, keith and Tamara then you know that they live by the concept. Joe
CSI wrt biology is all about biological specification which pertains to function (Dembski, "No Free Lunch"). It is the same as biological information as defined by Crick. Joe
Tamara- try communicating by using a meaningless combination of grunts and random typing or words that you made up. Shannon was all about making sure what was transmitted is what was received. The machines used don't care about what the messages meant. That was Shannon's point. To a machine sending/ receiving gibberish is the same as sending/receiving detailed instructions. Basically Shannon only told us about information carrying capacity. Joe
Silver Asiatic quoting Mung quoting Orgel: roughly speaking Orgel seems to be referring to something like Kolmogorov complexity, but not quite, because the shortest description of a random sequence is the sequence itself. It's a qualitative definition for the purpose of discussion. That's what Dembski's formula attempts to make precise. Unless you can calculate specified complexity, then you probably can't reach any firm conclusions based on specified complexity. Nor is there likely a simple answer to the problem of information. Zachriel
Shannon is all about communication not meaning.
Ah, the secret of the ID critique is revealed. It's communication without meaning. That explains so much. Thank you. Mung
Zachriel: William, as an information theorist, claims he can determine design from the pattern alone. Is he wrong? Silver Asiatic: It sounds like a guessing game … it’s best to be as specific as possible when asking questions like that Not sure what is ambiguous about the question. Silver Asiatic: But anyway, note my response to Robb in #51. Do you agree or not? There's no way to know because you keep using the term CSI without providing a definition. We provided Dembski's definition. Silver Asiatic: If you’re hoping to trick me somehow, then you win. We just want to know what you mean by "CSI" and "information". If you can't provide a precise definition, then it's not clear your position is supportable. Zachriel
Joe @ 65 - thanks, whenever it's convenient. I think it would be an important thing to post not only for me. Silver Asiatic
Zachriel - you could look at this ... Mung #40 quoting Orgel on Information:
These vague idea can be made more precise by introducing the idea of information. Roughly speaking, the information content of a structure is the minimum number of instructions needed to specify the structure. One can see intuitively that many instructions are needed to specify a complex structure. On the other hand a simple repeating structure can be specified in rather few instructions. Complex but random structures, by definition, need hardly be specified at all. – p. 190
Does that help? Silver Asiatic
No,
I'm glad we agree with what I said: "Information science is not limited to Shannon information as sole means of identifying or defining information."
but you’re the one who said it was the same definition as used in communications.
If you're hoping to trick me somehow, then you win. Congratulations. Silver Asiatic
Z
William, as an information theorist, claims he can determine design from the pattern alone. Is he wrong?
It sounds like a guessing game ... it's best to be as specific as possible when asking questions like that, but anyway, note my response to Robb in #51. Do you agree or not? Silver Asiatic
Joe @ 64
To make it unequivocally clear what we mean by information. Shannon dodn’t care about meaning whereas CSI is all about meaning or function. And communication would be impossible without meaning.
Right. Information has certain characteristics. When we observe a high degree of specificity and of complexity oriented towards meaning and function, if we observed also sender, code, translation and receiver - as with DNA coding, we call that CSI (for lack of a better term) to distinguish from Shannon information. Silver Asiatic
Joe: To make it unequivocally clear what we mean by information. Shannon dodn’t care about meaning whereas CSI is all about meaning or function.
Having just spotted your comment at the top of the recent posts list, it seems there has been a similar discussion here to the one over on the Dembski thread I'm glad we've cleared that up now and agree on at least one key issue. Let's hope with your help Mung can see the light too. However, you do rather muddy the waters again with your qualifier:
And communication would be impossible without meaning.
since Shannon is all about communication not meaning And then when told
Shannon information is highest for a random sequence,
you reply
Not necessarily.
Which is trivially true, because the set of random sequences for any Shannon length necessarily includes any strings which you claim contain CSI, whether or not that CSI is at a level you consider too unlikely to occur by natural means. Makes a good smokesceen though, because I'm sure a clever chap like you must know that the sums of the Shannon information in N m-bit random strings will be statistically higher than the sums of the Shannon information in N m-bit randomly chosen CSI laden strings. EDIT : Excellent enhancement. I don't know why that link did not work, but I can't edit it to point to: https://uncommondesc.wpengine.com/informatics/how-is-bill-dembskis-being-as-communion-doing/#comment-528353 Tamara Knight
Silver Asiatic- I will look at NFL tonight when the painting is done for the day. I have the snow blower ready so that won't be a worry for later. Joe
Why call it CSI if it is just the same definition of information used in communications, which is Shannon Information?
To make it unequivocally clear what we mean by information. Shannon dodn't care about meaning whereas CSI is all about meaning or function. And communication would be impossible without meaning.
Shannon information is highest for a random sequence,
Not necessarily. Joe
Silver Asiatic: Why call it “natural selection” is nothing is actually being selected? In this case, it's a scientific term chosen because the effects are hypothesized to be similar to artificial selection. That's a different question than asking why you call information CSI if it is the same definition as used in communications, which already has a name. Silver Asiatic: We’re talking about scientific analysis, of course. Not a man who is afraid of ghosts. William is also blind. Can he reliably see the source of sounds? William, as an information theorist, claims he can determine design from the pattern alone. Is he wrong? Zachriel
Silver Asiatic: All information is complex and specified. Not really. Shannon information has to do with uncertainty in a message stream. The underlying message may very well be a random sequence, which has the highest information density. You didn't provide any specific formula or algorithm for your notion of information. You might want to do that. Silver Asiatic: Information science is not limited to Shannon information as sole means of identifying or defining information. No, but you're the one who said it was the same definition as used in communications. Zachriel
William is afraid of the dark. “Can objects, even if nothing is known about how they arose, exhibit features that reliably signal the action of an intelligent cause?”
We're talking about scientific analysis, of course. Not a man who is afraid of ghosts. William is also blind. Can he reliably see the source of sounds? I think your analogy is too simple. Can science reliably determine that information is present (sender, coding, medium, translation, receiver, operation, organization)? I think so. After that, we're talking about inference to the best explanation of its origin. Silver Asiatic
Why call it CSI
It's a fair question. All information is complex and specified. But I think the term helps highlight that point. More importantly, there are degrees of complexity and specification - and we generally are talking about Highly Complex, and Highly Specified information (functional multi-level coding). So, as a means of stressing the nature of information, Complex and Specified are modifiers (not entirely necessary in my view). But science does this, doesn't it? Why call it "natural selection" is nothing is actually being selected? Information science is not limited to Shannon information as sole means of identifying or defining information. Silver Asiatic
Silver Asiatic: Yes, I discussed that alternative definition in #48. Z: The problem is the changeable nature of the calculation of specified complexity. Silver Asiatic: We observe CSI – which is information. It’s the same that we see in communication – coding, sender, translation, receiver, organizing. Why call it CSI if it is just the same definition of information used in communications, which is Shannon Information? Shannon information is highest for a random sequence, so per that equivalence, white noise is high in CSI. Silver Asiatic: You’d use ordinary forensic techniques to determine if there is information present. William is afraid of the dark. "Can objects, even if nothing is known about how they arose, exhibit features that reliably signal the action of an intelligent cause?" Zachriel
Joe
We believe that equation is to see if the specification warrants a design inference
That's the way I hope the equation is used. It's not what defines CSI, it only attempts to find the origin of it. Silver Asiatic
Joe 54
In “No Free Lunch” Dembski has a proof that necessity and chance cannot produce CSI
Can you summarize his proof? If not, that's ok (understandable if it's too complex). Silver Asiatic
Zachriel @ 53 Yes, I discussed that alternative definition in #48. If probability is included in the definition of CSI, then that's a problem, as I see it. If CSI is a scientific observation of information, then you're objection is not relevant - and your story-line on the haunted house doesn't follow. You'd use ordinary forensic techniques to determine if there is information present. You don't need probability calculations to observe information. Silver Asiatic
Zachriel- We believe that equation is to see if the specification warrants a design inference Joe
Silver Asiatic- In "No Free Lunch" Dembski has a proof that necessity and chance cannot produce CSI Joe
Silver Asiatic: Right – we observe information first. The problem is the changeable nature of the calculation of specified complexity. Using Dembski's definition: chi = – log2 [ 10 ^ –120 * phi~S(T) * P(T|H) ] , and taking P(T|H) to refer to a standard probability distribution:
Today, William got an incredible deal on an old Victorian house. Highly satisfied with his business acumen, William settled in for a blissful night of sleep in his new home. SLAM! William woke with a start. He listened intently. But he didn’t hear anything, so he settled back to sleep. Cree..eak William listened even more closely this time until, after a bit, the creaking noise died away. For some reason, he recalled the seller’s maniacal laughter just after William signed the papers to buy the house. SLAM! William was trembling and his teeth were rattling. He thought about getting out of bed to investigate. Instead, he pulled the covers over his head. Cree..eak Hmm, William thought. Being a design theoretician, I can use the patented (not really) Dembski Inference to determine if the pattern is being caused by a ghost, er some unspecified intelligent cause. SLAM! Cree..eak SLAM! Cree..eak SLAM! Cree..eak SLAM! Cree..eak …
edit: chi = specified complexity Zachriel
To avoid some known-tricks in the above: "If we have 500 two-sided coins lined up in a repeating pattern of all heads ..." That's what I was referring to. Silver Asiatic
Robb #49
If we have 500 coins lined up in a repeating pattern of all heads, is that an instance of specified complexity, according to Orgel’s usage of the term?
Yes, that is specified complexity. It's not as highly complex as information that serves an observable function, but it does show specificity since it aligns geometrically ("coins lined up") and matches known informational pattern (uniform, repeated signal) and organization. So there is some level of information communicated in that pattern. It's CSI. Now that we've observed that, we can search for sources of its origin and calculate probabilities on that. If someone can show that 500 heads can be lined up via a natural process, then ID is falsified. If not, then we detected design and ID is validated. Silver Asiatic
Zachriel, How much research do you do? And do you think that Michael Behe keeps his job by publishing ID books? Or by publishing in mainstream journals? I know that those papers are not usually ID papers. But he and Dembski have put their careers and reputations on the line by espousing a controversial theory. They have more courage than me and you, I'd wager. Collin
Mung @ 40, yes, well-quoted. Thank you. So I'll ask the following again: If we have 500 coins lined up in a repeating pattern of all heads, is that an instance of specified complexity, according to Orgel's usage of the term? Anyone is welcome to answer. R0bb
Joe # 33
The term “complex specified information” is used so that people understand that the information being discussed is the same as the information used for communication and education. Science journals and textbooks are full of CSI.
That's why ID is not a circular argument. We observe CSI - which is information. It's the same that we see in communication - coding, sender, translation, receiver, organizing. Only after that do we try to calculate probabilities. We notice that, thus far the only known source for CSI is intelligence. But this can be falsified if someone can find another source.
I like to head them off by telling them that CSI is just complex Shannon information that has meaning or function.
Exactly. We can use the tools of information science to analyze and study it.
That exclusion is saved for the proof of the concept that CSI only arises if an intelligent agency makes it so. Our opponents conflate the proof with the definition
The only thing I'm not sure about is if Dembski included a probability component in the definition of CSI. But science does not need to calculate any probabilities to observe information. Probability is needed to try to find the origin of that information. Silver Asiatic
Mung #37
Quoting: “Once we understand more about the evolution of biological organization, we should be able to say something quantitative about the probability that life exists elsewhere in the universe."
Right - we observe information first. The key term in CSI is "Information". We observe that (the principle of biological organization) based on characteristics of information. It's only after that observation that we "say something quantitative about the probability" of the origin of that information. But I'm guessing that Debmski's alternative definition mistakenly includes a probability measure within the definition of CSI. Silver Asiatic
Moose Dr #29 Ok, thanks. That's the kind of definition I'd use. But I mentioned this to VJ Torley and he objected that it wasn't a quantifiable (computable) definintion of CSI. For that, he proposed, you need the formula which (apparenly) includes a non-probability factor. So there are different definitions, as you said. Silver Asiatic
Mung @40: BOOM! goes the dynamite. William J Murray
And what is the blind watchmaker research, Zachriel? Who is conducting blind watchmaker research? Joe
Mung: What in earth was Orgel thinking, linking biological organization to probability! Orgel wasn't claiming to know the process of abiogenesis, so he didn't claim to know the probability. Collin: Dembski’s critics aren’t interested in finding out if life is designed. Actually, virtually no research in biology is done by IDers. Zachriel
Mung, I have updated and annotated the Orgel clip here: https://uncommondesc.wpengine.com/atheism/fyi-ftr-what-about-onhs-vs-invisible-rain-fairies-salt-leprechauns-and-planet-pushing-angels-etc/ I have added links to onward discussions. KF kairosfocus
Mung, well quoted. KF PS: And I know of the original Learned Hand. kairosfocus
Orgel on Specified Complexity
Crystals are usually taken as the prototypes of simple well specified structures...Lumps of granite or random mixtures of polymers are examples of structures which are complex but not specified. p. 189
Wait for it ...
These vague idea can be made more precise by introducing the idea of information. Roughly speaking, the information content of a structure is the minimum number of instructions needed to specify the structure. One can see intuitively that many instructions are needed to specify a complex structure. On the other hand a simple repeating structure can be specified in rather few instructions. Complex but random structures, by definition, need hardly be specified at all. - p. 190
Oh yeah, that sounds SO UNLIKE DEMBSKI! The critics really ought to shut up now. Really. A final nail:
Paley was right to emphasize the need for special explanations of the existence of objects with high information content, for they cannot be formed in nonevolutionary, inorganic processes. - p. 196
You're welcome. Mung
Collin, Right you are. ID is full of errors and shortcoming. Glad to see someone here at UD admit that. ;) Great quote, btw. By the way, am I the only one who thinks it a shame that the name of a great man is being tarnished here at UD daily? http://en.wikipedia.org/wiki/Learned_Hand Mung
Mung, Dembski's critics aren't interested in finding out if life is designed. Some, like Keiths, brush it aside as trillions of times less likely than unguided evo. Teddy Roosevelt said: “It is not the critic who counts; not the man who points out how the strong man stumbles, or where the doer of deeds could have done them better. The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood; who strives valiantly; who errs, who comes short again and again, because there is no effort without error and shortcoming; but who does actually strive to do the deeds; who knows great enthusiasms, the great devotions; who spends himself in a worthy cause; who at the best knows in the end the triumph of high achievement, and who at the worst, if he fails, at least fails while daring greatly, so that his place shall never be with those cold and timid souls who neither know victory nor defeat.” Collin
"Once we understand more about the evolution of biological organization, we should be able to say something quantitative about the probability that life exists elsewhere in the universe." - Orgel. 1973. p. 232 What in earth was Orgel thinking, linking biological organization to probability! Mung
Moose Dr: Seems to me that nature had only about 1/2 billion years to create all animal life. Many fundamental organic processes preexist animal life. Moose Dr: I understand that bacteria experiments have found it about impossible to produce 2 mutational event advancements unless the first event already offers a clear advancement. That is incorrect. See Blount, Borland & Lenski, Historical contingency and the evolution of a key innovation in an experimental population of Escherichia coli, PNAS 2008. Moose Dr: So are you willing to concede that if it can be demonstrated that a CSI protein came to existence without “a broad plane of low functionality” or a nearby island of functionality from a similar protein, that this would falsify NDE? Neodarwinism is several human generations old. It's been "falsified" many times, though many basic findings still hold. If you mean there has to be a plausible evolutionary pathway, then sure. Moose Dr: So you are saying that all protein evolved from islands of other functional protein? ‘Seemed like you just said otherwise in the previous paragraph. Proteins evolve by many different mechanisms, including mutation, recombination, selection, and the reuse of motifs in novel combinations. Moose Dr: Does this really explain all orphan genes? Not at all. It shows a worst case scenario, starting from nothing and still stumbling across functional proteins. The experiment was originally made (with RNA libraries) because it was an entailment of theories of abiogenesis. It's easier if you start with bits and pieces of existing proteins. Easier still if you work with motifs. Zachriel
Zachriel:
Selection and replication can optimize function, so it shows how specified complexity (as generally construed) can evolve.
That is intelligent design evolution. With unguided evolution "selection" is elimination and optimization is a pipe dream. So given intelligent design evolution we would expect to see specified complexity evolving. And before Zachriel gets ahead of itself, unguided evolution can't even explain the processes and systems required for making proteins. So that would be a problem, especially given the vast oceans and regardless of billions of years. Joe
Zachriel, "They’re usually simple functions, but then again, experimenters don’t have the resources of billions of years and oceans of organisms." Billions of years is a bit of an exaggeration, don't you think? 'Seems to me that nature had only about 1/2 billion years to create all animal life. I understand that a 5 gallon pail can quite handily hold more bacteria than there ever has been animals on the face of the earth. I understand that about 10 years of bacteria reproduction is equivalent to the number of generations from first animal to human. I understand that bacteria experiments have found it about impossible to produce 2 mutational event advancements unless the first event already offers a clear advancement. Maybe its a bit easier to experiment in an ocean of organisms, and a bit harder to get results, than all that. "Before optimization, obviously not. After optimization, they have a great deal of specified complexity (as normally construed)." So are you willing to concede that if it can be demonstrated that a CSI protein came to existence without "a broad plane of low functionality" or a nearby island of functionality from a similar protein, that this would falsify NDE? "No. No one thinks proteins evolve from random sequences in nature. This just answers the question about islands of function." So you are saying that all protein evolved from islands of other functional protein? 'Seemed like you just said otherwise in the previous paragraph. "Evolution may often find itself on a local fitness peak, but recombination allows for moving between areas of function." You seem to be putting a lot of weight in recombination. Most of what I know of recombination (which isn't extensive, I will admit) is that it is a strategy on the part of an organism, not normally a random event. "Common enough that random sequences can form weak enzymes, that can then be optimized." How much real genetic development is so explained. How much remains unexplained. Does this really explain all orphan genes? I seem to be learning that there is a lot of orphan genetic activity. Moose Dr
The term "complex specified information" is used so that people understand that the information being discussed is the same as the information used for communication and education. Science journals and textbooks are full of CSI. It is used so that our opponents don't try to equivocate with various misunderstandings of the word "information". I like to head them off by telling them that CSI is just complex Shannon information that has meaning or function. As you can see there isn't anything in the definition that excludes unguided processes from producing CSI. That exclusion is saved for the proof of the concept that CSI only arises if an intelligent agency makes it so. Our opponents conflate the proof with the definition Joe
Moose Dr: You say that there are functional proteins in random space, but functional is not what is called for. Rather, particular function is called for. That's what such experiments do. They look for pre-specified functions. They're usually simple functions, but then again, experimenters don't have the resources of billions of years and oceans of organisms. Moose Dr: (‘bet by my calculation it has vastly less than 500 bits of data) Before optimization, obviously not. After optimization, they have a great deal of specified complexity (as normally construed). Moose Dr: Is it really reasonable that the variety of real world functionality is achievable by random search? No. No one thinks proteins evolve from random sequences in nature. This just answers the question about islands of function. Moose Dr: It shows that some islands are connected. That's right. Moose Dr: Why does the NDE crowd so frequently prove that the easy case is possible, then extrapolate that all cases are possible? It was a hard case. "The two ribozyme folds share no evolutionary history and are completely different, with no base pairs (and probably no hydrogen bonds) in common." Moose Dr: First, once a protein has evolved into a high peak, it can no longer explore the broad plane for other functionality. Correct? Evolution may often find itself on a local fitness peak, but recombination allows for moving between areas of function. Moose Dr: Second, while many genes may exhibit a broad plane, is this phenomenon truly universal, or even very common? Common enough that random sequences can form weak enzymes, that can then be optimized. Moose Dr: The discovery of a new protein fold that produces new function would involve a single mutational event* from a protein with previous function. Yes? That's what Schultes & Bartel showed, that selectable pathways exist. Moose Dr: * I define mutational event broader than the point mutation, but I include any one transaction within the DNA such as an insertion, deletion, etc. Recombination is a very powerful mechanism for creating novelty. Zachriel
Learned Hand:
I don’t think I’ve ever criticized Dembski’s definition of complexity.
Sure you did. You whined repeatedly that Dembski defines it differently than everyone else. Which is hogwash, as I have just shown. What makes you think Orgel was using it differently than everyone else? Mung
Re: Zachriel (27) "If the specificity is only for weak functionality, then there are functional proteins in random sequences." I concede that once weak functionality is achieved, RM+NS is theoretically capable of producing strong functionality. However, I have this one question. You say that there are functional proteins in random space, but functional is not what is called for. Rather, particular function is called for. If, for instance, a protein is needed that functions to create finger nails (just as a point of reference), then a protein that functions to produce a retracting muscle is unlikely to be helpful. While it may be that some functionality is so generalized that random space produces it ('bet by my calculation it has vastly less than 500 bits of data), what about real world function. Is it really reasonable that the variety of real world functionality is achievable by random search? "Schultes & Bartel ... showed a pathway from one functional fold to another functional fold even while maintaining the original function. This shows that the so-called islands are connected." Hold the phone. No it doesn't! It shows that some islands are connected. It would appear much more effective to find a protein that has the appearance of being truly unique, with unique function and establishing how it got there. Why does the NDE crowd so frequently prove that the easy case is possible, then extrapolate that all cases are possible? When the hard cases are addressed, then your theory will become more compelling. "Think of function as a high peak with a broad plain." Two points with regard to this. First, once a protein has evolved into a high peak, it can no longer explore the broad plane for other functionality. Correct? Second, while many genes may exhibit a broad plane, is this phenomenon truly universal, or even very common? "No, because new protein folds are not that unusual, even in random sequences. Given modification of working proteins, the odds are even better." The discovery of a new protein fold that produces new function would involve a single mutational event* from a protein with previous function. Yes? So, I concede that island hopping -- where there is no land bridge is possible. However, this form of island hopping, as far as I can see, remains limited to the power of a single mutational event. * I define mutational event broader than the point mutation, but I include any one transaction within the DNA such as an insertion, deletion, etc. Moose Dr
Silver_Asiatic (26) "Where do you mean “defined as above” – or better yet, what is the definition you’re referring to?" Please read the first 10 lines of this post -- the first block quote is Dembski's definition that I am referring to. Moose Dr
Joe # 20 I was just asking about the definition of CSI. I didn't see the answer following in your response (although I appreciate the reply). Silver Asiatic
Moose Dr: There has been a lot of squibbling lately about Dembski’s definition of specified complexity, or CSI. The primary question is whether the conclusion is part of the definition. Sure, which is why we had to add the caveat "as normally construed". Not sure even that is strong enough. Moose Dr: I would concede that if all proteins resided on overlapping islands, RM+NS could, in theory produce them all. If the specificity is only for weak functionality, then there are functional proteins in random sequences. And these sequences can be optimized through rounds of selection and replication, which implies that there is a reasonable pathway from low functionality to higher functionality. As for moving from one protein directly to another, see Schultes & Bartel, One Sequence, Two Ribozymes: Implications for the Emergence of New Ribozyme Folds, Science 2000, who showed a pathway from one functional fold to another functional fold even while maintaining the original function. This shows that the so-called islands are connected. Moose Dr: However, I do not believe that such a clean overlapping exists. I believe that Dembski is saying that if ever DNA must swim between islands, it is swimming in the infinite sea. Think of function as a high peak with a broad plain. If you look for only highly optimized functions, then they will be isolated; however, if you allow for weak function, there are long broad plains connecting many areas of function. Moose Dr: Do you concede that overlapping islands are necessary for RM+NS in all but the rarest of cases? No, because new protein folds are not that unusual, even in random sequences. Given modification of working proteins, the odds are even better. Zachriel
Moose Dr 22
I believe that Dembski’s No Free Lunch theorem goes on to postulate that CSI cannot be produced by RM+NS. This latter argument seems rather compelling. However it can only be made if CSI is defined as above, not if it is defined as quoted elsewhere.
Where do you mean "defined as above" - or better yet, what is the definition you're referring to? Silver Asiatic
I don't think I've ever criticized Dembski's definition of complexity. It was probably different back in 1973 though. Seems likely. Learned Hand
Learned Hand with yet another biting criticism:
The whole conversation shows how Dembski thinks of “complexity” as a function of probability, which is nowhere in any snip, excerpt, or paraphrase of Orgel’s work I’ve ever seen.
Complexity is generally used to characterize something with many parts where those parts interact with each other in multiple ways. The study of these complex linkages is the main goal of complex systems theory.
In physical systems, complexity is a measure of the probability of the state vector of the system.
It was probably different back in 1973 though. Mung
Zachriel (21), I generally agree with you that the specificity of a specification is important. It needs to be measured and factored in. In this forum I believe I have seen a term like "island of function". This idea that a particular protein, while being a very specific implementation, may dwell on an island. Any variation that maintains its status as being on that island would leave it capable of the function it currently does. Ie, there is flexibility. Sometimes the islands are huge -- there is a lot of flexibility. Sometimes the islands are very small -- not much flexibility at all. The larger the island, the less value should be provided for "complexity". In comments on previous threads, I have shown an algorithm for precisely quantifying complexity in bits, and quantifying the effect of this lack of precision. Now, Zachriel, I would concede that if all proteins resided on overlapping islands, RM+NS could, in theory produce them all. So if there is a mathematical union between protein island with function A and protein island with function B, then one protein (within the union) could perform both functions. At that point gene duplication could quite readily produce separate proteins for function A and B. However, I do not believe that such a clean overlapping exists. I believe that Dembski is saying that if ever DNA must swim between islands, it is swimming in the infinite sea. If it is swimming in the infinite sea, it has not better chance than a random search in finding another island to land on. Do you concede that overlapping islands are necessary for RM+NS in all but the rarest of cases? (One might pretend, though Dembski argues otherwise, that the random search might just find another island.) Moose Dr
There has been a lot of squibbling lately about Dembski's definition of specified complexity, or CSI. The primary question is whether the conclusion is part of the definition. Other quotes of Dembski seem to imply that Dembski defines CSI as "not of unguided source". This component has been rightly recognised by some as producing a circular argument. The Dembskian definition provided here is really rather acceptable. I believe that Dembski's No Free Lunch theorem goes on to postulate that CSI cannot be produced by RM+NS. This latter argument seems rather compelling. However it can only be made if CSI is defined as above, not if it is defined as quoted elsewhere. Moose Dr
The specification can determine the specificity. For instance, proteins often vary considerably in functionality, and we normally set a degree of function as part of the specification. If we accept low functionality, then the sequence may only be weakly specified. If we insist on a highly optimized function, then the sequence will also be highly specified. Selection and replication can optimize function, so it shows how specified complexity (as generally construed) can evolve. Zachriel
Silver Asiatic:
Is it true that Dembski included in the definition of CSI that it had a low probability of having been caused by natural forces?
No. Dembski said that to infer intelligent design we must eliminate necessity and chance. Joe
LoL! CSI exists regardless of what caused it. It's just that the only known cause is an intelligent agency. Joe
Yeah, you would have to know how likely it was to come about through a source other than design to calculate CSI. Learned Hand
Is it true that Dembski included in the definition of CSI that it had a low probability of having been caused by natural forces? I haven't studied his work and always assumed that CSI was a quantitative measure of information identified before any probability analysis took place. 1. CSI has these characteristics (specification, complexity, communicative, functional, coding, organizational, etc) to a certain degree of measurement. 2. We observe these characteristics 3. Therefore, we observe CSI 4. We now determine the origin of CSI via the probability that it originated by natural sources The fact that it is always low probability does not mean we define CSI by that feature. We defined CSI in step 1, then observed it in 2, then analyzed probabilities of its origin later. Apparently, my understanding is wrong according to Dembski's view. Analysis of the origin of the CSI, and the low-probability of natural origin is part of the definition of what CSI is -- true? Silver Asiatic
R0bb:
How about your example from a few weeks ago: 500 coins, all heads, and therefore a highly ordered pattern. What would Orgel say — complex or not?
It depends if Orgel understood probability theory as probability is a complexity measurement. Joe
Maybe I'm just thick. LH quotes: “Yes, Orgel used the term more loosely than Dembski, but they are talking about the same concept.” And this he takes to mean they are not talking about the same concept. Where else but here at UD... Mung
An object, event, or structure exhibits specified complexity if it is both complex (i.e., one of many live possibilities)... You mean we can't just go by whatever we can imagine? Trillions and trillions of of made up possibilities. Mung
Barry:
Dembski: A long sequence of randomly strewn Scrabble pieces is complex without being specified. A short sequence spelling the word “the” is specified without being complex. A sequence corresponding to a Shakespearean sonnet is both complex and specified. Orgel: Mixtures of random polymers are complex without being specified. Crystals such as granite are specified without being complex. Living organisms are both complex and specified.
How about your example from a few weeks ago: 500 coins, all heads, and therefore a highly ordered pattern. What would Orgel say -- complex or not? R0bb
Barry, LH, thanks for the explanation. Collin
I'm a Texan, I prefer firearms metaphors. Learned Hand
LH @ 9. That is correct. The specification does not have to come first temporally as in the arrow/target metaphor. Dembski has used the arrow/target metaphor to elucidate the necessity that that specification be independent. Barry Arrington
In No Free Lunch, Dembski talks about specification needing to be separated from the observation of the function--it doesn't actually have to come first, just not be dependent on the thing being tested. His example on page 289 is the bacterial flagellum. He points out that humans invented outboard motors, which is an independent target for the flagellum to hit even though the flagellum came first. Learned Hand
Collin @ 6. There is a vast literature on the subject. In very very brief summary, the specification in living things is the functional arrangement of many parts; or digitally encoded semiotic information. Barry Arrington
LH @ 4. You seem to be implying that I have backed off of my October 23 and 24 statements. I have not. Orgel and Dembski are talking about exactly the same fundamental concept – living things are characterized by the combination of complexity and specification, which means that neither complexity nor a specification, standing alone, is sufficient to describe living things. Though it is beyond the scope of this post, Orgel and Dembski also agree that the natural forces we observe in action on this planet are almost certainly insufficient to explain the complex specificity seen in living things. And in a sense they both resort to the same alternative explanation: Dembski: intelligent design generally; Orgel: intelligent design by aliens specifically (which is why he was an advocate of directed panspermia). I am glad you have admitted that you have learned something from our exchange. Barry Arrington
Thanks Barry. How do we do that with life? Collin
Collin, Dembski means that when it comes to a specification, we can't draw the target around the arrow. We have to specify the target before we shoot the arrow. Barry Arrington
Barry Arrington on October 23 and 24: "Your next gambit will be: Orgel was talking about something else. Fail. He was talking about exactly the same thing." "He uses the terms complex and specified in exactly the sense Dembski uses the terms." Above, a month later: "Yes, Orgel used the term more loosely than Dembski, but they are talking about the same concept." It's taken a long time to move you just a little bit, but at least you're now in the realm of a reasonable disagreement rather than an outright and obvious mistake. I learned something, you learned something, and it's been a relatively civil discussion (by UD standards). Thanks and you're welcome. Learned Hand
I apologize for asking this. I haven't read Dembski's work. But what does he say is the "independently given pattern"? Are we talking about how DNA is independent of proteins but controls their creation and function? Collin
BA, it seems we must belabour the obvious because -- it more and more seems -- the obvious patently does not point where some wish to go. KF PS: Did n=you notice that to date AFAIK none of the more stringent objectors of recent weeks has actually admitted that FSCO/I exists as a real characteristic of anything? That, when confronted with something as simple and direct as an Abu 6500 C3 fishing reel to date they have studiously avoided it apart from one objector who had the bright idea to say well it has gears in it and we know of only one case of gears in the world of life. Actually, just one case of seriously properly meshed gears should be in itself a wake up call. But the wider manifestations of FSCO/I are all around us -- think, wiring diagram style node arc linkages and organisation that depends on specific configuration to achieve function -- literally (think PC screen and the wider PC not to mention the data strings, programs, keyboards, track pads etc etc) staring us in the face. kairosfocus
Barry Learned hand made it clear, they are not the same because its just opinion, even if it is reasonable. I really struggle with inability some have with the meaning of things..... Andre

Leave a Reply