Uncommon Descent Serving The Intelligent Design Community

Is Mathgirl Smarter than Orgel and Wicken Combined? Doubtful.

Categories
Intelligent Design
Share
Facebook
Twitter/X
LinkedIn
Flipboard
Print
Email

Mathgirl wrote in a comment to my last post:  “My conclusion is that, without a rigorous mathematical definition and examples of how to calculate [CSI], the metric is literally meaningless.  Without such a definition and examples, it isn’t possible even in principle to associate the term with a real world referent.”

Let’s examine that.  GEM brings to our attention two materialists who embraced the concept, Orgel [1973] and Wicken [1979].

Orgel:

. . . In brief, living organisms are distinguished by their specified complexity. Crystals are usually taken as the prototypes of simple well-specified structures, because they consist of a very large number of identical molecules packed together in a uniform way. Lumps of granite or random mixtures of polymers are examples of structures that are complex but not specified. The crystals fail to qualify as living because they lack complexity; the mixtures of polymers fail to qualify because they lack specificity.

The Origins of Life (John Wiley, 1973), p. 189.

Wicken:

‘Organized’ systems are to be carefully distinguished from ‘ordered’ systems. Neither kind of system is ‘random,’ but whereas ordered systems are generated according to simple algorithms [[i.e. “simple” force laws acting on objects starting from arbitrary and common- place initial conditions] and therefore lack complexity, organized systems must be assembled element by element according to an [[originally . . . ] external ‘wiring diagram’ with a high information content . . . Organization, then, is functional complexity and carries information. It is non-random by design or by selection, rather than by the a priori necessity of crystallographic ‘order.’

“The Generation of Complexity in Evolution: A Thermodynamic and Information-Theoretical Discussion,” Journal of Theoretical Biology, 77 (April 1979): p. 353, of pp. 349-65.]

I assume mathgirl believes Orgel and Wicken were talking meaningless nonsense.  Or maybe she doesn’t and that’s why she has dodged GEM’s challenge at every turn.

Be that as it may, both dyed-in-the-wool materialists and ID advocates understand that living things are characterized by CSI.  Indeed, the law recognizes that DNA is characterized by CSI.  Recently a federal judge wrote:

Myriad’s focus on the chemical nature of DNA, however, fails to acknowledge the unique characteristics of DNA that differentiate it from other chemical compounds. As Myriad’s expert Dr. Joseph Straus observed: “Genes are of double nature: On the one hand, they are chemical substances or molecules. On the other hand, they are physical carriers of information, i.e., where the actual biological function of this information is coding for proteins. Thus, inherently genes are multifunctional.” Straus Decl. 1 20; see also The Cell at 98, 104 (“Today the idea that DNA carries genetic information in its long chain of nucleotides is so fundamental to biological thought that it is sometimes difficult to realize the enormous intellectual gap that it filled. . . . DNA is relatively inert chemically.”); Kevin Davies & Michael White, Breakthrough: The Race to Find the Breast Cancer Gene 166 (1996) (noting that Myriad Genetics’ April 1994 press release described itself as a “genetic information business”). This informational quality is unique among the chemical compounds found in our bodies, and it would be erroneous to view DNA as “no different[]” than other chemicals previously the subject of patents.

Myriad’s argument that all chemical compounds, such as the adrenaline at issue in Parke-Davis, necessarily conveys some information ignores the biological realities of DNA in comparison to other chemical compounds in the body. The information encoded in DNA is not information about its own molecular structure incidental to its biological function, as is the case with adrenaline or other chemicals found in the body. Rather, the information encoded by DNA reflects its primary biological function: directing the synthesis of other molecules in the body – namely, proteins, “biological molecules of enormous importance” which “catalyze biochemical reactions” and constitute the “major structural materials of the animal body.” O’Farrell, 854 F.2d at 895-96. DNA, and in particular the ordering of its nucleotides, therefore serves as the physical embodiment of laws of nature – those that define the construction of the human body. Any “information” that may be embodied by adrenaline and similar molecules serves no comparable function, and none of the declarations submitted by Myriad support such a conclusion. Consequently, the use of simple analogies comparing DNA with chemical compounds previously the subject of patents cannot replace consideration of the distinctive characteristics of DNA.

In light of DNA’s unique qualities as a physical embodiment of information, none of the structural and functional differences cited by Myriad between native BRCA1/2 DNA and the isolated BRCA1/2 DNA claimed in the patents-in-suit render the claimed DNA “markedly different.” This conclusion is driven by the overriding importance of DNA’s nucleotide sequence to both its natural biological function as well as the utility associated with DNA in its isolated form. The preservation of this defining characteristic of DNA in its native and isolated forms mandates the conclusion that the challenged composition claims are directed to unpatentable products of nature.

Association for Molecular Pathology v. U.S. Patent and Trademark Office, 702 F.Supp.2d 181 (S.D.N.Y. 2010).

Maybe mathgirl knows something that this federal court or Orgel or Wicken didn’t when she says CSI is a meaningless concept.  But I doubt it.

Comments
idcurious:
Archeology doesn’t tell us which individuals built stonehenge. It tells us which cultures did – most likely, given all the evidence
Strage that is what I said.
You ignore the vast majority of life & earth scientists who say there is no merit in ID.
I ignore bald declarations. [Ribosome's are] About as “naturally occurring” as my cars.
That sums up the level of your argument quite perfectly.
And your bald assertions and lies sum up the level of your argument quite perfectly. In what way are ribosmes "naturally occurring"? here isn't any evidence for nature, oprating freely, producing them.
We find them everywhere in naturally occurring DNA. We have no idea at present how to create DNA.
Craig Venter created DNA and there isn't any evidence that nature, operatin freely, can do the same.Joseph
April 13, 2011
April
04
Apr
13
13
2011
02:44 PM
2
02
44
PM
PDT
Joseph @ 105
he person who built my house doesn’t live in it.
Archeology doesn't tell us which individuals built stonehenge. It tells us which cultures did - most likely, given all the evidence.
It happened to long-atheist Anthony Flew too- and Den Kenyon, Michael Behe, Michael Denton, etc., etc., etc.
You list four names. You ignore the vast majority of life & earth scientists who say there is no merit in ID. Cherry picking, much?
[Ribosome's are] About as “naturally occurring” as my cars.
That sums up the level of your argument quite perfectly.idcurious
April 13, 2011
April
04
Apr
13
13
2011
02:28 PM
2
02
28
PM
PDT
Upright BiPed @ 88
Well, I suppose after finding meaning instantiated into matter an investigator would probably begin by asking themselves “where else” do we find such a phenomena.
We find them everywhere in naturally occurring DNA. We have no idea at present how to create DNA. We have two basic explanations for this. One, DNA came about from simpler structures given the right circumstances (as yet unknown) and enormous amounts of time... The other that something "intelligently designed" DNA that did not itself require intelligent design. Humour me. Can you understand why the second explanation might seem problematic? Do you accept that recognisably human beings existed for thousands of years without using symbolic representations?idcurious
April 13, 2011
April
04
Apr
13
13
2011
02:22 PM
2
02
22
PM
PDT
So are you saying that Turing machine was designed by that equation you linked to? No. That is a we were dicussing.
The Turing Machine describes something that works in nature if the correct parts are in the correct order.
Most designed things fit that categoy.
Ribosomes are naturally occurring Turing Machines.
About as "naturally occurring" as my cars.
Science tells us nature is way smarter than we are.
No it doesn't.
You want to pretend that nature doesn’t work.
Stop lying about me. You are pathetic.Joseph
April 13, 2011
April
04
Apr
13
13
2011
02:11 PM
2
02
11
PM
PDT
idcurius: No-one is throwing deep time at anything. Except evoluionists.
You are throwing out the research, evidence and opinions of thousands of scientists in hundreds of different fields because they disagree with the pre-conceptions of Joseph P. Nobody.
Nope, you are lying, of course.
The evidence is that the bacterial flagellum evolved from simpler forms.
That is the untestable speculation based on the untestable assumption. On Stonehenge:
You think it’s more likely that space aliens or incorporeal beings came down to earth and build stonehenge at the same time as these people lived in the area, then vanished with no trace?
No. The person who built my house doesn't live in it.
No, Joseph, you don’t have anything to say, and hence you flail away at science – and proclaim that you know better than thousands of scientists in hundreds of different fields because they disagree with your pre-conceptions.
Except I don't have any preconceptions and was a devoted evo until started looking more closely at the scientic evidence. It happened to long-atheist Anthony Flew too- and Den Kenyon, Michael Behe, Michael Denton, etc., etc., etc.Joseph
April 13, 2011
April
04
Apr
13
13
2011
02:06 PM
2
02
06
PM
PDT
Joseph @ 84
So are you saying that Turing machine was designed by that equation you linked to?
No. The Turing Machine describes something that works in nature if the correct parts are in the correct order. Ribosomes are naturally occurring Turing Machines. Similarly there are naturally occurring nuclear reactors - discovered after humans built their first nuclear reactors. Science tells us nature is way smarter than we are. You want to pretend that nature doesn't work.idcurious
April 13, 2011
April
04
Apr
13
13
2011
02:02 PM
2
02
02
PM
PDT
Joseph @ 83
Throwing deep time at issues is not scientific and your avoidance of that proves my point- that you are an intellectual coward.
No-one is throwing deep time at anything. Pretending that billions of years of time is not an issue in discussing billions of years of history is simply brain-dead.
Sed the ignorant evo.
You are throwing out the research, evidence and opinions of thousands of scientists in hundreds of different fields because they disagree with the pre-conceptions of Joseph P. Nobody. They have not demonstrated that a bacterial flagellum can evolve via accumulating random mutaions fom a popultion that never had one. The evidence is that the bacterial flagellum evolved from simpler forms. ID proposes that the mechanism for this was "design" from an unknown "designer".
How do they kow that tos people are the people who designed Stonehenge?... A far as they know those are just the people who lived in the area.
You think it's more likely that space aliens or incorporeal beings came down to earth and build stonehenge at the same time as these people lived in the area, then vanished with no trace? /facepalm.
Your position doesn’t have anythng and that is why you are forced to flail away at ID.
No, Joseph, you don't have anything to say, and hence you flail away at science - and proclaim that you know better than thousands of scientists in hundreds of different fields because they disagree with your pre-conceptions.idcurious
April 13, 2011
April
04
Apr
13
13
2011
01:56 PM
1
01
56
PM
PDT
MathGrrl:
The concept of “specified complexity” presented by Orgel is not the same as the concept of “specified complexity” discussed by Dembski.
Dembski expanded on what Orgel said. IOW he used Orgel's concept and modernized it. Are you sure you read "No Free Lunch"? Chapter 4 and chapter 6- he discusses previous talks about specified complexity.Joseph
April 13, 2011
April
04
Apr
13
13
2011
01:17 PM
1
01
17
PM
PDT
If you really wanted to know about CSI you would read “No Free Lunch”. MathGrrl:
I did.
Methinks you are lying. I say that because in NFL Dembski makes it clear that CSI pertains to origins. And what he means but the quote you mined is that we don't have to know how it arose. If CSI is present then it is safe to infer it arose via some designing agency. That is the whole point. CSI cannot be expressed by an algorithm.
You seem to be in agreement with vjtorley but in disagreement with those ID proponents who claim to be able to measure CSI in bits.
It can be measured in bits. I did it for your example #3. 22 bytes = 176 bits. Then you factor in the variation tolerance to get the specification. Then start counting.Joseph
April 13, 2011
April
04
Apr
13
13
2011
01:07 PM
1
01
07
PM
PDT
MathGrrl
You have previously agreed that Dembski’s description of CSI does not use Shannon information. Why do you keep bringing up this red herring?
Why do you keep quoting me out-of-context? CSI is Shannon information of a certain complexity and with meaning/ function. What the heck is your problem?Joseph
April 13, 2011
April
04
Apr
13
13
2011
01:01 PM
1
01
01
PM
PDT
Muramasa, If you noticed MathGrrl didn't comprehend what I posted. Apparently neither do you. AS I said CSI is Shannon information with meaning/ function and of a certain complexity. (shannon information doesn't care about meaning/ content/ function) Stephen Meyer goes over that in "Signature in the Cell". So again Shannon provided the math for the information part. Dembski did the math for the complexity and also for the specification.Joseph
April 13, 2011
April
04
Apr
13
13
2011
12:59 PM
12
12
59
PM
PDT
KF @ 96: You may be misinterpreting MathGrrl's statement. As I read it, 1) Joseph put forward Shannon's work as providing "the math for information". 2) MathGrrl pointed out that Joseph had previously accepted that Dembski did not use Shannon information in his descrioption of CSI. If I read your post correctly, you are also asserting that Dembski did not use Shannon information. So with whom are you in agreement? Joseph or MathGrrl?Muramasa
April 13, 2011
April
04
Apr
13
13
2011
12:36 PM
12
12
36
PM
PDT
PaV, you write,
QID: this comment exhibits quite a bit of ignorance on your part; ignorance in the sense that you are very likely not familiar with Dembski’s book, “No Free Lunch”, and neither with his paper on “Specification”—a paper, I may note, that is strictly an on-line publication.
In fact, I have read both. I find NFL to be the strangest, least helpful ID book I have ever read. The paper is a bit better but it's not particularly good at guiding future research. It's hard to know how to build a research program on such slippery foundations.QuiteID
April 13, 2011
April
04
Apr
13
13
2011
12:17 PM
12
12
17
PM
PDT
Onlookers: Re MG & 94: Dembski’s description of CSI does not use Shannon information. I predict the following will be ignored or strawmannised. Dembski's description of CSI uses Hartley's information as negative log probability. He goes beyond the concern of Shannon [info carrying capacity of Gaussian noise bandlimited channels in bits per symbol then symbols per second in light of signal to noise power ratios . . . theoretical capacity not practical capacity as out codes and modulation systems have some ways to go] to deal with the meaningfulness and function of info, thus answering questions on credible cause. Shannon's H is a metric of average info per symbol, based on the same fundamental negative log metric, but weighted by the probability of any one symbol to give that weighted average. H = - [SUM on i] pi log pi Ii = - log pi (on Hartley's suggestion, which gives an additive metric) We can already see that the Dembski metric boils down to info in bits beyond the threshold of 398+ bits, thus identifying as being in a large enough config space that we can rule out lucky noise as a credible source. This was already discussed above; no response. The two are related derivations form the fundamental concepts, but distinct for different purposes. The fundamental metric is NOT Shannon Info, H, but the Hartley metric, I = - log(p). Now, let us see if MG is responsive, or continues what looks more and more like ignore, dismiss, cloud and side track games. GEM of TKIkairosfocus
April 13, 2011
April
04
Apr
13
13
2011
11:44 AM
11
11
44
AM
PDT
QuiteID:
Maybe. Certainly your definition of CSI was the first one that made sense. Until then people couldn’t agree on whether specification was measurable, whether CSI was measured in bits or was “just a number” (vjtorley), etc., etc., etc. It was clear that all these people were committed to a concept of CSI, but nobody could agree on what that concept was. Which suggests to me that CSI is less valuable than other ID concepts that I’ve mentioned.
QID: this comment exhibits quite a bit of ignorance on your part; ignorance in the sense that you are very likely not familiar with Dembski's book, "No Free Lunch", and neither with his paper on "Specification"---a paper, I may note, that is strictly an on-line publication. Why do I point out that his Specification Paper is only on-line? Because in it he addresses criticism leveled at him in his NFL book. So, it is a rebuttal to his critics---critics, I may add, that seem to themselves be ignorant of what Dembski wrote. I had a go-round with one of these critics, and came to the conclusion that he very likely either didn't read the book, or only looked at it superficially. MathGrrl falls into this latter category. Now you want to make a big thing about the various views that have been promulgated. The fact that there are different views seems to bother you---as it does MathGrrl. The nub of this dissettlement is the fact that we have two versions, if you will, of "specified complexity": CSI in NFL, and "specified complexity" in the Specification Paper. Jeffrey Shallit, for example, said that Dembski's definition of CSI had changed from one book to another. Is this really a problem? Do you know how many different kinds of solutions there are to Einstein's equations? Quite a number. Does the fact that you can come up with different numbers using GR mean that GR is not "mathematically rigorously defined"? I don't think so. In answering MathGrrl's ridiculous demands---and that was, and is, what they are; no more, no less---I chose to invoke CSI (from NFL) for two reasons: (1) I'm more familiar with it than with the Specification paper; nad (2) it is simpler to apply. And, per NFL, CSI is defined in "bits". Let's remember, poor old MathGrrl was complaining that she didn't know how to do any of these calculations. Why add refinements when the basics aren't even understood? Hence, my approach. vjtorley, OTOH, chose to invoke the Specification Paper. The approach, though seemingly mathematically different, is essentially the same. Dembski only throws in refinements to rebut the criticisms leveled at him. The paper stands on its own. And to assuage the dilettantes of Information Theory, Dembski chose to couch the mathematics so that a "number", rather than "bits" is the parameter measuring "specified complexity". Newtonian gravity and Einsteinian gravity give you the same numbers basically for anything happening here on earth. Does the fact that they're different, and arrived at in a completely different way mean that "gravity" is not "rigorously defined" mathematically? MathGrrl's tactics are just that: tactics. She should know better. That she doesn't know any better is an indictment of her; not ID. So there you have the reasons for the differences. And, conceptually, and mathematically, they are the same; one is simply a refined version of the former (a refinement brought about by silly criticism directed at CSI). So let's let that dead dog just lay. P.S. I went through an artificial example for MathGrrl's comprehension. The essentials are all there. What is she going to do now? Is she going to apply them to here "four scenarios"? She should be able to. So why doesn't she do just one of them and "wow" us? We await. But, of course, her lament is: it can't be done! So she does nothing!PaV
April 13, 2011
April
04
Apr
13
13
2011
10:21 AM
10
10
21
AM
PDT
Joseph,
DrBot:
Or you could just show us how to do the calculation?
Claude Shannon provided the math for information.
You have previously agreed that Dembski's description of CSI does not use Shannon information. Why do you keep bringing up this red herring?MathGrrl
April 13, 2011
April
04
Apr
13
13
2011
10:21 AM
10
10
21
AM
PDT
PaV,
I have a challenge for you. Scientists assert the “Law of Conservation of Angular Momentum”. I say that it has not been rigorously demonstrated. For scientists—and you in particular—to convince me of this supposed “law”, please apply this “law” to the destruction of the World Trade Centers. Unless you can demonstrate clearly that it applies to that event, then the “Law of Conservation of Angular Momentum” is just hyperbole. I await your proof. And when you “prove” that, then I’ll show you how to calculate CSI for any one of your four scenarios.
The immediate thought in my head upon reading this was Charles Babbage's memorable statement: "I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question." I'm not making any claims based on conservation of angular momentum. ID proponents are making claims based on CSI. The burden of proof is on those making the claim. I'd like to see that proof.MathGrrl
April 13, 2011
April
04
Apr
13
13
2011
10:20 AM
10
10
20
AM
PDT
PaV,
Patently, none of those four scenarios rises to the level of actual CSI.
Without a rigorous mathematical definition of CSI and an example calculation, that assertion is baseless.
Even if Bill Dembski himself did the calculation for any of those scenarios, he would conclude that CSI is not present. So, how does that make ANY of those FOUR SCENARIOS worth five minutes worth of anyone’s attention?
It would show that CSI does have a rigorous mathematical definition and that it can be objectively calculated. I find it passing strange that no ID proponent (other than vjtorley, of course) seems to consider that valuable.
ev demonstrates nothing; and it’s the best that evo-biologists can do.
That assertion could use some support as well.MathGrrl
April 13, 2011
April
04
Apr
13
13
2011
10:19 AM
10
10
19
AM
PDT
Collin,
In the previous threads you were given several mathematical definitions of CSI but I don’t recall you ever saying why they were not good enough. You just kept insisting, over and over, that no one has provided a rigorous mathematical definition.
That is not an accurate summary of my guest thread. If you believe it is, please reference particular comments where I dismissed definitions so cavalierly. I repeatedly requested a rigorous mathematical definition of CSI as described by Dembski. This is the metric claimed by ID proponents to indicate the involvement of intelligent agency. None of the definitions offered in that thread were consistent with Dembski's writings.MathGrrl
April 13, 2011
April
04
Apr
13
13
2011
10:19 AM
10
10
19
AM
PDT
Barry Arrington,
mathgirl writes: “Reading the source material from Orgel will show that he uses the term “specified complexity” in a subjective, descriptive, qualitative sense.” I take it then that you agree that the concept of CSI as Orgel used it is not meaningless. Good we are making progress.
Please don't put words in my mouth. I meant exactly what I wrote. The concept of "specified complexity" presented by Orgel is not the same as the concept of "specified complexity" discussed by Dembski. I have said nothing about whether or not Orgel's concept is coherent or meaningful.MathGrrl
April 13, 2011
April
04
Apr
13
13
2011
10:18 AM
10
10
18
AM
PDT
Joseph,
What a crock. If you really wanted to know about CSI you would read “No Free Lunch”.
I did. I also read some other of Dembski's papers. None provide sufficient detail to calculate CSI objectively.
CSI cannot be expressed by an algorithm.
You seem to be in agreement with vjtorley but in disagreement with those ID proponents who claim to be able to measure CSI in bits. Since you agree that CSI does not have a mathematically rigorous definition, I believe our conversation is complete. Thank you for your time.MathGrrl
April 13, 2011
April
04
Apr
13
13
2011
10:18 AM
10
10
18
AM
PDT
IDcurious,
That is a very interesting question. How would we know the meaning was “instantiated” (word of the day) by an intelligence rather than not-yet-understood natural processes?
Well, I suppose after finding meaning instantiated into matter an investigator would probably begin by asking themselves “where else” do we find such a phenomena. Questions then follow: a) Is that a reasonable response? If not, then why not? b) If the resulting scientific explanation is that it came about by a yet-not-understood process, how would that conclusion be tested for its veracity? c) On what specific grounds would that explanation be favored over one that is based upon what we already know about such phenomena?
The best evidence we have is that life arose over billions of years.
Is this a positioning statement, or is there a point you’d like to make? Quite frankly, I think you’ll find that the best estimates are that life arose almost immediately after the earth cooled to a reasonable degree, and that it was highly organized and complex at that time.
I’m afraid I really don’t see that inferring a “designer” which ID can’t tell us anything about is helpful.
“Where we came from” has been a fairly viable topic for the greater part of human history. I would think that most people find it an integral part of understanding reality as it is. Are you suggesting that the validity of the issue is directly related to whether or not you personally find it interesting?
That said, “hi mom” encoded in DNA would be quite a find
So you’ve returned to my question with a requirement that the Design argument produce “Hi Mom” from DNA as a means to validation it -is that correct? - - - - - Here are a few questions you might like to ask yourself, or for extra credit you can answer them here: Is meaningful information recorded in DNA by the arrangement of material symbols? How is information created? Are there any examples of recorded information that were not first the product of perception? Is there any information in existence anywhere that is not the product of perception? Are there any examples of recorded information that exist without the use of symbolic representation? How are symbols created? What makes a symbolic representation a symbolic representation? How is the symbol-to-object relationship established in a symbol? Are there any examples of naturally occurring symbols? Is there any distinction between an analog symbol (howl of a wolf) and a digital symbol (morse code)? Are the symbols used to record information freely chosen? Are there any examples of symbols used to record information which were not freely chosen?Upright BiPed
April 13, 2011
April
04
Apr
13
13
2011
09:44 AM
9
09
44
AM
PDT
F/N: For those who don't want to scroll up and click on the links, here is UD WAC 28: ___________________ >> 28] What about FSCI [Functionally Specific, Complex Information] ? Isn’t it just a “pet idea” of some dubious commenters at UD? Not at all. FSCI — Functionally Specific, Complex Information or Function-Specifying Complex Information (occasionally FCSI: Functionally Complex, Specified Information) – is a descriptive summary of the particular subset of CSI identified by several prominent origins of life [OOL] researchers in the 1970?s – 80?s. For at that time, the leading researchers on OOL sought to understand the differences between (a) the highly informational, highly contingent functional macromolecules of life and (b) crystals formed through forces of mechanical necessity, or (c) random polymer strings. In short, FSCI is a descriptive summary of a categorization that emerged as pre-ID movement OOL researchers struggled to understand the difference between crystals, random polymers and informational macromolecules. Indeed, by 1984, Thaxton, Bradley and Olson, writing in the technical level book that launched modern design theory, The Mystery of Life’s Origin [Download here], in Chapter 8, could summarize from two key origin of life [OOL] researchers as follows:
Yockey [7] and Wickens [5] develop the same distinction [as Orgel], explaining that “order” is a statistical concept referring to regularity such as might characterize a series of digits in a number, or the ions of an inorganic crystal. On the other hand, “organization” refers to physical systems and the specific set of spatio-temporal and functional relationships among their parts. Yockey and Wickens note that informational macromolecules have a low degree of order but a high degree of specified complexity. In short, the redundant order of crystals cannot give rise to specified complexity of the kind or magnitude found in biological organization; attempts to relate the two have little future. [TMLO, (Dallas, TX: Lewis and Stanley reprint), 1992, erratum insert, p. 130. Emphases added.] [NB: that reference to erratum insert tells you this comes from the PRINT copy reprint]
The source of the abbreviation FSCI should thus be obvious – and it is one thing to airily dismiss blog commenters; it is another thing entirely to have to squarely face the result of the work of men like Orgel, Yockey and Wickens as they pursued serious studies on the origin of life. But also, while the cluster of concepts came up in origin of life studies, these same ideas are very familiar in engineering: engineering designs are all about stipulating functionally specific, complex information. Indeed, FSCI is a hallmark of engineered or designed systems. [I add: ever wonder why so many engineers, applied scientists and computer programmers are ID supporters? They know what they are looking at, and they know how hard it is to get it developed.] So, FSCI is actually a functionally specified subset of CSI, i.e. the relevant specification is connected to the presence of a contingent function due to interacting parts that work together in a specified context per requirements of a system, interface, object or process. For practical purposes, once an aspect of a system, process or object of interest has at least 500 – 1,000 bits or the equivalent of information storing capacity, and uses that capacity to specify a function that can be disrupted by moderate perturbations, then it manifests FSCI, thus CSI. This also leads to a simple metric for FSCI, the functionally specified bit; as with those that are used to display this text on your PC screen. (For instance, where such a screen has 800 x 600 pixels of 24 bits, that requires 11.52 million functionally specified bits. This is well above the 500 – 1,000 bit threshold.) On massive evidence, such cases are reliably the product of intelligent design, once we independently know the causal story. So, we are entitled to (provisionally of course; as per usual with scientific work) induce that FSCI is a reliable, empirically observable sign of design. >> ____________________ I doubt the issues could have been more explicitly laid out in the sort of short essay compass that was required, and the conceptual definition is supported by a simple metric with a concrete example that shows how it works on the very object you are using to read this post. As to MG's attempt to drive a wedge between CSI and FSCI, it stands exposed for -- pardon my directness, but given what was there all along and repeatedly pointed out but ignored or brushed aside improperly -- a shabby, shallow rhetorical gambit. Scrolling up to no 27 in the WACs, the Durston metric, the table of results for 35 protein families and the Dembski metric were also identified, discussed and linked (though the details on the Durston metric, which are more technical, were not given). Whatever the merits and demerits of objections on how Dembski got there, the Dembski metric -- as I showed [and as is evident from a simple enough analysis of what something of form C = - log2(D*p) means: C is information beyond a threshold: C = Ip - K, K a threshold] -- boils down to so many bits of info beyond 398 - 500 bits depending on the circumstances. Since the search resources of our solar system are of order 10^102 events, and those of the cosmos 10^150 events, it is reasonable that things significantly beyond those thresholds are designed, on search space isolation grounds. With the 1,000 bits used in my X-metric, that rises to moral certainty. The Durston metric -- apart from giving 35 values of FSCI published in the peer reviewed literature [and recall, FSCI is a subset of CSI] -- gives a way to estimate the scope of islands of function and to compare the islands to the space of possibilities, as is shown in the excerpts and notes here. The basic information was there all along. Attention was repeatedly drawn to it when MG raised her challenges and talking points, but was dismissed or deliberately ignored. Repeatedly. Over several threads and weeks. That has to be willful. Now, I believe the onward analysis of threshold metrics as the context for Dembski and for VJT's CSI-lite, draws these back together into the same setting as the X-metric, which was simpler to understand to begin with. And the Durston metric shows how we may empirically identify islands of function and compare to config spaces. I trust that this is enough to help those who want to be helped. G'day GEM of TKIkairosfocus
April 13, 2011
April
04
Apr
13
13
2011
08:11 AM
8
08
11
AM
PDT
Did it set things in motion to use blind, undirected processes?
There isn't any evidence that blind, undirected process can construct functionl multi-part systems.
Did it have a master plan?
Most likely.
Was there one designer or hundreds?
Yes.
Where did the designer come from?
ID isn't about the designer(s). Man you are pathetic. True ID doesn't he all the answers but that is what science is for. Your position doesn't have anythng and that is why you are forced to flail away at ID.Joseph
April 13, 2011
April
04
Apr
13
13
2011
07:32 AM
7
07
32
AM
PDT
QI: Re, 73: Certainly your [JOSEPH'S] definition of CSI was the first one that made sense. Until then people couldn’t agree on whether specification was measurable, whether CSI was measured in bits or was “just a number” (vjtorley), etc., etc., etc. It was clear that all these people were committed to a concept of CSI, but nobody could agree on what that concept was. 1 --> First, this is not "Joseph's definition," as can be seen from the UD WAC's no 28, which cashes out a metric, the X-metric. That the WAC uses the range 500 - 1,000 bits makes little difference. (Cf here for a more detailed elaboration, and here for an explanation of the links between the various metrics. This last builds on discussions in the LGM thread, given the talking points that have been flying around recently.) 2 --> In short, in coming here as in effect a critic, the evidence suggests that you have not done basic corrective reading in a context where it is explicitly warned:
WAC Intro: . . . Many who interact with us on this blog recycle . . . misinformation. Predictably, they tend to raise notoriously weak objections that have been answered thousands of times. What follows is a list of those objections and our best attempt to answer them in abbreviated form. If you have been sent here, you are now being asked to familiarize yourself with basic ID knowledge so that you can acquire the minimal amount of information necessary to conduct meaningful dialogue . . .
3 --> Now, CSI can be considered as a number, and in discussing with VJT, I have pointed out that the number can be understood as in effect information in bits beyond a reasonable threshold, as the - log2 [DE*p] format implies. 4 --> You will notice, all along since MG raised objections and talking points that I have repeatedly stressed that if one does not understand what the simple X-metric is doing, s/he cannot understand the more sophisticated metrics. Identify that something is contingent and complex beyond 1,000 bits, identify that it is objectively specific, and then count up the information it uses, in bits., The result is a measure of its functionally specific complex info in bits. 5 --> That 1,000 bit threshold is big enough to flatten off all objections on probability metrics etc, and to show how the search resources of our observed cosmos could not scratch the surface of the number of possibilities in the config space. 6 --> At the same time, it is small enough that with 125 bytes or 143 ASCII characters in hand, you plainly cannot write much of a functional control program or specify a lot of description. Or, write much more than one longish sentence of about 20 English words. Can't say a lot in 20 words. 7 --> Now, all of this is debating on the mathematical metrication of complex specified information or functionally specific, complex information. These were quire meaningfully identified and specified up to material family resemblance -- the same basis on which we identify what a living thing is for biology [there is no one general necessary and sufficient conditions definition of life, much less a quantitative metric for living/non living] -- in the 1970's by Orgel and Wicken as is cited again in the OP by BA above. That is what MG keeps on ducking and is still ducking. 8 --> So, please don't let talking points designed to throw up confusion on measurement of quantities mislead you to think what is to be quantified, is meaningless. 9 --> Most important things that are very meaningful cannot be reduced to a simple number by defining a unit and estimating a ratio relative to that unit, then reporting N number of Q quantity. 10 --> In that context, observe how consistently MG has ducked dealing with the X-metric and the Durston et al FSC metric; no prizes for guessing why. (Cf here for a newly put up excerpt -- it is hard slogging to wade through the whole paper to figure it out -- that sums up what this is about and what it does as well as how it ties in with the other metrics.) 11 --> there has been one significant debate, that shows up that there is a flaw in the Dembski derivation of his Chi-metric, but it does not make a dime's difference to what that metric ends up doing: specifying a threshold of 398 to about 500 bits depending on phi_S(T), beyond which you can be confident that the complex, specified object that has been reduced to an information metric in bits, is designed. (VJT's CSI-lite simply rounds up the threshold to the top of that range. 13 --> The X-metric, to eliminate the onward debates on how you can assign probabilities, simply makes the threshold so big that once you have a reasonable way to get a bit count, the config space specified by the threshold will swamp any prospect that the observed universe will be practically able to search out so big a space by random walks and trial and error. If you see a 747 or a paragraph in English, it is not a likely product of chance and mechanical necessity going on to hit and miss trial and error, which is what the infinite monkeys analysis is about. 14 --> That monkeys at keyboards random typing and then filtering resulting text for function is what evolutionary materialistic mechanisms boil down to: finding islands of function by random walks then moving around in such islands and hill climbing by rewarding accidentally hit on improvements. 15 --> The problem is, that the threshold at which the complexity overwhelms trial and error based search on random walks, is quite low: 400 or 500 - 1,000 bits; 10^120 - 10^150 up to 10^301 possible configs. [The total number of quantum states of the atoms in our solar system since its origin -- mostly in the sun -- is about 10^102, scoping out available probability resources, lower than the bottom end of the range.] 16 --> the real issue is that FSCI as just described has never been empirically observed save by intelligence, and for reasons tied to the swamping out infinite monkeys analysis, that is understandable. 17 --> There is no credible observational evidence -- notice yesterday the blunder-filled distraction on the Marian canali and the like -- and no reason to believe that chance and necessity can credibly give us FSCI on the gamut of our observed cosmos. 18 --> That is what all those red herring and burning strawmen to make a smokescreen talking points are designed to distract you from. GEM of TKI PS: IDC was it above is trying to cite a known, unfortunately massively dishonest materialist talking point site, Talk Origins, as though it is an authority. Sorry, if that is your source that is not a good sign. To see what is going on, see if you will find there a straightforward acknowledgement of the imposition of a priori materialism on origins science and its damaging implications. And, you will also see that one has to be very selective indeed in citing Wiki. Take it as an evo mat driven, too often ideologically biased 101 site [for all the good stuff that is there], and cross check.kairosfocus
April 13, 2011
April
04
Apr
13
13
2011
07:29 AM
7
07
29
AM
PDT
So are you saying that Turing machine was designed by that equation you linked to?Joseph
April 13, 2011
April
04
Apr
13
13
2011
07:26 AM
7
07
26
AM
PDT
Design is a mechanism. idcurious:
Really?
Yes, you seem to be ignorant about many things. Throwing deep time at issues is not scientific and your avoidance of that proves my point- that you are an intellectual coward.
You appear to be staggeringly ignorant about a great many things.
Sed the ignorant evo.
Scientists have shown the bacterial flagellum can still have functionality with parts removed. They have provided plausible mechanisms for how it might have evolved.
They have not demonstrated that a bacterial flagellum can evolve via accumulating random mutaions fom a popultion that never had one. No way to even test the premise. Vague speculation based on their investigations of the evidence left behind- just as I said… Still no specifics. And still no investigating the designers.
No specifics?
Not on the designers.
Archeologists know what animals the people kept, what tools they used, what they ate, where they came from, what symbols they commonly used, when they lived, and plenty more.
How do they kow that tos people are the people who designed Stonehenge? A far as they know those are just the people who lived in the area.
You are telling me that ID cannot say anything about the designer – just the design.
ID is about the design not the designer. There isn't anything in ID that prevents us from trying to figure out the designer. The theory of evolution is not about the origins of life- and ID is not about the designer.
And yet forensics and archaeology rely on the *evidence* that the “designers” left behind.
That is what I said. And that is what ID is all about- the detection and study of designs.
And yet you provide no way of distinguishing between “design from an unknown designer for unknown purposes” and “natural processes not yet understood”.
Strange how I provided exactly that. And evolution is so clever there isn't any evidece ofblind, undirected processes producing a functional multi-part system.Joseph
April 13, 2011
April
04
Apr
13
13
2011
07:22 AM
7
07
22
AM
PDT
Joseph @ 80
Design is a mechanism.
Really? Without anything at all about the designer? Did it set things in motion to use blind, undirected processes? Did it have a master plan? Was there one designer or hundreds? Where did the designer come from? ID can't tell us anything at all.
There isn’t any scientific literature that supports the claim of refuting Dr Behe.
You saying this doesn't make it so. Scientists have shown the bacterial flagellum can still have functionality with parts removed. They have provided plausible mechanisms for how it might have evolved. Can they show exactly how it happened? No - but as it happened over millions if not billions of years, that's not surprising.
Throwing deep time at issues is not scientific and your avoidance of that proves my point- that you are an intellectual coward.
You appear to be staggeringly ignorant about a great many things.
Show me a turing machine.
Case in point. Have you never heard of Google? DIY Turing Machine in Action.
It is a safe bet that I know more than you do.
It's becoming ever more apparent that there is just no point in talking to you
Vague speculation based on their investigations of the evidence left behind- just as I said... Still no specifics. And still no investigating the designers.
No specifics? Archeologists know what animals the people kept, what tools they used, what they ate, where they came from, what symbols they commonly used, when they lived, and plenty more. You are in a complete state of denial. If you are representative of the ID movement then lord help us all.
Except that is not what I said. You have serious mental issues and should seek help.
You seriously need to learn how to talk to people. You @ 38: "I said ID is about the design, not the designer. And I have also said that the only way to make any scientific determination about the designer, in the absence of direct observation or designer input, is by studying the design in question. And that is how it works in forensics and archaeology." You @ 41: "Archaeologists can’t investigate their designers." You are telling me that ID cannot say anything about the designer - just the design. And yet forensics and archaeology rely on the *evidence* that the "designers" left behind. Absent that evidence, they can tell us nothing about why something happened, who did it, when it happened, or what their motives where.
And that is what ID is all about- the detection and study of designs.
And yet you provide no way of distinguishing between "design from an unknown designer for unknown purposes" and "natural processes not yet understood". This is the crux of Mathgirl's point, as far as I can see. One final point. The original post in this thread quotes Leslie Orgel. Here's another quote from him: "Evolution is cleverer than you are." His point? "Trial and error" strategies are often better than centralized intelligent human planning.idcurious
April 13, 2011
April
04
Apr
13
13
2011
06:50 AM
6
06
50
AM
PDT
My caim about ID and the designer is and aways has been that in the absence of direct observation or designer input, the only possible way to make any scientific determination about the designer(s) or the specific process(es) used, is by studying the design in question. And that is what ID is all about- the detection and study of designs.Joseph
April 13, 2011
April
04
Apr
13
13
2011
06:28 AM
6
06
28
AM
PDT
idcurious:
The mechanism. What does ID tell us about the mechanism?
Design is a mechanism.
Talkorigins is packed with references to the scientific literature.
There isn't any scientific literature that supports the claim of refuting Dr Behe.
Deep time certainly scientific – your avoidance of it isn’t my problem.
Throwing deep time at issues is not scientific and your avoidance of that proves my point- that you are an intellectual coward.
I note you never returned to the Turing Machine being a mathematically rigorous form of computer.
Show me a turing machine.
You don’t seem to know much about computer science or archeology
It is a safe bet that I know more than you do. Archaeologists still don’t know who designd and buit Stonehenge, so how can they investigate the designer(s)?
From aboutstonehenge.info:
Vague speculation based on their investigations of the evidence left behind- just as I said. Still no specifics. And still no investigating the designers. It is likethe difference between a geologist looking for geological processes for the formation of Stonehenge vs archaeologists investigating it from a design perspetive.
But you’ve told me repeatedly that ID can’t tell us anything about the designer.
Except that is not what I said. You have serious mental issues and should seek help.Joseph
April 13, 2011
April
04
Apr
13
13
2011
06:17 AM
6
06
17
AM
PDT
1 5 6 7 8 9 10

Leave a Reply