Uncommon Descent Serving The Intelligent Design Community

Oldies but baddies — AF repeats NCSE’s eight challenges to ID (from ten years ago)

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

In a recent thread by Dr Sewell, AF raised again the Shallit-Elsberry list of eight challenges to design theory from a decade ago:

14 Alan FoxApril 15, 2013 at 12:56 am Unlike Profesor Hunt, Barry and Eric think design detection is well established. How about having a go at this list then. It’s been published for quite a while now.

I responded a few hours later:

______________

>>* 16 kairosfocus April 15, 2013 at 2:13 am

AF:

I note on points re your list of eight challenges.

This gets tiresomely repetitive, in a pattern of refusal to be answerable to adequate evidence, on the part of too many objectors to design theory:

>>1 Publish a mathematically rigorous definition of CSI>>

It has long since been shown, objections and censorship games notwithstanding, that reasonable quantitative metrics for FSCO/I and so for CSI, can be built and have been built. Indeed Durston et al have used such to provide a published list of values for 15 protein families.

>> 2 Provide real evidence for CSI claims >>

Blatant, all around you. But, a man convinced against his will is of the same opinion still.

Just to pick an example {–> from the list}, a phone number is obviously functionally specific (ever had a wrong number call?) and — within a reasonable context [though not beyond the 500 bit threshold] complex.

>> 3 Apply CSI to identify human agency where it is currently not known >>

FSCO/I is routinely intuitively used to identify artifacts of unknown cause, as IIRC, WmAD has pointed out regarding a room in the Smithsonian full of artifacts of unknown purpose but identified to be credibly human.

>> 4 Distinguish between chance and design in archaeoastronomy >>

The pattern of Nazca lines or the like, fit within the nodes-arcs pattern and collectively exhibit FSCO/I similar to other complex drawings. The 500 bit threshold is easily passed. If you want to contrast odds of a marker wandering randomly in a random walk, the difference will be trivial.

In short this is a refusal to use simple common sense and good will.

>> 5 Apply CSI to archaeology >>

Just shown, this is a case or repeating much the same objection in much the same context as though drumbeat repetition is capable of establishing a claim by erasing the underlying fallacies. Being wrong over and over and over again, even in the usual anti-design echo chambers, does not convert long since corrected fallacy into cogent reasoning.

>> 6 Provide a more detailed account of CSI in biology
Produce a workbook of examples using the explanatory filter, applied to a progressive series of biological phenomena, including allelic substitution of a point mutation. >>

There are book-length cogent treatments of CSI as applied to biology [try Meyer’s SITC for starts {{ –> . . . I know, I know, this was published 2009, six years after the “challenge,” but AF is raising it in 2013, TEN years after the challenge}}], and that is not enough for the objectors, there will never be enough details.

Similarly, the objection starts within an island of existing function and demands a CSI based explanation of a phenomenon known to be well within the threshold of complexity. This is a strawman tactic.

>> 7 Use CSI to classify the complexity of animal communication As mentioned in Elsberry and Shallit (2003: 9), many birds exhibit complex songs. >>

What?

Is there any doubt that bird or whale songs or bee dances for that matter are long enough and complex enough to be FSCI? That they function in communication? That we did not directly observe the origin of the capacities for such but have reason to see that they are grounded in CSI in the genome and related regulatory information expressed in embryological development that wires the relevant nerve pathways?

So, are you demanding a direct observation of the origin of such, which we do not have access to and cannot reasonably expect, when we do have access to the fact that we have indications of FSCO/I and so raise the question as to what FSCO/I is a known reliable, strongly tested sign of as best causal explanation?

>> 8 Animal cognition
Apply CSI to resolve issues in animal cognition and language use by non-human animals. >>

Capacity for language, of course, is biologically rooted, genetically stamped and embryologically expressed. So it fits into the same set of issues addressed under 7 just now.

Repetitive use of fallacies does not suddenly convert them into sound arguments.

Nor, can one reasonably demand solutions to any number of known unresolved scientific problems as a condition of accepting something that is already well enough warranted on reasonable application of inductive principles. That is, it is well established on billions of test cases without significant exception, that FSCO/I is a reliable sign of design as cause.
____________

To suddenly demand that design thinkers must solve any number of unsolved scientific questions or the evidence already in hand will be rejected, is a sign of selective hyeprskepticism and a red herring tactic led away to a strawman misrepresentation, not a case of serious and cogent reasoning. >>

=========

(*And yes, AF, I am modifying French-style quote marks to account for the effect of the Less Than sign in an HTML-sensitive context. No need to go down that little convenient side-track again twice within a few days. Especially, as someone by your own testimony apparently living in a Francophone area.)

NB: BA77’s comment at 17 is worth a look also. Let’s clip in modified French style, that he may clip and run that readeth:

>> Mr. Fox, it seems the gist of your eight ‘questions’ from ten years ago is that you doubt whether or not information, as a distinct entity, is even in the cell? In fact I remember many arguments with neo-Darwinists on UD, not so many years back, who denied information, as a distinct entity, was even in the cell. Is this still your position? If so, may I enlighten you to this recent development???,,,

Harvard cracks DNA storage, crams 700 terabytes of data into a single gram – Sebastian Anthony – August 17, 2012
Excerpt: A bioengineer and geneticist at Harvard’s Wyss Institute have successfully stored 5.5 petabits of data — around 700 terabytes — in a single gram of DNA, smashing the previous DNA data density record by a thousand times.,,, Just think about it for a moment: One gram of DNA can store 700 terabytes of data. That’s 14,000 50-gigabyte Blu-ray discs… in a droplet of DNA that would fit on the tip of your pinky. To store the same kind of data on hard drives — the densest storage medium in use today — you’d need 233 3TB drives, weighing a total of 151 kilos. In Church and Kosuri’s case, they have successfully stored around 700 kilobytes of data in DNA — Church’s latest book, in fact — and proceeded to make 70 billion copies (which they claim, jokingly, makes it the best-selling book of all time!) totaling 44 petabytes of data stored.
http://www.extremetech.com/ext…..ingle-gram

That DNA stores information is pretty much the mainstream position now Mr. Fox,,,

Venter: Life Is Robotic Software – July 15, 2012
Excerpt: “All living cells that we know of on this planet are ‘DNA software’-driven biological machines comprised of hundreds of thousands of protein robots, coded for by the DNA, that carry out precise functions,” said (Craig) Venter.
http://crev.info/2012/07/life-is-robotic-software/

That information is a distinct entity in the cell is pretty uncontroversial Mr. Fox, so why the list of eight questions? The only question that really matters is can purely material processes generate these extreme levels of functional information? Perhaps you would like to be the first Darwinist on UD to produce evidence that material processes can produce enough functional information for say the self assembly of a novel molecular machine?>>

The much underestimated and too often derided BA77  continues at 18:

>> Mr. Fox, as to the fact that a cell contains functional information, I would like to, since Dr. Sewell approaches this from the thermodynamic perspective, point out something that gets missed in the definition of functional information in the specific sequences of DNA, RNAs, and proteins. There is a deep connection between entropy and information,,

“Is there a real connection between entropy in physics and the entropy of information? ….The equations of information theory and the second law are the same, suggesting that the idea of entropy is something fundamental…”
Siegfried, Dallas Morning News, 5/14/90, [Quotes Robert W. Lucky, Ex. Director of Research, AT&T, Bell Laboratories & John A. Wheeler, of Princeton & Univ. of TX, Austin]

“Bertalanffy (1968) called the relation between irreversible thermodynamics and information theory one of the most fundamental unsolved problems in biology.”
Charles J. Smith – Biosystems, Vol.1, p259.

Demonic device converts information to energy – 2010
Excerpt: “This is a beautiful experimental demonstration that information has a thermodynamic content,” says Christopher Jarzynski, a statistical chemist at the University of Maryland in College Park. In 1997, Jarzynski formulated an equation to define the amount of energy that could theoretically be converted from a unit of information2; the work by Sano and his team has now confirmed this equation. “This tells us something new about how the laws of thermodynamics work on the microscopic scale,” says Jarzynski.
http://www.scientificamerican……rts-inform

And what is particularly interesting about this deep connection between information and entropy is that,,,

“Gain in entropy always means loss of information, and nothing more.”
Gilbert Newton Lewis – preeminent Chemist of the first half of last century

And yet despite the fact that entropic processes tend to degrade information, it is found that the thermodynamic disequilibrium of a ‘simple’ bacteria and the environment is,,,

“a one-celled bacterium, e. coli, is estimated to contain the equivalent of 100 million pages of Encyclopedia Britannica. Expressed in information in science jargon, this would be the same as 10^12 bits of information. In comparison, the total writings from classical Greek Civilization is only 10^9 bits, and the largest libraries in the world – The British Museum, Oxford Bodleian Library, New York Public Library, Harvard Widenier Library, and the Moscow Lenin Library – have about 10 million volumes or 10^12 bits.” – R. C. Wysong
http://books.google.com/books?…..;lpg=PA112

Moleular Biophysics – Information theory. Relation between information and entropy: – Setlow-Pollard, Ed. Addison Wesley
Excerpt: Linschitz gave the figure 9.3 x 10^12 cal/deg or 9.3 x 10^12 x 4.2 joules/deg for the entropy of a bacterial cell. Using the relation H = S/(k In 2), we find that the information content is 4 x 10^12 bits. Morowitz’ deduction from the work of Bayne-Jones and Rhees gives the lower value of 5.6 x 10^11 bits, which is still in the neighborhood of 10^12 bits. Thus two quite different approaches give rather concordant figures.
http://www.astroscu.unam.mx/~a…..ecular.htm

Moreover we now have good empirics to believe that information itself is what is constraining the cell to be so far out of thermodynamic equilibrium:

Information and entropy – top-down or bottom-up development in living systems? A.C. McINTOSH
Excerpt: It is proposed in conclusion that it is the non-material information (transcendent to the matter and energy) that is actually itself constraining the local thermodynamics to be in ordered disequilibrium and with specified raised free energy levels necessary for the molecular and cellular machinery to operate.
http://journals.witpress.com/paperinfo.asp?pid=420

Does DNA Have Telepathic Properties?-A Galaxy Insight – 2009
Excerpt: DNA has been found to have a bizarre ability to put itself together, even at a distance, when according to known science it shouldn’t be able to.,,, The recognition of similar sequences in DNA’s chemical subunits, occurs in a way unrecognized by science. There is no known reason why the DNA is able to combine the way it does, and from a current theoretical standpoint this feat should be chemically impossible.
http://www.dailygalaxy.com/my_…..ave-t.html

In fact, Encoded ‘classical’ information such as what Dembski and Marks demonstrated the conservation of, and such as what we find encoded in computer programs, and yes, as we find encoded in DNA, is found to be a subset of ‘transcendent’ (beyond space and time) quantum information/entanglement by the following method:,,,

Quantum knowledge cools computers: New understanding of entropy – June 2011
Excerpt: No heat, even a cooling effect;
In the case of perfect classical knowledge of a computer memory (zero entropy), deletion of the data requires in theory no energy at all. The researchers prove that “more than complete knowledge” from quantum entanglement with the memory (negative entropy) leads to deletion of the data being accompanied by removal of heat from the computer and its release as usable energy. This is the physical meaning of negative entropy. Renner emphasizes, however, “This doesn’t mean that we can develop a perpetual motion machine.” The data can only be deleted once, so there is no possibility to continue to generate energy. The process also destroys the entanglement, and it would take an input of energy to reset the system to its starting state. The equations are consistent with what’s known as the second law of thermodynamics: the idea that the entropy of the universe can never decrease. Vedral says “We’re working on the edge of the second law. If you go any further, you will break it.”
http://www.sciencedaily.com/re…..134300.htm

And yet, despite all this, we have ZERO evidence that material processes can generate even trivial amounts classical information much less generate massive amounts transcendent ‘non-local’ quantum information/entanglement,,,

Stephen Meyer – The Scientific Basis Of Intelligent Design
https://vimeo.com/32148403

Stephen Meyer – “The central argument of my book is that intelligent design—the activity of a conscious and rational deliberative agent—best explains the origin of the information necessary to produce the first living cell. I argue this because of two things that we know from our uniform and repeated experience, which following Charles Darwin I take to be the basis of all scientific reasoning about the past. First, intelligent agents have demonstrated the capacity to produce large amounts of functionally specified information (especially in a digital form). Second, no undirected chemical process has demonstrated this power. Hence, intelligent design provides the best—most causally adequate—explanation for the origin of the information necessary to produce the first life from simpler non-living chemicals. In other words, intelligent design is the only explanation that cites a cause known to have the capacity to produce the key effect in question.”

Verse and Music:

John 1:1-4
In the beginning was the Word, and the Word was with God, and the Word was God. He was with God in the beginning. Through him all things were made; without him nothing was made that has been made. In him was life, and that life was the light of all mankind.

The Afters – Every Good Thing – Lyric Video
http://www.youtube.com/watch?v=FY2ycrpbOlw >>

Joe puts in a good knock at 25:

>>Earth to Alan Fox,

Neither you, Shallit, Elsberry nor the NCSE need concern yourselves with CSI. That is because all of you can render CSI moot just by stepping up and demonstrating that blind and undirected processes can account for what we call CSI.

It is that simple- demonstrate blind and undirected processes can produce CSI and our argument wrt CSI, falls.

However seeing that you all are nothing but cowards, you won’t do that because that means actually having to make a positive case. And everyone in the world knows that you cannot do such a thing.

The point being is that your misguided attacks on ID are NOT going to provide positiove evidence for your position. And only positive evidence for blind and undirected processes producing CSI is going to refute our arguments. >>

I picked back up from BA77 at 26:

>> BA77: The connexion between entropy and information is indeed important. I like the expression of it that runs like: the entropy of a body is the average missing info to specify the exact microstate of its constituent particles, that exists if what one knows about the system is the thermodynamic macrostate defined by its macro-level thermodynamic properties. This of course implies the degree of freedom or lack of constraint on the particles, and links to the situation where a rise in entropy is often linked to a rise in disorder, a degradation of availability of energy.  >>

_______________
And, dear Reader, what do you think AF’s answer is, several days later on this the 19th of April in this, The Year of Our Risen Lord, “dos mil trece” [= 2013]?

Dead silence, and heading off to other threads where he thought he could score debate points.

(In short, he raised dismissive talking points and stayed not for an answer. Sad.)

Let us hope that headlining the above will at least allow others who need and want such, to find a reasonable summary answer to the NCSE talking points. END

PS: Dembski and Luskin have responded at one time or another to the S-E team, try here and here (part II here; complete with with AF popping up here at no 3).

Comments
Alan Fox:
Or are you talking about Dembski’s formula that Dr Liddle eviscerates at TSZ?
LoL! Lizzie hasn't "eviscerated" anything. Like you, she doesn't even appear to understand science.Joe
April 22, 2013
April
04
Apr
22
22
2013
05:32 AM
5
05
32
AM
PDT
?= –log2[10120 · ?S(T)·P(T|H)] (Sorry, doesn't display correctly) Or are you talking about Dembski's formula that Dr Liddle eviscerates at TSZ?Alan Fox
April 22, 2013
April
04
Apr
22
22
2013
05:25 AM
5
05
25
AM
PDT
AF: If I had thought you a naive newbie, I would be patient. But I know better. You full well know that the Chi_500 metric incorporates a parameter, S, that reflects the objective evaluation of functional specificity, which can be identified per the approach of Dembski in NFL, 141 - 8, namely a separate, independent specifying description that puts you in a zone T in W where T is such that on relevant accessible resources, blind chance and mechanical necessity are maximally unlikely to access T, for the reasons of the gold bead in the sack of beans or the needle in the haystack or the large number of monkeys typing at random. All of this, you know or should know. All of this you routinely use in a world of just such functionally specific digital information. KFkairosfocus
April 22, 2013
April
04
Apr
22
22
2013
05:24 AM
5
05
24
AM
PDT
Alan Fox:
The result, if it exceeds a threshold of 500 bits, is “designed”.
Not quote. The if 500 bits is met then CSI is present.
How does your calculation show the difference between meaningful text and jumbled letters?
It doesn't. It isn't supposed to. Meaning/ function is an OBSERVATION. And yes the comment sections of my blog demonstrates the vacuuity of the evos' position. Nice call Alan.Joe
April 22, 2013
April
04
Apr
22
22
2013
05:22 AM
5
05
22
AM
PDT
You full well know there are good and public, published answers to your pretended points of objection.
I know no such thing. If I am mistaken, then it should be no problem to direct me to the source of a genuine calculation of the CSI of something... anything. Or do you endorse Joe's risible response?Alan Fox
April 22, 2013
April
04
Apr
22
22
2013
05:17 AM
5
05
17
AM
PDT
A simple character count reveals 202 characters which translates into 1010 bits of information/ specified complexity. 1010 bits > 500, therefor CSI is present.
Let me see if I have this right. For your example of text, you work out the probability of a particular letter (1 in 26) and then multiply up for the number of letters in a piece of text. The result, if it exceeds a threshold of 500 bits, is "designed". How does your calculation show the difference between meaningful text and jumbled letters? BTW the comment section in those blogposts of yours should be recommended to everyone here who'd like to get a better understanding of human communication. ;)Alan Fox
April 22, 2013
April
04
Apr
22
22
2013
05:14 AM
5
05
14
AM
PDT
AF: You have been around UD for at least eight years. You full well know there are good and public, published answers to your pretended points of objection. It is public record that both Dembski and Durston et al have published metrics, the latter building on work by Szostak which in turn is in the context of Shannon's H-metric of average info per symbol in a comms system. Abel has published a warrant for a set of plausibility bounds, from earth to solar system to observed cosmos, consistent with what has been in use for years. That is, 500 bits is a very good solar system threshold for what is well beyond the credible reach of blind chance and mechanical necessity. All this, if you really don't know, it is because you have willfully shut your eyes to what you full well should have long since noted and acknowledged and instead insist on continued misrepresentation. And yes, at this stage I fully intend what that implies; this is now going to character, not merely an issue of disagreement as the facts are patent, easily seen and have been repeatedly presented to you, but willfully ignored the better to sustain a misrepresentation in hopes of that being perceived as truth. In any case, the actual simplified Dembski 2005 metric is easy to understand and see its warrant, save to those who refuse to do so, on the pretence that any objection they can repeat drumbeat style is fatal. The best answer to this is to ask of you, show us a case where on actual observation, something that Chi_500 = Ip*S - 500 passes as credibly designed, is known to be the product of blind chance and mechanical necessity. To this date, after years on this and other ways to measure much the same that come down to being equivalent in effect, all that comes up is a string of hopeful cases that on closer inspection support the point that FSCO/I is reliably observed to be the product of design. This is multiplied by literally billions of cases in point, growing with every post in this thread including yours, on how intelligence is the known source of such FSCO/I. In short the truth is that you are playing a game of pretending that a burden of warrant that has long since been met to any reasonable standard is not met. Obviously, because of the implications that are so patently unwelcome to you and ilk. Furthermore, what is increasingly evident -- has been clear for months now -- is that you are simply here to spout dismissive talking points and ignore evidence that does not suit your agendas, not to seriously or fairly engage matters on the merits. Duly noted, to your utter discredit. For shame! KF PS: Onlookers who may not know the actual facts, for just a start, cf. here on.kairosfocus
April 22, 2013
April
04
Apr
22
22
2013
05:09 AM
5
05
09
AM
PDT
Joe posted this:
It is and we have told you how to measure it. OTOH you cannot provide any support for the claims of your position- not even any methodology.
Joe is claiming that CSI is measurable. Could he please provide a published example of the CSI value of an actual object? Note: I am asking for a published example of the CSI of an actual object, not an explanation of how to do it. Just the facts, Ma'am.timothya
April 22, 2013
April
04
Apr
22
22
2013
05:00 AM
5
05
00
AM
PDT
Longer version:
The causal tie between an artifact and its intended character — or, strictly speaking, between an artifact and its author’s productive intention — is constituted by an author’s actions, that is, by his work on the object.- Artifact
When discussing information some people want to know how much information does something contain? If it is something straight-forward such as a definition, we can count the number of bits in that definition to find out how much information it contains. For example:
aardvark: a large burrowing nocturnal mammal (Orycteropus afer) of sub-Saharan Africa that has a long snout, extensible tongue, powerful claws, large ears, and heavy tail and feeds especially on termites and ants
A simple character count reveals 202 characters which translates into 1010 bits of information/ specified complexity. Now what do we do when all we have is an object? One way of figuring out how much information it contains is to figure out how (the simplest way) to make it. Then you write down the procedure without wasting words/ characters and count those bits. The point is that you have to capture the actions required and translate that into bits. That is if you want to use CSI. However by doing all of that you have already determined the thing was designed Now you are just trying to determine how much work was involved. But anyway, that will give you an idea of the minimal information it contains- Data collection and compression (six sigma DMAIC- define, measure, analyze, improve, control). CSI is a threshold, meaning you don't need an exact number. And it is a threshold that nature, operating freely has never been observed to come close to. Once CSI = yes you know it was designed. On Shannon Information and measuring biological information:
The word information in this theory is used in a special mathematical sense that must not be confused with its ordinary usage. In particular, information must not be confused with meaning.- Warren Weaver, one of Shannon's collaborators
Is what Weaver said so difficult to understand? Kolmogorov complexity deals with, well, complexity. From wikipedia:
Algorithmic information theory principally studies complexity measures on strings (or other data structures).
Nothing about meaning, content, functionality, prescription. IOW nothing that Information Technology cares deeply about, namely functional, meaningful, and useful information. Not only Information Technology but the whole world depends on Information Technology type of information, ie the type of information Intelligent Design is concerned with. And both Creationists and IDists make it clear, painfully clear, that when we are discussing "information" we are discussing that type of information. And without even blinking an eye, the anti-IDists always, and without fail, bring up the meaningless when trying to refute the meaningful. “Look there is nature producing Shannon Information, you lose!”- ho-hum. Moving on-
Biological specification always refers to function. An organism is a functional system comprising many functional subsystems. In virtue of their function, these systems embody patterns that are objectively given and can be identified independently of the systems that embody them. Hence these systems are specified in the same sense required by the complexity-specification criterion (see sections 1.3 and 2.5). The specification of organisms can be crashed out in any number of ways. Arno Wouters cashes it out globally in terms of the viability of whole organisms. Michael Behe cashes it out in terms of minimal function of biochemical systems.- Wm. Dembski page 148 of NFL
In the preceding and proceeding paragraphs William Dembski makes it clear that biological specification is CSI- complex specified information. In the paper "The origin of biological information and the higher taxonomic categories", Stephen C. Meyer wrote:
Dembski (2002) has used the term “complex specified information” (CSI) as a synonym for “specified complexity” to help distinguish functional biological information from mere Shannon information--that is, specified complexity from mere complexity. This review will use this term as well.
In order to be a candidate for natural selection a system must have minimal function: the ability to accomplish a task in physically realistic circumstances.- M. Behe page 45 of “Darwin’s Black Box”
With that said, to measure biological information, ie biological specification, all you have to do is count the coding nucleotides of the genes involved for that functioning system, then multiply by 2 (four possible nucleotides = 2^2) and then factor in the variation tolerance: from Kirk K. Durston, David K. Y. Chiu, David L. Abel, Jack T. Trevors, “Measuring the functional sequence complexity of proteins,” Theoretical Biology and Medical Modelling, Vol. 4:47 (2007):
[N]either RSC [Random Sequence Complexity] nor OSC [Ordered Sequence Complexity], or any combination of the two, is sufficient to describe the functional complexity observed in living organisms, for neither includes the additional dimension of functionality, which is essential for life. FSC [Functional Sequence Complexity] includes the dimension of functionality. Szostak argued that neither Shannon’s original measure of uncertainty nor the measure of algorithmic complexity are sufficient. Shannon's classical information theory does not consider the meaning, or function, of a message. Algorithmic complexity fails to account for the observation that “different molecular structures may be functionally equivalent.” For this reason, Szostak suggested that a new measure of information—functional information—is required.
Here is a formal way of measuring functional information: Robert M. Hazen, Patrick L. Griffin, James M. Carothers, and Jack W. Szostak, "Functional information and the emergence of biocomplexity," Proceedings of the National Academy of Sciences, USA, Vol. 104:8574–8581 (May 15, 2007). See also: Jack W. Szostak, “Molecular messages,” Nature, Vol. 423:689 (June 12, 2003). original posts can be found here, here and hereJoe
April 22, 2013
April
04
Apr
22
22
2013
04:52 AM
4
04
52
AM
PDT
Alan Fox:
Forgive me for being dubious, Joe, but, if this were the case, this would establish “Intelligent Design” as having a possible theory which would, I would have thought, been shouted from the rooftops.
Seeing that your head is [snip, language], I doubt that you would hear it anyway.
If someone can indeed calculate the CSI of something …anything… why don’t they want to demonstrate the procedure?
We have. OTOH your position has yet to demonstrate anything beyond bald assertions.
If there is some calculation that I have overlooked, by all means draw it to my attention.
I have. Again your willful ignorance means nothing here. But anyway- When discussing information some people want to know how much information does something contain? If it is something straight-forward such as a definition, we can count the number of bits in that definition to find out how much information it contains. For example:
aardvark: a large burrowing nocturnal mammal (Orycteropus afer) of sub-Saharan Africa that has a long snout, extensible tongue, powerful claws, large ears, and heavy tail and feeds especially on termites and ants
A simple character count reveals 202 characters which translates into 1010 bits of information/ specified complexity. 1010 bits > 500, therefor CSI is present. Now wrt biology each nucleotide has 2 bits. _______ CALC: 7 bits per ASCII character (not 5 per Baudot character) * 202 characters = 1414 bits. Functionally specific, so S = 1. Chi_500 = 1414 * 1 - 500 = 914 bits beyond the solar system threshold. Designed. KF Joe
April 22, 2013
April
04
Apr
22
22
2013
04:49 AM
4
04
49
AM
PDT
It is and we have told you how to measure it.
Forgive me for being dubious, Joe, but, if this were the case, this would establish "Intelligent Design" as having a possible theory which would, I would have thought, been shouted from the rooftops. If someone can indeed calculate the CSI of something ...anything... why don't they want to demonstrate the procedure? If there is some calculation that I have overlooked, by all means draw it to my attention. "It's been done" as a bald statement without reference to what, when and where something has been done is insufficient (to put it mildly).Alan Fox
April 22, 2013
April
04
Apr
22
22
2013
04:43 AM
4
04
43
AM
PDT
Alan Fox:
The point at issue is whether CSI is a measurable quantitiy.
It is and we have told you how to measure it. OTOH you cannot provide any support for the claims of your position- not even any methodology.Joe
April 22, 2013
April
04
Apr
22
22
2013
04:29 AM
4
04
29
AM
PDT
Axel quoting Joe
‘Specified information is just Shannon information with meaning or function. IOW it is information in the normal use of the word. I don’t understand your issue with it. You use it every day.’
The point at issue is whether CSI is a measurable quantitiy. If it were, I would have thought someone could explain how to measure the CSI of something, anything. Of course, then, I would be inclined to ask, what can you then show from the result of such a calculation. But, first things first.Alan Fox
April 22, 2013
April
04
Apr
22
22
2013
04:23 AM
4
04
23
AM
PDT
@ Eric Anderson: In case Timothy A doesn't find time to respond, Dr Liddle has posted an answer at TSZ
Eric asks timothya:
timothya: Two questions: 1. Are you suggesting that we have to know the exact, precise, unequivocal probability of event X occurring by purely natural processes before we can draw an inference that event X did not occur by purely natural processes?
No.
2. On what basis do forensics experts and archaeologists draw an inference to design? Must they first lay out a precise formulation of all possible probabilities of the item in question having been produced by purely natural processes?
No (assuming that by “natural” you mean “unintended”, or something similar). Look: Dembski proposed a formula – a metric – for inferring design, based on Fisherian hypothesis testing, that involves determining that the candidate pattern is in the rejection region of a probability distribution under the null of non-design. So, clearly, to, calculate that metric we need the probability distribution under the null. Dembski provides no way of calculating that distribution that does not involve first knowing what non-design processes can do. And if we knew that, he wouldn’t need his calculation. So his entire argument is circular. That doesn’t mean that inferring design from a pattern isn’t possible; what it does mean is that CSI, and its relatives, are useless for doing so.
There are several other comments that might be of interest. Just to mention the "needle-in-a-haystack" analogy beloved by Kairosfocus. Unless you know there is only one needle (and all evidence points to the contrary) and you can work out the size of the haystack, the analogy is inapt.Alan Fox
April 22, 2013
April
04
Apr
22
22
2013
04:17 AM
4
04
17
AM
PDT
Axel posted this:
By crystallising the animus against Philip of his ‘bottom-feeder’ critics, KF, you are inadvertently dignifying their inchoate howling, giving aid comfort to ‘our friends’ across the epistemological chasm – the enemies of reason – in however nugatory a measure and elliptical manner. Pardon my presumption, as a 22 carat ‘dogsbody’ nescientist, in bringing this to your notice.
Pardon me, but what is a "nescientist"?timothya
April 22, 2013
April
04
Apr
22
22
2013
03:41 AM
3
03
41
AM
PDT
Kairosfocus posted this, with my answers interpolated in his approved manner:
F/N: TA, do you understand that far tails of bell or similar distributions are but one rather peculiar example of the broader case of separately and “simply” describable narrow — i.e. unrepresentative of the typical members of W — zones of interest T in large config spaces W? [Where as the W standing in for omega hints, the antecedents of interest lie in statistical thermodynamics and the warrant for results, including the statistical underpinnings of the second law of thermodynamics and why entropy of a system tends to increase. Cf. here on.]
No I don't understand anything in that paragraph. Particularly obscure is any connection between standard statistical sampling from biological populations and statistical analysis of physical entropy.
1: Do you understand the generality of sampling theory results on capturing gross and typical patterns of a population without needing to know details of its generating factors, forces and resulting mathematical distributions?
What is "the generality of sampling theory"? I do know about sampling theory and I know it works well if you already know the form of the population distribution being sampled.
2: As in, do you understand that with a large sack of beans that you have in hand, a couple of hands full from deep in the sack, can be utterly telling?
Or not utterly telling, depending on the frequency distribution of the beans with respect to the attribute you are investigating. Can you specify what hypothesis you are investigating about the nature of the beans?
3: But, if there is a single gold bead or a lost wedding ring in the sack, you have a challenge to find it that way?
In that case it isn't a bag of beans. Your null hypothesis has been invalidated. Try again. For example: what is the likelihood of finding a wedding ring in a sack of beans sampled at random from a bean factory?
4: What then happens when the sack is so large that you cannot pour it out to search more than a tiny, tiny fraction, regardless of the number of gold beads, once they are sufficiently rare that feasible samples are utterly unlikely to pick them up?
In that case you have invalidated your sample design again (you can't "adequately" sample an infinitely large bag of beans). Please, please, please get your statistical definitions straight, and then stick to them.
5: Do you now see the significance of the base definition being offered by WmAD in NFL, and of the developments over the past decade or so? Namely, why it is that random search, in a world of search for search [S4S], becomes the typical search [which BTW is a good example of why the Bernoulli-Laplace principle of indifference is so often a useful default . . . ], and why if a search is outperforming it, that is a sign that it was tuned to the distribution in some way, indicative of intelligence and injection of active information, which can be measured by the degree of over-performance relative to the expected result of the typical search?
Invalid argument if you are addressing biological processes (biological processes do not involve cosmological-scale searches. They "search" the specific genomic options offered by mutation.
6: That people may object to something does not mean that the thing objected to is thereby falsified — the error of selective hyperskepticism. And,
Agreed. Your arguments are falsified by evidence, not because I happen to disagree with them.
7: even where there are problems, we can learn to keep the baby and discard the bath water or even the dirty diaper.
Agreed. The bath water of cosmic-scale "blind search" can safely be discarded in relation to the behaviour of biological systems.
8: As has been shown, the basic concept of CSI is sound, the distinctions between OSC, RSC and FSC are objective, and the matter can be reduced to metric models that consistently show a track record of performance that FSCO/I is consistently seen to come form just one source, intelligence.
You are simply restating the same circular argument. To calculate the CSI/(insert any other preferred version) of an object, you have to know in advance what designed objects look like. How do we know such a thing? Why, because we know how humans design stuff! To borrow your own terminology, you are smuggling in the answer via your premises.
9: Where, chance and necessity, on reasonable models, will fall below the threshold or reasonable observability, something that is being abundantly confirmed by the experience of billions of test cases all around us.
Wrong. We do not have "billions of test cases". There is only one known cause of intentional design: humans. With a sample of one, you can freely construct a regression curve of any shape that floats your boat. Well actually, we might reasonably extend a capability of design to a range of species related to humans - primates, beavers, birds of paradise, bees, lichens. Though whether our cousins in the rest of nature are exercising intentional design is a good question.timothya
April 22, 2013
April
04
Apr
22
22
2013
03:34 AM
3
03
34
AM
PDT
'Specified information is just Shannon information with meaning or function. IOW it is information in the normal use of the word. I don’t understand your issue with it. You use it every day.' Neither does Alan, Joe. He'd have to go back to primary school, and start learning from scratch again. He's got lost. What should be reason is a maze to him.Axel
April 22, 2013
April
04
Apr
22
22
2013
03:26 AM
3
03
26
AM
PDT
'The much underestimated and too often derided BA77...' By crystallising the animus against Philip of his 'bottom-feeder' critics, KF, you are inadvertently dignifying their inchoate howling, giving aid comfort to 'our friends' across the epistemological chasm - the enemies of reason - in however nugatory a measure and elliptical manner. Pardon my presumption, as a 22 carat 'dogsbody' nescientist, in bringing this to your notice.Axel
April 22, 2013
April
04
Apr
22
22
2013
03:19 AM
3
03
19
AM
PDT
F/N: TA, do you understand that far tails of bell or similar distributions are but one rather peculiar example of the broader case of separately and "simply" describable narrow -- i.e. unrepresentative of the typical members of W -- zones of interest T in large config spaces W? [Where as the W standing in for omega hints, the antecedents of interest lie in statistical thermodynamics and the warrant for results, including the statistical underpinnings of the second law of thermodynamics and why entropy of a system tends to increase. Cf. here on.] 1: Do you understand the generality of sampling theory results on capturing gross and typical patterns of a population without needing to know details of its generating factors, forces and resulting mathematical distributions? 2: As in, do you understand that with a large sack of beans that you have in hand, a couple of hands full from deep in the sack, can be utterly telling? 3: But, if there is a single gold bead or a lost wedding ring in the sack, you have a challenge to find it that way? 4: What then happens when the sack is so large that you cannot pour it out to search more than a tiny, tiny fraction, regardless of the number of gold beads, once they are sufficiently rare that feasible samples are utterly unlikely to pick them up? 5: Do you now see the significance of the base definition being offered by WmAD in NFL, and of the developments over the past decade or so? Namely, why it is that random search, in a world of search for search [S4S], becomes the typical search [which BTW is a good example of why the Bernoulli-Laplace principle of indifference is so often a useful default . . . ], and why if a search is outperforming it, that is a sign that it was tuned to the distribution in some way, indicative of intelligence and injection of active information, which can be measured by the degree of over-performance relative to the expected result of the typical search? 6: That people may object to something does not mean that the thing objected to is thereby falsified -- the error of selective hyperskepticism. And, 7: even where there are problems, we can learn to keep the baby and discard the bath water or even the dirty diaper. 8: As has been shown, the basic concept of CSI is sound, the distinctions between OSC, RSC and FSC are objective, and the matter can be reduced to metric models that consistently show a track record of performance that FSCO/I is consistently seen to come form just one source, intelligence. 9: Where, chance and necessity, on reasonable models, will fall below the threshold or reasonable observability, something that is being abundantly confirmed by the experience of billions of test cases all around us. KFkairosfocus
April 22, 2013
April
04
Apr
22
22
2013
01:27 AM
1
01
27
AM
PDT
Eric Anderson posted two questions:
1. Are you suggesting that we have to know the exact, precise, unequivocal probability of event X occurring by purely natural processes before we can draw an inference that event X did not occur by purely natural processes? 2. On what basis do forensics experts and archaeologists draw an inference to design? Must they first lay out a precise formulation of all possible probabilities of the item in question having been produced by purely natural processes?
1. No. Though you do need to have a clear idea of how natural processes work before rejecting the no-design null hypothesis. That is to say, you should have a well-specified (and well-populated) frequency distribution for your attribute before tossing the null overboard. 2. On the basis of frequency distributions of designed events observed to occur in the past. In other words, the real work involved is constructing a frequency distribution for the attribute of interest against which any new event can be compared. It is dangerous, as well as unscientific, to simply assume what that frequency distribution looks like until the legwork has been undertaken. Constructing frequency distributions for attributes at the genomic, biochemical, physiological and population levels is what biologists, archaeologists, forensic scientists and insurance actuaries generally do via observations and experiment.timothya
April 22, 2013
April
04
Apr
22
22
2013
12:29 AM
12
12
29
AM
PDT
EA @22:
We were dropping our son off at a big all-day hacking event for teens...
My bank account just went to zero dollars. Could you talk to your son for me? ThanksMung
April 21, 2013
April
04
Apr
21
21
2013
07:17 PM
7
07
17
PM
PDT
CSI wrt biology = biological specifcation. Biological specifcation always refers to function. That is more proof that Lizzie did NOT create CSI with her algorithm- no function created/ observed (and it starts with the very function that needs to be explained-> reproduction).Joe
April 21, 2013
April
04
Apr
21
21
2013
04:11 PM
4
04
11
PM
PDT
EA: There is no need for exact probability calcs or estimates. All we need are circumstances that sampling theory will let us see are of the needle in haystack variety, where is is simple to see that by its nature, the bulk of possible configs in a relevant situation will be gibberish. KFkairosfocus
April 21, 2013
April
04
Apr
21
21
2013
03:25 PM
3
03
25
PM
PDT
PS: Flint and other TSZ denizens, I did not invent the short, sharp little word or the meaning: speaking with disregard to the truth, hoping to profit by the misrepresentation or outright untruth being perceived as true. Nor did I create the fact that you, Flint, spoke in just that way by so willfully misrepresenting design theory as Young Earth Creationism, with the onward intent to infer or imply -- the subtext is quite obvious -- attempted imposition of an imagined right wing theocratic tyranny on science and culture; which is a slander, even of young earth creationists as a whole. The very concept, that there is a 500 bit FSCI threshold, is in the context of the usual models of solar system formation, and much of the context of cosmological fine tuning is that of a cosmos of some 13.7 BY in light of a big bang cosmological model. For years, there have been the WAC's to correct those willing to pay attention, right here at UD, so there is no excuse at this stage for someone who resorts to such slander tactics. And those who harbour or try to defend such are indulging in enabling and upholding in wrong. Where also if you want to try an immoral equivalency game observe that when Mr Arrington learned that he had been in error he corrected it and apologised. (There seems to be some sort of technical problem with the UD mobile site, which I did not even know existed previously. [I thought the purpose of those nice fat tablets was to be able to access web sites normally? Or, is this for those who want cell phones to play at being miniature computers? My advice is go get a tablet.]) After years of smears, outright vicious lying and web stalking, outing tactics, cruel mockery and threats against my family, I do not find any widespread willingness to acknowledge wrong and apologise, much less retract and seek to make amends for injury and insult done. Instead, I find a tide of Alinskyite amoral nihilism that takes malicious glee in creating poison and polarisation, and read from that, that such are a menace to our civilisation and a sign of what lies ahead if the madness is not stopped short of the cliff. KFkairosfocus
April 21, 2013
April
04
Apr
21
21
2013
03:21 PM
3
03
21
PM
PDT
TA: Do, tell us specifically, on evidence, the relevant flaw? Let's start from a simple contrast, a la Abel & Trevors and harking back through Thaxton et al to Orgel and Wicken:
1: OSC/Order: ftftftftftftftft . . . 2: RSC/Randomness: ygd46eyvyfduhudcuftdc . . . . 2: FSC/Organisation: this is a sequence of characters in English . . .
Do you wish to imply that there is no material, describable distinction across the three? Kindly, explicitly justify such a notion. (The sound you hear is laughter on the part of onlookers. BTW, as through nodes and arcs net lists, we can reduce any describable structure to strings, so the above is WLOG.) Next, let us examine Dembski's basic definition, from NFL:
p. 148: “The great myth of contemporary evolutionary biology is that the information needed to explain complex biological structures can be purchased without intelligence. My aim throughout this book is to dispel that myth . . . . Eigen and his colleagues must have something else in mind besides information simpliciter when they describe the origin of information as the central problem of biology. I submit that what they have in mind is specified complexity [[cf. here below], or what equivalently we have been calling in this Chapter Complex Specified information or CSI . . . . Biological specification always refers to function . . . In virtue of their function [[a living organism's subsystems] embody patterns that are objectively given and can be identified independently of the systems that embody them. Hence these systems are specified in the sense required by the complexity-specificity criterion . . . the specification can be cashed out in any number of ways [[through observing the requisites of functional organisation within the cell, or in organs and tissues or at the level of the organism as a whole] . . .” p. 144: [[Specified complexity can be defined:] “. . . since a universal probability bound of 1 [[chance] in 10^150 corresponds to a universal complexity bound of 500 bits of information, [[the cluster] (T, E) constitutes CSI because T [[ effectively the target hot zone in the field of possibilities] subsumes E [[ effectively the observed event from that field], T is detachable from E, and and T measures at least 500 bits of information . . . ”
Now, I actually do think there is a minor flaw here, the need to clarify the limits. That is why I use 500 bits as a solar system limit; for 10^57 atoms, 10^17 s and chem rxn times ~ 10^-14 s. For the observed cosmos, I square this, and am assured that the needle in haystack challenge is even deeper. That is, the 10^80 atoms of the observed cosmos working for 10^25 s and at PLANCK-time rates rounded down to 10^-45 s, cannot sample as much as 1 in 10^150 of the space. This is a needle in a haystack blind search challenge on skyrockets. Next, we can look at the Dembski quantification of 2005, and see how reducing logs and making reasonable upper limits gives us, for the solar system: Chi_500 = Ip*S - 500, where once Chi goes positive on a solar system scope (our practical cosmos for chemical interactions, absent invention of a warp drive) we can be assured blind search is all but utterly certain to fail. A practical impossibility. So, I think your bluff that all definitions of complex specified info are flawed, fails. And BTW, in the world of information systems, the normal stuff we use, buy and create, is FSCI, in bits. It is high time to put that talking point out to pasture. KFkairosfocus
April 21, 2013
April
04
Apr
21
21
2013
03:02 PM
3
03
02
PM
PDT
timothya: Two questions: 1. Are you suggesting that we have to know the exact, precise, unequivocal probability of event X occurring by purely natural processes before we can draw an inference that event X did not occur by purely natural processes? 2. On what basis do forensics experts and archaeologists draw an inference to design? Must they first lay out a precise formulation of all possible probabilities of the item in question having been produced by purely natural processes?Eric Anderson
April 21, 2013
April
04
Apr
21
21
2013
12:31 PM
12
12
31
PM
PDT
timothya:
The sleight of hand involves continuing to offer various reformulations of CSI, all of which have a fatal flaw built in.
Your entire position is a fatal flaw as it doesn't even deserve probability considerations as you can't demonstrate a feasibility. Does it really bother you guys so much that unlike you, ID actually has a methodology for making determinations?Joe
April 21, 2013
April
04
Apr
21
21
2013
09:32 AM
9
09
32
AM
PDT
So Flint lies like a rug and the TSZ ilk attack kairosfocus for calling him out. Lizzie's lozers are just total wastes of skin...Joe
April 21, 2013
April
04
Apr
21
21
2013
07:33 AM
7
07
33
AM
PDT
But it is nice to see keiths bring up his oft-refuted diatribe and say that we have no response to it. Earth to keiths- you were corrected in the thread you linked to, loser.Joe
April 21, 2013
April
04
Apr
21
21
2013
06:52 AM
6
06
52
AM
PDT
Well the TSZ ilk are still all in a tither wrt CSI. Unfortunately for them they STILL don't have any evidence to support the claims of their position. They are satisfied attacking a caricature of CSI and ID. That is because they are either too dishonest, ignorant or stupid to try to support their position. My apologies for dragging their tripe back to UD.Joe
April 21, 2013
April
04
Apr
21
21
2013
06:46 AM
6
06
46
AM
PDT
1 14 15 16 17 18

Leave a Reply