Uncommon Descent Serving The Intelligent Design Community

The Original WEASEL(s)

Categories
Darwinism
Evolution
Intelligent Design
Share
Facebook
Twitter/X
LinkedIn
Flipboard
Print
Email

On August 26th last month, Denyse O’Leary posted a contest here at UD asking for the original WEASEL program(s) that Richard Dawkins was using back in the late 1980s to show how Darwinian evolution works. Although Denyse’s post generated 377 comments (thus far), none of the entries could reasonably be thought to be Dawkins’s originals.

It seems that Dawkins used two programs, one in his book THE BLIND WATCHMAKER, and one for a video that he did for the BBC (here’s the video-run of the program; fast forward to 6:15). After much beating the bushes, we finally heard from someone named “Oxfordensis,” who provided the two PASCAL programs below, which we refer to as WEASEL1 (corresponding to Dawkins’s book) and WEASEL2 (corresponding to Dawkins’s BBC video). These are by far the best candidates we have received to date.

Unless Richard Dawkins and his associates can show conclusively that these are not the originals (either by providing originals in their possession that differ, or by demonstrating that these programs in some way fail to perform as required), we shall regard the contest as closed, offer Oxfordensis his/her prize, and henceforward treat the programs below as the originals.

For WEASEL1 and WEASEL2 click here:

WEASEL1:

Program Weasel;

Type

Text=String[28];

(* Define Parameters *)

Const

Alphabet:Text=’ABCDEFGHIJKLMNOPQRSTUVWXYZ ‘;

Target:Text=’METHINKS IT IS LIKE A WEASEL’;

Copies:Integer=100;

Function RandChar:Char;

(* Pick a character at random from the alphabet string *)

Begin

RandChar:=Alphabet[Random(27)+1];

End;

Function SameLetters(New:Text; Current:Text):Integer;

(* Count the number of letters that are the same *)

Var

I:Integer;

L:Integer;

Begin

L:=0;

I:=0;

While I< =Length(New) do Begin If New[I]=Current[I] Then L:=L+1; I:=I+1; End; SameLetters:=L; End; Var Parent:Text; Child:Text; Best_Child:Text; I:Integer; Best:Integer; Generation:Integer; Begin Randomize; (* Initialize the Random Number Generator *) (* Create a Random Text String *) Parent:=''; For I:=1 to Length(Target) do Begin Parent:=Concat(Parent, RandChar) End; Writeln(Parent); (* Do the Generations *) Generation:=1; While SameLetters(Target, Parent) <> Length(Target)+1 do

Begin

(* Make Copies *)

Best:=0;

For I:=1 to Copies do

Begin

(* Each Copy Gets a Mutation *)

Child:=Parent;

Child[Random(Length(Child))+1]:=RandChar;

(* Is This the Best We’ve Found So Far? *)

If SameLetters(Child, Target) > Best Then

Begin

Best_Child:=Child;

Best:=SameLetters(Child, Target);

End;

End;

Parent:=Best_Child;

(* Inform the User of any Progress *)

Writeln(Generation, ‘ ‘, Parent);

Generation:=Generation+1;

End;

End.

WEASEL2:

PROGRAM WEASEL;
USES
CRT;

(* RETURN A RANDOM LETTER *)
FUNCTION RANDOMLETTER : CHAR;
VAR
NUMBER : INTEGER;
BEGIN
NUMBER := RANDOM(27);
IF NUMBER = 0 THEN
RANDOMLETTER := ‘ ‘
ELSE
RANDOMLETTER := CHR( ORD(‘A’) + NUMBER – 1 );
END;

(* MEASURE HOW SIMILAR TWO STRINGS ARE *)
FUNCTION SIMILARITY(A : STRING; B : STRING) : INTEGER;
VAR
IDX : INTEGER;
SIMCOUNT : INTEGER;
BEGIN
SIMCOUNT := 0;

FOR IDX := 0 TO LENGTH(A) DO
BEGIN
IF A[IDX] = B[IDX] THEN
SIMCOUNT := SIMCOUNT + 1;
END;
SIMILARITY := SIMCOUNT;
END;

FUNCTION RANDOMSTRING(LEN : INTEGER) : STRING;
VAR
I : INTEGER;
RT : STRING;
BEGIN
RT := ”;
FOR I := 1 TO LEN DO
BEGIN
RT := RT + RANDOMLETTER;
END;
RANDOMSTRING := RT;
END;

VAR
X : INTEGER;
TARGET : STRING;
CURRENT : STRING;
OFFSPRING : STRING;
TRIES : LONGINT;
FOUND_AT : INTEGER;
BEGIN
RANDOMIZE;

CLRSCR;

WRITELN(‘Type target phrase in capital letters’);
READLN(TARGET);
(* PUT SOME STRING ON THE SCREEN *)
TEXTCOLOR(GREEN);
GOTOXY(1, 6);
WRITELN(‘Target’);

GOTOXY(10, 6);
WRITELN(TARGET);

TEXTCOLOR(BLUE);

GOTOXY(1,13);
WRITELN(‘Darwin’);

TEXTCOLOR(BLUE);
GOTOXY(1,19);
WRITELN(‘Random’);

TEXTCOLOR(WHITE);
GOTOXY(1, 25);

WRITE(‘Try number’);

(* PICK A RANDOM STRING TO START DARWIN SEARCH *)
CURRENT := RANDOMSTRING(LENGTH(TARGET));

(* RUN THROUGH MANY TRIES *)
FOUND_AT := 0;
FOR TRIES := 1 TO 100000 DO
BEGIN

(* Darwin *)
OFFSPRING := CURRENT;
OFFSPRING[ 1 + RANDOM(LENGTH(OFFSPRING)) ] := RANDOMLETTER;

GOTOXY(10,13);
WRITELN(OFFSPRING, ‘ ‘);

IF( SIMILARITY(OFFSPRING, TARGET) >= SIMILARITY(CURRENT, TARGET) ) THEN
CURRENT := OFFSPRING;

IF( (SIMILARITY(CURRENT, TARGET) = LENGTH(TARGET)) AND (FOUND_AT = 0) ) THEN
BEGIN
(* TELL THE USER WHAT WE FOUND *)
FOUND_AT := TRIES;
GOTOXY(1, 15);
TEXTCOLOR(BLUE);
WRITELN(‘Darwin’);
TEXTCOLOR(WHITE);
GOTOXY(9, 15);
WRITELN(‘reached target after’);
GOTOXY(37, 15);
TEXTCOLOR(BLUE);
WRITELN(FOUND_AT);
WRITE(‘tries’);
TEXTCOLOR(WHITE);

GOTOXY(1, 21);
TEXTCOLOR(BLUE);
WRITE(‘Random’);
TEXTCOLOR(WHITE);
WRITELN(‘ would need more than ‘);
TEXTCOLOR(BLUE);
WRITELN(‘1000000000000000000000000000000000000000’);
TEXTCOLOR(WHITE);
WRITE(‘tries’);
END;

(* Random *)
GOTOXY(10, 19);
WRITELN(RANDOMSTRING(LENGTH(TARGET)), ‘ ‘);

GOTOXY(27,25);
WRITE(TRIES, ‘ ‘);
END;

GOTOXY(1, 20);
End.

Comments
Dieb: Is the only possible case where a "divide and conquer," "ratchet[ed]" search that shows "cumulative" progress to target -- thus, "partition[ing]" -- one in which the rate of mutation per incorrect letter to date is 100%? The answer is obvious: no. Partitioning -- AmHD: "The act or process of dividing something into parts . . . To divide into parts, pieces, or sections" --in effect finds a way to put the letters of generational champions -- what was to be explained, per the showcased examples of 1986! -- into two bins: the ones already on target [that will not slip back] and the ones not yet on target [which will vary until they can be put into the on-target bin]. Catch and keep, not catch and release. Such can be accomplished by explicit mechanisms, and by implicit mechanisms. All of which has long since been shown. Also, the W1 above, which is on balance of evidence likely to be one of the original versions of Weasel in question, manifests the implicit latching pattern. So, I think it is fair comment to say that you have missed the forest for the trees. Yes, the example and calculation M & D provide on p 1055 of the IEEE paper are relevant to a case where non-correct letters undergo 10% mutation rates. At the same time, their lab, EIL, hosts several versions of Weasel that show explicit and implicit latching of letters in generational champions and consequent ratcheting, cumulative progress to target; which last is what CRD enthused over in his remarks on the performance of the original Weasel algorithm. And since CRD did not provide the actual algorithm or program, we must consider how this can be done. It turns out, many ways that fit with the description and examples c. 1986 in BW. Of these, the implicitly latched versions are on balance of evidence the correct general class. And, on the balance of evidence discussed in this thread, W1 in the original post is a credible candidate to be the actual original Weasel responsible for the showcased runs. W1, as discussed, shows implicit latching of generational champions. In short, regardless of pros and cons on the M & D discussion on p 1055 of the IEEE paper, what was needed to be explained in the first place as a secondary matter, has been adequately explained. And, on the primary matter concerning Weasel, it would be far more relevant to the central issues at stake if Darwinists were to attend with 1/100 the effort expended on trying to dismiss the concept of implicit latching and ratcheting, to the issue that Weasel, from the outset, has been fundamentally dis-analogous to the claimed process of chance variation and natural selection and what that can achieve. For Weasel plainly and admittedly rewards non functional nonsense phrases on mere increment in proximity to a target, betraying its fundamental quesiton begging: what was to be explained is . . . not . . . [a] that functionality based on complex information can vary through random events affecting the information, and that in a competitive population, that may lead to shifts in relative abundance of varieties . . . but instead . . . [b] the origin of complex, information based function, for both first life and body plan level bio-diversity. When such a central issue should be on the table, but instead every tangential side-issue imaginable is being discussed and debated, that tells me a lot about the true state of the case on the merits. And, not to the benefit of Darwinism, nor to the credit of Darwinists so committed to distractive side issues. GEM of TKIkairosfocus
September 25, 2009
September
09
Sep
25
25
2009
01:15 AM
1
01
15
AM
PDT
Dembski and Marks say:
Partitioned search is a “divide and conquer” procedure best introduced by example.
Apparently the introduction via example isn't good enough, as we seem to disagree on what such a search entails. "divide and conquer" implies that at each step we can exclude a set of strings from being possible solutions: In the example of Dembski and Marks, all strings which don't fit .......................E.S.. after the first generation, and all strings which don't fit ..T..........S....E._..E.S.L after the second generation. That is, we can decide for a string before the evaluation of the fitness function whether it is a possible solution or not. Therefore, we are reducing our search space in each step, as only possible solutions will be presented to the oracle. Does Dawkins's algorithm works in this way? Does weasel1 work in this way? Whether they are quasi-latching or whatever, they are not dividing-and-conquering: Reexamine Dawkins's example in his book - after 20 generations we get the phrase 20 MELDINLS IT ISWPRKE Z WECSEL So, will only strings of the form ME..IN.S_IT_IS........WE.SEL be proposed to the oracle? If you have a look at weasel1, the answer for this program is emphatically no: especially if the generations are big, many strings not fitting this muster will be examined: the search space isn't partitioned in each step. If you understand "divide and conquer" in this context in another way, please elaborate... Until than, I can only repeat: Dawkins's algorithm isn't a "divide and conquer" algorithm as introduced by Dembski's and Marks's example.DiEb
September 24, 2009
September
09
Sep
24
24
2009
08:00 AM
8
08
00
AM
PDT
PS: Joseph, I forgot: latching, ratcheting, cumulative progress to target [which is what CRD enthused over in 1986] and partitioning of the targetted search are all causally connected. Thanks for the reminder.kairosfocus
September 24, 2009
September
09
Sep
24
24
2009
05:29 AM
5
05
29
AM
PDT
Dieb: A modification of the approach will show one way that implicit latching can be achieved. recall, the essential feature of implicit latching and ratcheting is that if we have a per letter mutation rate and population that are matched to an appropriate selection filter, we will see that implicit latching will become observable in at least some runs if: 1 --> To high probability, no-change cases appear in the pop of children of a given seed. (Not very hard to achieve as 1 of 27 times a mutation comes back to being the original letter. And, in other cases, with a low enough probability of mutation per letter in a child phrase, some members of a generation will have no letters so chosen and will be equal to the original.) 2 --> This creates the ratcheting action's required antireverse. And, antireverse playing out in a pop run is what is at the heart of latching, ratcheting and partitioning: once a letter goes correct, the search for it is effectively over, i.e. the search is effectively letterwise. That is, it is locked in or latched by the antireverse effect. In other words, latching, ratcheting and partitioning are causally interconnected. [And, onlookers, this has been pointed out over and over and over again; there is a persistent strawmanising of what partitioning etc. means.] 3 --> Once single step advances then dominate the behaviour on the filter, we will have a pattern of implicit latching in at least some runs. (Quasi-latching with occasional slips will pop up otherwise.) 4 --> This has of course been demonstrated here at UD ever since April 9th. 5 --> As also discussed above in this thread at 39 ff, the W1 algorithm -- W2 is decidedly different, not showing generational clustering -- has a backstop and actually enforces single letter advances only. With a pop of 100 per gen, the odds of showing implicit latching of generation champions are according to your calculation something like 199 out of 200. 6 --> Now, on p 1055 of the IEEE paper, M & D presented a mathematical analysis for a case where 100% of non-correct letters mutate. (We now know part of why, as this gives them the opportunity to present an interwoven code.) 7 --> By the alchemy of strawmannising rhetoric, this has been transmuted into "the" M & D algorithm, to be triumphalistically contrasted with what Dawkins did. Problems: (i) EIL actually presents -- ever since April (long before the IEEE paper was published) a cluster of algorithms covering the range of reasonable interpretations of the Weasel description [and the comparative all at once search], and (ii) Dawkins' description and showcased runs c 1986 are compatible with two whole families of algorithms: explicitly and implicitly latched targetted searches. 8 --> It is in this context that I have objected to trying to turn partitioning to a synonym for the sort of algor that may be interpreted from the didactic example and associated calculation on p 1055 of the IEEE paper. 9 --> Similarly, Weasel, as a targetted search that rewards decidedly non functional "nionsense phrases," on mere proximity to target, is fundametnally dis-analogous to the proposed darwinian mechanism of chance variation and natural selection across competing preproducing populaitons. 10 --> For, surely, being a viable, reproducing life form matched to a particular environment is a necessary condition for CV + NS to even be a factor. And, that requires origin of complex function. 11 --> Which brings us back to the core challenge to the Darwinian synthesis ever since the Wistar consultation of 1966: origin of complex function. Until you have a means to credibly and with sufficient probability create a 747 in a junkyard by a tornado, you have no basis for originating a von Neumann replicator with metabolic action [first life], and you have no mechanism to onward create complex body-plan level novelty. 12 --> And after the various rhetorical dodges, objections and turnabout tactics are discounted, the bottomline remains: evolutionary materialism, the reigning orthodoxy, has no viable mechanism for the origin of required complex function in the context of metabolic von Neumann self-replicators.
[Recall, we need metabolic machines to create parts, we need blueprints, we need coding schemes, we need code readers, we need organised clusters of effector machinery. And at just 1,000 bits as a therdhold for FSCI, the atomic resources of the observed cosmos across its credible lifespan are simply not adequate to give us a credible scan across the number of possible configs. Where, 1,000 bits -- less than 150 bytes -- is hopelessly inadequate storage to set up a VNR.]
------------- GEM of TKIkairosfocus
September 24, 2009
September
09
Sep
24
24
2009
05:25 AM
5
05
25
AM
PDT
Except that as explained and illustrated by Dawkins, cumulative selection is a partitioned search. Nothing you say will change that.Joseph
September 24, 2009
September
09
Sep
24
24
2009
04:55 AM
4
04
55
AM
PDT
On the main issue, the acknowledged targetted search on increments in non-functional proximity to target suffice to show that Weasels are fundamentally dis-analogous to the claimed context of chance variation and natural selection. And, as Dawkins himself admitted, this is a bit of a “cheat” and fundamentally “misleading.” Weasel should never have been used, and in fact inadvertently demonstrates the power of intelligent design to use targetting to overcome the search space challenge of getting to complex function. So, the issue in the main seems more or less settled.
1. Dawkins's weasel highlights some aspects of chance variation and natural selection, but of course not all of them: this would be to much to ask of any short algorithm 2. Weasel demonstrates cumulative selection, and it is used to this effect. As any man-made algorithm it is designed, yet it uses evolutionary techniques 3. For me the issue in the main was: Does the algorithms exemplified in the paper of Dembski and Marks represents the algorithm described by Dawkins in "The Blind Watchmaker", and therefore, is the math of the paper applicable to it? This issue surely is settled: it does not and it is not.DiEb
September 24, 2009
September
09
Sep
24
24
2009
01:30 AM
1
01
30
AM
PDT
--kf,
Eqn 22 p 1055 of the IEEE paper was discussed previously. It as it stands relates to one scenario; which exists in the context of a fairly large cluster of various Weasel algorithms.
Eq 22 may be applicable for a fairly large cluster of various Weasel algorithms, but it doesn't seem to apply to the larger cluster of what you call implicitly latching weasels. Or can you formulate an equivalent equation for weasel1? That the equation isn't applicable on weasel1 shows again that the algorithm described by Dawkins isn't the Partitioned Search as exemplified by Marks and Dembski.DiEb
September 24, 2009
September
09
Sep
24
24
2009
12:53 AM
12
12
53
AM
PDT
Onlookers: A few notes. 1] Weasels: Dieb has raised interesting points on M & D's further approaches to the general Weasel question. Eqn 22 p 1055 of the IEEE paper was discussed previously. It as it stands relates to one scenario; which exists in the context of a fairly large cluster of various Weasel algorithms. Since the reality of implicit latching has been demonstrated and since the reasonably likely original weasels fall under this ambit, the secondary issue on latching-ratcheting in the showcased runs of 1986 has been cogently answered. On the main issue, the acknowledged targetted search on increments in non-functional proximity to target suffice to show that Weasels are fundamentally dis-analogous to the claimed context of chance variation and natural selection. And, as Dawkins himself admitted, this is a bit of a "cheat" and fundamentally "misleading." Weasel should never have been used, and in fact inadvertently demonstrates the power of intelligent design to use targetting to overcome the search space challenge of getting to complex function. So, the issue in the main seems more or less settled. 2] Moseph, 70: What KF wants is a lab experiment that’s set up to run all on it’s own that will generate life spontaneously. Putting aside the fact for now that even if that did happen he’s claim “investigator interference” it should be obvious to all that the only experiment that could possibly perform as he requires is one that has already happened and it was a one off. The Earth, billions of years ago, was the experiment . Much else is rather crudely distractive, distorting or ad hominemistic, so we will ignore it for the moment. The above excerpt, however, captures the essential problem: an experiment set up to replicate credible early earth or similar conditions is by M's acknowledgement not likely to go anywhere. [Apart from the specific designed and intelligent intervention of investigators, which would inadvertently show the capacity of intelligent design. Rather like Weasel.] But, functionally specific, complex algorithm-implementing information is routinely seen to produce funcitonal entities that are beyond teh credible reach of chance circumstances and blind mechanical forces on the gamut of our observed cosmos. In particular, a metabolism-implementing self- replicating life form will have had to implement a von Neumann replicator. This is an extension of known technologies or approaches: coded blueprint storage, code, code reader, organised effectors, support units to ingest and transform environmental inputs to feed the process with required input materials. So, M has inadvertently supported the thesis that such FSCI -- and a descriptive term and acronym take legitimacy from that, not from whoever uses it [cf here from the WAC's on its roots in 1970's - 1980's OOL researcher discussions by e.g. Orgel, Yockey and Wickens] -- is only empirically credible on intelligence. But, that cuts across the a priori intent to assume or assert that life originated spontaneously through chance + necessity, so M proceeds to assert that anyway. 3] . . . it was spontaneous? Onlookers, “spontaneous” here indicates “it just happened” when in fact it’s rather unlikely it did. As noted, networks of auto catalyzing chemicals no doubt had a part to play. There was little spontaneous about it, except that KF would like to make you think it happened by magic. In fact, KF has the version of events where things happened in an instant, from nothing. His intelligent designer swooped in and made the first cell whole. Ad hominem laced strawman, ignited to cloud asnd conguse the issue, while polarising the atmosphere. "Spontaneous" in the relevant -- and fairly obvious -- sense means: Arising from a natural inclination or impulse and not from external incitement or constraint [AmHD]. That is, I have described in a nutshell the abiogenetic models that trace to chance + necessity creatign life by tehmselves without intelligent constraint or direction. Autocatalysis without accounting for origin of gnetetic codign, metabiolism and the like to get to a VN Replicator, is assertion rather than evidence. Similarly, I have made no reference to magic, just tot he factt hat there arte two serious alternatives ont eh table for OOL:
(i) spontaneous fromation under whatever favoured prelife models are applied, (ii) intelligent -- not magical or supernatural as such [OOL on earth on intelligent design would only imply the existence of soneone at he relevant time with the technoliogy to do a VN replicator on carbon polymer molecular technology -- cause acting to form FSCI and assocated designed organisation.
As to requred time vs in an "instant," the threshold for FSCI and unlikelihood of its spontaneous origination is something like using up the 10^80 atoms of our observed cosmos from a reasonable big bang event and runing forward for 10^25 s, the thermodynamically credible lifetime, only to be sampling a maximum of 1 in 10^150 of the available states of 1,000 bits. 10^25 s is something like 50 million times the run to date on the usual cosmological timeline of 13.7 BY, and 1,000 bits or about 130 bytes, is by far and away inadequate to store a blueprint, algorithms and data structures required. Intelligent designers are known to routinely produce FSCi well beyondf tha tlevel, and within rather brief timespans compared to 10^25 s. So, the real issue is being -- predictably -- dodged, distorted, derided and dismissed: inference to best causal explanation across chance, necessity and intelligence, on empirical evidence. 4] And therefore it must be the case the intelligent designer made it happen. So, tell us about that KF? Rather then launch into another monologue about shores of function etc etc just write down what you know for a fact about how life was created by the intelligent designer. M here tries to convert inference to best explanation on empirical evidence, into an a priori assertion. Strawman distortion, again. And, instead of addressing the search space challenge squarely and fairly, he wishes to dismiss it. that should tell the asture onlooker the balance of the matter on the mertis pretty well. 5] KF: setting up a string data structure — this is the de facto fundamental data structure — that has two layer significance, one reading forwards [the five-letter increment in Weasel] plus a backwards reading expression in English is — quite literally — interwoven multi-layer coding. M: Have you proven that such is impossible to evolve? No. Are you a biologist who’s proven that such is impossible to evolve? No. So what is your point? Of course, I referred to the example on p. 1055 of the M & D IEEE paper, where on ecversign (on a hint from Rob to Dieb) we may see, cf. 39 above:
20: mas-evolutionarey-informatics ORIG: SCITAMROFN ? IYRANOITULOVE ? SAM. [has two initially correct Weasel sentence members] 21: Listen-are-these-designed-too ORIG: OOT ? DENGISEDESEHT ? ERA?NETSIL. [adds 5 newly correct Weasel sentence members]
Folks, we have here the first winner of our annual "Welcome to wales" lucky noise onion award! Random chance of course can always in logical possibility account for any apparently meaningful sequence of glyphs. How many reasonable people confronted by such evidence -- in the context of the known provenance of the EIL (it is not even a biological context, and the context showes exactly what was described: multilayer codes with interweaving . . . which of course just happens to also be present in DNA) -- will accept that the strings in question happened by chance rather than intent? On what grounds? In short: the problem here is that M plainly has not seriously reckoned with inference to best empirically based explanation in the context of islands of complex function in large configuration spaces. And that, sadly, tells us all we really need to know. ____________ GEM of TKIkairosfocus
September 24, 2009
September
09
Sep
24
24
2009
12:10 AM
12
12
10
AM
PDT
Onlookers: Observe the beginnings of a quiet little drift away from the focal issue for hte thread. Guess why. GEM of TKIkairosfocus
September 23, 2009
September
09
Sep
23
23
2009
06:46 AM
6
06
46
AM
PDT
BTW, I added some thoughts on the algorithm Random Mutation (p. 1056) here.DiEb
September 23, 2009
September
09
Sep
23
23
2009
06:10 AM
6
06
10
AM
PDT
Moseph:
Yet none of what you wrote pertains to anything that we know about the first replicator.
We don't know anything about the first replicator. And living organisms are much more than replicators. IOW Moseph you don't have anything. What we do know demonstrates it takes agaency involvement just to get two nucleotides. It also takes agency involvement to create a nucleotide sequence to catalyze ONE bond. IOW the blind watchmaker sceanrio for the OoL is in very bad shape.Joseph
September 23, 2009
September
09
Sep
23
23
2009
06:08 AM
6
06
08
AM
PDT
Kairosfocus
Pardon, but your side-issue is becoming ever more evidently just that: distractive.
You are under no obligation to respond.
And, if you insist, kindly tell us about what we KNOW per empirical observations on simplest or first self-replicators
According to you, you know all about them. According to you it's known how complex they are. According to you even the number of bits that make it up is known, more or less. According to you the fitness landscape at the time is known. So what more can I add to what you've already stated as fact?
or current spontaneously -- no undue intelligent direction, please, including e.g. selecting homochiral solutions, esp without potentially cross-interfering reactants, and without trapping-out -- formed cases under credible early earth/early solar system warm pond or deep sea vent or cometary head conditions etc
To untangle this for onlookers, what KF means is "Show me an experiment set up to generate a self replicator where the experiment has not been set up at all". All "no undue intelligent direction" means is that any such experiment conducted can always be refuted by the simple tactic of "but it was set up by intelligent agents and therefore is invalid". It seems that KF is unaware that nature itself can provide environments with homochiral solutions and without potentially cross-interfering reactants. It those conditions are replicated in the lab them as far as KF is concerned, you've proven his point for him. Onlookers, is this who you want to take your lead from on this? Someone who has already stated from the outset that any experimental methodology conducted outside of his parameters is invalid, when KF has not seen the inside of a lab since he was at University.
showing metabolic activity and genetic self-replication.
Do you think that If I had achieved self-replication in a organic lifeform from scratch that self-assembled I'd be wasting my time talking to the likes of you? No, my Nobel would await. Again, onlookers, it's another rhetorical trick from KF. Ask for something that you know does not exist and claim the non-delivery of such as a victory.
Autocatalysing molecules etc that do not spontaneously form and express code for metabolic machines and operations will not do.
If you were handing out grants then perhaps your restrictions would make some sense. As it stands, you can reject the results of such experiments if you like but as already mentioned if you want to ignore entire hierarchies of complexity that may have lead to the first replication then be my guest. Onlookers, the reason KF wishes to ignore such things as autocatalysing molecules and networks is because he can then talk about the first cell as if it appeared all at once, and the probability of that is of course very low. What is more probable is that a network sits atop a network which sits atop a network leading to the first replicator. KF wants to consider the first replicator on it's own and claim how unlikely it is to exist without intelligent guidance. When you don't consider anything but the first cell then of course it's unlikely. Another rhetorical device.
Set up all the autocatalysing RNA strings you want. You will then have to get to such strings that CODE — including both algorithms and data structures in the context of spontaneous origin of such languages — for metabolisms
But I thought you knew all about the first replicator? Perhaps then you can tell us how your model gets to CODE, algorithms and data structures? How the language of metabolism arose under your model?
[including the known complex intermediaries known as enzymes] and associate themselves with readers and effectors, then account for a shift to DNA world, all without undue investigator interference.
What KF wants is a lab experiment that's set up to run all on it's own that will generate life spontaneously. Putting aside the fact for now that even if that did happen he's claim "investigator interference" it should be obvious to all that the only experiment that could possibly perform as he requires is one that has already happened and it was a one off. The Earth, billions of years ago, was the experiment . It's not reproducible without a) A pre-biotic earth b) A lab the size of the pre-biotic earth. So, onlookers, although KF's claims seem reasonable on the face in fact he is asking for the impossible. And on past form, he'd reject it even if it came to pass.
Of course, all of these patterns exhibit targetting and associated purposeful construction of complex multi-part entities that are irreducible on function as the entities.)
Of course they do.
As for the evolutionary materialistic magic of claimed spontaneous co-optation of parts that happen to be lying around,
The only person claiming the use of magic is you.
and resulting spontaneous emergence of complex functional structures through cumulative progress [each step being functional in itself! . . . talk about a Lewontinian just-so story!],
Are you a biologist by trade then?
have you ever tried to fit a claimed substitute/ souped-up electronically active part for a car? (Mechanical electrical and electronic/infrormational compatibility have to be all present. This is not at all a given.)
Cars do not sexually reproduce. You can make a part for a car that fits no other car in the world. Somewhat of a different story with biological entities, would you not agree?
Have you ever seen a house built up from parts in a hardware store hit by a hurricane?
No real biologist thinks that cells were make in such a way. Again, another rhetorical device to fool the unsophisticated onlooker into thinking that there is some science and probability behind KF's words. As cells do not arise in such a way and nobody has every claimed that they do (even for the first replicator) what relevance does your comment have KF?
We do know that FSCI -bearing entities are routinely set up by intelligence, and for sixty years we have known how a self-replicating machine would have to be organised.
Nobody uses FSCI except you. Machines are not organic. Machines did not evolve from simpler machines via sexual reproduction, mutation and selection.
It is just a matter of technology to actually build one by intelligence. (E.g. We have self-diagnostic cars already, I would love a self-maintaining one, or close to self-maintaining one.)
You have provided no proof, other then your incredulity, that cells were designed.
And, setting up a string data structure — this is the de facto fundamental data structure — that has two layer significance, one reading forwards [the five-letter increment in Weasel] plus a backwards reading expression in English is — quite literally — interwoven multi-layer coding.
Have you proven that such is impossible to evolve? No. Are you a biologist who's proven that such is impossible to evolve? No. So what is your point?
Onlookers, this issue shows just how pernicious is the misleading impression created by Weasel through rewarding non-functional partially correct phrases on mere proximity to target.
Richard Dawkins' own words have been reproduced to you often enough on this topic that even you should have listened by now. Dawkins never claimed the example did not have serious flaws, namely that it is not a great example as it has a fixed target. Onlookers, don't be fooled by these misrepresentations. Simply obtain a copy of "The Blind Watchmaker" and see for yourself. The fact that you continue to harp on this as if you'd discovered a secret is telling. If you had a substantial criticism you'd have made it already. Despite the fact you've been corrected on this multiple times you continue to repeat it. For shame!
It is the claimed spontaneous arrival of functionality based on complex, specific information that needs to be explained cogently, not the capability of cumulative intelligent design based on targets and warmer-colder hints on guesses.
And that's being explained. By people who know what they are doing. What you need to explain is how the intelligent designer made it happen. Why don't you leave the actual research (oh, you are) to the professionals? And again (another question for you to ignore) could you please tell me who is arguing that
claimed spontaneous arrival of functionality
it was spontaneous? Onlookers, "spontaneous" here indicates "it just happened" when in fact it's rather unlikely it did. As noted, networks of auto catalyzing chemicals no doubt had a part to play. There was little spontaneous about it, except that KF would like to make you think it happened by magic. In fact, KF has the version of events where things happened in an instant, from nothing. His intelligent designer swooped in and made the first cell whole. It's amazing how much you want to talk about the way you insist it did not happen is it not KF? I mean, there's many more ways it did not happen then did. You claim to know how it did happen. Why not talk about that for a change?
PPS: A von Neumann replicator is plainly irreducibly complex: no blueprint, no coded info and instructions. No code, no capability to communicate instructions or descriptions and specifications. No reader, no way to make sense of same. No effector, no ability to use the same to do the task in hand. No functionally correct spatio-temporal organisation corresponding to the instructions and data, and there will be no way for components to work together to achieve the process. No metabolism, no resources to do all of that. (Sub-components may in turn exhibit a similar irreducibility.)
And therefore it must be the case the intelligent designer made it happen. So, tell us about that KF? Rather then launch into another monologue about shores of function etc etc just write down what you know for a fact about how life was created by the intelligent designer. Then we can compare that against the "just so stories" coming out of the labs of actual scientists researching the origin of life and see who has the most creditability. I'm waiting.Moseph
September 23, 2009
September
09
Sep
23
23
2009
03:42 AM
3
03
42
AM
PDT
-kf could you tell us how to apply eq. 22 on W1? Thanks!DiEb
September 23, 2009
September
09
Sep
23
23
2009
12:01 AM
12
12
01
AM
PDT
Moseph: Pardon, but your side-issue is becoming ever more evidently just that: distractive. And, if you insist, kindly tell us about what we KNOW per empirical observations on simplest or first self-replicators [or current spontaneously -- no undue intelligent direction, please, including e.g. selecting homochiral solutions, esp without potentially cross-interfering reactants, and without trapping-out -- formed cases under credible early earth/early solar system warm pond or deep sea vent or cometary head conditions etc] showing metabolic activity and genetic self-replication. Autocatalysing molecules etc that do not spontaneously form and express code for metabolic machines and operations will not do. (That's a strawmannish bait and switch. Set up all the autocatalysing RNA strings you want. You will then have to get to such strings that CODE -- including both algorithms and data structures in the context of spontaneous origin of such languages -- for metabolisms [including the known complex intermediaries known as enzymes] and associate themselves with readers and effectors, then account for a shift to DNA world, all without undue investigator interference. Of course, all of these patterns exhibit targetting and associated purposeful construction of complex multi-part entities that are irreducible on function as the entities.) As for the evolutionary materialistic magic of claimed spontaneous co-optation of parts that happen to be lying around, and resulting spontaneous emergence of complex functional structures through cumulative progress [each step being functional in itself! . . . talk about a Lewontinian just-so story!], have you ever tried to fit a claimed substitute/ souped-up electronically active part for a car? (Mechanical electrical and electronic/infrormational compatibility have to be all present. This is not at all a given.) Have you ever seen a house built up from parts in a hardware store hit by a hurricane? We do know that FSCI -bearing entities are routinely set up by intelligence, and for sixty years we have known how a self-replicating machine would have to be organised. It is just a matter of technology to actually build one by intelligence. (E.g. We have self-diagnostic cars already, I would love a self-maintaining one, or close to self-maintaining one.) And, setting up a string data structure -- this is the de facto fundamental data structure -- that has two layer significance, one reading forwards [the five-letter increment in Weasel] plus a backwards reading expression in English is -- quite literally -- interwoven multi-layer coding. GEM of TKI PS: Onlookers, this issue shows just how pernicious is the misleading impression created by Weasel through rewarding non-functional partially correct phrases on mere proximity to target. It is the claimed spontaneous arrival of functionality based on complex, specific information that needs to be explained cogently, not the capability of cumulative intelligent design based on targets and warmer-colder hints on guesses. PPS: A von Neumann replicator is plainly irreducibly complex: no blueprint, no coded info and instructions. No code, no capability to communicate instructions or descriptions and specifications. No reader, no way to make sense of same. No effector, no ability to use the same to do the task in hand. No functionally correct spatio-temporal organisation corresponding to the instructions and data, and there will be no way for components to work together to achieve the process. No metabolism, no resources to do all of that. (Sub-components may in turn exhibit a similar irreducibility.)kairosfocus
September 22, 2009
September
09
Sep
22
22
2009
11:44 PM
11
11
44
PM
PDT
Atom: Okay, so it is Marks and Dembski. (That collaboration is getting seriously devious in their mathematically tinged devices.) No guzum ;) GEM of TKI PS: Dieb. You spotted more than I did. It is now quite clear that the example was a set-up illustration, and in the hidden context of a design inference on two-layer interwoven coding, one read backwards the other front-ways. So, it served more than one didactic purpose! (And, odds become irrelevant in such a clearly decisional context. I would not like to calculate the odds of generating that particular doubly functionally specified string by chance! 1 of 70 or so odds are also not particularly easily observed, though it is possible: "couple [ = two] of runs" would most likely not be enough to see such a jump in the very first generation from the parent seed. As well, they have given a definition of ratcheting and illustrated it with what is plainly a didactic example, then gone on to a calculation for a simple case; they have not presented an algorithm as such.) Similarly, the calculation they gave can be extended to the sort of generational- seed- as- child- backstopped [or, is "dogged" better suited here -- this is the pawl in the ratchet], single- step- advance- child- ratcheting case that is relevant for implicit latching, by a "fairly simple" adjustment. This simple dynamical model then extends to the case where some advances are missed due to various masking effects.kairosfocus
September 22, 2009
September
09
Sep
22
22
2009
11:12 PM
11
11
12
PM
PDT
-kf, R0b & W. Dembski 1: I had spotted the EVOLUTION in the first phrase, but hadn't thought much of it as you can start the algorithm with any phrase. I didn't spot the DESIGN, and frankly, I didn't expect the second string to be designed... 2: ... though the probability to gain five or more correct letters starting this first string is less than 1.5 %, it could have been observed after a couple of runs 3: but perhaps, R. Marks and W. Dembski didn't actually implement the algorithm. That could be explain the lack of an accompanying picture, which the other examples have. 4: If Mr. Dembski is reading this: Why did you choose µ=0.00005 for fig. 2? Of course you wanted to be able to apply the equation of your appendix, but the expected number of generations for this parameter is ~55,500 , while the best choice needs only ~ 10,600? And while the error of your elegant estimation is only .25% in the first case, with 2.4%, it's not too bad in the second case, neither (at least according to my calculations.)DiEb
September 22, 2009
September
09
Sep
22
22
2009
04:56 PM
4
04
56
PM
PDT
kf, Hey, haven't been keeping up with UD posts. Wish I had thought of the codes, but can't claim credit for it. So no guzum, please sir. :) AtomAtom
September 22, 2009
September
09
Sep
22
22
2009
03:20 PM
3
03
20
PM
PDT
kairosfocus, their math states that the probability of hitting the target in Q iterations is (1-(1-(1/N))^Q)^L. That math is correct only if an iteration consists of holding the correct letters fixed and randomly changing all incorrect letters. I see no way for those conditions to obtain in an "implicit partitioning" scenario.R0b
September 22, 2009
September
09
Sep
22
22
2009
02:51 PM
2
02
51
PM
PDT
But in every case we are orders of magnitude beyond the 1,000 bit threshold where the resources of the cosmos are hopelessly inadequate to get to provide enough configs to have a credible chance of getting to a bit of FSCI.
Yet none of what you wrote pertains to anything that we know about the first replicator. Do you really think that looking at currently existing unicellular life tells us a single thing about the first replicator? To give you an example. If you looked at the first motorcar would you conclude that the first motorcar was representative of the first example of locomotion? Would you knock out parts of that first car and conclude when it stopped working "ah - this is the minimum number of parts needed to travel"? Or rather would it be the case that the car was the pinnacle of a different hierarchy of technology, and only by looking at that hierarchy could you come to a full understanding of how the first car came to be. And so it is with the first replicator. By looking only at today's "car" you miss the hierarchy hidden away that allowed the car to come into being. It's then ironic that you say
Note, we here deal with empirical reality, not just so RNA world or imaginary subsea vent sulphur worlds etc: provide observed life forms carrying out independent metabolism much below that band and I will accept them.
Yet empirical reality notes you know nothing at all about the first replicator. And what do you mean by "imaginary subsea vent sulphur worlds"? Such worlds cut off from the sun are not imagination, as much as you might wish it. They are empirical reality. And what does it matter what you accept as a possibility? You are not actively researching in this space, you are not involved except as a bystander. So, be convinced or not it matters not at all. To recap. You know nothing about the first replicator and can only provide "just so" stories about it where you insist it must have a given level of complexity that is impossible to occur naturally, despite only knowing about currently existing organisms and not knowing (nor caring it seems) about such possibilities as undersea vent environments, rich in energy and chemical mixing.
AND, 150 bytes is way too narow to cde blueprints, code for replicating algorithms AND for metabolism — evne withthe sort of code layering trick that M & D pulled on us all.
Only you could call some words spelled out backwards a "code layering trick" And again, you talk about bytes "replicating algorithms AND for metabolism" as if you know something about the conditions surrounding the first replicator. You do not.Moseph
September 22, 2009
September
09
Sep
22
22
2009
02:36 PM
2
02
36
PM
PDT
Moseph: I do not have much time just now. 1 --> We look at unicellular life and look for when knockout studies lead to disintegration of life function. 300 - 500 k bases drops out, just double to get bits. 2 --> This is an observational hard point, and the parasitic forms down to about 200 k bits show that below the threshold we run into problems of not ding all the biochem work required. 3 --> Note, we here deal with empirical reality, not just so RNA world or imaginary subsea vent sulphur worlds etc: provide observed life forms carrying out independent metabolism much below that band and I will accept them. 4 --> But in every case we are orders of magnitude beyond the 1,000 bit threshold where the resources of the cosmos are hopelessly inadequate to get to provide enough configs to have a credible chance of getting to a bit of FSCI. 5 --> AND, 150 bytes is way too narow to cde blueprints, code for replicating algorithms AND for metabolism -- evne withthe sort of code layering trick that M & D pulled on us all. 6 --> So, kindly keep the eye on the ball. ___________ ATOM: You haven't confessed yet, and I might just pull up that old bun dem food guzum! ;) GEM of TKIkairosfocus
September 22, 2009
September
09
Sep
22
22
2009
02:18 PM
2
02
18
PM
PDT
Rob: You give an interesting historical note. Basic problem: partitioning can happen implicitly or explicitly, i.e it is equivalent to ratcheting and associated latching. (Partitioning looks at the issue from the facet where once a letter goes correct in a given generation champ it is effectively in the correct bin and does not fall back out. Latching looks on the lock on the bin, and ratcheting on the way new letters are dropped in.) GEM of TKIkairosfocus
September 22, 2009
September
09
Sep
22
22
2009
02:10 PM
2
02
10
PM
PDT
Correction @ 58 -- I think the paper actually goes back to 2007, but May 2008 is the earliest dated copy that I have.R0b
September 22, 2009
September
09
Sep
22
22
2009
02:08 PM
2
02
08
PM
PDT
Kairosfocus
First life took ~ 600 – 1,000 k bits of genetic info
How on earth can you possibly know that? Why are your error bars so high? 600 to 1,000? What can you do to reduce those error bars? What is your methodology? If you are basing this on currently existing life and using some sort of process to work backwards billions of years in time then how exactly do you know when you've got it right? As nobody whatsoever knows anything at all about "first life" I find it amazing that you can proclaim such with complete confidence. Tell me Kairosfocus, do you have a sample of this "first life"? You seem to know so much about it.... If you are in fact basing your claims on what is currently known about simple lifeforms and the amount of information in them then are your claims not just "just so" stories with no basis in fact? Exactly like the "just so" stories about evolution routinely decried here. If you know so much about the origin of life why don't you write up a proposal and get the Biologic institute to research it? Prove that it is impossible without intelligent design. You seem to have already proven the case in your own mind, now why not try and convince some other people outside the little circle here? I.E. actual scientists. Of course, if you don't believe your ideas would stand up to some serious scrutiny then perhaps it's best you don't do that.Moseph
September 22, 2009
September
09
Sep
22
22
2009
02:01 PM
2
02
01
PM
PDT
kairosfocus, recall that the original WeaselWare consisted only of the partitioned search. Atom coded it according to Marks and Dembski's understanding of WEASEL. It was in response to comments on this forum that Atom created WeaselWare 2.0 this past spring. Note also that the paper in question preceded WeaselWare 2.0 by about a year. The May 2008 version of M&D's paper has a section on partitioned search that is essentially identical to that of the current version, with the same reference to TBW. And note, finally, that the EIL's Weasel math page (not written by Atom) still says that Dawkins' algorithm is a partitioned search, and presents the same math as in the paper. In summary, it's clear that M&D think (or thought) that Dawkins' algorithm in TBW was, in fact, the partitioned search that they describe in their paper.R0b
September 22, 2009
September
09
Sep
22
22
2009
01:53 PM
1
01
53
PM
PDT
kairosfocus
EIL presented a cluster of alternatives covering hte bases, including explicit and implict latched cases and cases that show the single step approach.
Which one represents a non-partitioned search with no "explict" letter latching? Which is the Weasel you get if you implement it as described, line by line, from TBW and add nothing that is not described. Which one is it? Is it there?Moseph
September 22, 2009
September
09
Sep
22
22
2009
01:47 PM
1
01
47
PM
PDT
Moseph: 1] EIL presented a cluster of alternatives covering hte bases, including explicit and implict latched cases and cases that show the single step approach. Guess which ones converge in a reasonable time, why. 2] I and others are long since on record in say the WAC's no 28 above, that a useful rule of thumb heuristic threshold is 500 - 1,0000 bits as the border of of functionally specific information not credibly achievable by chance based processes on the gamut of the observed cosmos. First life took ~ 600 - 1,000 k bits of genetic info, if it is at all comparable to observed simple life forms not dependent on other life forms for vital nutrients. Novel body plans take on evidence 10 - 100+ million bits. (You would do well to read the WACs.) 3] to understand this, 1,000 bits corresponds to ~ 10^301 configs, or ten times the square of the number of states the atoms of the observed universe would go through across their thermodynamically credible lifespan, or ~ 50 million times longer than the timeline back to the usual date for the big bang. (And only a very small portion of the cosmos will form zones fitted for emergence of life.) 4] in short, I am not making a "probability" challenge but a "search tantamount to no search" challenge. the cosmos simply will not exhaust sufficient of these states to be significantly different from a search of zero scope. So, 500 - 1,000 bits is a reasonable upper threshold for CV -- to get to the variation in the first instance. 5] for 1st life that is about 150 bytes of information, barely enough to sneeze and certainly not enough to write out a blueprint for a self-replicating von Neumann machine's parts much less the code to execute the replication. GEM of TKIkairosfocus
September 22, 2009
September
09
Sep
22
22
2009
01:36 PM
1
01
36
PM
PDT
PS: Rob, my point on M & D of EIL is that given the EIL GUI's multiple algors, trying to pin them down to just one algor claimed to be fundamentally incompatible with the description of Weasel c 1986 is just a bit strawmannish. You will note I started form how that claimed "run" looked fishy [save as a didactic illustration], and you have given me reason to see that it was really and truly fishy indeed.kairosfocus
September 22, 2009
September
09
Sep
22
22
2009
01:19 PM
1
01
19
PM
PDT
Kairosfocus
anything reasonably claim-able for chance variation and natural selection of living, reproducing forms of life.
What can reasonably be claimed for chance variation and natural selection. For you, what is the limit?Moseph
September 22, 2009
September
09
Sep
22
22
2009
01:17 PM
1
01
17
PM
PDT
Kairosfocus
I pointed out that EIL hosts several different algors,
Thank you for the tip. Could you tell me which one represents "Weasel" as described in TBW?Moseph
September 22, 2009
September
09
Sep
22
22
2009
01:11 PM
1
01
11
PM
PDT
Rob: I owe you one. Appreciated. Weird, manipulating spaces, reading in reverse from p. 1055 the IEEE paper:
21: Listen-are-these-designed-too 20: mas-evolutionarey-informatics
Never saw that before! (Talk about functionally specific and complex information that is contextually responsive . . . And, of course I am not exactly a word search fan. How much time did M & D spend on putting that little interwoven meaning time-bomb? ATOM: Fess up!!!! [Or I will work a "guzum" to make the Luminous One burn your dinner for a week! ;) Just joking! Give her our love.]) There is enough present to see that they do math on a 100% mut rate on non-correct letters, and they of course are speaking of a context in which correct letters latch. As we have seen and demonstrated practically, once we have non-change cases with high enough odds in the generation, and single chance cases dominate otherwise a significant fraction of the runs will latch implicitly and ratchet to the target due to the proximity reward filtering process that selects the next generation champion. The side issues are to all intents and purposes over. And on the main one, a targetted search rewarding increments on mere non-functional proximity is fundamentally dis-analogous to anything reasonably claim-able for chance variation and natural selection of living, reproducing forms of life. GEM of TKIkairosfocus
September 22, 2009
September
09
Sep
22
22
2009
01:05 PM
1
01
05
PM
PDT
1 2 3 4 5

Leave a Reply