Uncommon Descent Serving The Intelligent Design Community

An Eye Into The Materialist Assault On Life’s Origins

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Synopsis Of The Second Chapter Of  Signature In The Cell by Stephen Meyer

ISBN: 9780061894206; ISBN10: 0061894206; HarperOne

When the 19th century chemist Friedrich Wohler synthesized urea in the lab using simple chemistry, he set in motion the ball that would ultimately knock down the then-pervasive ‘Vitalistic’ view of biology.  Life’s chemistry, rather than being bound by immaterial ‘vital forces’ could indeed by artificially made.  While Charles Darwin offered little insight on how life originated, several key scientists would later jump on Wohler’s ‘Eureka’-style discovery through public proclamations of their own ‘origin of life’ theories.  The ensuing materialist view was espoused by the likes of Ernst Haeckel and Rudolf Virchow who built their own theoretical suppositions on Wohler’s triumph.  Meyer summed up the logic of the day

“If organic matter could be formed in the laboratory by combining two inorganic chemical compounds then perhaps organic matter could have formed the same way in nature in the distant past” (p.40)

Darwin’s theory generated the much-needed fodder to ‘extend’ evolution backward’ to the origin of life.  It was believed that “chemicals could “morph” into cells, just as one species could “morph” into another “ (p.43).   Appealing to the apparent simplicity of the cell, late 19th century biologists assured the scientific establishment that they had a firm grasp of the ‘facts’- cells were, in their eyes, nothing more than balls of protoplasmic soup.   Haeckel and British scientist Thomas Huxley were the ones who set the protoplasmic theory in full swing.  While the details expounded by each man differed somewhat, the underlying tone was the same- the essence of life was simple and thereby easily attainable through a basic set of chemical reactions.

Things changed in the 1890s.  With the discovery of cellular enzymes the complexity of the cell’s inner workings became all too apparent and a new theory that no longer relied on an overly simplistic protoplasm-style foundation, albeit one still bounded by materialism, had to be devised.  Several decades later, finding himself in the throws of a Marxist socio-political upheaval within his own country, Russian biologist Aleksandr Oparin became the man for the task. 

Oparin developed a neat scheme of inter-related processes involving the extrusion of heavy metals from the earth’s core and the accumulation of atmospheric reactive gases all of which, he claimed, could eventually lead to the making of life’s building blocks- the amino acids.  He extended his scenario further, appealing to Darwinian natural selection as a way through which functional proteins could progressively come into existence.  But the ‘tour de force’ in Oparin’s outline came in the shape of coacervates- small, fat-containing spheroids which, Oparin proposed, might model the formation of the first ‘protocell’.

Oparin’s neat scheme would in the 1940s and 1950s provide the impetus for a host of prebiotic synthesis experiments, most famous of which was that of Harold Urey and Stanley Miller who used a spark discharge apparatus to make the three amino acids- glycine, alpha-alanine and beta-alanine.  With little more than a few gases (ammonia, methane and hydrogen), water, a closed container and an electrical spark Urey and Miller had seemingly provided the missing link for an evolutionary chain of events that now extended as far back as the dawn of life.  And yet as Meyer concludes, the information revolution that followed the elucidation of the structure of DNA would eventually shake the underlying materialistic bedrock.          

Meyer’s historical overview of the key events that shaped origin-of-life biology is extremely readable and well illustrated.  Both the style and the content of his discourse keep the reader focused on the ID thread of reasoning that he gradually develops throughout his book.

Comments
nakashima wrote:
I find it strange that some sites never credit Huxley for admitting and retracting his error.
Even stranger is the way people gloss over a contemporary account of the Bathybius scam, such as Joseph Cooke's, in favour that be-all, end-all fount of truth, Wikipedia.Vladimir Krondan
July 20, 2009
July
07
Jul
20
20
2009
01:32 AM
1
01
32
AM
PDT
Some further footnotes: I see a considerable exchange over FSCI (the functionally defined subset of CSI) happened yesterday. Jerry has just captured the essential point, given what we know about how cell based, DNA and RNA-using life operates:
“Nothing in Biology Makes Sense Except in the Light of functional complex specified information.”
In that context, it would be useful to remind ourselves of Orgel's original remark from 1973, as already noted at 82:
In brief, living organisms are distinguished by their specified complexity. Crystals are usually taken as the prototypes of simple well-specified structures, because they consist of a very large number of identical molecules packed together in a uniform way. Lumps of granite or random mixtures of polymers are examples of structures which are complex but not specified. The crystals fail to qualify as living because they lack complexity; the mixtures of polymers fail to qualify because they lack specificity. [L.E. Orgel, 1973. The Origins of Life. New York: John Wiley, p. 189.]
In short, Orgel makes a two-part distinction, points out examples and counter examples and draws a conclusion. In so doing, he sets the term "specified complexity" in a bio-functional (and of course -- given the significance of DNA and the genetic code -- informational) context. Thus, to deduce from it and explore the significance of BOTH (i) complex specified information in general, and (ii) functionally specified, complex information are plainly legitimate. Dembski has provided a general metric, Abel et al have provided a more narrow metric on the average information per symbol [aka informational entropy] in light of observed sequences, and Hazen is using a very similar concept with a threshold of function. The point there is that function is a macro-observable, which is compatible only with a cluster of components that are so fitted together as to be at an operating point -- borrowing a term from amplifier design. This means in turn that we have a recognisable macrostate, which can be distinguished form a non-functional macrostate; thence we can in principle do comparative microstate counts, thence get to entropy metrics a la Boltzmann's s = k ln w, so through Brillouin's negentropy formulation, to information. Or, following Jaynes et al, we may bridge to information on the informational interpretation of entropy as I excerpt on and discuss in App 1 my always linked. Onward, as Bradley et al following Yockey et al show [observe my point 9 in that appendix], this ties into the CSI concept in a functional context. This state space, macro-micro distinction and compatibility leading to comparative counts and info and entropy metrics is the underlying thinking that is always at work in my discussion of FSCI. It is also directly relevant to the issue of the origin of life: bio-function is in effect a recognisable "macro" phenomenon, which is compatible with certain configurations of biologically relevant macromolecules, which are observed as created based on digital code and energy-using informational processing and associated organisation into functional structures at operating points. Structures that are fairly easily fatally perturbed, e.g. by radiation (which mostly creates free radicals from water molecules, triggering at random thermodynamically favourable reactions, in turn breaking up that fine-tuned complex, co-ordinated functionality). This demonstrates the reality of islands of function in a sea of non-function. That islands of function often exist in a sea of non-function is an extension of remarks by Denton on what would happen if letters were concatenated at random in his 1985 work, and here at UD it is GPuccio [GP -- are you out there watching?] who seems to have first hit on the happy phrasing and imagery. Now, as a result of that, we can ask about thresholds of complexity that make it utterly unlikely that such function originated by chance. The answer is that when specificity and complexity pass a threshold, it is maximally unlikely that available resources can move from plausible initial conditions [prebiotic "soups" especially] by undirected chance plus blind mechanical forces. but by contrast, it is well known that FSCI is produced by intelligent agents, linguistic and algorithmic text strings being an obvious case in point. So, the threshold is one of inference to best explanation. For that, we know that the observed cosmos has about 10^80 atoms, and that 10^25 s and one state per 10^-43 s [the Planck time] are a reasonable cluster. So, for a unique functional state 500 bits of information [~ 3 *10^150 configs] is a good marker. Squaring that, we have a situation where the 10^150 states reachable by the observed cosmos across its thermodynamically plausible working life would only be able to search less than 1 in 10^150 of the possible configurations, So, it is reasonable that 1,000 bits of information storage capacity [= Shannon information] in an observed functional system that is vulnerable to modest perturbation, will not be detectable by undirected processes, as sampling 1 in 10^150 of the config space is equivalent to marking 1 atom for one intant in the observed comsos, and then usign a space probe capable to time travel and traversign the entirte observed cosmos at random, and then picking up just one atom, just once, and voila, it is the marked moment for the particular atom. But such systems are routinely produced using imagination, foresight and design by intelligent actors. And, this is a matter of easily observed empirical fact, starting with posts in this tread and the functional information displayed on your PC screen when you access it. In short, it is maximally unlikely that on the gamut of the observed cosmos, the observed digital information based cellular life originated by chance + necessity alone. [Recall, independent life forms start out at about 300 - 500 k base pairs.] Similarly, when we observe that major body plan innovations credibly call for leaps of 10 -100+ million bases in DNA, and associated cellular machinery to give it effect; we can infer that major body plans are not credibly explained on chance + necessity alone. These observations are very compatible with design. The real problem is that one possible candidate for such a designer -- especially given the evident fine-tuning of the cosmos that facilitates such life -- sounds a lot like the God of theism. (And worse, in Rom 1:19 - 22 or so, a major text in the specifically Judaeo-Christian manifestation of theism, it is baldly stated that attributes of the Godhead are discernible from the features of creation, rendering men "without excuse" for rejecting God. [This is actually a point of empirical testability for that tradition, as had it been obviously failed we would hear no end of gloating on the point; but when that test seems instead to have been passed, it becomes a lot more cutting in its force.]) And that seems to explain a lot of the intensity we have seen: much is at stake, much more than merely science based on a very reasonable inference to best explanation per FSCI and its known sources, that in any other context would hardly be worth noticing. GEM of TKIkairosfocus
July 20, 2009
July
07
Jul
20
20
2009
12:28 AM
12
12
28
AM
PDT
Let me provide a brief history of FSCI from my point of view. The term CSI is a term developed by Bill Dembski. Anyone with a different understanding on its origin pleas chip it. Dr. Dembski tries to formulate through mathematics a rigorous definition of design using this concept of CSI. As a new comer here in 2005 I watched as people used the term CSI but witnessed what I describe as a floundering to describe it precisely in a non mathematical fashion. I still don't understand the concept because I believe it only makes sense in terms of the mathematics. Early in 2007, there was another long discussion of just what CSI was with no apparent agreement by anyone on just what it meant. Then two things happened. Bfast said it was his understanding that CSI just was information that specified something else. That made sense to me. Well that explained DNA, sentences and computer programs but did not explain bridge hands, certain coin flips, or Mt. Rushmore. Well in a way it describes Mt. Rushmore but not perfect bridge hands, patterns in supposedly shuffled decks of cards, choices by political party members, or coin flips with specific patterns etc. At the same time kairosfocus came out of lurking and started contributing here and described the functional complex specified information. It was then obvious. Dembski was trying to be too general and develop a system that would determine for all entities whether they were designed or not while in terms of evolution the interest was much more narrow. There was no need for this more generalized concept that seemed to befuddle everyone. The information in DNA met this very narrow case of specified information so why bother with CSI since it was problematic. Hence, the focus on FSCI or FCSI. It is simple to understand and some calculations can be done on the sequences without too much trouble. No one realized at the time that OOL researchers such as Hazen was focusing on this same concept in trying to understand how life arose. They were relating sequence complexity to functional information. But we have been inundated with mock complaints by the anti ID people ever since trying to steer the discussion to the more general CSI concept which is less well defined. And a typical complaint is that our definition is not used in real science thus it is bogus. It is an interesting game the anti ID people play and I often wonder what drives them to do such things. They must obviously know how simple and appropriate the concept is yet they go on and on about its lack of scientific background or its imprecision when the concept lies at the foundation of biology. If Theodosius Dobzhansky were to make an honest statement about biology it would be "Nothing in Biology Makes Sense Except in the Light of functional complex specified information." That would be a more accurate statement than the one he made.jerry
July 19, 2009
July
07
Jul
19
19
2009
09:37 PM
9
09
37
PM
PDT
Thank you Nakashima. Then I will venture one more contribution. If you find the 2007 PNAS Hazen paper: Functional information and the emergence of biocomplexity Robert M. Hazen, Patrick L. Griffin, James M. Carothers, and Jack W. Szostak ...they do in fact make reference to "Islands of Functionality":
Islands of Function. What is the source of the reproducible discontinuities in Figs. 1 and 2? We suggest that the population of random Avida sequences contains multiple distinct classes of solutions, perhaps with conserved sequences of machine instructions similar to those of words in letter sequences or active RNA motifs (52).
Page 5 So I think KF is in fact using Hazen's simple formulation. AtomAtom
July 19, 2009
July
07
Jul
19
19
2009
09:14 PM
9
09
14
PM
PDT
Nakashima, Sorry, but FSCI is a very simple concept. In the genome think transcription and translation. A string of DNA is FSCI if it enables the formation of some unrelated but useful entity through some set of intermediary processes. How difficult is that to understand. Some DNA is junk and leads to nothing. It is just a repeat of probably useless information as opposed to useful information or FSCI. DNA which is FSCI leads to a protein or a RNA polymer which has function within the cell. Someone who dabbles in GA's can not be that dense that this is not obvious. This is now literally Biology 101. Because we recognize the similarity of this process to language and computer programming, and give it a name, does not mean that it is not meaningful if others do not use the same name. Others have made the same assessment that it is information, it is complex, it specifies a function of some other entity. If it makes you happy, call it the Nakashima transform system. As indicated other processes that follow this pattern are language and computer programming. Each requires an intermediary system to specify another entity that has function. To measure the complexity of the DNA string just as one measures the complexity of a word, sentence, paragraph, line of code, module or program one calculates the likelihood of the sequence of symbols, or in the genome, the DNA sequence to assess its likelihood. You obviously know this, but you demand exact precision feigning that this is not a scientific concept that could go the way of phlogiston when in fact it us used everywhere, every day in biology. There is no worry it will disappear because you and others do not like our definition. As I said use your own definition. I find this amusing and wonder why you and others persist in this charade. What could be the root of this faux concern for the inappropriateness of this ID concept? We could start a whole thread on this topic.jerry
July 19, 2009
July
07
Jul
19
19
2009
09:02 PM
9
09
02
PM
PDT
Mr Atom, A contribution is never an interruption!Nakashima
July 19, 2009
July
07
Jul
19
19
2009
08:45 PM
8
08
45
PM
PDT
Mr. Nakashima, Yes, there is no mention of that. I had the impression that KF's usage was consistent (and equivalent) with Hazen's, since I've seen him in the past relate FSCI to the work of Durston, Abel and Trevors which use either Hazen's formulation or one very similar. I'll let him answer if he had a different usage in mind. (If he did, I apologize for interrupting the flow of conversation with an inaccurate reference!) AtomAtom
July 19, 2009
July
07
Jul
19
19
2009
08:30 PM
8
08
30
PM
PDT
Mr Atom, Thanks for the reference! But I am not sure if Mr Kairosfocus will agree that the two definitions are equivalent, because there is no discussion of "islands of function" in Hazen's functional information. Good luck withyour work at EIL!Nakashima
July 19, 2009
July
07
Jul
19
19
2009
07:24 PM
7
07
24
PM
PDT
Dear Nakashima, Sorry to jump into your thread, but Functional Information (which is equivalent to FSCI) is defined by Hazen: here. Also, earlier this week we had a breakthrough in which we would be able to directly relate functional information (FSCI) to Active Information (which is the metric developed by the EIL, and seems most amenable to use with GAs and searches in general.) I'm guessing it will be the topic of an upcoming paper. AtomAtom
July 19, 2009
July
07
Jul
19
19
2009
06:15 PM
6
06
15
PM
PDT
Mr jerry, Very happy to respond to you, sir. First, as Mister Kairosfocus has reminded us, specified complexity has its roots in the OOL literature. I for one have no objection to deriving CSI from specified complexity. FSCI seems to be KF's private term, often used here, picked up by a few others. That is fine, as far as it goes, but to be taken seriously outside this blog it needs a clear and concise definition. Otherwise it will fall into the category of terms such as phlogiston, protoplasm and "I know it when I see it" vagueness. It seems that FSCI is measurable in bits, according to KF's frequent usage. So it would seem amenable to precision. I believe this is exactly the kind of precision that ID studies needs to earn its place in the scientific communirt. My encouragement is genuine. Just as FSCI is an abstract concept, and can be applied to many non-biological models, so genetic algorithms are abstractions that can be applied in a variety of contexts. They do not model or rely on a close analogy to the cell. The broader term Evolutionary Computation is more apt, since it focuses on the of evolutionary operators - a population, history, variation and selection. So the first point is that if FSCI is truly an important concept, it should be applicable to an abstract GA, just as much as beaker of chemicals. I'm not implying that abiogenesis research is being done by anyone today via GA, though there are some relevant efforts. More to the point are KF's frequent claims that abiogenesis is akin to solving a 1000 bit problem. Well, some 1000 bit problems are not very hard, and others are. (BTW, GAs can be used to solve problems up to a gigabit in size, larger than the human genome. Larger than the potato genome!!) How do you tell the difference? It seems to me that KF has identified FSCI equivalent problem hardness with "islands of function". Again, the term needs a precise definition to be useful. Some problems have obvious islands of function and some don't. I'd be perfectly happy if KF said he is working towards some FSCI metric such as (# of bits in the solution) * (hardness), with a strict definition of hardness. To bring this back to abiogenesis, it remains to be proven that actual chemical evolution happens under appropriate circumstances, and whether it is a 'hard' problem. Exactly what 'islands of function' means in prebiotic conditions is unclear. Having confirmed Oparin's guesses, chemists must try to find support for hypercycles and NK landscapes. The success or failure of that kind of investigation will confirm whether prebiotic chemistry has the 'islands of function' which KF asserts that it does, by analogy with today's biochemistry. In summary, FSCI and GAs are both abstractions from real biology, and claims are made for them on abstract problems that people hope are relevant to chemical systems. But if you want to advance our understanding of the world, you have to be a bit more precise, than we have seen heretofore. My questions have been meant to help that happen.Nakashima
July 19, 2009
July
07
Jul
19
19
2009
05:31 PM
5
05
31
PM
PDT
"Can you give me an example of something with 499 (not designed)" No, I doubt anything exists of this complexity that was also functionally specified could ever arise naturally. You would be hard pressed to find something of a few bits that wasn't designed. The number was picked because it is so large that no possible combination of atomic states since the Big Bang could lead to it. "500 (designed" probably any of your posts here "million bits of FSCI?" a typical short thread here "an almost impossible hurdle for naturalistic processes to deal with I note you say “almost”. You don’t rule it out 100% then? Why not?" Because it is theoretically possible for all the atoms to end up in one corner of the room at the same time. There is a greater than zero probability so it is not impossible. If we were in an universe that was eternal, then all is possible. We are dealing with rhetoric here and while I never say absolutely, the actual probability is quite low, requiring thousands of zeros past the decimal place before you get to a non zero integer.jerry
July 19, 2009
July
07
Jul
19
19
2009
05:04 PM
5
05
04
PM
PDT
Jerry, Can you give me an example of something with 499 (not designed), 500 (designed) and a million bits of FSCI?
an almost impossible hurdle for naturalistic processes to deal with
I note you say "almost". You don't rule it out 100% then? Why not?Mr Charrington
July 19, 2009
July
07
Jul
19
19
2009
04:38 PM
4
04
38
PM
PDT
Nakashima, Most of us here believe that FSCI is a impediment that the anti ID people cannot get over/around/through. It is easy to understand and the proposition that it does not occur in nature, excluding life and intelligent activity, seems an almost impossible hurdle for naturalistic processes to deal with. Because of this difficulty we get lots of double talk here from the anti ID contingent and precious little thats deals with this very simple idea. Stereotypical troll behavior. It does not seem that GA's are relevant to abiogenesis because genetic algorithms (which I know next to nothing) are dealing with stuff that already exist in a cell and I assume a GA is looking for some sort of improvement. A GA could get you non functional FSCI or in essence not FSCI since it does not have function and thus, the organism would die. Or it could get you modified FSCI which is not very interesting in the evolution debate and which happens all the time and is a big ho hum. Sometimes the modified FSCI has a different function and this is interesting and only a small ho hum. What it cannot seem to get is a completely unrelated FSCI because as kairosfocus says, they are rare and there is nothing functional in-between so no organism could continue to exist when the GA took the organism off the reservation and into no man's land where nothing can survive. Or else the thing being modified is not functional (duplicated genomic material) and the GA modifies this non functional element till it eventually stumbles on to a far away island of functionality. This latter part is what I understood the Gouldian worshipers believe and has upset some Darwinian worshipers here. That is my layman's understanding of kairosfocus thoughts and GA's and it seems to make sense to me. There really isn't any such thing as a GA or a search in nature but only a continual production of modifications which natural selection sweeps aside except maybe a rare exception as hoped for by the anti ID enthusiasts. Of the latter, the examples are few and very far between and nearly always trivial. Occasionally and I mean occasionally we get a big drum roll, breathless expectations and then presentation of what in the long run is minutiae. So tell me where I am wrong so I do not write an invalid synopsis next time.jerry
July 19, 2009
July
07
Jul
19
19
2009
04:28 PM
4
04
28
PM
PDT
I understand that a value of 500 bits for FSCI indicates certain design. Is there an example of something with 499 bits of FSCI? I'm interested see examples of the value of FSCI in actual examples as it's spoken so much about. Can anyone give me an example of something with 1 Bit of FSCI 350 bits of FSCI 499 bits of FSCI 501 bits of FSCI 1,000,000+ bits of FSCI and explain how you came to that figure? That would be great.Mr Charrington
July 19, 2009
July
07
Jul
19
19
2009
04:14 PM
4
04
14
PM
PDT
"Chemistry is the medium; information is the message." It is a true shame then that this amazing "vital force" can be tinkered with by replication enzymes making mistakes. "A little bit of knowledge can be a dangerous thing, if unjustified extrapolations are made from it." How unintentionally prophetic.derwood
July 19, 2009
July
07
Jul
19
19
2009
03:55 PM
3
03
55
PM
PDT
From the OP: "...Leon Urey and Stanley Miller who used a spark discharge apparatus to make the three amino acids- glycine, alpha-alanine and beta-alanine. I do hope that Meyer gave a more accurate accounting than that. In an interview, when questioned about the yield of his experiments, Miller replied: "Just turning on the spark in a basic pre-biotic experiment will yield 11 out of 20 amino acids. If you count asparagine and glutamine you get thirteen basic amino acids. We don't know how many amino acids there were to start with. Asparagine and glutamine, for example, do not look prebiotic because they hydrolyze. The purines and pyrimidines can alos be made, as can all of the sugars, although they are unstable." 11 is more than 3. (from http://www.accessexcellence.org/WN/NM/miller.php)derwood
July 19, 2009
July
07
Jul
19
19
2009
03:52 PM
3
03
52
PM
PDT
Mr jerry, Sadly, no. It has been mostly about nothing. Mr Kairosfocus advanced a very interesting idea about FSCI but seems unwilling to follow it up and discuss its implications. A discussion of GAs where fitness is decided by competition, rather than nearness to a prespecified target, and with selection based clearly on relative fitness, would be very valuable to many readers of UD (including myself, of course). It would move us past the insufferable WEASEL rag doll immensely.Nakashima
July 19, 2009
July
07
Jul
19
19
2009
03:34 PM
3
03
34
PM
PDT
Question of the day. Has Nakashima actually said anything in all his recent posts or are they much ado about nothing? Kairosfocus, keep up the good work. You seemed to have upset Nakashima with your relevant logic and facts. Maybe Nakashima and most of the other anti ID people should retire to an appropriate cul de sac to discuss their ideas.jerry
July 19, 2009
July
07
Jul
19
19
2009
02:42 PM
2
02
42
PM
PDT
Sensei
Is anyone going to argue that Brownian motion is intelligently designed, or that each perturbation is the finger of the deity?
I wouldn't be at all surprised! According to some here we must boil our computers if we want to accurately model Brownian motion.BillB
July 19, 2009
July
07
Jul
19
19
2009
02:17 PM
2
02
17
PM
PDT
Mr kairosfocus, The insistence on starting from the shores of islands of function simply underscores that there is no cogent answer on getting to such a shoreline without intelligent direction. This completely misses the point of how fitness landscapes based on competition, and selection based on relative fitness, obviate arguments based on "islands of function". It doesn't matter if one molecule's reactivity is low. If it is twice another molecule's, then the first molecule will capture twice the resources over time for its reaction products.Nakashima
July 19, 2009
July
07
Jul
19
19
2009
01:32 PM
1
01
32
PM
PDT
Mr Kairosfocus, (Indeed, we may simply observe that organisms die on modest perturbation of functional organisation.) And this is relevant to abiogenesis, how? you don't seem to be grasping the essential point that there is no absolute sense of function in discussing chemical evolution. There are only relative rates of reaction.Nakashima
July 19, 2009
July
07
Jul
19
19
2009
12:51 PM
12
12
51
PM
PDT
Mr Kairosfocus, Also, we do not routinely observe such random number generators routinely issuing King Henry V’s speech or the like. We do see intelligent agents routinely issuing linguistic and algorithmic organised sequences that exhibit FSCI. Again, apropos of nothing. A GA is not just an RNG. But perhaps you would like to return to a discussion Polonius' speech and its generation? That went so well for you.Nakashima
July 19, 2009
July
07
Jul
19
19
2009
12:46 PM
12
12
46
PM
PDT
Mr Kairosfocus, c] does the fitness landscape have to have islands of function before the functional context generates FCSI? Again, ever since Orgel in 1973, it has been well understood that complex functional organisation is distinct from mechanically generated order, and randomness. (Cf here the Abel et al cluster: orderly, random and functional sequence complexity.) In that context, the concepts of complex specified information and as a relevant subset functionally specified complex information, are relevant. Relevant, agreed, but how defined? You are not making progress towards a detailed definition. You have objected to examples which you believe do not exhibit "islands of function". By what metric can anyone make that distinction? Further to this, since complex function resign on complex co-adapted and configured elements is inherently highly vulnerable to perturbation, such functionality naturally shows itself as sitting on islands in a sea of non-function. That is, the description of islands in a sea of non-function is not arbitrary or suspect, but empirically well-warranted. (We do not write posts here by spewing out letters at random . . . ) Warranted, but how measured? Alliteration is not explanation.Nakashima
July 19, 2009
July
07
Jul
19
19
2009
12:39 PM
12
12
39
PM
PDT
Mr Kairosfocus, This is the same basic reasoning that underlies the statistical form of the 2nd law of thermodynamics. Which is apropos of exactly nothing. What is your procedure for differentiating active information sourced in the RNG from active information sourced in the code provided by me?Nakashima
July 19, 2009
July
07
Jul
19
19
2009
12:33 PM
12
12
33
PM
PDT
Mr Kairosfocus, a] N, 123: In repeating that the FSCI must have come from the programmer . . . As the above shows, I am not making an a priori commitment (which is what he highlighted indicates) but an inference to best, empirically anchored explanation. That is, I have made a scientific rather than a philosophical inference — it is evolutionary materialism that has introduced a priori censoring commitments on this subject, cutting off the evidence from speaking. You will need to provide more detail in step 2. Your 'inference' is one that Dembski and Marks cannot acheive, you are on the brink of making ID scientific in a way even Lewontin would approve of.Nakashima
July 19, 2009
July
07
Jul
19
19
2009
12:29 PM
12
12
29
PM
PDT
Mr Kairosfocus, A few footnotes: It is fairly clear from the telling rhetorically strategic silence on the point above that advocates of abiogenesis and/or body plan level macro-evolution have no clear empirical evidence of the following originating by undirected chance and mechanical necessity tracing to blind natural forces: There is no point "above". my question to you, which this word smog is trying to avoid, is about your definition of FSCI.Nakashima
July 19, 2009
July
07
Jul
19
19
2009
12:25 PM
12
12
25
PM
PDT
Yes, always Lewontin because he is the poster child of all the trolls on this site.jerry
July 19, 2009
July
07
Jul
19
19
2009
10:43 AM
10
10
43
AM
PDT
Mr BillB, Your point could be equally well addressed to Mr Joseph. Random number generators are abstractions, part of a model of a reality that may in fact depend on Brownian motion or some other mixing process. Is anyone going to argue that Brownian motion is intelligently designed, or that each perturbation is the finger of the deity?Nakashima
July 19, 2009
July
07
Jul
19
19
2009
10:24 AM
10
10
24
AM
PDT
Ok, shorter KF NO, I am not going to answer your questions. PPS - Lewontin!Nakashima
July 19, 2009
July
07
Jul
19
19
2009
10:14 AM
10
10
14
AM
PDT
BillB, Joseph seems to be saying that his process directly outputs "random numbers" whereas the radioactive decay process only outputs "yes, decay has happened" or "no decay" - i.e. 1 or 0 with a variable, unpredictable time between each 1 or 0 (of course, the average decay time is predicable, leading to the "half life" concept). Just not "random numbers" as such, e.g 344543 7938284596 238957438564 etc Is that about the size of it Joseph? Out of interest, Joseph, what is the range of random numbers that your diode can generate?Mr Charrington
July 19, 2009
July
07
Jul
19
19
2009
09:37 AM
9
09
37
AM
PDT
1 7 8 9 10 11 14

Leave a Reply