Uncommon Descent Serving The Intelligent Design Community

An Eye Into The Materialist Assault On Life’s Origins

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Synopsis Of The Second Chapter Of  Signature In The Cell by Stephen Meyer

ISBN: 9780061894206; ISBN10: 0061894206; HarperOne

When the 19th century chemist Friedrich Wohler synthesized urea in the lab using simple chemistry, he set in motion the ball that would ultimately knock down the then-pervasive ‘Vitalistic’ view of biology.  Life’s chemistry, rather than being bound by immaterial ‘vital forces’ could indeed by artificially made.  While Charles Darwin offered little insight on how life originated, several key scientists would later jump on Wohler’s ‘Eureka’-style discovery through public proclamations of their own ‘origin of life’ theories.  The ensuing materialist view was espoused by the likes of Ernst Haeckel and Rudolf Virchow who built their own theoretical suppositions on Wohler’s triumph.  Meyer summed up the logic of the day

“If organic matter could be formed in the laboratory by combining two inorganic chemical compounds then perhaps organic matter could have formed the same way in nature in the distant past” (p.40)

Darwin’s theory generated the much-needed fodder to ‘extend’ evolution backward’ to the origin of life.  It was believed that “chemicals could “morph” into cells, just as one species could “morph” into another “ (p.43).   Appealing to the apparent simplicity of the cell, late 19th century biologists assured the scientific establishment that they had a firm grasp of the ‘facts’- cells were, in their eyes, nothing more than balls of protoplasmic soup.   Haeckel and British scientist Thomas Huxley were the ones who set the protoplasmic theory in full swing.  While the details expounded by each man differed somewhat, the underlying tone was the same- the essence of life was simple and thereby easily attainable through a basic set of chemical reactions.

Things changed in the 1890s.  With the discovery of cellular enzymes the complexity of the cell’s inner workings became all too apparent and a new theory that no longer relied on an overly simplistic protoplasm-style foundation, albeit one still bounded by materialism, had to be devised.  Several decades later, finding himself in the throws of a Marxist socio-political upheaval within his own country, Russian biologist Aleksandr Oparin became the man for the task. 

Oparin developed a neat scheme of inter-related processes involving the extrusion of heavy metals from the earth’s core and the accumulation of atmospheric reactive gases all of which, he claimed, could eventually lead to the making of life’s building blocks- the amino acids.  He extended his scenario further, appealing to Darwinian natural selection as a way through which functional proteins could progressively come into existence.  But the ‘tour de force’ in Oparin’s outline came in the shape of coacervates- small, fat-containing spheroids which, Oparin proposed, might model the formation of the first ‘protocell’.

Oparin’s neat scheme would in the 1940s and 1950s provide the impetus for a host of prebiotic synthesis experiments, most famous of which was that of Harold Urey and Stanley Miller who used a spark discharge apparatus to make the three amino acids- glycine, alpha-alanine and beta-alanine.  With little more than a few gases (ammonia, methane and hydrogen), water, a closed container and an electrical spark Urey and Miller had seemingly provided the missing link for an evolutionary chain of events that now extended as far back as the dawn of life.  And yet as Meyer concludes, the information revolution that followed the elucidation of the structure of DNA would eventually shake the underlying materialistic bedrock.          

Meyer’s historical overview of the key events that shaped origin-of-life biology is extremely readable and well illustrated.  Both the style and the content of his discourse keep the reader focused on the ID thread of reasoning that he gradually develops throughout his book.

Comments
Mr Jerry, There has never been any known FCSI produced by nature including life. The origin of life is under debate but by all current understanding FSCI is beyond the power of nature to produce. The only logical conclusion then is to conclude that life was probably not produced by nature because nature most likely cannot produce FCSI. That reasoning is perfectly circular.Nakashima
July 20, 2009
July
07
Jul
20
20
2009
02:28 PM
2
02
28
PM
PDT
Joseph-san, As you can see on the thread above, KF-san was showing how to perform a calculation, compare the value to a standard, and from that comparison infer design. It sounds as if you think that inference proceeds with no evidence whatsoever.Nakashima
July 20, 2009
July
07
Jul
20
20
2009
02:17 PM
2
02
17
PM
PDT
jerry:
Around here we are talking about the subset that does specify something else.
Who is we? From what I can tell, you and bFast are the only ones here who think that CSI and FSCI refer to information that specifies something else, as opposed to information that is specified. Correct me if I'm mistaken in that observation.
DNA meets the definition of FCSI.
Really? Is FCSI's complexity relative to a chance hypothesis like CSI's is? If so, what is the chance hypothesis and how did you estimate the probabilities?
There has never been any known FCSI produced by nature including life.
That's quite a sweeping claim. And it seems a little premature, seeing that no work on FCSI has ever been published. (Unless you think that FCSI is synonymous with Abel's and Durston's functional sequence complexity. If so, why introduce another term?)R0b
July 20, 2009
July
07
Jul
20
20
2009
02:07 PM
2
02
07
PM
PDT
"The information is specified by a specifying agent. It does not necessarily specify something else." But it could and some very interesting subsets of CSI specify other entities. Around here we are talking about the subset that does specify something else. In all cases except DNA we can identify a specifier or a likely specifier so it is not an issue that it is specified according to your understanding of CSI and has its origin in intelligence. So let's say we abandon the concept of CSI for the moment. Then FCSI exists on its own merits and is easy to understand and we will assume it is not related to CSI. DNA meets the definition of FCSI. There has never been any known FCSI produced by nature including life. The origin of life is under debate but by all current understanding FSCI is beyond the power of nature to produce. The only logical conclusion then is to conclude that life was probably not produced by nature because nature most likely cannot produce FCSI. Thus by default because life is based heavily on FCSI it was probably originally specified by some intelligence. So now we are back to DNA being CSI according to your definition and understanding.jerry
July 20, 2009
July
07
Jul
20
20
2009
01:20 PM
1
01
20
PM
PDT
Nakashima-san:
You can’t come to a design inference without a rigorous, repeatable process.
That is false. First there is a design inference and only then can one hope to determine a specific process. Ya see reality dictates that in the absence of direct observation or designer input the only possible way to make any scientific determination about the designer(s) and/ or the specific process(es) used, is by studying the design in question. And BTW it is very repeatable that designing agencies can design and create irreducibly complex systems, information storge systems, and information communications systems.Joseph
July 20, 2009
July
07
Jul
20
20
2009
12:53 PM
12
12
53
PM
PDT
Mr Joseph, You can't come to a design inference without a rigorous, repeatable process. Mr Kairosfocus has said he can make a scientific inference.Nakashima
July 20, 2009
July
07
Jul
20
20
2009
12:45 PM
12
12
45
PM
PDT
R0b, CSI is more rigorous than anything the non-telic position has to offer. So please stop your whining. Also this isn't about revolutionizing science- science got to this point on the shoulders of IDists- it is about again letting scientists come to a design inference if that is what the data points to.Joseph
July 20, 2009
July
07
Jul
20
20
2009
11:38 AM
11
11
38
AM
PDT
I am an engineer (software) not a scientist nor mathemetician. As such, I think in much more concrete terms than the others do.
I think there is good-sized population of us engineers on this board, and precious few scientists and mathematicians. A pity, IMO.
ROb, if you are suggesting that Dembski says that CSI only exists when a product is the result of the CSI, rather than the information being gathered from the product, then I think you present a valid splitting of hairs.
Actually, the specification, according to Dembski's usage of the term, need not precede the product. One of Dembski's objectives in his work is to flesh out the idea of "post-specification".
That said, it is my understanding that Dembski was not the originator of the term CSI.
As far as I know, he is the originator of the term, unless you count Crime Scene Investigation. He has a handful of technical definitions of the term, but he usually uses it quite loosely. Other people have differing understandings of the concept, which is certainly fine. But if the concept is to be employed in revolutionizing science, it needs to be rigorized.R0b
July 20, 2009
July
07
Jul
20
20
2009
11:33 AM
11
11
33
AM
PDT
Hey, a good opportunity to interject. I am an engineer (software) not a scientist nor mathemetician. As such, I think in much more concrete terms than the others do. I see CSI as a blueprint that fully describes a product. Now, if one finds a product, and generates a blueprint to describe the product this is somehow fundimentally different than if one finds the blueprint which was used to manufacture the product. ROb, if you are suggesting that Dembski says that CSI only exists when a product is the result of the CSI, rather than the information being gathered from the product, then I think you present a valid splitting of hairs. That said, it is my understanding that Dembski was not the originator of the term CSI. Further, I have had discussions with others who pull quotes out of Dembski's work that suggest that he has created a definition which obligates his conclusion. I think that CSI is a concept that must belong to the world, not to Dembski alone.bFast
July 20, 2009
July
07
Jul
20
20
2009
10:50 AM
10
10
50
AM
PDT
Mr Nakashima, You are correct. You asked if the Hazen link made reference to "Islands of Functionality", which you took as a sign it was different from KF's usage. You wrote:
because there is no discussion of “islands of function” in Hazen’s functional information.
I wanted to point out that they did discuss the Islands of Function concept using their functional info in the same way KF did. You are completely correct that they're talking about a specific landscape. In the same way, KF is talking about specific biological landscapes. AtomAtom
July 20, 2009
July
07
Jul
20
20
2009
10:43 AM
10
10
43
AM
PDT
jerry:
Bfast said it was his understanding that CSI just was information that specified something else. That made sense to me./
If bfast said this, then he and you are not talking about Dembski's CSI. Dembski defines CSI as complex specified information, not complex specifying information. The information is specified by a specifying agent. It does not necessarily specify something else.R0b
July 20, 2009
July
07
Jul
20
20
2009
10:06 AM
10
10
06
AM
PDT
Nakashima:
So what is the source of this increase? Mr KF has asserted something - “[put there by its designer]“, but I have trouble following his logic.
Indeed, both Nakashima and I have been trying, from different angles, to get kairosfocus to provide details on FSCI accounting practices, which appear rather ad hoc. kairosfocus has spent an awful lot of functionally specified pixels responding to arguments I haven't made, while leaving this issue floating in the ether. When FSCI comes from a computer, even if there random elements involved, he invariably credits the FSCI to the designer/programmer of the computer. When FSCI comes from a human, he invariably does not credit the FSCI to the designer/environment/inherited traits/randomness of the human. The basis for this seems to be the view that computers are mechanical, preset, programmed, without thought or understanding, capable of only artificial languages, non-learning, etc., while humans are volitional, creative, spontaneous, original, decision-making, common sense, rational but not predictable, etc. If those are the key concepts in crediting FSCI to the proper source, then the concepts need to be operationalized and incorporated into the definition of FSCI. As it is, we have no way of determining the actual source of the FSCI.R0b
July 20, 2009
July
07
Jul
20
20
2009
09:47 AM
9
09
47
AM
PDT
jerry
And a typical complaint is that our definition is not used in real science thus it is bogus.
But are leading ID researchers like Dr. Demnski using it? I've asked this before but unfortunately, Dr. Dembski is seemingly not following the KF's FCSI comments.sparc
July 20, 2009
July
07
Jul
20
20
2009
09:46 AM
9
09
46
AM
PDT
Mr Kairosfocus, Thank you for humoring me and being patient with me. I appreciate your linking to specific materials out of the large store of your always linked. I appreciate your working a calculation of an example. I am a bit (a weak pun, very sorry) unsure why C and S are binary values. Are these also "eye of the beholder" variables? Perhaps I don't understand what you mean by 1/0. Let me try to work through an example - a screen displaying an image of the Mona Lisa. C = 1, because the image is contingent. The Mona Lisa is a single point in the screen's config space. S = 1, because the image is specific. I'll use the fact that it has a non-zero size when compressed to motivate this choice. B = 11.52*10^6 bits C*S*B = 11.52*10^6 gt 1000, therefore design! Let's try again to make sure I understand. Image of static C = 1, same as before. S = 1, same as before. B = 11.52*10^6 bits C*S*B = 11.52*10^6 bits gt 100, therefore design! It seems to me from your presentation that there is an inference to design for all B.Nakashima
July 20, 2009
July
07
Jul
20
20
2009
09:18 AM
9
09
18
AM
PDT
Mr BillB, Yes, it seems that Mr KF is willing to allow that a GA's population members contain FCSI. I think he might agree that the best of generation 1 contains less FSCI than the best of generation N. Indeed the whole population has probably increased in FCSI. So what is the source of this increase? Mr KF has asserted something - "[put there by its designer]", but I have trouble following his logic. He claims to be able to make a scientific inference that it is the designer, not the RNG, the clock, the growing history, that is the source of this incremental FCSI. This is more than Dembski and Marks claim to be able to do.Nakashima
July 20, 2009
July
07
Jul
20
20
2009
08:37 AM
8
08
37
AM
PDT
KF: Just to pick you up on one point:
A genetic algorithm is expressing active information in it [put there by its designer], towards a target zone.
If you mean a target area in its configuration space then this is incorrect, the algorithm will try and maximise an agents score as given by a fitness function but how the agent it achieves this score is not normally defined by the fitness function - the phenotype is not a target. If you take a look at some of Karl Sims early work in evolving virtual creatures you see lots of different evolved solutions that arise from the same fitness function. There can be many possible configurations that qualify as 'fit'. Also, not all GA's use static fitness functions, incremental approaches will use fitness functions that gradually change and embodied fitness functions are sometimes used that are a product of the agents environment rather than an imposition by an external auditor. If FCSI enters the design via the fitness function then is it not inconceivable that all the FCSI we observe in nature is a product of natural selection, which its self is a fitness function designed by a deity?BillB
July 20, 2009
July
07
Jul
20
20
2009
06:51 AM
6
06
51
AM
PDT
PPS: Oops, point7 ha a less than init and clipped, sorry: 7 --> For instance, for the 800 * 600 pixel PC screen, C = 1, S = 1, B = 11.52 * 10^6, so C*S*B = 11.52 * 10^6, FS bits. This is well beyond the threshold. [Notice that if the bits were not contingent or were not specific, then X = 0 automatically. Similarly, if B {is less than} 500, the metric would indicate the bits as functionally or compressibly etc specified, but without enough bits to be comfortably beyond the UPB threshold. Of course, the DNA strands of observed life forms start at about 200,000 FS bits, and that for forms that depend on others for crucial nutrients. 600,000 - 10^6 FS bits is a reported reasonable estimate for a minimally complex independent life form.]kairosfocus
July 20, 2009
July
07
Jul
20
20
2009
06:06 AM
6
06
06
AM
PDT
PS: A genetic algorithm is expressing active information in it [put there by its designer], towards a target zone. It is not an original source of FSCI, though the fact that such a program will normally itself have FSCI in it, it is testimony to the reality that designers create FSCI. PPS: Brillouin et al do create an information metric, as shown in the linked: negentropy. Jaynes et al show a link from physical to informational entropy, and Durston et al give FSC metrics that are more sophisticated versions of what the simple FSCI metric here presents. Excerpting the just linked; two clicks away from all my recent posts at UD: >> FSCI is also an observable, measurable quantity; contrary to what is imagined, implied or asserted by many objectors. This may be most easily seen by using a quantity we are familiar with: functionally specific bits [FS bits], such as those that define the information on the screen you are most likely using to read this note: 1 --> These bits are functional, i.e. presenting a sceenful of (more or less) readable and coherent text. 2 --> They are specific, i.e. the screen conforms to a page of coherent text in English in a web browser window; defining a relatively small target/island of function by comparison with the number of arbitrarily possible bit configurations of the screen. 3 --> They are contingent, i.e your screen can show diverse patterns, some of which are functional, some of which -- e.g. a screen broken up into "snow" -- would not (usually) be. 4 --> They are quantitative: a screen of such text at 800 * 600 pixels resolution, each of bit depth 24 [8 each for R, G, B] has in its image 480,000 pixels, with 11,520,000 hard-working, functionally specific bits. 5 --> This is of course well beyond a "glorified common-sense" 500 - 1,000 bit rule of thumb complexity threshold at which contextually and functionally specific information is sufficiently complex that the explanatory filter would confidently rule such a screenful of text "designed," given that -- since there are at most that many quantum states of the atoms in it -- no search on the gamut of our observed cosmos can exceed 10^150 steps . . . . 6 --> So we can construct a rule of thumb functionally specific bit metric for FSCI: a] Let contingency [C] be defined as 1/0 by comparison with a suitable exemplar, e.g. a tossed die. b] Let specificity [S] be identified as 1/0 through functionality [FS] or by compressibility of description of the information [KS] or similar means. c] Let degree of complexity [B] be defined by the quantity of bits to store the relevant information, with 500 - 1,000 bits serving as the threshold for "probably" to "morally certainly" sufficiently complex to meet the FSCI/CSI threshold. d] Define the vector {C, S, B} based on the above [as we would take distance travelled and time required, D and t], and take the element product C*S*B [as we would take the ratio D/t to get speed]. e] Now we identify: C*S*B = X, the required FSCI/CSI-metric in [functionally] specified bits. 7 --> For instance, for the 800 * 600 pixel PC screen, C = 1, S = 1, B = 11.52 * 10^6, so C*S*B = 11.52 * 10^6, FS bits. This is well beyond the threshold. [Notice that if the bits were not contingent or were not specific, then X = 0 automatically. Similarly, if B A more sophisticated metric has of course been given by Dembski, in a 2005 paper . . . . 9 --> When 1 >/= ?, the probability of the observed event in the target zone or a similar event is at least 1/2, so the available search resources of the observed cosmos across its estimated lifespan are in principle adequate for an observed event [E] in the target zone to credibly occur by chance. But if ? significantly exceeds 1, that becomes increasingly implausible. The only credibly known and reliably observed cause for events of this last class is intelligently directed contingency, i.e. design. 10 --> Thus, we have a rule of thumb informational metric and a more sophisticated informational measure for CSI/FSCI, both providing reasonable grounds for confidently inferring to design. (Durston, Chiu, Abel and Trevors provide a third metric, functional bits or fits, a functional bit extension of Shannon's H-metric of average information per symbol, here.)>>kairosfocus
July 20, 2009
July
07
Jul
20
20
2009
06:01 AM
6
06
01
AM
PDT
Nakashima-San: first, the point of evoolutionary materialistic OOL scenarios is that somehow molecular noise transformed itself into DNA-driven algorithmic digital processing,and that further noise created major body plans. We are not talking about one or two mutations at points in already functioning DNA, but 600 k bits or so of information for first life and 10's - 100's+ of mega bits for novel body plans. And, if you will follow up on the links and the identified cases, I believe the key points that need to be clarified will become much clearer to you. (It now seems to me from your remarks on the log2 pi relationship, that you have not first seen what I am saying -- and link from EVERY post ever made by me at UD -- before criticising. That's not cricket.) GEM of TKIkairosfocus
July 20, 2009
July
07
Jul
20
20
2009
05:31 AM
5
05
31
AM
PDT
Mr Kairosfocus, From your always linked: As is discussed briefly in Appendix 1, Thaxton, Bradley and Olsen [TBO], following Brillouin et al, in the 1984 foundational work for the modern Design Theory, The Mystery of Life's Origins [TMLO], exploit this information-entropy link, through the idea of moving from a random to a known microscopic configuration in the creation of the bio-functional polymers of life, and then -- again following Brillouin -- identify a quantitative information metric for the information of polymer molecules. For, in moving from a random to a functional molecule, we have in effect an objective, observable increment in information about the molecule. Are these your FSCI? Can you state them explicitly. From a subsequent post: Once 500 - 1,000 bits info storage capacity is passed, that becomes a most unlikely explanation relative to the known source of such FSCI: design. How pleasant to return to where we began. So a GA that evolves 1000 bit long IPD competitors is generating FSCI?Nakashima
July 20, 2009
July
07
Jul
20
20
2009
05:29 AM
5
05
29
AM
PDT
Mr kairosfocus, I am not claiming that all cases of functionality follow the pattern of islands of function in a sea of non-function, only that this is relevant to certain key cases studied using the concept FSCI. I agree these are the case of ultimate interest, but along the way you have made a number of claims, which I am hoping you can clarify. In particular, cases where modest perturbation at random will as a rule derange function. Most digital symbolic codes and code based systems are like that: change the sufficiently long string at random, and it will usually either become a non code word or inappropriate to its context. Actually, I think that the longer the string, the more likely you are to preserve function after a single letter change, but that depends very much on the problem. In terms of chemistry, the cube-square law and the categorization of amino acids by hydrophobic or -phillic means that long strings are resistant to losses in function due to small random perturbations. This where the rubber meets the road and no amount of generalization on either side of the argument will resolve the issue, only experiment will. (Genetic algorithms are very particular about what they allow to change at random — the “genes” not the rest of the program or the operating system that supports it!) Sir, are you entering this as a serious comment? This is the error that led to Gil Dodgen receiving so much ridicule.Nakashima
July 20, 2009
July
07
Jul
20
20
2009
05:16 AM
5
05
16
AM
PDT
PPPS: A bridge or poker etc hand can be so assigned a target zone that it is complex and specified to some threshold or other (though such hands in general will not pass observable universe level CSI thresholds . . . cf the calculation on Dembski's metric in the weak argument corrective no 27). But, it has no directly observable function in a system, unlike machine code in a PC or DNA code in a cell. Right from Orgel, the latter has been a specific relevant context for FSCI as an issue. OBSERVE real world function, then look at degree of complexity and what happens on perturbing it. is it reasonable that so much complex functionality could arise by chance + necessity only? Once 500 - 1,000 bits info storage capacity is passed, that becomes a most unlikely explanation relative to the known source of such FSCI: design.kairosfocus
July 20, 2009
July
07
Jul
20
20
2009
05:02 AM
5
05
02
AM
PDT
PPS: And, I have used FSCI in particular in contexts of complex digital information (and things reducible to that). DNA is a digitally coded information-bearing molecule that uses a four state basic code element, so it is entirely appropriate to speak of it in that light and draw comparisons to other cases of codes and algorithmic instructions.kairosfocus
July 20, 2009
July
07
Jul
20
20
2009
04:54 AM
4
04
54
AM
PDT
PS: Nakashima-San, please, please! (Cf Section A my always linked where I discuss information [not to mention Appendix 3 where I speak to CSI, FSCI and related concepts in the context of their roots and relevance]; you have spoken ignorantly and dismissively.) Furthermore, we have been discussing a very clear set of cases of observed function, which are quire publicly available: posts in this blog, the wider Internet, long enough [143 ACII character] text strings that make sense in English, program code, PC screens, assembly of a house from its parts, assembly of a flyable jet plane from its parts, DNA-RNA and the cell's executing mechanisms. This is not at all "private" or "imprecise."kairosfocus
July 20, 2009
July
07
Jul
20
20
2009
04:50 AM
4
04
50
AM
PDT
Mr Jerry, Dembski was trying to be too general and develop a system that would determine for all entities whether they were designed or not while in terms of evolution the interest was much more narrow. There was no need for this more generalized concept that seemed to befuddle everyone. That is an interesting perspective. It goes to the heart of whether FSCI is a general, abstract concept that can be calculated for bridge hands, sequences of coin flips, etc. or whether it is specific to biological contexts. This is why I spoke earlier of FSCI of the output of a computer program vs that of a beaker of chemicals. But it seems that Mr Kairosfocus has used FSCI in very non-biological contexts, so again, I await clarification from him.Nakashima
July 20, 2009
July
07
Jul
20
20
2009
04:45 AM
4
04
45
AM
PDT
Nakashima-San: I am not claiming that all cases of functionality follow the pattern of islands of function in a sea of non-function, only that this is relevant to certain key cases studied using the concept FSCI. In particular, cases where modest perturbation at random will as a rule derange function. Most digital symbolic codes and code based systems are like that: change the sufficiently long string at random, and it will usually either become a non code word or inappropriate to its context. (Genetic algorithms are very particular about what they allow to change at random -- the "genes" not the rest of the program or the operating system that supports it!) Observe in this regard that there is a fair degree of error detection and correction routinely carried out on DNA in the living cell. GEM of TKIkairosfocus
July 20, 2009
July
07
Jul
20
20
2009
04:41 AM
4
04
41
AM
PDT
Mr jerry, To measure the complexity of the DNA string just as one measures the complexity of a word, sentence, paragraph, line of code, module or program one calculates the likelihood of the sequence of symbols, or in the genome, the DNA sequence to assess its likelihood. Yes, -log2(p(x)). Wouldn't it be nice if Mr Kairosfocus was in agreement that this simple definition was appropriate, since so many other people use it! That is all I'm asking for - agreement on a precise definition. Or not. Mr KF could also simply declare that FSCI has no precise definition, that 'functional' is an adjective like 'pretty' - its meaning completely private. I'm just not willing to assume I know what Mr KF means. Look at how much wrangling there was on the Weasel threads over terminology, with invented terms like quasi-latching springing up like mushrooms after the rain. I agree with Mr Atom, it is better to let the man speak for himself.Nakashima
July 20, 2009
July
07
Jul
20
20
2009
04:34 AM
4
04
34
AM
PDT
Mr Atom, That material from Hazen 2007 is saying that islands of function exist in a particular fitness landscape, not that they are intrinsic or necessary to the definition of functional information. As an example, if all the functional points on a landscape were arranged like Mt Fuji, gradually sloping up to a single peak, they would have the same functional information measure as a landscape where each functional point was a pole sticking up out of the sea, separated from any other pole by miles of flat surface. (That is how they were drawn by someone, Douglas Axe?)Nakashima
July 20, 2009
July
07
Jul
20
20
2009
04:17 AM
4
04
17
AM
PDT
Mr Krondan, You are free to edit Wikipedia to bring more attention to that account, if you like.Nakashima
July 20, 2009
July
07
Jul
20
20
2009
03:58 AM
3
03
58
AM
PDT
Note to ID opponents: Keep throwing rocks at KF. By all means, feel free to split some more hairs.Upright BiPed
July 20, 2009
July
07
Jul
20
20
2009
03:13 AM
3
03
13
AM
PDT
1 6 7 8 9 10 14

Leave a Reply