Uncommon Descent Serving The Intelligent Design Community

The Key Thing to Remember

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Last week the Wall Street Journal published a brief list of the scientific problems with evolution, supplied by John West of the Discovery Institute. Scientists are well aware of these problems but it is probably worthwhile to spell them out occasionally in a major newspaper. Even more worthwhile were the responses supplied by evolutionist Dr. Eric Meikle. [1]

Meikle is the Outreach Coordinator at the National Center for Science Education and has several decades of experience in evolution research, teaching and advocacy. Not surprisingly Meikle’s responses to West’s four problems are typical. They can be found throughout the evolutionary literature, from popular treatments to textbooks, and they speak volumes.

The evolutionist’s response to fundamental problems with his theory is reminiscent of a salesman. “Don’t worry, just trust us” is the message which otherwise is void of any scientific depth. Evolution is a fact, even if we don’t have a clue how it happened.

Is it not a problem that most mutations (the supposed fuel for evolution) are harmful and the rare beneficial ones produce only minor changes? Not a problem reassures the evolutionist. Perhaps harmful mutations can turn beneficial if the environment shifts. And in any case, as Meikle explains, biologists are continuing “to research mechanisms that produce evolutionary advances.” So we’re supposed to ignore scientific problems on the conviction that they will be resolved by future research?

West also points out that natural selection does not explain the development of fundamentally new biological features and organisms. Again, don’t worry, replies the evolutionist, for the Darwinists are busy looking at other factors such as genetic drift, in which genes can spread rapidly through small populations even if they don’t confer a specific advantage. But of course this helps very little for West’s point would apply with equal force in the case of genetic drift. Science is not telling us that natural selection, or any other known mechanism, creates fundamentally new biological features and organisms.

What about those species appearing abruptly in the fossil record? It is as though, as Richard Dawkins once put it, they were planted there. Again it can all be explained if, that is, one is willing to speculate. The vagaries of the fossil record allows for several million years, explains Meikle, over which the species could have evolved in a rapid process that would have left few fossils. That’s convenient. The fossil species appear abruptly, so this must mean that evolution occurs rapidly, leaving scant trace of its prodigious activities.

Surely the origin of first life must be admitted to be a problem, for everything from the basic macromolecules to the cell seem to defy a naturalistic explanation. Yet evolutionists even here maintain their optimistic speculation. “Research on the origin of life,” assures Meikle, “is very active.” And some of life’s chemicals have been synthesized under simulated conditions while others occur naturally in outer space. But Meikle’s confidence is without support, for this research has revealed a host of profound problems, as any origin of life researcher knows. As if aware of the overstatement Meikle concludes that, in any case, we ought not “assume that simply because humans have not done something, it cannot have happened through natural processes.” In other words, never doubt natural processes.

Evolution may defy science, but since it is a fact we know it must have occurred, one way or another. In this house of mirrors the high claims are matched only by the mysteries in explanation. But not to worry. “The key thing to remember,” reassures Meikle, “is that the evidence of evolution is overwhelming and independent — it stands no matter what debate might arise about the precise mechanisms involved.” But it is precisely this “evidence of evolution” that Meikle just failed to provide. In the face of fundamental scientific problems, the evolutionist can only respond with automated replies about future research and the heroics of naturalism.

1. Stephanie Simon, “Critiques, and Defenses, of Evolution,” The Wall Street Journal, May 2, 2008.

Comments
I thought “God doesn’t create junk” But God hath chosen the foolish things of the world to confound the wise; and God hath chosen the weak things of the world to confound the things which are mighty; And base things of the world, and things which are despised, hath God chosen, yea, and things which are not, to bring to nought things that are: That no flesh should glory in his presence. 1 Corinthians 1:27-29Vladimir Krondan
May 16, 2008
May
05
May
16
16
2008
09:27 PM
9
09
27
PM
PDT
Not to mention a certain percentage of dead code in any application - or code that is so rarely accessed as to be in a coma. What about the junk dna lying around a computer lab, or reams of film footage on the editing room floor after doing 20 takes of every scene. What about previous versions of software maintained for years and years on the slim chance that someone might need to refer to them again. Every design endeavor we know about involves great quantities of junk dna.JunkyardTornado
May 16, 2008
May
05
May
16
16
2008
09:03 AM
9
09
03
AM
PDT
I am not buying their junk DNA.
I've never understood ID proponents who think the genome will be 100% functional. There IS going to be a certain level of data corruption. After all, the hard drive on the computer you are using suffers a known error rate and there will be some corrupted files eventually. The design of the biological system even allows for non-functional duplications, which computers do not.Patrick
May 16, 2008
May
05
May
16
16
2008
08:35 AM
8
08
35
AM
PDT
I am not buying their junk DNA. they can wrap it all the fanct words they want but it is just plain wrong. The information coding is to great. The data compression of the DNA is proven to be up to 12 codes thick (maybe more!). The capacity of a DNA molecule to store information is so efficient that all the information needed to specify an organism as complex as man weighs less than a few thousand-millionths of a gram. The information needed to specify the design of all species of organisms that have ever existed (a number estimated to be one billion) could easily fit into a teaspoon with plenty of room left over for every book ever written on the face of earth. For comparison sake, if mere man were to write out the proper locations of all those proteins molecules in just one human body, in the limited mathematical language he now uses, it would take a billion-trillion computer hard drives, and that’s just the proper locations for the protein molecules in one human body, that billion-trillion computer hard-drives would not contain a single word of instruction telling those protein molecules how to self assemble themselves. The atoms in a human being are the equivalent to the information mass of about a thousand billion billion billion bits. Even with today's top technology, this means it would take about 30 billion years to transfer this mass of data from one point to another. That's twice the age of the universe. It is beyond ludicrous to continue playing games with the darwinists, this level of complexity, to quote Newton,"could only proceed from the dominion of an intelligent and powerful Being." To believe chance can build something that we cannot even understand even with our most powerful super-computers helping us, is ignorant. In fact it should be the very definition of ignorance we find in the dictionary! As well if "information" being the foundation of reality, (Zeilinger) can be formalized to sufficient degree all this petty talk of materialism generating "information" will be moot for the very legs of the argument shall be taking out from under the alchemistic-evolutionists.bornagain77
May 14, 2008
May
05
May
14
14
2008
09:38 PM
9
09
38
PM
PDT
Its kind of an ironic scenario - ID would say most of the universe is junk in relation to the creation of humans, in that they would say that there is nothing there capable of making humans on its own. Also there is a significant group in ID that would say the same about the brain, that any role it has in thought is minor in comparison to something else not yet discovered. Evolutionists OTOH would say that a lot of the genome is junk, whereas ID is radically opposed to this idea. And evolutionists would say in contrast to ID, that the physical universe is not junk in that it is capable of producing the biological world on its own.JunkyardTornado
May 14, 2008
May
05
May
14
14
2008
08:15 PM
8
08
15
PM
PDT
BornAgain77 wrote: As well, since the genome is soon promising to be verified to 100% poly-functionality this leaves no room in the Genome (Junk DNA is not JUNK) for evolutionists to play their fantasy games in with their hypothetical math using their hypothetical “beneficial mutations” which have actually never been shown to occur in reaklity in the first place. And yet you can lose 90% of your brain and still think perfectly well - go figure. The articles from the NIH news and Boston.Com diverged somewhat. NIH: The new data indicate the genome contains very little unused sequences and, in fact, is a complex, interwoven network. In this network, genes are just one of many types of DNA sequences that have a functional impact. Boston.com: As for the remaining 95 percent of the genome? "There's this weird lunar landscape of stuff we don't understand," Lander said. "No one has a handle on what matters and what doesn't."... "To our shock and consternation, we're learning how little we know about the parts of the genome that may matter most," said Dr. David M. Altshuler, associate professor of genetics and medicine at Harvard Medical School and also a top researcher at the Broad Institute. "Maybe some of it really is junk. Maybe most of it is junk," he said. "But one shouldn't bet against nature. Maybe it all serves some sort of a purpose. We really don't know."... No one knows what all that extra RNA is doing. It might be regulating genes in absolutely essential ways. Or it may be doing nothing of much importance: genetic busywork serving no real purpose. Many researchers believe the truth falls somewhere in between. "Half of it may be doing something very useful," said Lander, who is also a professor of biology at MIT. "The other part may turn out to be, well, just junk - doing neither great good nor great harm. -------- But even in further down in the NIH article from the previously mentioned definitive-sounding statement there is the following: According to ENCODE researchers, this lack of evolutionary constraint may indicate that many species' genomes contain a pool of functional elements, including RNA transcripts, that provide no specific benefits in terms of survival or reproduction. As this pool turns over during evolutionary time, researchers speculate it may serve as a "warehouse for natural selection" by acting as a source of functional elements. Interesting nonetheless.JunkyardTornado
May 14, 2008
May
05
May
14
14
2008
08:03 PM
8
08
03
PM
PDT
In current studies we can also gather a few more tidbits to bolster the case for degradation of CSI. African cichlid fish: a model system in adaptive radiation research http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=1635482 of special note: Interestingly, ecological opportunity (the availability of an unoccupied adaptive zone), though explaining rates of diversification in radiating lineages, is alone not sufficient to predict whether a radiation occurs. The available data suggest that the propensity to undergo adaptive radiation in lakes evolved sequentially along one branch in the phylogenetic tree of African cichlids, but is completely absent in other lineages. Instead of attributing the propensity for intralacustrine speciation to morphological or behavioural innovations, it is tempting to speculate that the propensity is explained by genomic properties that reflect a history of repeated episodes of lacustrine radiation: the propensity to radiate was significantly higher in lineages whose precursors emerged from more ancient adaptive radiations than in other lineages. Thus as you can see, the evolutionists are mystified that the radiations are not happening for "sub-species" of cichlids but are always radiating from the "more ancient" parent lineage. Yet this evidence fits in very well, even refining, the ID model to perfection. Specifically it links CSI to a single parent species (biblical kind) as well it shows confirmation to Lee Spetner's evironmentally driven rapid adaptations (adaptations that always occur at a loss of CSI) Rapid adaptations, I point out, that have no credible explantion from any Neo-Darwinists models or math. As well current breakthroughs in Physics with quantum teleportation (Zeilinger) are showing that "information" is foundational to reality. i.e. information is now shown to have dominion and transcendence over energy/material. In fact Zeilinger goes further: In conclusion it may very well be said that information is the irreducible kernel from which everything else flows. Then the question why nature appears quantized is simply a consequence of the fact that information itself is quantized by necessity. It might even be fair to observe that the concept that information is fundamental is very old knowledge of humanity, witness for example the beginning of gospel according to John: "In the beginning was the Word". Anton Zeilinger Professor of Physics If information is the irreducible kernel of reality, then evolutionists have a huge problem of the first order, for now not only is information shown not to be able to be generated by Darwinian methods, i.e. it is shown that material (living or dead) cannot generate meanigful CSI information, it is now shown that transcendent information itself is what ultimately gives rise to the energy/material realm. Somehow as a Theistic IDists, I think the poetry of this turn of events is a most fitting justice to such a preposterous theory as Darwin is.bornagain77
May 14, 2008
May
05
May
14
14
2008
05:56 PM
5
05
56
PM
PDT
Naturalists/Evolutionists use all sorts of tricks to try to find the amount of "beneficial mutations" they need in order for them to make the math work for evolution. The most common is to claim that a "beneficial" mutation, which actually destroys something so as to increase fitness (trench warfare), is not really destructive to the molecular machinary but is beneficial in the long term (unrealistic/unscientific fantasy thinking) . Yet when looked at more realistically as Behe did in EoE of 2007, we find that truly beneficial mutations, that are actually constructive on the molecular level to explain the complexity we see in life, are virtually non-existent (10^40 double CCC estimate). This fits in well with Dr. Dembski's conservation of information (CSI) as well as with the empirical evidence that is now available though Genetic studies and the fossil record itself. For example Dr. Sanford in Genetic Entropy points out the fact that the Genome is poly-functional and as such is now poly-constrained to any random mutations. This fact that he pointed out in 2005 has been verified by ENCODE; In June 2007, a international team of scientists, named ENCODE, published a study that indicates the genome contains very little unused sequences and, in fact, is a complex, interwoven network. This “complex interwoven network” throughout the entire DNA code makes the human genome severely poly-constrained to random mutations (Sanford; Genetic Entropy, 2005; page 141). This means the DNA code is now much more severely limited in its chance of ever having a hypothetical beneficial mutation since almost the entire DNA code is now proven to be intimately connected to many other parts of the DNA code. Thus even though a random mutation to DNA may be able to change one part of an organism for the better, it is now proven much more likely to harm many other parts of the organism that depend on that one particular part being as it originally was. Since evolution was forced, by the established proof of Mendelian genetics, to no longer view the whole organism as to what natural selection works upon, but to view the whole organism as a multiple independent collection of genes that can be selected or discarded as natural selection sees fit, this “complex interwoven network” finding is extremely bad news, if not absolutely crushing, for the population genetics scenario of evolution (modern neo-Darwinian synthesis) developed by Haldane, Fisher and Wright (page 52 and 53: Genetic Entropy: Sanford 2005)! http://www.genome.gov/25521554 BETHESDA, Md., Wed., June 13, 2007 -" An international research consortium today published a set of papers that promise to reshape our understanding of how the human genome functions. The findings challenge the traditional view of our genetic blueprint as a tidy collection of independent genes, pointing instead to a complex network in which genes, along with regulatory elements and other types of DNA sequences that do not code for proteins, interact in overlapping ways not yet fully understood." http://www.boston.com/news/globe/health_science/articles/2007/09/24/dna_unraveled/?page=1 "The science of life is undergoing changes so jolting that even its top researchers are feeling something akin to shell-shock. Just four years after scientists finished mapping the human genome - the full sequence of 3 billion DNA "letters" folded within every cell - they find themselves confronted by a biological jungle deeper, denser, and more difficult to penetrate than anyone imagined." As well, since the genome is soon promising to be verified to 100% poly-functionality this leaves no room in the Genome (Junk DNA is not JUNK) for evolutionists to play their fantasy games in with their hypothetical math using their hypothetical "beneficial mutations" which have actually never been shown to occur in reaklity in the first place. To top matters off for this. the fossil record when scutinized for fidelity actually shows what the IDists would expect from degradation of information in the genome. (Webster Trilobites 2007) http://www.terradaily.com/reports/The_Cambrian_Many_Forms_999.html As well genomes of canines show loss of genetic diversity with sub-speciation from wolves. Molecular evolution of the dog family ROBERT K. WAYNE This pattern holds for humans: "We found an enormous amount of diversity within and between the African populations, and we found much less diversity in non-African populations," Tishkoff told attendees today (Jan. 22) at the annual meeting of the American Association for the Advancement of Science in Anaheim. "Only a small subset of the diversity in Africa is found in Europe and the Middle East, and an even narrower set is found in American Indians." Tishkoff; Andrew Clark, Penn State; Kenneth Kidd, Yale University; Giovanni Destro-Bisol, University "La Sapienza," Rome, and Himla Soodyall and Trefor Jenkins, WITS University, South Africa, looked at three locations on DNA samples from 13 to 18 populations in Africa and 30 to 45 populations in the remainder of the world. even Allan Macneil agrees that this is the persistent pattern of the fossil record (altough he will say loss of diversity is due to anything other than the degradation of CSI). "As Niles Eldredge and Stephen Jay Gould pointed out almost three decades ago, the general pattern for the evolution of diversity (as shown by the fossil record) follows precisely this pattern: a burst of rapid diversity following a major ecological change, and then a gradual decline in diversity over relatively long periods of time." Allen MacNeill PhD.; 2007 (Evolutionist) Teaches introductory biology and evolution at Cornell University in Ithaca, NY. to be cont.bornagain77
May 14, 2008
May
05
May
14
14
2008
05:15 PM
5
05
15
PM
PDT
edit: UBB=UPBJunkyardTornado
May 14, 2008
May
05
May
14
14
2008
12:49 PM
12
12
49
PM
PDT
Paul Geim wrote: Your question about the elephant, if put into the realm of the genome, would suggest a very complex genome initially, which would imply some sort of front-loading. It would also make it even more unlikely that unguided evolution could account for present-day life, as unguided evolution would have had to create an even more complex genome than we have now, and then let it deteriorate to the present state. The deterioration would be quite understandable, but the creation of a huge functional genome, when a much smaller one would do fine, would be even more unlikely than a direct route. Who says the original would have to be functional. It would just have to be complex. Just some sort of proto-organism(s) with extremely limited manifest functionality but crawing around with a vast repository of junk dna.JunkyardTornado
May 14, 2008
May
05
May
14
14
2008
12:36 PM
12
12
36
PM
PDT
Patrick How is that not a constructive beneficial mutation? My point was, that constructive mutation wouldn't be anywhere near 320 bits of CSI - more like 20. Depends on where and how the change occurred, doesn't it? Patrick had already said that it might not exceed the UPB when written in C/C++. Shifting the goalposts does not make your argument strong. You haven't even substantiated your claim that Darwinian processes could pull off the easier change you described. -- UD AdminJunkyardTornado
May 14, 2008
May
05
May
14
14
2008
12:24 PM
12
12
24
PM
PDT
Despite you intending your elephant example as a joke, I have seen scientists posit that a series of deleterious beneficial mutations will "eventually" lead to a constructive beneficial mutation.
If the UBB says that there aren’t enough particle interactions that could occur in the universe to create a cotton-pickin’ flagellum by chance, then why on earth does the universe exist?
I'm not following your logic.
This seems to indicate something about the nature of reality itself for this to be able to occur.
If the design of nature itself allows for the emergence of self-organization I'd have no problem. That's pretty much cosmological ID extended further. But we should be able to observe this if that's the case. Especially in regards to chemical evolution, the design of nature seems to deny the self-organization of biological systems.Patrick
May 14, 2008
May
05
May
14
14
2008
12:07 PM
12
12
07
PM
PDT
Thanks to Paul and Patrick for your thoughtful replies, but as I was essentially just making a joke, so you may have been casting your pearls before swine (or elephants). But if I can digress for just a moment, within the general context of "front-loading" and CSI and UBB, and ask your opinion on something. If the UBB says that there aren't enough particle interactions that could occur in the universe to create a cotton-pickin' flagellum by chance, then why on earth does the universe exist? It doesn't seem to be good for anything. This question seems expecially relevant in a Christian context, where one would assume that man is the endpoint of creation (as God was manifested as a man). So man is the endpoint of creation, and yet an ininitessimally minute speck in a vast universe of absolute worthless junk of unimaginable scale. I thought "God doesn't create junk", to quote an inspirational poster (or at least not junk dna). If we can look up at the sun and recognize it as the energy source for all meaningful activity on this planet, be it thinking, planning, designing, working, building, and so on, would it be unreasonable to assume that all those stars out there were integral to our creation as well, in that they represent a vast energy pool that created the probabilistic resources necessary for our world to emerge by chance. Front-loading most definitely I would say, but in that the natural world is such that marvelous things can emerge purely by chance. This seems to indicate something about the nature of reality itself for this to be able to occur. If the Darwinists are blind to that, then that's there problem, but isn't the official position of evolution merely agnosticism as opposed to atheism.JunkyardTornado
May 14, 2008
May
05
May
14
14
2008
12:04 PM
12
12
04
PM
PDT
tribune7, (31) Thanks for your comments. I agree that we all, ID and non-ID alike, can be too quick to settle into an explanation and assume it right and the only explanation.Paul Giem
May 14, 2008
May
05
May
14
14
2008
11:33 AM
11
11
33
AM
PDT
JunkyardTornado, (32) Your question about the elephant, if put into the realm of the genome, would suggest a very complex genome initially, which would imply some sort of front-loading. It would also make it even more unlikely that unguided evolution could account for present-day life, as unguided evolution would have had to create an even more complex genome than we have now, and then let it deteriorate to the present state. The deterioration would be quite understandable, but the creation of a huge functional genome, when a much smaller one would do fine, would be even more unlikely than a direct route. One can account for Microsoft Word as a deterioration of Microsoft Office, but then one has to account for the production of Microsoft Office from DOS by simple random changes, duplications, etc. This seems to make the probability of creating a large genome even more vanishingly small than it already is. On the other hand, it would fit front-loading just fine. Perhaps the ancestors of all dogs had a more complex genome than all modern dogs, and the genome has just been pared to the point that we have wolves, coyotes, foxes, dingos, etc. It would seem that the natural conclusion of the discussion you suggested would be that front-loading would be more likely than unguided evolution.Paul Giem
May 14, 2008
May
05
May
14
14
2008
11:26 AM
11
11
26
AM
PDT
Bad analogy. It's like saying that in order to prevent criminals from breaking into your house you'll remove all the doorknobs and that will eventually lead to a security system. At what point will these degradative mutations lead to an IC core that provides a pro-active immune system mechanism to combat malaria? Such a mechanism may not even contain components related to pyruvate kinase. BTW, I was referring to the code being modified in C. I didn't consider implementing a timer in the sorting function at the time. (Although for these types of apps the industry practice is now to leave the dataset unsorted, so it's not just me who came up with this solution.) And in order to maintain the example that's assuming the self-replication is done at a machine code level. I'd have to compare the machine code to figure out exactly what the informational bits would be in order to use the Explanatory Filter. But I don't even have access to that code anymore, so...
what if a timer got thrown into the sorting code by accident and different variants of the code had different time-out values. Once the timer expired, the sorting was aborted. Since only partially sorted code certainly wouldn’t hurt anything, presumably the variant with the optimal timeout value would prevail.
How is that not a constructive beneficial mutation? It's just a different method to get the same result I achieved with my method: to pro-actively stop the sorting when the dataset got too large. The timer value might not be balanced properly but, again, optimality is another issue. You've also failed to establish that such a change would be within the reach of Darwinian processes. I already gave an example of a deleterious beneficial mutation, but here's another one: critical lines of code within the sorting function are deleted. The loop may still execute but the performance bottleneck was from the memory usage.
A high-level programming language is an artifice that obscures what is actually going on
In my opinion, a better comparison to genomes would be bytecode and not a high-level programming language. Obviously biological code, despite its intense level of error correction which is the level of modern hard drives, has a greater level of plasticity (aka it doesn't fail to compile on a single minor error). But the vast majority of single changes that are not caught by error correction are still fatal to the resulting organism.
But with the actual binary executable you can change bits around and it will still run.
That could be compared to epigenetics.Patrick
May 14, 2008
May
05
May
14
14
2008
11:25 AM
11
11
25
AM
PDT
Patrick wrote: Again, we’re looking for CONSTRUCTIVE mutations, not just beneficial mutations in general. [Behe:] The relevance to my analysis in The Edge of Evolution is that, like other mutations that help with malaria, these mutations, too, are ones which degrade the function of a normally very useful protein even beneficial mutations are very often degradative mutations. Second, it’s a lot faster to get a beneficial effect (if one is available to be had) by degrading a gene than by making specific changes in genes. Question: How do you make a statue of an elephant? Answer: Get a block of stone and carve away everything that doesn't look like an elephant. Discuss.JunkyardTornado
May 14, 2008
May
05
May
14
14
2008
10:49 AM
10
10
49
AM
PDT
Paul, Very good points. I think the proper attitude is just to simply love a mystery. I think many of those on the anti-ID side want an answer now, insist they have one, and insist it is the one they wanted.tribune7
May 14, 2008
May
05
May
14
14
2008
08:07 AM
8
08
07
AM
PDT
The more I think about Patrick's example, the more it does seem to be an instructive microcosm. A couple of other points: 1) My understanding of computer architecture is dated, but historically, you would have a small set of high-speed registers, and often there was a requirement that at least one operand in any instruction had be in one of these registers. So in the example, I said let's assume that the loop iterator is in one of the registers, where there were only a hundred registers. Someone could object that are not there potentially millions of memory location where a data value could be stored. And yes, there are, but the immediate relevant context tends to be in those high-speed registers (and furthermore there's not anywhere near a hundred of those I'm sure on even a modern processor.) So the obvious point is, since there's only a handful of possible instructions, then if there's also an immediate context where data values will be found (those registers) it REALLY increases the odds of what you can accomplish through random chance. Is there an immediate relevant context with organisms as well. A high-level programming language is an artifice that obscures what is actually going on, so that if you reference a variable, its not made known to you that behind the scenes its being shifted temporarily into a register. Also, with a high-level language, have one character out of place anywhere and the thing won't even compile. That cannot be comparable to an organism. Compilers could be written so that with anything ambiguous it just takes some arbitrary course of action, but instead they throw an error. But with the actual binary executable you can change bits around and it will still run. 2). In the case of the sorting subroutine, it was pointed out you could throw that instruction into it anywhere and have it function correctly. The interesting point is the larger and more complex that subroutine is, the more likely you will be able to optimize it by random chance in this way, (because if the subroutine is larger the greater chance you'll hit it by tossing the instruction in to the program at random.) This seems to be a correllary to the alternative idea that the more complex a function is, the easier it is to break it (which is definitely not the case here).JunkyardTornado
May 13, 2008
May
05
May
13
13
2008
11:44 PM
11
11
44
PM
PDT
Sorry, but there's no editing feature here. (seem's like even ID is not capable of coming up with that). Just have the register being referenced be the one holding the data set size, not the loop iterator.JunkyardTornado
May 13, 2008
May
05
May
13
13
2008
10:10 PM
10
10
10
PM
PDT
tribune7, (21) An experiment that is similar in some ways to the one you quoted has been done. At the WIPP project in New Mexico, large salt crystals were selected that reportedly showed no evidence of recrystallization. The crystals were carefully sterilized with concentrated hydrochloric acid and (separately) sodium hydroxide. Spore-forming halophilic bacteria were cultured from fluid inclusions in two of the crystals (three inclusions--one crystal had two inclusions). The bacilli were documented to be reduced in surviving number by 10^-11 by the acid and alkali treatment. Conclusive proof that halophilic bacterial spores came from when the salt beds originally formed, right? Not so fast. You see, these salt beds had a conventional geologic age of 250 million years. There's no way these spores could have lasted this long, what with radiation and all. And so, reviewers made choices A and/or B: A. The salt crystals had recrystallized, regardless of evidence to the contrary. B. The acid/alkali treatment had not completely sterilized the crystals, and the spores came from outside of the crystals. The investigators continued to choose: C. 250 million year old bacterial spores can still be viable. Of course, one can imagine the pandemonium that would be set off within the scientific community with: D. The formations are not that old. The halophilic bacteria story might make some ID adherents a little more sympathetic to the reactions that you postulate for the proposed prayer experiment. Ideas die hard, and to be fair, perhaps some of them should.Paul Giem
May 13, 2008
May
05
May
13
13
2008
10:01 PM
10
10
01
PM
PDT
The point about the timer merely 'aborting' means it could just do a subroutine 'return', not have to specify an actual goto address. Assume the sorting code is in it own subroutine. Then within that subroutine we know there will be one major loop that just get executed over and over again until the sorting is completed. As a result you could throw the following instruction into the subroutine at any random point: "if (n>x) then return". So if the instruction set is 5 in length (e.g.) and says there's 100 registers, you have 1/5*1/100*1/5 = 1/2500 odds. Here we're assuming the loop iterator n is stored in one of the registers, and two instructions are referenced, 'if' and 'return' [i.e. 'abort']. Whatever we plug in there for x has a great shot at improving program performance.JunkyardTornado
May 13, 2008
May
05
May
13
13
2008
09:36 PM
9
09
36
PM
PDT
Patrick wrote: So, as an example of a constructive beneficial mutation, I used a conditional statement to check the size of the dataset. If it was below a certain amount I ran the sorting function. This way the program ran optimally no matter the size of the dataset. This change did not require much code. I believe (it’s been years) it was between 40-70 characters. In CSI terms that’s 320-560 informational bits. So depending on the complexity, it might be good enough for a design inference Defintely hair-splitting here, (why can't someone come up with some fresh new cliches, BTW) but at the machine level it would be one byte for an op code and a couple of operands (not 40-70 characters). Even that's too conservative, because we know only three instructions are required to compute anything - move zero into register, increment register, and conditional jump. So only 2 bits would be required for the op code, not eight bits. But it got me thinking, what if a timer got thrown into the sorting code by accident and different variants of the code had different time-out values. Once the timer expired, the sorting was aborted. Since only partially sorted code certainly wouldn't hurt anything, presumably the variant with the optimal timeout value would prevail.JunkyardTornado
May 13, 2008
May
05
May
13
13
2008
08:09 PM
8
08
09
PM
PDT
Here's the definitions of beneficial and deleterious I was using: a beneficial mutation increases the organisms' number of expected descendents (even if it is destructive of some pre-existing function in the organism's parents); a deleterious one decreases that number. How do you define destructive and constructive mutations? Destructive are those that destroy or degrade some existing function? Constructive are those which contribute to some new function? But a single mutation could theoretically do both of those things.congregate
May 13, 2008
May
05
May
13
13
2008
01:20 PM
1
01
20
PM
PDT
Every mutation is beneficial or deleterious in a limited sense; whether a mutation is beneficial or deleterious is always dependent in part on environmental conditions.
Nonsense! Again, we're looking for CONSTRUCTIVE mutations, not just beneficial mutations in general. But I would expect you to say that since the only way to make Darwinism seem reasonable is to constantly conflate "beneficial mutations". Once people start making distinctions then that's when it becomes obvious how frail Darwinism truly is. Behe noted this recently:
An interesting paper appeared recently in the New England Journal of Medicine. (1) The workers there discovered some new mutations which confer some resistance to malaria on human blood cells in the lab. (Their usefulness in nature has not yet been nailed down.) The relevance to my analysis in The Edge of Evolution is that, like other mutations that help with malaria, these mutations, too, are ones which degrade the function of a normally very useful protein, called pyruvate kinase. As the workers note: "[H]eterozygosity for partial or complete loss-of-function alleles . . . may have little negative effect on overall fitness (including transmission of mutant alleles), while providing a modest but significant protective effect against malaria. Although speculative, this situation would be similar to that proposed for hemoglobinopathies (sickle cell and both -thalassemia and -thalassemia) and G6PD deficiency. . ." This conclusion supports several strong themes of The Edge of Evolution which reviewers have shied away from. First, that even beneficial mutations are very often degradative mutations. Second, it’s a lot faster to get a beneficial effect (if one is available to be had) by degrading a gene than by making specific changes in genes. The reason is that there are generally hundreds or thousands of ways to break a gene, but just a few to alter it beneficially without degrading it. And third, that random mutation plus natural selection is incoherent. That is, separate mutations are often scattered; they do not add up in a systematic way to give new, interacting molecular machinery. Even in the professional literature, sickle cell disease is still called, along with other mutations related to malaria, “one of the best examples of natural selection acting on the human genome.” (2) So these are our best examples! Yet breaking pyruvate kinase or G6PD or globin genes in thalassemia does not add up to any new system. Then where do the elegant nanosystems found in the cell come from?
In Behe's biology example a constructive beneficial mutation would be one that created new functionality that defends against malaria. You'd think a defensive mechanism like that would be within the reach of Darwinian processes considering all the other great feats they're supposed to have performed. Never mind all the other advances that humans have made during the time malaria has been attacking us. Now I will give a software-based example of the difference between destructive beneficial mutations and constructive beneficial mutations. I'm hoping it'll be easier to follow. I had written a software program where the dataset was organized via a sorting algorithm before the primary functions were run. While the CPU did have to spend extra time the overall performance was improved. Unfortunately, this function did not scale. Eventually, once the dataset reached a certain size the sorting function became a bottleneck, decreasing overall performance. I optimized the function, but that was only a stopgap measure. Running a series of tests, I found that after a certain dataset size was reached disabling the sorting function was best. In fact, in some tests the improvement was 700%. So, as an example of a constructive beneficial mutation, I used a conditional statement to check the size of the dataset. If it was below a certain amount I ran the sorting function. This way the program ran optimally no matter the size of the dataset. This change did not require much code. I believe (it's been years) it was between 40-70 characters. In CSI terms that's 320-560 informational bits. So depending on the complexity, it might be good enough for a design inference. But this beneficial result could be reached via a destructive beneficial mutation. This change would be within the reach of Darwinian processes since it only takes 2 single point changes (16 informational bits): a "//" to comment out the line where the sorting function is called to organize the dataset. But this destructive beneficial mutation is only beneficial in environments where the dataset has exceeded a certain amount. I hope this example makes the difference clear. But I'll try to be charitable in reading what you wrote. I'll assume what you might be trying to say is that not all changes will be optimal in some environments, not that some will be deleterious. Deleterious is like blowing up a bridge to stop an enemy's advance. Constructive is building a fortress on the bridge. Optimality is a different subject, which is addressed in several pages: http://www.designinference.com/documents/2000.02.ayala_response.htm https://uncommondescent.com/intelligent-design/the-problem-of-improvable-design/ https://uncommondescent.com/intelligent-design/airplane-magnetos-contingency-designs-and-reasons-id-will-prevail/Patrick
May 13, 2008
May
05
May
13
13
2008
12:59 PM
12
12
59
PM
PDT
Folks, freedom of religious expression is not only meant for Baptists and Catholics, but also for Darwisciples and all other religions, as would be expected by the Constitution of all free and democratic countries. I think an individual's right to hold unto his or her convictions, as his or her conscience dictates, is a principle worth honoring.JPCollado
May 13, 2008
May
05
May
13
13
2008
10:41 AM
10
10
41
AM
PDT
”Now faith is being sure of what we hope for and certain of what we do not see." Hebrews 11: 1 I bet Dr. Eric Meikle would agree.JPCollado
May 13, 2008
May
05
May
13
13
2008
10:11 AM
10
10
11
AM
PDT
Take 2 sterile petri dishes, labeled A and B, and have a person put them in an isolated environment. In another room, have a group of people pray for life to arise in one or the other petri dishes. Determine which one they pray for by a coin flip. Make sure that the group of people and the person handling the petri dishes don’t contact which one they’re praying for to eliminate any possible unconscious or conscious tampering. After a(n) hour/day/week/year, have the first person check the petri dishes for any bacterial growth. Repeat a number of times. OK, and evo, though now banned, has proposed this experiment. Care to guess what the evo would say if life should arise in the targeted dish? Anyone? Anyone? Would it be: A. Despite our best efforts, the target dish became contaminated with bacteria. B. PASTEUR HAS BEEN DISPROVED!!!! Life can form spontaneously!!!!! Just goes to show all religious types didn't know what you were talking about with regard to abiogenesis. HA HA HA. I think it's pretty safe to leave out C. It was a miracle.tribune7
May 13, 2008
May
05
May
13
13
2008
09:55 AM
9
09
55
AM
PDT
Patrick- Every mutation is beneficial or deleterious in a limited sense; whether a mutation is beneficial or deleterious is always dependent in part on environmental conditions.congregate
May 13, 2008
May
05
May
13
13
2008
08:38 AM
8
08
38
AM
PDT
sort of off topic: Most people in the intelligent design camp are aware of the lack of truly beneficial mutations to account for evolution(Behe; EoE). So the problem, first and foremost, for the Theistic IDer, is to prove that "mind" can have a notable effect on "chance" that would be greater from the normal effect on chance from the "normal" environment. This following studies offer the first tentative "baby steps" in that direction of proof for Theistic ID. Page 187 "Your Eternal Self" Hogan In the studies, random number generators (RNGs) around the world were examined after events that affected great numbers of people whether the numbers began to show some order during the events. During widely televised events that have captured the attention of many people, such as Princess Diana's de^ath and the 9/11 tragedies, the combined output of the 60 RNGs around the world showed changes at the exact moments of the announcements of the events that could not be due to chance. To add control to their study researchers identified an event they knew was about to happen that would have an impact on large numbers of people and set up a study to measure the effects on RNGs in different parts of the world....... Oct 3, 1995 OJ Simpson verdict was chosen: around the time that the TV preshows began, at 9:00 AM Pacific Time, an unexpected degree of order appeared in all RNGs. This soon declined back to random behavior until about 10;00 AM, which is when the verdict was supposed to be announced. A few minutes later, the order in all 5 RNGs suddenly peaked to its highest point in the two hours of data precisely when the court clerk read the verdict. --- For me this is verifiable repeatable evidence that overcomes the insurmountable problems that "random chance" has posed to Darwinism and offers proof of principle for the position held by Theistic IDers.bornagain77
May 13, 2008
May
05
May
13
13
2008
07:15 AM
7
07
15
AM
PDT
1 2

Leave a Reply