Uncommon Descent Serving The Intelligent Design Community

Ken Miller may face more embarrassing facts, Behe’s DBB vindicated

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

[slight correction and update follows]

[update: I had previously taken a second hand quotation from Paul Nessleroade of Ken Miller here (2003 Wedge Update). Below I give a quote by Miller by a better source document here: Life’s Grand Design.

However, the more accurate quotation still demonstrates Miller made an embarrassing assessment about the architecture of DNA. I apologize to him for the inaccuracy, however, it was not a material inaccuracy as can be seen in the quotation from his website.

The greater burden is on Miller to retract his mistaken ideas from the public sphere. Further, he owes an apology to many in the ID movement for misrepresentations and errors of fact which he has promoted currently, over the years, and even under oath.]

the designer made serious errors, wasting millions of bases of DNA on a blueprint full of junk and scribbles.

Ken Miller, 1994

Empirical evidence 13 years later is putting some egg on Miller’s face [remember Ken Miller is the guy who made misrepresentations under oath in the Dover trial].

Encyclopedia Of DNA: New Findings Challenge Established Views On Human Genome

The new data indicate the genome contains very little unused sequences and, in fact, is a complex, interwoven network. In this network, genes are just one of many types of DNA sequences that have a functional impact. “Our perspective of transcription and genes may have to evolve,” the researchers state in their Nature paper, noting the network model of the genome “poses some interesting mechanistic questions” that have yet to be answered.

Other surprises in the ENCODE data have major implications for our understanding of the evolution of genomes, particularly mammalian genomes. Until recently, researchers had thought that most of the DNA sequences important for biological function would be in areas of the genome most subject to evolutionary constraint — that is, most likely to be conserved as species evolve. However, the ENCODE effort found about half of functional elements in the human genome do not appear to have been obviously constrained during evolution, at least when examined by current methods used by computational biologists.

According to ENCODE researchers, this lack of evolutionary constraint may indicate that many species’ genomes contain a pool of functional elements, including RNA transcripts, that provide no specific benefits in terms of survival or reproduction.

Behe 10 years ago, in Darwin’s Black Box (DBB) suggested junk DNA may not be junk after all. Behe has been vindicated by the facts, Miller refuted.

Finally, there is at least one other interesting fact in this article: “the ENCODE effort found about half of functional elements in the human genome do not appear to have been obviously constrained during evolution“. This means these designs NOT attributable to natural selection. Features in the genome have been shown not to be likely products of “slight successive modifications”. How I love science!

Comments
[...] Ken Miller [...]Has the progress of science vindicated Mike Behe or Ken Miller? « Wintery Knight
April 10, 2010
April
04
Apr
10
10
2010
07:02 AM
7
07
02
AM
PST
[...] Ken Miller, 1994 [...]Nature “writes back” to Behe Eight Years Later | Uncommon Descent
April 8, 2010
April
04
Apr
8
08
2010
08:02 PM
8
08
02
PM
PST
Does anyone know where in Darwin's Black Box Behe suggests "junk DNA" functionality (what pages)? Thanks!bioinformaticist
July 18, 2007
July
07
Jul
18
18
2007
07:35 AM
7
07
35
AM
PST
"...harder to explain with MET than with ID", I meant to write. (I wish this forum had a way to edit a post.)mike1962
July 12, 2007
July
07
Jul
12
12
2007
06:22 PM
6
06
22
PM
PST
DaveScot, "It’s hard to reconcile a load of junk in a brand spanking new design." Maybe, but not impossible. Maybe the length of a particular sequence matters, but not the particular nucleotides. Like packaging "peanuts." Or maybe the values of a particular segment are something like informational dross produced as a byproduct of cellular division because of some otherwise logistical problem in sloughing it off during the duplication. Until we fully understand the entire process, we probably won't understand what the junk is. One thing seems obvious though- conserved sequences of junk between species with hundreds of millions of years of evolution from their alledged divergence is harder to explain with MET and ID.mike1962
July 12, 2007
July
07
Jul
12
12
2007
06:20 PM
6
06
20
PM
PST
Can we measure the information capacity of that portion of the genome which is actually used? We can at least estimate it. In the case of the human genome it is about 2 per cent… Richard Dawkins
OUCH!scordova
July 12, 2007
July
07
Jul
12
12
2007
05:38 PM
5
05
38
PM
PST
Consider the term "junk DNA." ... Design encourages scientists to look for function where evolution discourages it. -- William Dembski, “Intelligent Science and Design,” First Things, Vol. 86:21-27 (October 1998) Genomes are littered with nonfunctional pseudogenes, faulty duplicates of functional genes that do nothing, while their functional cousins (the word doesn't even need scare quotes) get on with their business in a different part of the same genome. And there's lots more DNA that doesn't even deserve the name pseudogene. It, too, is derived by duplication, but not duplication of functional genes. It consists of multiple copies of junk, 'tandem repeats', and other nonsense which may be useful for forensic detectives but which doesn't seem to be used in the body itself... Can we measure the information capacity of that portion of the genome which is actually used? We can at least estimate it. In the case of the human genome it is about 2 per cent... -- Richard Dawkins, "The Information Challenge," The Skeptic, vol. 18, no. 4 (December 1998)j
July 12, 2007
July
07
Jul
12
12
2007
05:35 PM
5
05
35
PM
PST
I have said in the past that the existence of junk does not imply the aritifact was un-designed. I maintain that position. I have owned cars that had broken non functioning parts. I worked on spaceships that were expected to have (over the course of time due to extreme conditions in space) various systems failures in the course of its mission. The functionality of the "junk DNA" was suspected for at least 2 reasons: 1. The creationists needed an explanation for the appearance of common descent ( formally speaking this was not an ID reason at all) 2. ID proponents who accepted universal common ancestry sensed the patterns were well designed and non-random (beginning with Michael Denton's "Biochemical Echo of Typology" and then various other studies thereafter). What proceeded from ID theory was the intuition to presume design at some point stated by Bill Dembski:
But design is not a science stopper. Indeed, design can foster inquiry where traditional evolutionary approaches obstruct it. Consider the term "junk DNA." Implicit in this term is the view that because the genome of an organism has been cobbled together through a long, undirected evolutionary process, the genome is a patchwork of which only limited portions are essential to the organism. Thus on an evolutionary view we expect a lot of useless DNA. If, on the other hand, organisms are designed, we expect DNA, as much as possible, to exhibit function. And indeed, the most recent findings suggest that designating DNA as "junk" merely cloaks our current lack of knowledge about function. For instance, in a recent issue of the Journal of Theoretical Biology, John Bodnar describes how "non-coding DNA in eukaryotic genomes encodes a language which programs organismal growth and development." Design encourages scientists to look for function where evolution discourages it. (William Dembski, "Intelligent Science and Design," First Things, Vol. 86:21-27 (October 1998))
I personally was interested in the "broken part" interpretation of design. Because if parts are broken, this suggests they can be somehow "fixed" and this would have medical implications.... The ID proponents in general had less to lose with the junk being junk interpretation, but the creationists had quite a bit at stake in the controversy. On the other hand, if large amounts of "junk DNA" were not junk it would make Haldane's dilemma very accute and it would also destroy large amounts of Neutral Theory and the Molecular Clock Hypothesis. Further, it reveals that biology is orders of magnitude more complex than what we even thought only 10 years ago. These polyconstrained architectures are complex beyond imagination.scordova
July 12, 2007
July
07
Jul
12
12
2007
08:21 AM
8
08
21
AM
PST
bfast He held that ID would fall if junk DNA proved to be junk. This is surely the PRIMARY prediction of ID. I completely disagree. Life could have been designed billions of years ago with no intervention since then and a lot of junk could have accumulated since that time even as the planned phylogenesis was unfolding. If you drive your car long enough without cleaning the windshield you get a lot of bugs on it but that doesn't mean the car wasn't designed or no longer able to do what it was designed to do. Where junk DNA doesn't play well is in special creation in the recent past. It's hard to reconcile a load of junk in a brand spanking new design.DaveScot
July 12, 2007
July
07
Jul
12
12
2007
01:00 AM
1
01
00
AM
PST
Regarding the 12 CODES, I had an intersting exchange with PZ Myers here. PZ complains about poly-constrained codes:
Where ever did you get these ideas? this elaborate fiction you've erected is otherwise nonsensical.
Yet this supposed "fiction" is turning out to be fact. PZ couldn't believe Darwinism would make such structures, therefore they don't exist. But let me quote a greater scientist than PZ, renowned Cornell Geneticist John Sanford from the landmark ID book, Genetic Entropy:
There is abundant evidence that most DNA sequences are poly-functional, and therefore are poly-constrained. This fact has been extensively demonstrated by Trifonov (1989). For example, most human coding sequences encode for two different RNAs, read in opposite direction s(i.e. Both DNA strands are transcribed Yelin et al., 2003). Some sequences encode for different proteins depending on where translation is initiated and where the reading frame begins (i.e. read-through proteins). Some sequences encode for different proteins based upon alternate mRNA splicing. Some sequences serve simultaneously for protein-encoding and also serve as internal transcriptional promoters. Some sequences encode for both a protein coding, and a protein-binding region. Alu elements and origins-of-replication can be found within functional promoters and within exons. Basically all DNA sequences are constrained by isochore requirements (regional GC content), "word" content (species-specific profiles of di-, tri-, and tetra-nucleotide frequencies), and nucleosome binding sites (i.e. All DNA must condense). Selective condensation is clearly implicated in gene regulation, and selective nucleosome binding is controlled by specific DNA sequence patterns - which must permeate the entire genome. Lastly, probably all sequences do what they do, even as they also affect general spacing and DNA-folding/architecture - which is clearly sequence dependent. To explain the incredible amount of information which must somehow be packed into the genome (given that extreme complexity of life), we really have to assume that there are even higher levels of organization and information encrypted within the genome. For example, there is another whole level of organization at the epigenetic level (Gibbs 2003). There also appears to be extensive sequence dependent three-dimensional organization within chromosomes and the whole nucleus (Manuelides, 1990; Gardiner, 1995; Flam, 1994). Trifonov (1989), has shown that probably all DNA sequences in the genome encrypt multiple "codes" (up to 12 codes). John Sanford
scordova
July 11, 2007
July
07
Jul
11
11
2007
07:34 PM
7
07
34
PM
PST
This is the de.ath blow for darwinists. The Dna will continue to astound researchers as more of the complexity is revealed. Dr. Sanford quotes a preliminary study that indicates the encryption of the DNA is up to 12 CODES Thick. To Quote the Psalmist, "I will praise you for I am fearfully and wonderfully made!"bornagain77
July 11, 2007
July
07
Jul
11
11
2007
07:25 PM
7
07
25
PM
PST
This is the blow for darwinists. The DNA will continue to display complexity that astound researches as more is revealed. To Quote The Psalmist. "I will praise You for I am fearfully and wonderfully made!"bornagain77
July 11, 2007
July
07
Jul
11
11
2007
07:20 PM
7
07
20
PM
PST
The orignal quote had some errors, but now it's repaired. Apologies to Ken Miller for the earlier mis-quotation. Unfortunately for him, the corrected quotation even more clearly demonstrates the gravity of his error.scordova
July 10, 2007
July
07
Jul
10
10
2007
04:59 PM
4
04
59
PM
PST
Ed Darrell, The full essay was here: Life's Grand Design There were two version, one for PBS and one for Technology Review. You probably got the PBS version. Nesselrode's citation, it turns out, was correct. Salvadorscordova
July 2, 2007
July
07
Jul
2
02
2007
03:15 PM
3
03
15
PM
PST
Thank's for pointing out my error. It was 2nd hand from Paul Nesslerode. Cornelius Hunter gives the correct reference through Behe's DBB page 225-226.scordova
June 30, 2007
June
06
Jun
30
30
2007
08:18 AM
8
08
18
AM
PST
Mr. Cordova, in the article I found, under the title you cite, by Kenneth Miller, he says something quite different from how you quote him. May I ask where you got the quote?edarrell
June 30, 2007
June
06
Jun
30
30
2007
12:44 AM
12
12
44
AM
PST
[...] 1. Ken Miller is the guy who has taken various bruisings from scientific evidence and continues his misrepresentations and story telling as he did under oath in the Dover trial. [See: Ken Miller may face more embarrassing facts, Behe’s DBB vindicated and Ken Miller caught making factually incorrect statements under oath] [...]Ken Miller, the honest Darwinist | Uncommon Descent
June 28, 2007
June
06
Jun
28
28
2007
12:01 AM
12
12
01
AM
PST
Trying to check out Miller's statement, as you cite it in the opening of the thread. What is "Life's Grand Design?" A book? An article? I'm not finding it at Amazon, or anywhere else. Help!edarrell
June 25, 2007
June
06
Jun
25
25
2007
02:52 PM
2
02
52
PM
PST
[...] [NOTES] 1. These recent developments are in addition to the egg on Ken Miller’s face and laurels for Michael Behe. [See: Ken Miller may face more embarrassing facts, Behe’s DBB vindicated] [...]Zuck is out of luck, marsupial findings vindicate Behe, Denton, Hoyle | Uncommon Descent
June 21, 2007
June
06
Jun
21
21
2007
04:08 PM
4
04
08
PM
PST
We must bear in mind that the most modern design paradigms in engineering today do not deal with deeply deeply redundant, self-healing, self-assembling architectures that operate with high levels of internal quantum noise. At the quantum level there is substantially much more noise to deal with then say macroscopically engineered artifacts. Today's engineers tend to think linearly, cause effect. In contrast, modern day nano-molecular engineering is having to confront non-linear thinking where at every step of the way something is presumed to go wrong because of quantum effects. This gives a totally different mode of how to build computational architectures. I got a taste of this building satelite software where you actually presumed, at the end of a module exection, your software may not have executed properly due to cosmic radiation. To compensate for this, one had what looked to the uninitiated, an absolutely horrendous amount of convolution and apparently meaningless "waste". Not so. I think it pre-mature to call certain processes garbage. Biology may be telling us, that this sort of convoluted architecture is what works best at the nano-molecular level. Art2 is expressing his prejudice that biology is an accident. Modem signals to the uninitiated ear sound like a accidents too....scordova
June 16, 2007
June
06
Jun
16
16
2007
04:30 PM
4
04
30
PM
PST
Art re Genomic garbage collection. If it's true that cells devote enormous resources to unproductive activities it's just another nail in Natural Selection's coffin. The cell lines you mention, like all living cell lines, have been undergoing selection for (ostensibly) billions of years. NS works to make cell lines more fit. If the huge amount of garbage collection is true that's awfully less fit. The garbage is most efficiently dealt with by removal from the genome rather than the transcriptome after considerable resources have been devoted to transcription. If true then it would be adverse to the veracity of both natural selection and interventionist theories of intelligent design pretty much only leaving a genome front-loaded billions of years ago as the best explanation with genetic entropy over billions of years explaining the gross inefficiency found today. That's fine with me as I already believe front loading is the best explanation but still I'd be rather surprised if the garbage collection in the transcriptome is really garbage collection rather than some misunderstood critical process. As I said the acid test would be to artifically eliminate the suspected genomic garbage from the genome so that it never reaches the transcriptome and see if there's any adverse effect from the knockout on the GM organism. Be sure to let me know when that's been done otherwise I think talking about transcriptome garbage collection is little more than woolgathering.DaveScot
June 16, 2007
June
06
Jun
16
16
2007
11:41 AM
11
11
41
AM
PST
Art There really is nothing remarkable about this region. Bona fide researchers in this area of expertise such as Arend Sidow disagree with you. What can I say? Should I put more stock in an anonymous blog commenter or people like Arend Sidow? No disrespect meant but I'm going with Sidow's opinion over yours. See my recent article (I wrote it with you in mind) quoting Sidow's reaction to the experiment which you don't think means anything.DaveScot
June 16, 2007
June
06
Jun
16
16
2007
11:19 AM
11
11
19
AM
PST
Hi DaveScot, Two points. First, about that mouse genic "desert", the original report was made before other genomes were available for comparison. Which is probably why you see the more recent and less, um, provocative stance from the author(s). If you take the time to get lost in the genome browser (for some of us, its more addictive than computer games), you will find that this region is, for all practical purposes, drifting randomly. This is apparent if you look at the nature and distribution of the mutational changes, and if you look at the conservation in other, more distantly-related genomes. There really is nothing remarkable about this region. As far as garbage disposals, it may be easier for the crew here to think in terms of an analogy. Grant, for the moment (I can hear the snickers..) that Microsoft Word is a lean, mean package of, say, 100 MB of executable code. The concept of "junk DNA" as commonly held would translate to an interspersing of this code with some 10 GB of stuff - lines that just sit there and are ignored. What the trf4 paper (and others like it) suggests is something far more bizarre. Not only would the Word code be interspersed with 100 fold more "junk", it would actually be interrupted by or wrapped around 100 times as much executable code. Code that is running all the time, occupying 100 times more processor resources than the actual program, but producing no useful output (in fact, no output at all).Art2
June 16, 2007
June
06
Jun
16
16
2007
06:39 AM
6
06
39
AM
PST
#26
I don’t think the researchers are suggesting anything that implies (at least in their minds) anything like front loading. What they almost certainly had in mind was a system that itself built up over evolutionary time, where it was advantageous to organisms to carry along a “warehouse” or evolutionary tool kit in order to better deal with new situations.
Agree. I didn't say that this was their intention. It's the existence of a toolkit isn't that isn't a speculation, while a speculation is their idea that the toolkit did evolve by RM+NS. What I mean is just that, as reasonably the toolkit couldn't be evolved by RM+NS, front loading hypothesis is now more plausible, IN SPITE of their NDE speculations.kairos
June 16, 2007
June
06
Jun
16
16
2007
06:05 AM
6
06
05
AM
PST
DaveScot,
We don’t care about where evolution will take living things 100 million years from now. All we really care about where evolution will take them next week, next year, or within the next 100 years. That limits any practical application of evolutionary theory to no more than the generation of varieties or sub-species.
This is a great point and fires gaping holes in the 'Darwinism is vital to medicine' argument. In one breath it is argued that we can't test Darwinism because it takes too long (Behe's 10^20 malarial cells just aren't enough) and is necessarily outside of our field of observation. In the next it is argued that if you don't agree with the historical conjecture about that which happens too slowly, too long ago and always somewhere else, then you can't practice science or medicine.Charlie
June 15, 2007
June
06
Jun
15
15
2007
11:50 PM
11
11
50
PM
PST
Art By the way garbage disposal in software engineering is called garbage collection. The fact that automated garbage collection algorithms were discovered in cellular machinery comes as no surprise to design pundits familiar with human designed information processing and control systems. Most everything useful in human designed machinery has parallels in biological machinery. Presuming that biological systems were designed by an intelligent agency similar to human design engineers is a fruitful heuristic for discovery whether true or not. Even more fruitful is the converse. The nanomolecular technology employed by living cells far surpasses anything we've invented on our own recognizance. For example, a bacteria was recently discovered living in the gut of sturgeon fish that is so large it's visible to the naked eye. Why is it so large one might ask? It's because it takes that much room to harbor a trillion (yes Virginia that's 1,000,000,000,000) DNA base pairs. Human engineers at present can only dream about having such information storage density as that dumb little bacteria.DaveScot
June 15, 2007
June
06
Jun
15
15
2007
10:35 PM
10
10
35
PM
PST
Art I don't make up the definitions of what conserved sequences are and are not. The non-coding sequences in question are classed as highly conserved (not ultra conserved) and the conservation is on a par with biologically active protein coding sequences that mice and men share. If I recall correctly the 1000+ sequences in question were at least 100 base pairs long with exact matches ranging from a minimum of 70% to a maximum of 95%. Although there is no standard writ in granite the authors of the research consider any sequence matches over 95% to be ultra-conserved. There were no ultra-conserved 100bp regions in the 1.5mbp block of deleted DNA (actually two blocks that added up to 1.5mbp were deleted). I understand that knockouts have been done on ultra-conserved non-coding DNA and the results revealed biological activity for those regions. There are precious few ultra-conserved non-coding regions between mice and men so there's not a lot to be discovered there. The researchers expressed amazement that no biologic activity was detected in the experiment and there's no mystery why - highly conserved regions are axiomatic with biologic activity (hence selection value hence conservation). It no longer appears to be axiomatic that conservation means activity and activity means conservation. That fundamental rule of NeoDarwinian evolution is now broken. In an honest world this would have been earth-shaking news but we're talking about evolutionary dogma and in that world inconvenient facts are quietly buried. Last night I went to the trouble of finding more recent published research from the same author to see what investigative avenue he took upon the findings in question. To my great surprise he simply wrote off the results under the rubric that different regions of DNA drift at different rates and those highly conserved non-coding regions with no evident biologic activity simply drifted much slower than expected. Par for the course. NeoDarwinian evolutionary dogmatists long ago stopped trying to test the theory and just assume it to be true. The lead researcher, instead of trying to verify his speculative explanation of different rates of neutral drift, is trying to find a reliable method of discriminating between functional and non-functional non-coding DNA. He's doing this by running non-coding genomic comparisons between humans and vertebrates presumably more distant than mice such as fish. As an engineer I understand this tack. He's looking for practical discoveries. Finding that highly conserved non-coding DNA between mice and men is no indicator of biologic activity does nothing to cure disease. He's interested in discovering functional non-coding DNA so that knockout experiments will reveal the function and hopefully lead to identification and treatment of genetic disorders caused by random mutations to those functional regions. That said, as far as discovery for discovery's sake it's a cop-out. It seems no one is interested in why a Darwinian prediction failed but are instead only concerned about how to work around the failed prediction so that medically relevant research can be pursued. As we all know but as some of us won't admit, the theory that evolution proceeds by chance & necessity instead of design has no bearing at all on anything with practical application. Genomes are what they are no matter how they got that way and evolution proceeds so slowly that only microevoluntionary change is relevant to anything practical. I couldn't invent a more adverse finding for the veracity of natural selection than sequence conservation over 100 million years doesn't imply biologic activity in the conserved sequence. But the truth of the matter is that no one cares why a prediction made by natural selection failed so miserably. Practical timeframes for evolution are measured in no more than a few human generations. We don't care about where evolution will take living things 100 million years from now. All we really care about where evolution will take them next week, next year, or within the next 100 years. That limits any practical application of evolutionary theory to no more than the generation of varieties or sub-species. No one worries about mice maybe evolving into a predatory species with nuclear weapons and a score to settle against mankind millions of years in the future. No one worries about bacteria evolving into something substantially different. All we worry about are microevolutionary changes. So there's no reason to figure out why a major prediction of natural selection failed but rather just a concern about how to work around it. Isn't that just precious! DaveScot
June 15, 2007
June
06
Jun
15
15
2007
09:25 PM
9
09
25
PM
PST
magnan, that's about the gist of things. What's really cool about the trf4 poly(A) polymerase is that it resembles, in its action, similar enzymes found in bacteria. This turns one particular "fact" that most students learn on its head - RNA polyadenylation is seen in all domains of life (not just the eukaryotic nucleus and cytoplasm), and its primordial function is not that which we learn of in school (translation and mRNA stabilization). Rather, it is all about breakdown, turnover, degradation. (Sorry for evangelizing about garbage disposals - they're really quite fascinating.)Art2
June 15, 2007
June
06
Jun
15
15
2007
08:44 PM
8
08
44
PM
PST
#15 (Art2): From the paper you cited: "Our data strongly support the existence of a posttranscriptional quality control mechanism limiting inappropriate expression of genetic information." You seem to be suggesting that the research cited in the "Encyclopedia of DNA" article is being wrongly interpreted by careless science journalists and overly speculative researchers. Actually the transcribed RNA from formerly considered "junk" sequences isn't functional in any way whether or not it is overlapping. It is still just garbage that is later disposed of by the polymerases mentioned in the paper you cited. Is this your thesis?magnan
June 15, 2007
June
06
Jun
15
15
2007
02:00 PM
2
02
00
PM
PST
#21 (kairos): "I don’t think that this is a speculation; they have simply stated a scientific hypotesis to try to explain datai within the current paradigm. But the result could be even more favourable for ID. After actual evidence of high % of useful DNA (first score) the researcher themselves are actually suggesting that DNA seems more and more FRONT LOADED (second score). Am I wrong?" The article writer refers to this thinking as speculation, though I later dignify it by calling it a hypothesis. I don't think the researchers are suggesting anything that implies (at least in their minds) anything like front loading. What they almost certainly had in mind was a system that itself built up over evolutionary time, where it was advantageous to organisms to carry along a "warehouse" or evolutionary tool kit in order to better deal with new situations.magnan
June 15, 2007
June
06
Jun
15
15
2007
01:27 PM
1
01
27
PM
PST
1 2

Leave a Reply