Jonathan Wells on Darwinism, Science, and Junk DNA
|November 12, 2011||Posted by News under 'Junk DNA', Intelligent Design|
On November 5, I posted a response to people who falsely claim that I set out to oppose Darwinism on orders from Reverend Sun Myung Moon. Since then, many comments have been posted—some of them critical of my book, The Myth of Junk DNA. Unfortunately, other commitments prevent me from responding to every detail (so many critics, so little time!). So I have selected some representative comments posted by two people using the pseudonyms “Gregory” and “paulmc.”
By “Darwinism,” I mean the claim that all living things are descended from one or a few common ancestors, modified solely by unguided natural processes such as variation and selection. For the sake of brevity, I use the term here also to include Neo-Darwinism, which attributes new variations to genetic mutations.
By “Darwinists,” then, I mean people who subscribe to that view. Having worked in close proximity with biologists for over two decades, I can confidently say that most of them—at least in the U.S.—are Darwinists in this sense.
“Gregory” also wrote that “without ‘doing science,’ Jonathan Wells personally concluded ‘evident design’ in ‘the mountains of Mendocino county.’ Thus, the argument that ‘intelligent design is a purely scientific pursuit’ is obviously untrue.” I’m not sure what “Gregory” means here by a “purely scientific pursuit.” Intelligent design (ID) holds that we can infer from evidence in nature that some features of the world, including some features of living things, are better explained by an intelligent cause than by unguided natural process such as mutation and selection. Unlike creationism, ID does not start with the Bible or religious doctrines.
So if “science” means making inferences from evidence in nature—as opposed to inventing naturalistic explanations for everything we see (as materialistic philosophy would have us do)—then ID is science.
Second, “paulmc” wrote that “there are a number of strong lines of evidence that suggest junk DNA comprises a majority of the human genome.” The lines of evidence cited by “paulmc” included (1) mutational (genetic) load, (2) lack of sequence conservation, and (3) a report that “putative junk” has been removed from mice “with no observable effects.” In addition, (4) “paulmc” wrote that “there is an active other side to the debate” about pervasive transcription. I’ll address these four points in order.
Before I start, however, I’d like to say that I’m not particularly interested in debates over what percentage of our genome is currently known to be functional. Whatever the current percentage might be, it is increasing every week as new discoveries are reported—and such discoveries will probably continue into the indefinite future. So people who claim that most of our DNA is junk, and that this is evidence for unguided evolution and evidence against ID, are making a “Darwin of the gaps” argument that faces the inevitable prospect of having to retreat in the face of new discoveries.
Now, to the points raised by “paulmc”:
(1) Mutational Load. In 1972, biologist Susumu Ohno (one of the first to use the term “junk DNA”) estimated that humans and mice have a 1 in 100,000 chance per generation of suffering a harmful mutation. Biologists had already discovered that only about 2% of our DNA codes for proteins; Ohno suggested that if the percentage were any higher we would accumulate an “unbearably heavy genetic load” from harmful mutations in our protein-coding DNA. His reasoning provided a theoretical justification for the claim that the vast majority of our genome is functionless junk—what Ohno called “the remains of nature’s experiments which failed”—and that this junk bears most of our mutational load.
According to “paulmc”, this is the first of “a number of strong lines of evidence that suggest junk DNA comprises a majority of the human genome.” But Ohno’s claim was a theoretical one, based on various assumptions about how often spontaneous mutations occur and how they affect the genome.
As of last year, however, the accurate determination of mutation rates was still controversial. According to a 2010 paper:
The rate of spontaneous mutation in natural populations is a fundamental parameter for many evolutionary phenomena. Because the rate of mutation is generally low, most of what is currently known about mutation has been obtained through indirect, complex and imprecise methodological approaches.
Furthermore, genomes are more complex and integrated than Ohno realized, so the effects of mutations are not as straightforward as he thought. As another 2010 paper put it,
Recent studies in D. melanogaster have revealed unexpectedly complex genetic architectures of many quantitative traits, with large numbers of pleiotropic genes and alleles with sex-, environment- and genetic background-specific effects.
In other words, the first line of evidence cited by “paulmc” is not evidence at all, but a 40-year-old theoretical prediction based on questionable assumptions. The proper way to reason scientifically is not “Ohno predicted theoretically that the vast majority of our DNA is junk, therefore it is,” but “If much of our non-protein-coding DNA turns out to be functional, then Ohno’s theoretical prediction was wrong.”
(2) Sequence Conservation. According to evolutionary theory, if two lineages diverge from a common ancestor that possesses regions of non-protein-coding DNA, and those regions are non-functional, then they will accumulate random mutations that are not weeded out by natural selection. Many generations later, the corresponding non-protein coding regions in the two descendant lineages will be very different. On the other hand, if the original non-protein-coding DNA was functional, then natural selection will tend to weed out mutations affecting that function. Evolution of the functional regions will be “constrained,” and many generations later the sequences in the two descendant lineages will still be similar, or “conserved.”
As “paulmc” pointed out , however, many regions of non-protein-coding DNA appear to “evolve without evidence of this constraint;” their sequences are not conserved. According to “paulmc,” this “implies that changes to these sequences do not affect fitness… we expect that for them to be functional they need some degree of evolutionary constraint,” and the absence of such constraint points to their “being putatively junk.”
Not so. Although sequence conservation in divergent organisms suggests function, the absence of sequence conservation does not indicate lack of function. Indeed, according to modern Darwinian theory, species diverge because of mutational changes in their functional DNA. Obviously, if such DNA were constrained, then evolution could not occur.
In 2006 and 2007, two teams of scientists found that certain non-protein-coding regions that are highly conserved in vertebrates (suggesting function) are dramatically unconserved between humans and chimps (suggesting… rapid evolution!). More specifically, one of the teams showed that one unconserved region contains an RNAcoding segment involved in human brain development.
Furthermore, the analysis by “paulmc” assumes that the only thing that matters in nonprotein-coding DNA is its nucleotide sequence. This assumption is unwarranted. As I pointed out in Chapter Seven of my book, non-protein-coding DNA can function in ways that are largely independent of its precise nucleotide sequence. So absence of sequence conservation does not constitute evidence against functionality.
(3) Mice without “junk” DNA. In 2004, Edward Rubin] and a team of scientists at Lawrence Berkeley Laboratory in California reported that they had engineered mice missing over a million base pairs of non-protein-coding (“junk”) DNA—about 1% of the mouse genome—and that they could “see no effect in them.”
But molecular biologist Barbara Knowles (who reported the same month that other regions of non-protein-coding mouse DNA were functional) cautioned that the Lawrence Berkeley study didn’t prove that non-protein-coding DNA has no function. “Those mice were alive, that’s what we know about them,” she said. “We don’t know if they have abnormalities that we don’t test for.”And University of California biomolecular engineer David Haussler said said that the deleted non-protein-coding DNA could have effects that the study missed. “Survival in the laboratory for a generation or two is not the same as successful competition in the wild for millions of years,” he argued.
In 2010, Rubin was part of another team of scientists that engineered mice missing a 58,000-base stretch of so-called “junk” DNA. The team found that the DNA-deficient mice appeared normal until they (along with a control group of normal mice) were fed a high-fat, high-cholesterol diet for 20 weeks. By the end of the study, a substantially higher proportion of the DNA-deficient mice had died from heart disease. Clearly, removing so-called “junk” DNA can have effects that appear only later or under other
(4) Pervasive transcription. After 2000, the results of genome-sequencing projects suggested that much of the mammalian genome—including much of the 98% that does not code for proteins—is transcribed into RNA. Scientists working on one project reported in 2007 that preliminary data provided “convincing evidence that the genome is pervasively transcribed, such that the majority of its bases can be found in primary transcripts, including non-protein-coding transcripts.”
Since an organism struggling to survive would presumably not waste its resources producing large amounts of useless RNA, this widespread transcription suggested to many biologists that much non-protein-coding DNA is probably functional. In 2010, four University of Toronto researchers published an article concluding that “the genome is not as pervasively transcribed as previously reported.” Yet the Toronto researchers had biased their sample by eliminating repetitive sequences with a software program called RepeatMasker, the official description of which states: “On average, almost 50% of a human genomic DNA sequence currently will be masked by the program.” In the fraction that remained, the Toronto researchers based their results “primarily on analysis of PolyA+ enriched RNA”—sequences that have a long tail containing many adenines. Yet molecular biologists had already reported in 2005 that RNA transcripts lacking the long tail are twice as abundant in humans as PolyA+ transcripts.
In other words, the Toronto researchers not only excluded half of the human genome with RepeatMasker, but they also ignored two thirds of the RNA in the remaining half. It is no wonder that they found fewer transcripts than had been found by the hundreds of other scientists studying the human genome. The Toronto group’s results were disputed in 2010 by an international team of eleven scientists, and the group’s flawed methodology was sharply criticized in 2011 by another international team of seventeen scientists.
So “paulmc” was technically but trivially correct in writing that there are two sides to the debate over pervasive transcription. There are also at least two sides to the larger debate over the functionality of non-protein-coding DNA. But I leave it to open-minded readers of The Myth of Junk DNA to decide whether “paulmc” was correct in claiming that “the science at the moment really does fall on one side of this: large amounts of putative junk exist in the human genome.”
Oh, one last thing: “paulmc” referred to an online review of my book by University of Toronto professor Larry Moran—a review that “paulmc” called both extensive and thorough. Well, saturation bombing is extensive and thorough, too. Although “paulmc” admitted to not having read more than the Preface to The Myth of Junk DNA, I have read Mr. Moran’s review, which is so driven by confused thinking and malicious misrepresentations of my work—not to mention personal insults—that addressing it would be like trying to reason with a lynch mob.
Follow UD News at Twitter!