Uncommon Descent Serving The Intelligent Design Community

Junk DNA: Only 20%? All but 8.2%?

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Estimates of how much human DNA is functional: 8.2% to 80%,according to a recent post at ScienceBlogs:

Science and its interpretation is wonderful. Today I saw a post on Twitter from @LAbizar, referencing an @GEN, post that stated 8.2% of Human DNA is Functional with a link to a GEN article: “Surprise: Only 8.2% of Human DNA Is Functional.” The GEN writeup cited a PLoS Genetics article, “8.2% of the Human Genome Is Constrained: Variation in Rates of Turnover across Functional Element Classes in the Human Lineage,” released today.

In 2012, the ENCODE (Encyclopedia of DNA Elements) project published a landmark summary, “An integrated encyclopedia of DNA elements in the human genome,” from nine years of work measuring the ways in which DNA structure and its interactions with proteins such as transcription factors might contribute to the regulation of genes. In the paper’s abstract the team stated that “These data enabled us to assign biochemical functions for 80% of the genome, in particular outside of the well-studied protein-coding regions.” As a very small fraction of the genome (~1%) encodes for protein sequences, a question in science has been, what does the other 99% do? ENCODE data demonstrated that much of this DNA participates in biochemistry in some way. Many lauded the work for its tour-de-force effort and the resources contributed have been significant.

Not a lot of room between those estimates, is there? Blogger finchtalk says that it all depends on what you count.

Well, if it counts for anything … it’s not junk.

Note: Here’s the referenced 8.2% article. (Public access.)

Follow UD News at Twitter!

Hat tip: Timothy Kershner

Comments
I read through the Abstract and the Author Summary part of the paper (parts from which the blog quotes were extracted). It seems that their proposed percentage (8%) of functional genome is based mostly on speculations based on the belief in evolution (conserved sequences along hundreds of million years, etc.) while ENCODE percentage resulted more from an empirical, research and measurement approach. I found interesting though the readers' comments. Two such comments notice that "the authors of this work have failed to apply existing nomenclature". But the most interesting comment is this:
Nonconserved or fast evolving DNAs are also functional, just for a different purpose Posted by ShiHuang on 10 Aug 2014 at 23:27 GMT I found the last paragraph of our poster "Testing the infinite sites assumption" recently presented at the "1000 genomes project and beyond" meeting in Cambridge UK to be relevant to this interesting paper. here it is: Sequence conservation per se may not automatically indicate functionality of variants within such sequences as is commonly assumed. Less conserved sequences are more important for adaptation to external environment, while the more conserved ones are important for internal integrity of a system. To a virus or bacteria facing elimination by human medicines, the fast evolving parts of their genome is far more critical/functional to their survival than their more conserved parts. The popular assumption of neutrality/non-functionality for the less conserved parts of the genome overlooks their fundamental function in quick adaptation. for a pdf of the poster, please visit this site http://www.sklmg.edu.cn/a...
I followed the above link to professor Shi HUANG's page where he summarizes his theory: Maximum Genetic Diversity (MGD) which I found interesting. I am quoting it's main paragraph below:
We proposed a novel hypothesis of genetic diversity and evolution, the Maximum Genetic Diversity (MGD) hypothesis. It is based on a pair of self-evident intuitions on construction. The first intuitive idea posits that the maximum tolerable level of random variations/errors/noises in building blocks above atom level is inversely related to system complexity. The equivalent of this concept in biology is that MGD is inversely related to epigenetic complexity. The second intuitive idea posits that any system can allow a certain degree of random noises/errors in its building blocks, and such limited degree of random errors may confer zero, negative, or positive values to the functioning/survival of the system under certain environmental circumstances. In biology, this simply means that an organism has a specific level of MGD and that genetic variations within MGD may confer zero, negative, or positive values to the functioning/survival of the organism under certain environmental circumstances. The first idea describes macroevolution where there is change in complexity over time, while the second idea describes microevolution and population genetics where there is no change in system complexity over time. The second idea thus underlies the proven portions of the modern evolution theory composed of Neo-Darwinism and Kimura’s Neutral theory. The novel points of the MGD hypothesis are 1) macroevolution and microevolution are different, and 2) an increase in organismal or epigenetic complexity is associated with a decrease in MGD. This hypothesis has been supported by numerous observations and has yet to meet a contradiction. It has solved a nearly half century old puzzle of biology, the genetic equidistance phenomenon. By studying fundamental mysteries of common diseases, complex traits, and evolution, our goal is to use the MGD hypothesis to solve major real world biomedical problems and to gain novel insights into the past, present and future of life on Earth.
If I understand this MGD theory correctly it claims that the more complex an organism is the more limited is its genetic diversity. Or paraphrasing it, the more complex an organism is, the smaller the number of (random) genetic variations. This make sense logically (and intuitively - as the professor says) because a perfectly functioning, complex organism requires higher precision in its construction and internal biological mechanisms and tolerates (is able to survive with) only reduced number of errors (random variations). I would say that this theory sounds like some common sense engineering perspective on complex machinery.InVivoVeritas
November 2, 2014
November
11
Nov
2
02
2014
11:05 PM
11
11
05
PM
PDT
1 2

Leave a Reply