'Junk DNA' Books of interest Intelligent Design

Larry Moran’s new book sounds like a scorcher

Spread the love

Readers may recall that Larry Moran, an evolutionary biologist who used to comment here (not very favorably, but hey) is writing a book asserting that 99% of DNA is junk. A recent blog post took after an MIT biologist, Rick Young:

He was interviewed by Jorge Conde and Hanne Winarsky on a recent podcast (Feb. 1, 2021) where the main topic was “From Junk DNA to an RNA Revolution.” They get just about everything wrong when they talk about junk DNA including the Central Dogma, historical estimates of the number of genes, confusing noncoding DNA with junk, alternative splicing, the number of functional RNAs, the amount of regulatory DNA, and assuming that scientists in the 1970s were idiots.

Larry Moran, “MIT Professor Rick Young doesn’t understand junk DNA” at Sandwalk (May 10, 2021)

This from the pod summary of the show featuring Rick Young:

Much of this so-called junk DNA actually encodes RNA—which we now know has all sorts of incredibly important roles in the cell, many of which were previously thought of as only the domain of proteins. This conversation is all about what we know about what that non-coding genome actually does: how RNA works to regulate all kinds of different gene expression, cell types, and functions; how this has dramatically changed our understanding of how disease arises; and most importantly, what this means we can now do—programming cells, tuning functions up or down, or on or off. What we once thought of as “junk” is now giving us a powerful new tool in intervening in and treating disease—bringing in a whole new category of therapies.

Anyway, Moran goes on to say,

This is a very serious question. It’s the most difficult question I discuss in my book. Why has the false narrative about junk DNA, and many other things, dominated the scientific literature and become accepted dogma among leading scientists? Soemething is seriously wrong with science.

Larry Moran, “MIT Professor Rick Young doesn’t understand junk DNA” at Sandwalk (May 10, 2021)

What’s “wrong,” so far as the rest of us can see, is that researchers keep finding new functions that formerly-junk DNA performs, so they keep looking. For the same reasons as fisherfolk return to the well-stocked lake.

See also: Larry Moran to write new book: Claims genome is 99% junk. If he wants to pick a fight with ENCODE, grab a seat.

We are encouraged to celebrate ENCODE III and the demise of junk DNA.

and

Did beliefs about junk DNA hinder the Human Genome Project?

32 Replies to “Larry Moran’s new book sounds like a scorcher

  1. 1
    Bob O'H says:

    What’s “wrong,” so far as the rest of us can see, is that researchers keep finding new functions that formerly-junk DNA performs, so they keep looking.

    No, what’s wrong is that some people see that researchers keep finding new functions that formerly-junk DNA performs, and leap to the conclusion that it means that we’re wrong about most non-coding DNA being “junk”. Even though we know what most of that non-coding DNA is, and what it’s doing in our genome. At best this is wilful ignorance.

  2. 2
    Jonathan11 says:

    @Bob O’H
    I don’t really get what you are trying to say…

  3. 3
    ycrad says:

    “Does nobody there ever question their own ideas? Do they only read the papers that support their views and ignore all those that challenge those views?” Larry Moran on “the entire culture at MIT and Whitehead”

    It’s really an art to be spontaneously funny.

  4. 4
    polistra says:

    Defending the old makes sense when the new is clearly unneeded or wrong or unproductive. Defending the old is silly when the new is turning out to be useful and productive.

    In 1493 it was still possible to dismiss the existence of continents outside Eurasia. Columbus didn’t actually land on the Americas, and didn’t show convincingly that he found a new place.

    Moran is dismissing the existence of the Americas in 1700, after England and Spain were sending thousands of colonists and bringing back gold and tobacco and corn and potatoes.

  5. 5
    ET says:

    Earth to Bob O’H- Only willful ignorance says that 90% of our genome is junk. It is beyond stupid to think that the histone octamers evolved to spool and organize the DNA. And without those octamers there wouldn’t be any metazoans.

  6. 6
    jerry says:

    I fail to understand the junk DNA argument.

    It has no function – So what.

    ID does not argue against natural processes affecting the genome including adding DNA sequences that don’t do anything.

    It has function – So what.

    ID just recognizes that various parts of the genome have function.

    What’s the big deal?

    Isn’t this an indictment of those making it a big deal?

  7. 7
    JVL says:

    ET: Only willful ignorance says that 90% of our genome is junk.

    How much of the human genome do you think is not transcribed and does not affect transcribed regions?

  8. 8
    Bob O'H says:

    Polistra @ 4 –

    In 1493 it was still possible to dismiss the existence of continents outside Eurasia. Columbus didn’t actually land on the Americas, and didn’t show convincingly that he found a new place.

    I hope you’ll look at this again and realise how bad that comment is. So Eurocentric it not only ignores anyone living outside Eurasia, it also ignores a lot of Eurasians, e.g. the Basques and, well, anyone who lived around the Mediterranean, and new what was to the south of that sea.

    ET @ 5 – huh? I have absolutely no idea what you are trying to get at. What do histones have to do with the function of junk DNA?

    Jerry @ 6 – indeed. We all know that some junk DNA has function, but a lot doesn’t (and we know what most of that lot is).

  9. 9
    bornagain77 says:

    Bob, along with Larry Moran and Dan Graur, believes most of the genome is junk.

    Bob, like Moran and Graur, is wrong.

  10. 10
    Fasteddious says:

    For an analogy of alternative roles for most of the DNA in a genome, take a look at:
    https://thopid.blogspot.com/2019/02/a-junk-dna-functionality-analogy.html
    There are many other purposes for design documents other than how to construct the building. Analogically, there are many functions in DNA other than how to make proteins.

  11. 11
    jerry says:

    indeed. We all know that some junk DNA has function, but a lot doesn’t (and we know what most of that lot is)

    My answer is again. So what. It’s of no consequence.

    Eventually, there will be an accurate estimate of what’s functional and what is not.

  12. 12
    Fasteddious says:

    Jerry: of course it has consequences.
    The fact that research into the supposed “junk” was needlessly delayed by more than a decade is one big consequence. The fact that new functions are being found which open new areas of research is another. If 99% of the genome is found to serve some role or other, rather than 2%, surely that would be consequential? Or perhaps you were just saying the precise percentage of “junk” is of no consequence? But even the difference between, say 20% and 80% is consequential to someone somehow?

  13. 13
    Silver Asiatic says:

    Bob O’H

    No, what’s wrong is that some people see that researchers keep finding new functions that formerly-junk DNA performs, and leap to the conclusion that it means that we’re wrong about most non-coding DNA being “junk”.

    The point is, as you say, something that was formerly claimed to be junk DNA was shown to have function – so they were wrong. This might be motivating Larry Moran, since it is an embarrassment to the very arrogant claims previously that Junk DNA was strong evidence for Darwinian evolution. I found that to be a problematic claim even if 99% of the genome is junk. Just assigning all of that to evolutionary detritus is not much of an explanation for why is it conserved and inherited.
    But there’s also an assumption that we’ve found all of the function that there is to find now and no more will be revealed. For those using Junk DNA as a prop-up for evolution, it seems that’s the hope – that no more function will be found. Certainly, for anyone who made big claims about Junk – pointing it that to a critique of the Creator, there are ulterior motives in all of it.

  14. 14
    ET says:

    Bob O’H:

    What do histones have to do with the function of junk DNA?

    Without them there wouldn’t be any life above prokaryotes. People like you want us to believe they just accidently happened and then just happened to be able to spool and organize the DNA to make it useful.

    The fact that histone octamers exist is evidence against junk DNA and evidence for planning and intelligent design.

  15. 15
    Silver Asiatic says:

    Jerry

    My answer is again. So what. It’s of no consequence.

    From an ID perspective, strictly – yes, it’s irrelevant. ID just looks for evidence of design. There are many aspects of nature that do not show evidence clearly – so that’s not a focus.
    But there are side-issues with Junk DNA. First, it is claimed as evidence supporting Darwin – like vestigial organs. So, as Junk DNA falls, so does that evolutionary argument. Secondly, if Junk DNA is an essential component of evolutionary theory and is gradually proven false – then that undercuts the credibility of evolutionists in their claims. Thirdly, IDists predicted that Junk DNA would have function – although I’m not sure on what basis they made that prediction. It could be they assume that everything in biology has some function beyond an evolutionary outcome, but in any case, new functions in non-coding regions are seen as an ID prediction fulfilled. I haven’t fully understood that point, but I know that Johnathan Wells is often cited as one proposing function for Junk DNA before the function was discovered.
    I think some of this blends over with some creationist arguments that reference God’s creation directly and not strictly ID-science arguments that would not care at all if there is Junk or not. Something either shows evidence of design or it doesn’t – and if it doesn’t then it’s not a focus of ID inquiry (although ID doesn’t claim that only that which shows observable evidence of design are the only things that were designed, but only those are the things we can recognize scientifically).

  16. 16
    ET says:

    JVL:

    How much of the human genome do you think is not transcribed and does not affect transcribed regions?

    Non-sequitur. Try again. DNA doesn’t have to be transcribed in order to provide a function.

  17. 17
    bornagain77 says:

    Junk DNA is a shining example of the unfalsifiable, pseudoscientific, nature of Darwinian evolution.

    The concept of Junk DNA from Darwinists was not born from empirical observation, but was a prediction that was born out of the mathematics of population genetics itself.

    As Robert Carter states, “Junk DNA is not just a label that was tacked on to some DNA that seemed to have no function, but it is something that is required by evolutionary theory. Mathematically, there is too much variation, too much DNA to mutate, and too few generations in which to get it all done.”

    The slow, painful death of junk DNA – Robert W. Carter – 2009
    Background
    Based on the work of J.B.S. Haldane5 and others, who showed that natural selection cannot possibly select for millions of new mutations over the course of human evolution, Kimura6 developed the idea of “neutral evolution”. If “Haldane’s Dilemma”7 were correct, then the majority of DNA must be non-functional. It should be free to mutate over time without needing to be shaped by natural selection. In this way, natural selection could act on the important bits and neutral evolution could act randomly on the rest. Since natural selection will not act on neutral traits, which do not affect survival or reproduction, neutral evolution can proceed through random drift without any inherent “cost of selection”.8 The term “junk DNA” originated with Ohno,9 who based his idea squarely on the idea of neutral evolution. To Ohno and other scientists of his time, the vast spaces (introns)between protein-coding genes were (exons) just useless DNA whose only function was to separate genes along a chromosome. Junk DNA is a necessary mathematical extrapolation. It was invented to solve a theoretical evolutionary dilemma. Without it, evolution runs into insurmountable mathematical difficulties.
    Junk DNA necessary for evolution
    Junk DNA is not just a label that was tacked on to some DNA that seemed to have no function, but it is something that is required by evolutionary theory. Mathematically, there is too much variation, too much DNA to mutate, and too few generations in which to get it all done. This was the essence of Haldane’s work. Without junk DNA, evolutionary theory cannot currently explain how everything works
    mathematically.
    https://creation.com/images/pdfs/tj/j23_3/j23_3_12-13.pdf

    Here is a short history of the Junk DNA argument from Darwinists,

    Haldane’s dilemma has not been solved
    Excerpt: The famous evolutionary geneticist J.B.S. Haldane (1892–1964) was one of the three founders of the field of study known as population genetics. Haldane articulated a serious problem for evolutionary theory in a seminal paper in 1957—the ‘cost of substitution’.1 When a beneficial mutation occurs in a population, it has to increase in the number of copies for the population to progress evolutionarily (if the mutation remained in one individual, then evolution cannot proceed; this is fairly obvious). In other words, it has to substitute for the non-mutated genes in the population. But the rate at which this can happen is limited. A major factor limiting the rate of substitution is the reproduction rate of the species. For a human-like creature with a generation time of about 20 years and low reproduction rate per individual, the rate of growth in numbers of a mutation in a population will be exceedingly slow. This is basically the ‘cost of substitution’.
    http://creation.com/haldanes-d.....een-solved

    Haldane’s Dilemma – Chase Nelson
    Excerpt: Haldane, one of the founders (along with Ronald Fisher and Sewall Wright) of mathematical population genetics, was the first to quantify such a limit on the speed of adaptive evolution.7 He concluded that the cost of selection “defines one of the factors, perhaps the main one, determining the speed of evolution.”8 Cost was the main reason Motoo Kimura proposed the neutral theory of molecular evolution.9 Many others cite its importance.10
    The implications for mammalian evolution were considered so severe that the issue became known as Haldane’s dilemma.11
    Despite Haldane’s work, a massive body of literature has accumulated asserting the primary role of natural selection in evolutionary change, often implying rates of adaptive evolution that exceed plausible limits.
    http://inference-review.com/ar.....es-dilemma

    Haldane’s Dilemma
    Excerpt: Haldane, (in a seminal paper in 1957—the ‘cost of substitution’), was the first to recognize there was a cost to selection which limited what it realistically could be expected to do. He did not fully realize that his thinking would create major problems for evolutionary theory. He calculated that in man it would take 6 million years to fix just 1,000 mutations (assuming 20 years per generation).,,, Man and chimp differ by at least 150 million nucleotides representing at least 40 million hypothetical mutations (Britten, 2002). So if man evolved from a chimp-like creature, then during that process there were at least 20 million mutations fixed within the human lineage (40 million divided by 2), yet natural selection could only have selected for 1,000 of those. All the rest would have had to been fixed by random drift – creating millions of nearly-neutral deleterious mutations. This would not just have made us inferior to our chimp-like ancestors – it surely would have killed us. Since Haldane’s dilemma there have been a number of efforts to sweep the problem under the rug, but the problem is still exactly the same. ReMine (1993, 2005) has extensively reviewed the problem, and has analyzed it using an entirely different mathematical formulation – but has obtained identical results.
    John Sanford PhD. – “Genetic Entropy and The Mystery of the Genome” – pg. 159-160

    Walter ReMine on Haldane’s Dilemma – interview
    http://kgov.com/Walter-ReMine-on-Haldanes-Dilemma

    Kimura’s Quandary
    Excerpt: Kimura realized that Haldane was correct,,, He developed his neutral theory in response to this overwhelming evolutionary problem. Paradoxically, his theory led him to believe that most mutations are unselectable, and therefore,,, most ‘evolution’ must be independent of selection! Because he was totally committed to the primary axiom (neo-Darwinism), Kimura apparently never considered his cost arguments could most rationally be used to argue against the Axiom’s (neo-Darwinism’s) very validity.
    John Sanford PhD. – “Genetic Entropy and The Mystery of the Genome” – pg. 161 – 162

    “Kimura (1968) developed the idea of “Neutral Evolution”. If “Haldane’s Dilemma” is correct, the majority of DNA must be non-functional.”
    – Sanford

    In other words, Neutral theory, and the entire concept of junk DNA, was not developed because of any compelling empirical observation, but was actually developed because it was forced upon Darwinists by the mathematics of population genetics itself.

    More specifically, neutral theory, and the entire concept of junk DNA, is actually the result of a theoretical failure of natural selection within the mathematics of population genetics!

    As Austin L. Hughes put the situation, “Darwinism asserts that natural selection is the driving force of evolutionary change. It is the claim of the neutral theory, on the other hand, that the majority of evolutionary change is due to chance.”

    Austin L. Hughes – The Neutral Theory of Evolution – Chase Nelson – 2016
    Excerpt: ORIGINALLY PROPOSED by Motoo Kimura, Jack King, and Thomas Jukes, the neutral theory of molecular evolution is inherently non-Darwinian.2 Darwinism asserts that natural selection is the driving force of evolutionary change. It is the claim of the neutral theory, on the other hand, that the majority of evolutionary change is due to chance.
    http://inference-review.com/ar.....evolution/

    Thus, with Natural selection being tossed to the side by population genetics, as the explanation for the ‘wonderful design’ we see in life, Darwinists did not accept such a devastating finding from population genetics as an outright falsification for their theory, as they should have done, but instead Darwinists are now reduced to arguing that the ‘wonderful design’ we see in life is, basically, the result of pure chance with natural selection now playing a very negligible role if any role at all.
    .

  18. 18
    bornagain77 says:

    Even Richard Dawkins himself finds the claim that chance, all by its lonesome, can build such ‘wonderful design’ to be absolutely inconceivable. In the following video Dawkins states that the ‘appearance of design’, “cannot come about by chance. It’s absolutely inconceivable that you could get anything as complicated or well designed as a modern bird or a human or a hedgehog coming about by chance.’

    4:30 minute mark: “It cannot come about by chance. It’s absolutely inconceivable that you could get anything as complicated or well designed as a modern bird or a human or a hedgehog coming about by chance. That’s absolutely out.,,, It’s out of the question.,,,
    So where (does the appearance of design)) it come from? The process of gradual evolution by natural selection.”
    Richard Dawkins – From a Frog to a Prince – video
    https://youtu.be/ClleN8ysimg?t=267

    For crying out loud, the entire purpose of ‘Natural Selection’ in the first place was to supposedly “explain away” the overwhelming ‘appearance of design’ that we see in life without any reference to a real Designer, i.e. without any reference to God.

    And as Ernst Mayr himself explained, “The theory of evolution by natural selection explains the adaptedness and diversity of the world solely materialistically. It no longer requires God as creator or designer,,, Every aspect of the “wonderful design” so admired by the natural theologians could be explained by natural selection.”

    “The theory of evolution by natural selection explains the adaptedness and diversity of the world solely materialistically. It no longer requires God as creator or designer (although one is certainly still free to believe in God even if one accepts evolution). Darwin pointed out that creation, as described in the Bible and the origin accounts of other cultures, was contradicted by almost any aspect of the natural world. Every aspect of the “wonderful design” so admired by the natural theologians could be explained by natural selection.”
    – Ernst Mayr – “Darwin’s Influence on Modern Thought” in Scientific American, July, 2000
    https://sciphilos.info/docs_pages/docs_Mayr_Dawin_css.html

    And as Francisco J. Ayala put it, natural selection supposedly accounted for “Design without designer”, i.e. “The adaptive features of organisms could now be explained,, as the result of natural processes, without recourse to an Intelligent Designer.,,,”

    Darwin’s greatest discovery: Design without designer – Francisco J. Ayala – May 15, 2007
    Excerpt: With Darwin’s discovery of natural selection, the origin and adaptations of organisms were brought into the realm of science. The adaptive features of organisms could now be explained, like the phenomena of the inanimate world, as the result of natural processes, without recourse to an Intelligent Designer.,,,
    Darwin’s theory of natural selection accounts for the “design” of organisms, and for their wondrous diversity, as the result of natural processes, the gradual accumulation of spontaneously arisen variations (mutations) sorted out by natural selection.
    https://www.pnas.org/content/104/suppl_1/8567

    And as Richard Dawkins himself staled in “The Blind Watchmaker”, “Yet the living results of natural selection overwhelmingly impress us with the appearance of design as if by a master watchmaker, impress us with the illusion of design and planning.”

    “Yet the living results of natural selection overwhelmingly impress us with the appearance of design as if by a master watchmaker, impress us with the illusion of design and planning.”
    – Richard Dawkins – “The Blind Watchmaker” – 1986 – page 21

    Contrary to what proponents of neutral theory may want to believe, with natural selection out of the way as an supposed ‘designer substitute’ for the appearance of design we see in life then the explanation for that ‘appearance of design’ in life does not automatically default to pure chance as the explanation for life but the explanation instead reverts back to the original assumption that life must be the product of Intelligent Design.

    As Richard Sternberg states, “Darwinism provided an explanation for the appearance of design, and argued that there is no Designer — or, if you will, the designer is natural selection. If that’s out of the way — if that (natural selection) just does not explain the evidence — then the flip side of that is, well, things appear designed because they are designed.”

    “Darwinism provided an explanation for the appearance of design, and argued that there is no Designer — or, if you will, the designer is natural selection. If that’s out of the way — if that (natural selection) just does not explain the evidence — then the flip side of that is, well, things appear designed because they are designed.”
    Richard Sternberg – Living Waters documentary
    Whale Evolution vs. Population Genetics – Richard Sternberg and Paul Nelson – (excerpt from Living Waters video)
    https://www.youtube.com/watch?v=0csd3M4bc0Q

    Here is another recent post on the subject

    April 2021 – Neutral Theory – the failure of Natural Selection, i.e. Darwinian evolution, within population genetics. (“chance”, as it is used by Darwinists, is actually an appeal to their own ignorance of a known cause rather than an appeal to any known cause)
    https://uncommondescent.com/genetics/larry-moran-to-write-new-book-claims-genome-is-99-junk/#comment-728351

  19. 19
    jerry says:

    First, it is claimed as evidence supporting Darwin – like vestigial organs.

    Maybe we should get rid of the fixation on Darwin and just concentrate on the inability of any natural mechanism to create functional complexity. (Darwin’s ideas are just one of the natural mechanisms proposed)

    There are many possible sources for what is called junk DNA that are recognized natural mechanisms. So why not just admit it? That’s why I said it’s no big deal.

    That some of what was considered junk DNA have functional value doesn’t mean that the rest was not due to a natural process. It seems like a potential for a major setback if it becomes important that all junk DNA has to have function. It’s not necessary.

    I have no idea of the percentage of functional vs just excess DNA. But it’s not important.

    Aside: iD recognizes Darwinian process as valid. However, while important for life, these processes are very limited. They are essentially modern genetics. To argue otherwise just makes one look foolish.

    Aside2: vestigial organs (as well as junk DNA) are supposedly evidence for a poor designer, thus not God as the designer, thus support for a natural mechanism. That is a series of non-sequiturs. None of that really flows. It never deals with the fact that natural mechanisms have incredibly high probabilities to overcome to produce functional complexity.

  20. 20
    bornagain77 says:

    Keynote Speech at Biology Conference Falsifies Major Claim of Darwinism – July 2018
    Excerpt: In the present day evolutionary biologists have been insanely angry at the NIH, particularly the Billion Dollar ENCODE-pioneered set of projects which argues against the idea of junk DNA. Evolutionary biologist Dan Graur said ENCODE is “bonkers”[v] because “If ENCODE is right, evolution is wrong.[vi]
    [v] Gruar, Dan. Rubbish DNA: The functionless fraction of the human genome. arXiv, 22 Jan 2016. https://arxiv.org/abs/1601.06047
    [vi] Klinghoffer, David. Dan Graur, Darwin’s Reactionary. Evolution News & Science Today, 21 June 2017.
    https://crev.info/2018/07/keynote-speech-falsifies-darwinism/

    Apparently junk DNA, and we are talking upwards to 90% of the genome being considered junk by leading Darwinists (i.e. Graur and Moran), is required by evolutionary theory for the theory to even be semi-feasible in the minds of these Darwinists.

    And they are claiming this absurd claim that upwards to 90% of the genome must be junk in the face of persistent empirical evidence to the contrary.

    For instance:

    Discovery Of Useful “Junk DNA” “Has Outstripped The Discovery Of Protein-Coding Genes By A Factor Of Five… – March 30, 2021
    https://uncommondescent.com/intelligent-design/discovery-of-useful-junk-dna-has-outstripped-the-discovery-of-protein-coding-genes-by-a-factor-of-five/

  21. 21
    Silver Asiatic says:

    Jerry @ 19. I agree that it’s not essential. But there’s some value also.
    As Fasteddious mentioned above – the finding of function fits the ID paradigm for science and nicely showed the blindness of evolutionary science. The search for design is a search for function -whereas evolution had a vested interest in making sure that Junk DNA was non functional. Even now, Larry Moran seems very pleased to claim that it’s almost all Junk (although he’s doing that by just changing definitions, it seems). In other words, “100% of Junk DNA is junk”. That’s a typical Darwinian claim. When we find function in it, the prediction stands because whatever had function is not Junk DNA.

    But I agree that taking credit for a prediction is slippery since someone could ask “ok, how much function does ID predict that non-coding DNA will have?” That’s much more difficult and runs the risk of being very wrong, and what basis would ID give us to make that prediction anyway? Because some aspects appear designed does not mean that there are no effects from natural causes or even random accidents.

  22. 22
    Seversky says:

    As a reminder:

    Five Things You Should Know if You Want to Participate in the Junk DNA Debate

    Here are five things you should know if you want to engage in a legitimate scientific discussion about the amount of junk DNA in a genome.

    Genetic Load

    Every newborn human baby has about 100 mutations not found in either parent. If most of our genome contained functional sequence information, then this would be an intolerable genetic load. Only a small percentage of our genome can contain important sequence information suggesting strongly that most of our genome is junk.

    C-Value Paradox

    A comparison of genomes from closely related species shows that genome size can vary by a factor of ten or more. The only reasonable explanation is that most of the DNA in the larger genomes is junk.

    Modern Evolutionary Theory

    Nothing in biology makes sense except in the light of population genetics. The modern understanding of evolution is perfectly consistent with the presence of large amounts of junk DNA in a genome.

    Pseudogenes and broken genes are junk

    More than half of our genomes consists of pseudogenes, including broken transposons and bits and pieces of transposons. A few may have secondarily acquired a function but, to a first approximation, broken genes are junk.

    Most of the genome is not conserved

    Most of the DNA sequences in large genomes is not conserved. These sequences diverge at a rate consistent with fixation of neutral alleles by random genetic drift. This strongly suggests that it does not have a function although one can’t rule out some unknown function that doesn’t depend on sequence.

    If you want to argue against junk DNA then you need to refute or rationalize all five of these observations.

    Posted by Laurence A. Moran at Thursday, July 04, 2013

  23. 23
    Silver Asiatic says:

    A few may have secondarily acquired a function … [and] one can’t rule out some unknown function

    Therefore, where we discovered function, it was false to call it Junk as all evolutionists did previously. It is equally false to refer any of rest as Junk where some “unknown function” may be found.
    Seems like Larry did a pretty good job arguing against Junk DNA right there.

  24. 24
    bornagain77 says:

    Since Seversky has apparently fallen in love with Moran’s fallacious list, perhaps it is time to let a little light shine on it.

    First off, as to the Genetic Load argument in particular,

    Genetic Load
    Every newborn human baby has about 100 mutations not found in either parent. If most of our genome contained functional sequence information, then this would be an intolerable genetic load. Only a small percentage of our genome can contain important sequence information suggesting strongly that most of our genome is junk.

    in the following article Moran claims that, because of Genetic Load, upwards to 90% of our genome must be junk.

    Revisiting the genetic load argument with Dan Graur – Larry Moran – July 14, 2017
    Excerpt: I’ve discussed genetic load several times on this blog (e.g. Genetic Load, Neutral Theory, and Junk DNA) but a recent paper by Dan Graur provides a good opportunity to explain it once more. The basic idea of Genetic Load is that a population can only tolerate a finite number of deleterious mutations before going extinct. The theory is sound but many of the variables are not known with precision.,,,
    Let’s look at the first line in this table. The deleterious mutation rate is calculated using the lowest possible mutation rate and the smallest percentage of deleterious mutations (4%). Under these conditions, the human population could survive with a fertility value of 1.8 as long as less than 25% of the genome is functional (i.e. 75% junk) (red circle). That’s the UPPER LIMIT on the functional fraction of the human genome.
    But that limit is quite unreasonable. It’s more reasonable to assume about 100 new mutations per generation with about 10% deleterious. Using these assumptions, only 10% of the genome could be functional with a fertility value of 1.8 (green circle).
    Whatever the exact percentage of junk DNA it’s clear that the available data and population genetics point to a genome that’s mostly junk DNA.
    http://sandwalk.blogspot.com/2.....-with.html

    Moran’s theoretical argument, (via Genetic Load and/or Neutral Theory), is simply not nearly as strong as Moran presupposes it to be.

    First off, Moran’s assumption that only 10% of mutations are deleterious is an unrealistic estimate.

    As Michael Behe stated in the following paper, “we had already known that the great majority of mutations that have a visible effect on an organism are deleterious. Now, surprisingly, it seems that even the great majority of helpful mutations degrade the genome to a greater or lesser extent”

    “The First Rule of Adaptive Evolution”: Break or blunt any functional coded element whose loss would yield a net fitness gain – Michael Behe – December 2010
    Excerpt: In its most recent issue The Quarterly Review of Biology has published a review by myself of laboratory evolution experiments of microbes going back four decades.,,, The gist of the paper is that so far the overwhelming number of adaptive (that is, helpful) mutations seen in laboratory evolution experiments are either loss or modification of function. Of course we had already known that the great majority of mutations that have a visible effect on an organism are deleterious. Now, surprisingly, it seems that even the great majority of helpful mutations degrade the genome to a greater or lesser extent.,,, I dub it “The First Rule of Adaptive Evolution”: Break or blunt any functional coded element whose loss would yield a net fitness gain.
    – per uncommon descent

    Moreover, as John Sanford explained in his book “Genetic Entropy” and as he elucidated in the following paper, the unselectable ‘near neutral’ mutations, which Dr Moran classified as being perfectly neutral in his calculation, should, in reality, all be classified as ‘slightly deleterious mutations’.
    Slightly deleterious mutations that will build up over time, instead of being classified as perfectly neutral mutations as Dr. Moran erroneously classified them as being in his calculation.

    Can Purifying Natural Selection Preserve Biological Information? – May 2013 –
    Paul Gibson, John R. Baumgardner, Wesley H. Brewer, John C. Sanford
    In conclusion, numerical simulation shows that realistic levels of biological noise result in a high selection threshold. This results in the ongoing accumulation of low-impact deleterious mutations, with deleterious mutation count per individual increasing linearly over time. Even in very long experiments (more than 100,000 generations), slightly deleterious alleles accumulate steadily, causing eventual extinction. These findings provide independent validation of previous analytical and simulation studies [2–13]. Previous concerns about the problem of accumulation of nearly neutral mutations are strongly supported by our analysis. Indeed, when numerical simulations incorporate realistic levels of biological noise, our analyses indicate that the problem is much more severe than has been acknowledged, and that the large majority of deleterious mutations become invisible to the selection process.,,,
    http://www.worldscientific.com.....08728_0010

    Kimura’s Distribution
    http://dl0.creation.com/articl.....-white.jpg

    Correct Distribution
    http://dl0.creation.com/articl.....-white.jpg

    Thus, even though Dr. Moran used unrealistic estimates for deleterious mutations in his calculation, Moran was still only able to calculate that 10% of the genome may be functional.

    Yet, the fact of the matter is that if realistic estimates are used for ‘slightly deleterious mutations’ in the calculations then ALL, i.e. 100%, of the genome should be considered functionless junk, instead of just 90%.

    To put it mildly, this is NOT a minor problem for Darwinists.

    Moreover, on top of the fact that Moran used an unrealistic estimate for deleterious mutations in his calculation, In their Genetic Load argument, Evolutionists also unrealistically hold that, “As the load of deleterious mutations grows over time, the pool of possible beneficial mutations also grows with it. This eventually leads to an equilibrium, preventing fitness decline beyond a certain point.,,,,”

    Yet Darwinists simply have no rational basis for presupposing that.

    As Dr. John Sanford stated in 2020, ’This argument is: 1) merely dismissive, 2) categorically wrong, and 3) without a rational or data-driven basis. Obviously, rapidly accumulating deleterious mutations do not lead to more and more beneficial mutations. Rather, the much more abundant deleterious mutations effectively overwhelm and negate the fitness effects of the extremely rare beneficial mutations.”

    And Dr. John Sanford further states, “We have done thousands of numerical simulations showing this. Even given the most generous parameter settings, the near-neutral bad mutations consistently accumulate about 1000 times faster than the beneficial mutations.”

    Responding to supposed refutations of genetic entropy from the ‘experts’
    by Paul Price, Robert Carter and John Sanford – 1 December 2020
    Excerpt: 1 Mutations & Equilibrium
    Claim: As the load of deleterious mutations grows over time, the pool of possible beneficial mutations also grows with it. This eventually leads to an equilibrium, preventing fitness decline beyond a certain point.,,,,
    Comments from Dr Sanford:
    This argument is: 1) merely dismissive, 2) categorically wrong, and 3) without a rational or data-driven basis. Obviously, rapidly accumulating deleterious mutations do not lead to more and more beneficial mutations. Rather, the much more abundant deleterious mutations effectively overwhelm and negate the fitness effects of the extremely rare beneficial mutations. The ratio of bad to good mutations is, minimally, 1000:1. With or without selection, bad mutations will always accumulate much more rapidly that beneficial mutations. We have done thousands of numerical simulations showing this. Even given the most generous parameter settings, the near-neutral bad mutations consistently accumulate about 1000 times faster than the beneficial mutations.
    https://creation.com/genetic-entropy-defense

    Moreover, unlike Dr. Sanford and company, Moran nor any other Evolutionist, has ever constructed a realistic computer simulation.

    As Dr. Robert Marks and company have found, (via extensive analysis of evolutionary algorithms), “Those hoping to establish Darwinian evolution as a hard science with a model have either failed or inadvertently cheated,,,,, there exists no model successfully describing undirected Darwinian evolution. According to our current understanding, there never will be.,,,”

    Top Ten Questions and Objections to ‘Introduction to Evolutionary Informatics’ – Robert J. Marks II – June 12, 2017
    Excerpt: “There exists no model successfully describing undirected Darwinian evolution. Hard sciences are built on foundations of mathematics or definitive simulations. Examples include electromagnetics, Newtonian mechanics, geophysics, relativity, thermodynamics, quantum mechanics, optics, and many areas in biology. Those hoping to establish Darwinian evolution as a hard science with a model have either failed or inadvertently cheated. These models contain guidance mechanisms to land the airplane squarely on the target runway despite stochastic wind gusts. Not only can the guiding assistance be specifically identified in each proposed evolution model, its contribution to the success can be measured, in bits, as active information.,,,”,,, “there exists no model successfully describing undirected Darwinian evolution. According to our current understanding, there never will be.,,,”
    https://evolutionnews.org/2017/06/top-ten-questions-and-objections-to-introduction-to-evolutionary-informatics/

    Thus, contrary to whatever Larry Moran may falsely imagine about his genetic load argument, the fact of the matter is that the genetic load argument is more appropriately, and realistically, used as an argument against the validity of evolutionary theory, and NOT as a argument for its validity.

    As Dr. John Sanford stated in his book “Genetic Entropy”, “Kimura apparently never considered his cost arguments (Genetic Load argument) could most rationally be used to argue against the Axiom’s (neo-Darwinism’s) very validity.”

    Kimura’s Quandary
    Excerpt: “Kimura realized that Haldane was correct,,, He developed his neutral theory in response to this overwhelming evolutionary problem. Paradoxically, his theory led him to believe that most mutations are unselectable, and therefore,,, most ‘evolution’ must be independent of selection! Because he was totally committed to the primary axiom (neo-Darwinism), Kimura apparently never considered his cost arguments (Genetic Load argument) could most rationally be used to argue against the Axiom’s (neo-Darwinism’s) very validity.”
    – John Sanford PhD. – “Genetic Entropy and The Mystery of the Genome” – pg. 161 – 162 – 2005

  25. 25
    Bob O'H says:

    Jerry @ 11 – TBH I think we’re in agreement.

    SA @ 13 –

    The point is, as you say, something that was formerly claimed to be junk DNA was shown to have function – so they were wrong.

    Who, precisely, were the “they”, and what exactly did they say? I’ve never come across a biologist claiming that all “junk” DNA is functionless. What you have repeated is, quite simply, a myth.

    But there’s also an assumption that we’ve found all of the function that there is to find now and no more will be revealed.

    Who makes that assumption? I’ve never seen it stated.

    ET @ 14 –

    What do histones have to do with the function of junk DNA?

    Without them there wouldn’t be any life above prokaryotes.

    And again, what does this have to do with junk DNA?
    SA @ 15 –

    Secondly, if Junk DNA is an essential component of evolutionary theory …

    Just to help you: it isn’t.

  26. 26
    bornagain77 says:

    As to the second item on Moran’s list. C-Value Paradox

    C-Value Paradox
    A comparison of genomes from closely related species shows that genome size can vary by a factor of ten or more. The only reasonable explanation is that most of the DNA in the larger genomes is junk.

    LOL, first off it is funny that Moran would try to use ‘the only reasonable solution’ as a criteria for ascertaining whether something may be junk or not.

    It is simply impossible for our capacity to reason to be based upon the reductive materialism that undergirds Darwin’s theory.

    The primary reason that reductive materialism cannot ground reasoning itself is because the reductive materialism that undergirds Darwinian theory denies the reality of free will.

    For a primary example of how the denial of free will undermines our ability to reason, I cite the following self-refuting quote on free will from Jerry Coyne himself

    THE ILLUSION OF FREE WILL – Sam Harris – 2012
    Excerpt: “Free will is an illusion so convincing that people simply refuse to believe that we don’t have it.”
    – Jerry Coyne
    https://samharris.org/the-illusion-of-free-will/

    That should literally be the number one example of a self-refuting argument that is given in philosophy 101 classes.

    As the preceding statement by Coyne makes abundantly clear, the denial of the reality of free will by Darwinists undermines any ability that we have to make logically coherent arguments in the first place.

    As Martin Cothran explains in the following article, “By their own logic, it isn’t logic that demands their assent to the claim that free will is an illusion, but the prior chemical state of their brains. The only condition under which we could possibly find their argument convincing is if they are not true.”

    Sam Harris’s Free Will: The Medial Pre-Frontal Cortex Did It – Martin Cothran – November 9, 2012
    Excerpt: There is something ironic about the position of thinkers like Harris on issues like this: they claim that their position is the result of the irresistible necessity of logic (in fact, they pride themselves on their logic). Their belief is the consequent, in a ground/consequent relation between their evidence and their conclusion. But their very stated position is that any mental state — including their position on this issue — is the effect of a physical, not logical cause.
    By their own logic, it isn’t logic that demands their assent to the claim that free will is an illusion, but the prior chemical state of their brains. The only condition under which we could possibly find their argument convincing is if they are not true. The claim that free will is an illusion requires the possibility that minds have the freedom to assent to a logical argument, a freedom denied by the claim itself. It is an assent that must, in order to remain logical and not physiological, presume a perspective outside the physical order.
    http://www.evolutionnews.org/2.....66221.html

    in short, if you believe, as Larry Moran apparently does, that we have the capacity to reason in a logically coherent fashion, then you are forced to believe that we have free will in some real and meaningful sense. But the reductive materialism of Dr. Moran’s worldview denies the reality of free will. Therefore Dr. Moran’s worldview, in so far as he himself believes in his capacity to reason, must be false.

    But anyways, aside from the fact that Dr. Moran has no basis within his reductive materialistic worldview to ground his “only reasonable explanation” as to why genomes would vary widely in their respective sizes, Dr. Moran is also apparently unaware of the fact that there are other ‘reasonable explanations’ for why the size of DNA would vary widely between different kinds of species.

    There are many functional considerations as to why it would be advantageuous to have widely varying genome sizes.

    The following article goes over some of these reasons. It is a lengthy article, so I will only quote the conclusion, but unbiased readers can dig into it the article see for themselves that there are other reasonable explanations as to why it would be functionally advantageous to have different genome sizes,

    Why the “Onion Test” Fails as an Argument for “Junk DNA” – Jonathan McLatchie – November 2, 2011
    Conclusion
    The so-called onion test, or indeed the “C-value enigma,” is predicated on unsupportable assumptions about the physiological effects of — and/or requirements for — larger genomes, many of which are contradicted by the scientific evidence. As we learn ever more about the nature and functional inter-dependency of the genome, as the extent of genomic “dark matter” continues to shrink, those who continue to advocate the view that the preponderance of our genome is non-functional should find these facts disconcerting.
    https://evolutionnews.org/2011/11/why_the_onion_test_fails_as_an/

    Jonathan McLatchie has a shorter article on the subject here:

    Biologist John Mattick on Junk DNA, ENCODE, and Intelligent Design
    Jonathan McLatchie – August 9, 2013
    Excerpt: There are a number of ways in which the C-value paradox could be resolved, however, and it is likely a combination of a number of factors. We now know that through RNA splicing after transcription, a single gene can produce more than one protein. Humans, for instance, produce around 100,000 proteins from 22,000 structural genes. Thus, an organism’s complexity cannot be viewed as a simple function of the number of genes it possesses.
    Second, there is in fact a relationship between biological complexity and the extent of gene regulation. Roughly 9% of genes in Homo sapiens encode transcription factors. In Drosophila melanogaster, only about 5.5% of genes code for transcription factors; 4.2% of the genes of Caenorhabditis elegans code for transcription factors; only 3.4% of genes code for transcription factors in the budding yeast Saccharomyces cerevisiae (see Table 2 of Messina et al., 2004). When coupled with an increased network of transcriptional enhancers and promoters, such a difference could result in a much larger set of gene expression patterns. This could lead to a non-linear increase in organismal complexity (e.g. see Levine and Tjian, 2003).
    There are other factors to consider as well — for example, organisms with larger cell volumes (such as amoebas) tend to produce repetitive DNA, which serves structural purposes. As Thomas Cavalier-Smith explains, when cell size increases, “there is positive selection for a corresponding increase in nuclear volume; it is generally easier to achieve this by increasing the amount of DNA rather than by altering its folding parameters” (Cavalier-Smith, 2005). Another thing to consider is that time taken to transcribe long stretches of non-coding DNA such as introns can be of functional consequence (e.g. see Swinburne and Silver, 2011). There are thus so many different factors needing to be taken into account that it is difficult to make a watertight argument for junk DNA based on the C-value paradox.
    https://evolutionnews.org/2013/08/john_mattick_on/

    And Jonathan Wells, author of “The Myth Of Junk DNA”, weighs in on the ‘onion test’ here. Specifically Jonathan Wells quotes T. Ryan Gregory himself to show that there are other ‘reasonable explanations’ for why genomes sizes would vary,

    The Onion Test Is a Red Herring – Jonathan Wells – March 11, 2015
    Excerpt: Gregory has written extensively on the C-value enigma,[39-42] and various hypotheses have been proposed to explain it.[43-48] One of those hypotheses attempts to explain the enigma by the accumulation of “junk DNA” or “selfish DNA,” but — as Gregory himself has pointed out — that explanation cannot make sense of the correlations noted above.[49] “Under the traditional junk DNA and selfish DNA theories,” Gregory wrote in 2005, “the relationship between genome size and cell size is considered purely coincidental.” Since this approach is incapable of explaining the correlation between C-value and cell size, “the strictly coincidental interpretation has been rejected.”[50]
    But if Gregory rejects the accumulation of “junk DNA” as an explanation for the C-value enigma, why does he use the “onion test” to defend the notion that most non-protein-coding DNA is nonfunctional? Something peculiar is going on here. Let’s take a closer look at his reasoning.,,,
    https://evolutionnews.org/2015/03/onion_expose_ca/

    Thus, contrary to what Dr. Moran claimed about there being no other ‘reasonable explanation’,, there are several other functional considerations as to why it might be advantageous to have widely varying genome sizes between species.

    So Dr. Moran’s argument fails on two levels. First, the reductive materialism that undergirds his worldview, (since it denies the reality of free will), simply cannot ground reasoning in the first place. And secondly, directly contrary to what Dr. Moran claimed, there are other ‘reasonable explanations’ as to why it would be advantageous for an Intelligent Designer to choose to use widely varying genome sizes in different species.

    And, given the inherent, and staggering, complexity of life, I certainly don’t consider Junk DNA to be a ‘reasonable explanation’ for the C-Value Paradox, much less ‘the only reasonable explanation’ for the C-Value Paradox as Dr. Moran holds.,, Far from it!

  27. 27
    bornagain77 says:

    As to the third item from Larry Moran’s list,

    Modern Evolutionary Theory
    Nothing in biology makes sense except in the light of population genetics. The modern understanding of evolution is perfectly consistent with the presence of large amounts of junk DNA in a genome.

    First off, it interesting to note where Dr. Moran is ‘borrowing’ the phrase “Nothing in biology makes sense except in the light of population genetics” from.

    Dr. Moran is getting his phrase from Theodosius Dobzhansky’s 1973 essay “Nothing in biology makes sense except in the light of evolution”.

    Yet the arguments contained within Theodosius Dobzhansky’s essay are Theological arguments. Theological arguments that all “hinge upon sectarian claims about God’s nature, actions, purposes, or duties.” Dobzhansky’s arguments are NOT purely scientific arguments, but are instead dependent on faulty theological presuppositions

    Nothing in biology makes sense except in light of theology? – Dilley S. – 2013
    Abstract
    This essay analyzes Theodosius Dobzhansky’s famous article, “Nothing in Biology Makes Sense Except in the Light of Evolution,” in which he presents some of his best arguments for evolution. I contend that all of Dobzhansky’s arguments hinge upon sectarian claims about God’s nature, actions, purposes, or duties. Moreover, Dobzhansky’s theology manifests several tensions, both in the epistemic justification of his theological claims and in their collective coherence. I note that other prominent biologists–such as Mayr, Dawkins, Eldredge, Ayala, de Beer, Futuyma, and Gould–also use theology-laden arguments. I recommend increased analysis of the justification, complexity, and coherence of this theology.
    http://www.ncbi.nlm.nih.gov/pubmed/23890740

    Moreover, directly contrary to what Darwinists believe, biology itself can get along “quite happily without particular reference to evolutionary ideas.”

    “While the great majority of biologists would probably agree with Theodosius Dobzhansky’s dictum that “Nothing in biology makes sense except in the light of evolution”, most can conduct their work quite happily without particular reference to evolutionary ideas. Evolution would appear to be the indispensable unifying idea and, at the same time, a highly superfluous one.”
    – Adam S. Wilkins, editor of the journal BioEssays, Introduction to “Evolutionary Processes” – (2000).

    The science of Biology simply does not need evolutionary ideas, i.e “Molecular biology, biochemistry, and physiology, have not taken evolution into account at all.”

    “In fact, over the last 100 years, almost all of biology has proceeded independent of evolution, except evolutionary biology itself. Molecular biology, biochemistry, and physiology, have not taken evolution into account at all.”
    – Marc Kirschner, founding chair of the Department of Systems Biology at Harvard Medical School, Boston Globe, Oct. 23, 2005?

    Moreover, in so far as Dr. Moran has substituted the phrase ‘population genetics’ for the word ‘evolution’ in Theodosius Dobzhansky’s dictum, Dr. Moran has not helped the fallacious nature inherent in Dobzhansky’s dictum, i.e. “Nothing in biology makes sense except in the light of population genetics”.

    Population genetics, directly contrary to what Dr. Moran falsely imagines, and as touched upon in post 17, has certainly not been kind to Darwinian ideas.

    First off, population genetics has cast Natural Selection itself, (the supposed ‘designer substitute’, and Charles Darwin’s main claim to scientific fame), completely under the bus.

    The waiting time problem in a model hominin population – 2015 Sep 17
    John Sanford, Wesley Brewer, Franzine Smith, and John Baumgardner
    Excerpt: The program Mendel’s Accountant realistically simulates the mutation/selection process,,,
    Given optimal settings, what is the longest nucleotide string that can arise within a reasonable waiting time within a hominin population of 10,000? Arguably, the waiting time for the fixation of a “string-of-one” is by itself problematic (Table 2). Waiting a minimum of 1.5 million years (realistically, much longer), for a single point mutation is not timely adaptation in the face of any type of pressing evolutionary challenge. This is especially problematic when we consider that it is estimated that it only took six million years for the chimp and human genomes to diverge by over 5 % [1]. This represents at least 75 million nucleotide changes in the human lineage, many of which must encode new information.
    While fixing one point mutation is problematic, our simulations show that the fixation of two co-dependent mutations is extremely problematic – requiring at least 84 million years (Table 2). This is ten-fold longer than the estimated time required for ape-to-man evolution. In this light, we suggest that a string of two specific mutations is a reasonable upper limit, in terms of the longest string length that is likely to evolve within a hominin population (at least in a way that is either timely or meaningful). Certainly the creation and fixation of a string of three (requiring at least 380 million years) would be extremely untimely (and trivial in effect), in terms of the evolution of modern man.
    It is widely thought that a larger population size can eliminate the waiting time problem. If that were true, then the waiting time problem would only be meaningful within small populations. While our simulations show that larger populations do help reduce waiting time, we see that the benefit of larger population size produces rapidly diminishing returns (Table 4 and Fig. 4). When we increase the hominin population from 10,000 to 1 million (our current upper limit for these types of experiments), the waiting time for creating a string of five is only reduced from two billion to 482 million years.
    http://www.ncbi.nlm.nih.gov/pm.....MC4573302/

    To put it mildly, this finding from population genetics is NOT a minor problem for Darwinists.

    Moreover, as if the falsification of Natural Selection itself by population genetics, as to being a major player in evolution, was not devastating enough for Darwinian presuppositions, populations genetics goes one step further and also proves that, if Darwinian evolution were actually true, then ALL of our perceptions of reality would be illusory.

    Donald Hoffman, a cognitive scientist, via extensive analysis of the mathematics of population genetics, has proven that, if Darwinian evolution is assumed as being true, then ALL of our perceptions of reality would be illusory

    Donald Hoffman: Do we see reality as it is? – Video – 9:59 minute mark
    Quote: “fitness does depend on reality as it is, yes.,,, Fitness is not the same thing as reality as it is, and it is fitness, and not reality as it is, that figures centrally in the equations of evolution. So, in my lab, we have run hundreds of thousands of evolutionary game simulations with lots of different randomly chosen worlds and organisms that compete for resources in those worlds. Some of the organisms see all of the reality. Others see just part of the reality. And some see none of the reality. Only fitness. Who wins? Well I hate to break it to you but perception of reality goes extinct. In almost every simulation, organisms that see none of reality, but are just tuned to fitness, drive to extinction (those organisms) that perceive reality as it is. So the bottom line is, evolution does not favor veridical, or accurate perceptions. Those (accurate) perceptions of reality go extinct. Now this is a bit stunning. How can it be that not seeing the world accurately gives us a survival advantage?”
    https://youtu.be/oYp5XuGYqqY?t=601

    The Evolutionary Argument Against Reality – April 2016
    The cognitive scientist Donald Hoffman uses evolutionary game theory to show that our perceptions of an independent reality must be illusions.
    Excerpt: “The classic argument is that those of our ancestors who saw more accurately had a competitive advantage over those who saw less accurately and thus were more likely to pass on their genes that coded for those more accurate perceptions, so after thousands of generations we can be quite confident that we’re the offspring of those who saw accurately, and so we see accurately. That sounds very plausible. But I think it is utterly false. It misunderstands the fundamental fact about evolution, which is that it’s about fitness functions — mathematical functions that describe how well a given strategy achieves the goals of survival and reproduction. The mathematical physicist Chetan Prakash proved a theorem that I devised that says: According to evolution by natural selection, an organism that sees reality as it is will never be more fit than an organism of equal complexity that sees none of reality but is just tuned to fitness. Never.”
    https://www.quantamagazine.org/20160421-the-evolutionary-argument-against-reality/

    Thus, in a humorous twist of irony, Moran’s phrase “Nothing in biology makes sense except in the light of population genetics” is much more appropriately read as i.e. “Nothing makes sense in the light of population genetics”. 🙂

    Moreover, since reliable observation is an indispensable part of the scientific method itself, in fact reliable observation is the first step in, and therefore the cornerstone of, the scientific method,

    The scientific method
    At the core of biology and other sciences lies a problem-solving approach called the scientific method. The scientific method has five basic steps, plus one feedback step:
    1. Make an observation.
    2. Ask a question.
    3, Form a hypothesis, or testable explanation.
    4. Make a prediction based on the hypothesis.
    5. Test the prediction.
    6. Iterate: use the results to make new hypotheses or predictions.
    The scientific method is used in all sciences—including chemistry, physics, geology, and psychology. The scientists in these fields ask different questions and perform different tests. However, they use the same core approach to find answers that are logical and supported by evidence.
    https://www.khanacademy.org/science/high-school-biology/hs-biology-foundations/hs-biology-and-the-scientific-method/a/the-science-of-biology

    Since reliable observation is an indispensable part of the scientific method itself, then the Darwinian claim from population genetics that ALL our perceptions of reality would be illusory undermines the scientific method itself.

    Obviously, just as with the casting of natural selection by the wayside by population genetics, undermining the scientific method itself is certainly NOT a minor problem for Darwin’s theory. ,,, Again, science, especially biology, simply does not need Darwin’s theory.

    “While the great majority of biologists would probably agree with Theodosius Dobzhansky’s dictum that “Nothing in biology makes sense except in the light of evolution”, most can conduct their work quite happily without particular reference to evolutionary ideas. Evolution would appear to be the indispensable unifying idea and, at the same time, a highly superfluous one.”
    – Adam S. Wilkins, editor of the journal BioEssays, Introduction to “Evolutionary Processes” – (2000).

    Fortunately for us, science itself, (real science, and not the dogmatic ‘scientism’ of Atheistic materialists), could care less if Darwinists are forced to believe that ALL their perceptions of reality are illusory.

    Specifically, advances in Quantum Mechanics have now experimentally proven that our observations of reality far more integral to reality, and therefore reliable of reality, than Darwinists are forced to claim via the mathematics of population genetics.

    As the following Wheeler Delayed Choice experiment that was conducted with atoms found, “It proves that measurement is everything. At the quantum level, reality does not exist if you are not looking at it,”

    New Mind-blowing Experiment Confirms That Reality Doesn’t Exist If You Are Not Looking at It – June 3, 2015
    Excerpt: Some particles, such as photons or electrons, can behave both as particles and as waves. Here comes a question of what exactly makes a photon or an electron act either as a particle or a wave. This is what Wheeler’s experiment asks: at what point does an object ‘decide’?
    The results of the Australian scientists’ experiment, which were published in the journal Nature Physics, show that this choice is determined by the way the object is measured, which is in accordance with what quantum theory predicts.
    “It proves that measurement is everything. At the quantum level, reality does not exist if you are not looking at it,” said lead researcher Dr. Andrew Truscott in a press release.,,,
    “The atoms did not travel from A to B. It was only when they were measured at the end of the journey that their wave-like or particle-like behavior was brought into existence,” he said.
    Thus, this experiment adds to the validity of the quantum theory and provides new evidence to the idea that reality doesn’t exist without an observer.
    https://awaken.com/2015/06/new-mind-blowing-experiment-confirms-that-reality-doesnt-exist-if-you-are-not-looking-at-it/

    And as the following violation of Leggett’s inequality found, “Leggett’s inequality is violated – thus stressing the quantum-mechanical assertion that reality does not exist when we’re not observing it.”

    Quantum physics says goodbye to reality – Apr 20, 2007
    Excerpt: Many realizations of the thought experiment have indeed verified the violation of Bell’s inequality. These have ruled out all hidden-variables theories based on joint assumptions of realism, meaning that reality exists when we are not observing it; and locality, meaning that separated events cannot influence one another instantaneously. But a violation of Bell’s inequality does not tell specifically which assumption – realism, locality or both – is discordant with quantum mechanics.
    Markus Aspelmeyer, Anton Zeilinger and colleagues from the University of Vienna, however, have now shown that realism is more of a problem than locality in the quantum world. They devised an experiment that violates a different inequality proposed by physicist Anthony Leggett in 2003 that relies only on realism, and relaxes the reliance on locality. To do this, rather than taking measurements along just one plane of polarization, the Austrian team took measurements in additional, perpendicular planes to check for elliptical polarization.
    They found that, just as in the realizations of Bell’s thought experiment, Leggett’s inequality is violated – thus stressing the quantum-mechanical assertion that reality does not exist when we’re not observing it. “Our study shows that ‘just’ giving up the concept of locality would not be enough to obtain a more complete description of quantum mechanics,” Aspelmeyer told Physics Web. “You would also have to give up certain intuitive features of realism.”
    http://physicsworld.com/cws/article/news/27640

    Thus, fortunately for us, science itself could care less if Darwinists are forced to believe that ALL their perceptions of reality are illusory.

    As far as experimental science itself is concerned, the Darwinist’s materialistic belief that ALL our perceptions of reality must be illusory is experimentally falsified.

    Thus, (in spite of the predictions from population genetics that the world would be illusory for us if Darwinian evolution were actually true), we can rest assured, that we live a rational and sensible universe that is not illusory.,,, Just as the Christian founders of modern science originally presupposed.

    “Science in its modern form arose in the Western civilization alone, among all the cultures of the world”, because only the Christian West possessed the necessary “intellectual presuppositions”.
    – Ian Barbour
    Presupposition 1: The contingency of nature
    “In 1277, the Etienne Tempier, the bishop of Paris, writing with support of Pope John XXI, condemned “necessarian theology” and 219 separate theses influenced by Greek philosophy about what God could and couldn’t do.”,,
    “The order in nature could have been otherwise (therefore) the job of the natural philosopher, (i.e. scientist), was not to ask what God must have done but (to ask) what God actually did.”
    Presupposition 2: The intelligibility of nature
    “Modern science was inspired by the conviction that the universe is the product of a rational mind who designed it to be understood and who (also) designed the human mind to understand it.” (i.e. human exceptionalism),
    “God created us in his own image so that we could share in his own thoughts”
    – Johannes Kepler
    Presupposition 3: Human Fallibility
    “Humans are vulnerable to self-deception, flights of fancy, and jumping to conclusions.”, (i.e. original sin), Scientists must therefore employ “systematic experimental methods.”
    – Stephen Meyer on Intelligent Design and The Return of the God Hypothesis – Hoover Institution
    https://www.youtube.com/watch?v=z_8PPO-cAlA

  28. 28
    ET says:

    Earth to Bob O’H- Without the spooling afforded by the histone octamers there wouldn’t be any living organisms above prokaryotes. They prove that the preponderance of DNA is NOT junk, as Larry ignorantly says. You have to be a gullible fool to think that nature produced a way to spool and organize the DNA so that the functional parts are exposed when needed.

    And if junk DNA has a function then obviously it isn’t junk, duh.

  29. 29
    ET says:

    As a reminder, seversky is just another gullible fool.

  30. 30
    bornagain77 says:

    As to the fourth item on Larry Moran’s list,

    Pseudogenes and broken genes are junk
    More than half of our genomes consists of pseudogenes, including broken transposons and bits and pieces of transposons. A few may have secondarily acquired a function but, to a first approximation, broken genes are junk.

    First off as to transposons,

    Transposable element
    A transposable element (TE, transposon, or jumping gene) is a DNA sequence that can change its position within a genome, sometimes creating or reversing mutations and altering the cell’s genetic identity and genome size.[1] Transposition often results in duplication of the same genetic material. Barbara McClintock’s discovery of them earned her a Nobel Prize in 1983.[2]

    It is interesting to note that Darwinists, after her discovery, derided Barbara McClintock’s discovery of transposons, for decades afterwards, since it was seen to be incompatible with Darwin’s theory.

    Barbara McClintock damaged corn DNA. It re-arranged its genome immediately (to repair the break) – without thousands or millions of failed trials,
    – Perry Marshall
    https://cancer-evolution.s3.amazonaws.com/perry_marshall_cancer_evolution_slides16oct.pdf

    Barbara McClintock, America’s most distinguished cytogeneticist,
    Excerpt: Her first public presentation of transposable elements was at the 1951 Cold Spring Harbor Symposium. McClintock expected recognition and acceptance, but instead was greeted with silence and derision. Almost certainly, much of this response resulted from the mutual admiration of McClintock and Richard Goldschmidt. The cantankerous Goldschmidt was a gadfly of genetics, known for denying the status quo. Since 1938 he had been arguing against the standard theory of the gene, promoting instead a holistic, chromosomal theory in which a gene’s position relative to other genes determined its function.
    http://library.cshl.edu/person.....mcclintock

    Transposons, contrary to what Darwinists presupposed, (i.e. elements jumping around the genome without any rhyme or reason), were shown, and are shown, to be highly specific in where and how they jump around in genomes. And that is why Darwinists shunned McClintock’s discovery for all those years.

    This ‘read/write’ ability of transposable elements is simply completely incompatible with Darwin’s Theory. As Shapiro states, Research dating back to the 1930s has shown that genetic change is the result of cell-mediated processes, not simply accidents or damage to the DNA.,,, This conceptual change to active cell inscriptions controlling RW genome functions has profound implications for all areas of the life sciences.

    How life changes itself: the Read-Write (RW) genome. – 2013
    Excerpt: Research dating back to the 1930s has shown that genetic change is the result of cell-mediated processes, not simply accidents or damage to the DNA. This cell-active view of genome change applies to all scales of DNA sequence variation, from point mutations to large-scale genome rearrangements and whole genome duplications (WGDs). This conceptual change to active cell inscriptions controlling RW genome functions has profound implications for all areas of the life sciences.
    http://www.ncbi.nlm.nih.gov/pubmed/23876611

    And as Jonathan Wells comments, “It’s the organism controlling the DNA, not the DNA controlling the organism.’

    Ask an Embryologist: Genomic Mosaicism – Jonathan Wells – February 23, 2015
    Excerpt: I now know as an embryologist,,,Tissues and cells, as they differentiate, modify their DNA to suit their needs. It’s the organism controlling the DNA, not the DNA controlling the organism.
    http://www.evolutionnews.org/2.....93851.html

    It is also interesting to note that Richard Sternberg, an Intelligent Design advocate, working with James Shapiro, anticipated that mobile genetic elements would be found to have function several years before ENCODE came along

    Bob Dylan, ENCODE and Evolutionary Theory: The Times They Are A-Changin’ – James Shapiro – 09/12/2012
    Excerpt: I had a longstanding, personal interest in the repetitive part of our genomes (up to as much as two-thirds of all our DNA) because it is composed of mobile genetic elements. I first discovered these elements in bacteria in my thesis research in 1968. I remember being scientifically offended by a 1980 article from Francis Crick and Leslie Orgel describing this DNA as “selfish” and functionless.
    My interest in the roles of repetitive and mobile DNA has continued since my thesis more than four decades ago. The initial sequencing of the human genome in 2001 found over 40% to be mobile repeats spread throughout our genomes, thirty times more than protein-coding DNA.
    In 2005, I published two articles on the functional importance of repetitive DNA with Rick von Sternberg. The major article was entitled “Why repetitive DNA is essential to genome function.”
    These articles with Rick are important to me (and to this blog) for two reasons. The first is that shortly after we submitted them, Rick became a momentary celebrity of the Intelligent Design movement. Critics have taken my co-authorship with Rick as an excuse for “guilt-by-association” claims that I have some ID or Creationist agenda, an allegation with no basis in anything I have written.
    The second reason the two articles with Rick are important is because they were, frankly, prescient, anticipating the recent ENCODE results. Our basic idea was that the genome is a highly sophisticated information storage organelle. Just like electronic data storage devices, the genome must be highly formatted by generic (i.e. repeated) signals that make it possible to access the stored information when and where it will be useful.
    The abstract of our paper tells the story:
    “ABSTRACT: There are clear theoretical reasons and many well-documented examples which show that repetitive DNA is essential for genome function. Generic repeated signals in the DNA are necessary to format expression of unique coding sequence files and to organise additional functions essential for genome replication and accurate transmission to progeny cells. Repetitive DNA sequence elements are also fundamental to the cooperative molecular interactions forming nucleoprotein complexes. Here, we review the surprising abundance of repetitive DNA in many genomes, describe its structural diversity, and discuss dozens of cases where the functional importance of repetitive elements has been studied in molecular detail. In particular, the fact that repeat elements serve either as initiators or boundaries for heterochromatin domains and provide a significant fraction of scaffolding/matrix attachment regions (S/MARs) suggests that the repetitive component of the genome plays a major architectonic role in higher order physical structuring. Employing an information science model, the ‘functionalist ‘ perspective on repetitive DNA leads to new ways of thinking about the systemic organisation of cellular genomes and provides several novel possibilities involving repeat elements in evolutionarily significant genome reorganisation. These ideas may facilitate the interpretation of comparisons between sequenced genomes, where the repetitive DNA component is often greater than the coding sequence component.”
    Although we could not predict in detail all the ways repeated DNA would serve genome functions, I think our statements stand up well in light of the recent data. Without knowing the specifics, we were correct in asserting that the genome had to be highly formatted to serve as the marvelous information organelle it is in every living cell and organism.
    https://www.huffpost.com/entry/bob-dylan-encode-and-evol_b_1873935

    Thus, Evolutionists, in their decades long claim that transposons are junk, that are just randomly jumping around the genome without any rhyme or reason, are apparently stuck in an endless repetitive time warp where they have learned absolutely nothing from the science as it has progressed.

    I will spare myself, and readers, from getting into the specific details of refuting Moran’s claim for junk pseudogenes, but suffice it to say that, as it is with transposons, pseudogenes are nowhere near being the knockdown proof for Junk DNA that Dr. Moran falsely imagines it to be.

  31. 31
    bornagain77 says:

    As to Larry Moran’s fifth and final item on his list

    Most of the genome is not conserved
    Most of the DNA sequences in large genomes is not conserved. These sequences diverge at a rate consistent with fixation of neutral alleles by random genetic drift. This strongly suggests that it does not have a function although one can’t rule out some unknown function that doesn’t depend on sequence.

    Well contrary to what Larry Moran falsely imagines of sequences diverging ‘at a rate consistent with fixation of neutral alleles by random genetic drift’, many other researchers find the sequence data to be quite incompatible with Darwinian theory.

    “The genomic revolution did more than simply allow credible reconstruction of the gene sets of ancestral life forms. Much more dramatically, it effectively overturned the central metaphor of evolutionary biology (and, arguably, of all biology), the Tree of Life (TOL), by showing that evolutionary trajectories of individual genes are irreconcilably different. Whether the TOL can or should be salvaged—and, if so, in what form—remains a matter of intense debate that is one of the important themes of this book.”
    Koonin, Eugene V. (2011-06-23). The Logic of Chance: The Nature and Origin of Biological Evolution

    A New Model for Evolution: A Rhizome – Didier Raoult – May 2010
    Excerpt: Thus we cannot currently identify a single common ancestor for the gene repertoire of any organism.,,, Overall, it is now thought that there are no two genes that have a similar history along the phylogenic tree.,,,Therefore the representation of the evolutionary pathway as a tree leading to a single common ancestor on the basis of the analysis of one or more genes provides an incorrect representation of the stability and hierarchy of evolution. Finally, genome analyses have revealed that a very high proportion of genes are likely to be newly created,,, and that some genes are only found in one organism (named ORFans). These genes do not belong to any phylogenic tree and represent new genetic creations.
    per DG

    Sweeping gene survey reveals new facets of evolution – May 28, 2018
    Excerpt: Darwin perplexed,,,
    And yet—another unexpected finding from the study—species have very clear genetic boundaries, and there’s nothing much in between.
    “If individuals are stars, then species are galaxies,” said Thaler. “They are compact clusters in the vastness of empty sequence space.”
    The absence of “in-between” species is something that also perplexed Darwin, he said.
    – per physorg

    Fudging Evolution to Avoid Falsification – March 12, 2015
    Evolutionary theory follows Finagle’s Rule #4: “Draw your curves, then plot your data.”
    Excerpt: “The fact that the clock is so uncertain is very problematic for us,” David Reich of Harvard said at a recent meeting where no consensus was reached. “It means that the dates we get out of genetics are really quite embarrassingly bad and uncertain.” The solution for some has been to invoke “rate heterogeneity”: mutations rates that speed up or slow down as needed to keep the theory intact. –
    – per crevinfo

    Also see W. Ewert – Dependency Graph – 2018 – BioComplexity

    Thus in conclusion, contrary to whatever Dr. Moran may falsely imagine to be true, the sequence data certainly does not conform to the prior Darwinian ‘tree-like’ prediction of genetic similarity across all species.

    Maybe Dr. Moran has other studies in mind when he claimed that “sequences diverge at a rate consistent with fixation of neutral alleles by random genetic drift”, but even if he did have access to such studies, that would merely reflect that there is not a consensus among researchers that sequence data conforms to Darwinian ‘tree-like’ predictions, and thus the sequence data would not be anywhere near the level of knock down proof that Dr. Moran needs in order to substantiate his fairly grandiose claim for massive amounts of Junk DNA (90%) in our genome.

    In short, Dr. Moran’s fifth argument, like all four of his previous arguments, fails to make his case that most of our genome must be junk. And personally, I would add that some of his arguments have failed rather spectacularly.

  32. 32
    bornagain77 says:

    I think it is worth highlighting Dr. Ewert’s 2018 paper in BioComplexity.

    In 2018 Dr. Ewert amassed a total of nine massive genetic databases. And compared the Design model to the common descent model.

    The following article is a bit lengthy, but it gets the point across quite clearly as to how badly the common descent model failed when compared to the design model.

    New Paper by Dr. Ewert Demonstrates Superiority of Design Model – Cornelius Hunter – July 20, 2018
    Excerpt: Ewert’s three types of data are: (i) sample computer software, (ii) simulated species data generated from evolutionary/common descent computer algorithms, and (iii) actual, real species data.
    Ewert’s three models are: (i) a null model which entails no relationships between any species, (ii) an evolutionary/common descent model, and (iii) a dependency graph model.
    Ewert’s results are a Copernican Revolution moment. First, for the sample computer software data, not surprisingly the null model performed poorly. Computer software is highly organized, and there are relationships between different computer programs, and how they draw from foundational software libraries. But comparing the common descent and dependency graph models, the latter performs far better at modeling the software “species.” In other words, the design and development of computer software is far better described and modeled by a dependency graph than by a common descent tree.
    Second, for the simulated species data generated with a common descent algorithm, it is not surprising that the common descent model was far superior to the dependency graph. That would be true by definition, and serves to validate Ewert’s approach. Common descent is the best model for the data generated by a common descent process.
    Third, for the actual, real species data, the dependency graph model is astronomically superior compared to the common descent model.
    Where It Counts
    Let me repeat that in case the point did not sink in. Where it counted, common descent failed compared to the dependency graph model. The other data types served as useful checks, but for the data that mattered — the actual, real, biological species data — the results were unambiguous.
    Ewert amassed a total of nine massive genetic databases. In every single one, without exception, the dependency graph model surpassed common descent.
    Darwin could never have even dreamt of a test on such a massive scale. Darwin also could never have dreamt of the sheer magnitude of the failure of his theory. Because you see, Ewert’s results do not reveal two competitive models with one model edging out the other.
    We are not talking about a few decimal points difference. For one of the data sets (HomoloGene), the dependency graph model was superior to common descent by a factor of 10,064. The comparison of the two models yielded a preference for the dependency graph model of greater than ten thousand.
    Ten thousand is a big number. But it gets worse, much worse.
    Ewert used Bayesian model selection which compares the probability of the data set given the hypothetical models. In other words, given the model (dependency graph or common descent), what is the probability of this particular data set? Bayesian model selection compares the two models by dividing these two conditional probabilities. The so-called Bayes factor is the quotient yielded by this division.
    The problem is that the common descent model is so incredibly inferior to the dependency graph model that the Bayes factor cannot be typed out. In other words, the probability of the data set, given the dependency graph model, is so much greater than the probability of the data set given the common descent model, that we cannot type the quotient of their division.
    Instead, Ewert reports the logarithm of the number. Remember logarithms? Remember how 2 really means 100, 3 means 1,000, and so forth?
    Unbelievably, the 10,064 value is the logarithm (base value of 2) of the quotient! In other words, the probability of the data on the dependency graph model is so much greater than that given the common descent model, we need logarithms even to type it out. If you tried to type out the plain number, you would have to type a 1 followed by more than 3,000 zeros. That’s the ratio of how probable the data are on these two models!
    By using a base value of 2 in the logarithm we express the Bayes factor in bits. So the conditional probability for the dependency graph model has a 10,064 advantage over that of common descent.
    10,064 bits is far, far from the range in which one might actually consider the lesser model. See, for example, the Bayes factor Wikipedia page, which explains that a Bayes factor of 3.3 bits provides “substantial” evidence for a model, 5.0 bits provides “strong” evidence, and 6.6 bits provides “decisive” evidence.
    This is ridiculous. 6.6 bits is considered to provide “decisive” evidence, and when the dependency graph model case is compared to comment descent case, we get 10,064 bits.
    But It Gets Worse
    The problem with all of this is that the Bayes factor of 10,064 bits for the HomoloGene data set is the very best case for common descent. For the other eight data sets, the Bayes factors range from 40,967 to 515,450.
    In other words, while 6.6 bits would be considered to provide “decisive” evidence for the dependency graph model, the actual, real, biological data provide Bayes factors of 10,064 on up to 515,450.
    We have known for a long time that common descent has failed hard. In Ewert’s new paper, we now have detailed, quantitative results demonstrating this. And Ewert provides a new model, with a far superior fit to the data.
    https://evolutionnews.org/2018/07/new-paper-by-winston-ewert-demonstrates-superiority-of-design-model/

    Thus, contrary to whatever Dr. Moran may falsely imagine to be true, the sequence data certainly does not conform to the prior Darwinian ‘tree-like’ prediction of genetic similarity across all species.

    So again to reiterate, Dr. Moran’s fifth argument, like all four of his previous arguments, fails to make his case that most of our genome (90%) must be junk.

Leave a Reply