Uncommon Descent Serving The Intelligent Design Community

Error Correction and Fidelity Mechanisms — How does life evolve these if life presupposes them?

arroba Email

Early nonsense: mRNA decay solves a translational problem
Nadia Amrani, Matthew S. Sachs and Allan Jacobson


[Here are some excerpts:]

The nonsense-mediated mRNA decay (NMD) pathway ensures that incomplete messenger RNAs that harbour premature termination, or nonsense (non Amino acid coding), codons are targeted for rapid degradation. New insights into the process of NMD have provided unexpected glimpses of the complexity of translation termination.

Gene expression is highly accurate and rarely generates defective proteins. Several mechanisms ensure this fidelity, including specialized surveillance pathways that rid the cell of mRNAs that are incompletely processed or that lack complete open reading frames.

Given the overall congruity of gene expression mechanisms from yeast to humans, and the obvious need for quality control, it comes as no surprise that the nonsense mediated mRNA decay (NMD) pathway is active in all eukaryotes that have been examined, and that the three core factors of this pathway are highly conserved.

Existing models for the respective mechanisms of NMD in yeast and mammalian cells are not easily reconciled.

We propose a model for NMD in yeast in which the underlying mechanism that causes an mRNA to become an NMD substrate is compatible with mRNA processing, translation and degradative activities in all eukaryotes.

Current concepts of the mechanism of translation termination are vastly oversimplified.

The comparably complicated process of initiation is already known to involve as many as 100 factors.

Other studies of the modulation of eRF activity indicate considerable regulation of the termination process and the probable involvement of numerous additional factors.

The complexities of termination remain to be unravelled.


It has been realized for quite a while that organisms can pass on beneficial acquired traits. For example, in many birds and mammals, offspring inherit high quality territories from their parents. In plants there is not a strict separation between germline and soma (as in most animals), and as a result beneficial somatic mutations are more easily transmitted to offspring (see Leo W Buss’ excellent book The Evolution of Individuality).

This is also a great read: A new biology for a new century, by Carl R Woese (2004). Microbiology and molecular biology reviews 68, 173-186.
Among other things, it discusses the early evolution of the translation machinery. The main ideas put forward are that horizontal transfer was of major importance in the early evolution of life and that early sloppy translation combined with horizontal transfer allowed “communities” of primitive cells to share innovations that occurred with high frequency due to the efficiency with which protein sequence space was “searched”.

There's virtually no separation between germ line and soma in fungi. Try telling that to Bob OH. Maybe he'll listen to you. -ds Raevmo
I guess a new topic based off of Dave's comment. If there is a mechanism that takes learned behavior and records it for future generations (insticts), shouldn't we expect that to be irreducibly complex? geoffrobinson


I have a question. If you believe that recent discoveries in epigenetics can resolve some of your doubts about the explanatory power of the 'standard model', where does that leave your opinion on the design inference ?


It doesn't effect my opinion on design. In that regard I'm interested in where the first cell came from. If DNA and ribosomes and all the 200 or so digitally coded protein products required for the simplest self-contained self-replicating organism could have assembled themselves from simple precursor chemicals without intelligent agency then I'll concede everything that followed could have happened without intelligence too. -ds bdelloid

It's unfathomable to me how anyone could have ever believed the environment does not directly influence heritable traits. It would be a tremendous advantage in differential reproduction for any organism able to pass on beneficial acquired traits. Darwin and Lamarck both intuitively knew it they just didn't know the mechanism. We're only now uncovering the mechanism after decades of thinking it was all in the nucleic acid sequences and only random mutation could change them. A nature (or a designer) who could produce the nanometer scale cellular information warehouse and factory complex we have already managed to unravel (and that's just the tip of the iceberg) could certainly come up with a way to pass along hard-won survival tools to the next generation. Big brains, language, writing, and high technology are merely the current state of the art in how life shares new survival tools and methods horizontally with contemporaries and vertically with descendents.


Mark Frank,

I would not want to conclude that our new understanding of epigenetics suggests that we need to throw the baby out with the bathwater. But new discoveries in epigenetics certainly suggest that we need to add some new parameters to our model of evolution. Certainly the "standard model" of the neo-Darwinian synthesis, whatever that is, is not 100 % sufficient. And whether it is 99.9 % sufficient, or 50 % sufficient, to explain evolution is not yet known. In time, evolutionary biologists will incorporate epigenetics more satisfactorily into their models. This is how science works. New discoveries lead to new understanding. New discoveries shouldn't lead to pitching seventy or so years of work off the cliff. If it turns out that the environment can have input into genetic systems, well, that will be a tremendous discovery and will certainly add another layer to the "standard model".

Some thoughts about Epigenesis and ID. 1) Epigenetics is fascinating and important but keep a sense of proportion. Mendelian/DNA based inheritance is understood in great detail, accounts for an enormous amount of what we observe in the way of inheritance, supports detailed predictions, and even allows us to control inheritance to a limited extent. None of this true of epigenesis. There is no reason to believe that epigenesis is more fundamental or in some way controls genetic inheritance. It is more likely to be just one of those heath robinson additions to the process going on in parallel that evolution inevitably produces. It works - so it remains. 2) You could imagine that epigenesis might be the route for some Lamarckian inheritance to take place. For example, the individual uses certain functions a lot and this increases the amount of mRNA created for the proteins required to support those functions, and the RNA in some way gets inherited even though the DNA responsible for the mRNA in the first place is not inherited. (It might even eventually affect the DNA of the germ line of the offspring). There is nothing designed or teleological about this - unless you think of the organism itself as the unconscious designer. (This is probably complete biochemical rubbish - it is the just the logic I am trying to illustrate.) 3) Lamarckian inheritance means that some variation is not random but is also not designed. This makes all estimates of the probability of certain proteins arising through random mutation irrelevant. Mark Frank
"The modern synthesis has a lot wrong with it." It's incomplete, but most evolutionary biologists recognise this fact. The theory of evolution has changed quite a bit since the modern synthesis was produced. Epigenetics is a good example of something that isn't part of the modern synthesis but is part of the theory now. Chris Hyland

Hey DS,

I disagree with you on many things, but I agree that epigenetics is hot stuff these days:

Check it out, you will dig it:


Let me know if the URL doesn't work due to a firewall.

The link worked, thanks. The modern synthesis has a lot wrong with it. That's not surprising given it's a melding of antique, simplistic, incomplete theories from Darwin and Mendel. The surprising thing is the tenacity the chance worshippers hang on to chance & necessity. I used to think "chance worshipper" was hyperbole but now it seems an accurate description. -ds



I agree that there is the appearance of a catch-22 situation with "evolving error correction" in general. In this specific case--if I understand the original post correctly--we are dealing with nonsense mediated decay (NMD), which has to do with mistakes at the level of protein translation. As someone earlier pointed out, these are *not* the variety of "mistakes" that darwinian evolution is primarily concerned with. (not that they are completely irrelevant to it) ..... Chris H. later suggested that NMD systems could impact genetic mutations that messed up intron splicing, etc. This is definitely true, but such mutations also suffer from the fact that if the protein they produce is constitutively malformed, there's a good chance the organism will not be viable (particularly when genetically homozygous) due to a *lack* of functional protein product being made. So I don't think a system such as this is geared towards destroying *genetically* aberrant gene products--that's geneerally taken care of by negative selection--so much as destroying translational mistakes that weren't coded for.

Nevertheless, as many of you are well aware, there *are* legitimate error-proofing and error-correcting systems built into several components of our genetic replication systems. So even if the original system in question doesn't fit the bill exactly, it would be easy enough to find one, such as DNA polymerases, that does fit. So now the obvious question is how could *these* systems evolve via RM+NS, given that such a process would ostensibly involve mutations that the final product itself functionally quells? I won't pretend to know how--or even *if*--this happened, but I will attempt to make the case that it is not *conceptually* impossible, which I think is the point that is in question here.

I believe the key is that such systems greatly *reduce* mutations, but they do not eradicate them. The darwinian gods may strike me down for invoking competing populations, but I think it's conceptually simpler to think about this in terms of two parallel populations. In one population, you have no error checking whatsoever. In any given timeslice, the cohort of viable, breeding organisms will literally be carrying a slough of mutations, the vast majority of which--by most anyone's standards of thinking--will be detrimental (where not neutral, of course). Effectively, its **real** (aka *effective* population size is much smaller. It is, in terms of shear numbers, ever-so-much closer to extinction (N=0). What's more, its reproductive capacity (the ability to generate new individuals) is significantly reduced because even viable breeding individuals will lose many total offspring to high levels of deleterious mutations. In this sense, this population is on even shakier ground. It does have one very important thing going for it, though; it's producing *lots* of genetic variation for NS to act on. Evolutionarily speaking, we're told that's good, right? But just how good is this population doing really? Statisticaly speaking, a population's ability to *find* or discover a new space in the adaptive landscape is contingent on both it's mutation rate (in this case very high) *and* it's effective population size. (and obviously the landscape/environment itself) In pop 1's case, the mutation rate itself is *reducing* its population size via deleterious mutation. The total mutations occurring, when factoring in the population size--ironically enough--could actually be lower than if it had a higher mutation rate. It's an optimization problem to determine what is is the best mutation rate. There *is* calculus in biology too after all...

Meanwhile, population two stumbled across a mutation that adds--and I won't insult your intelligence by pretending to know how this would happen--a dab of error-checking to its genetics. Its mutation rate is diminished. (note that the individual organism that first *found* this nifty mutation would produce more total healthy offspring than the others, and the mutation easily sweeps through the species population once "discovered") The effective population size of poopulation 2 increases as a consequence of having this dab of error-correction. Even though its mutation rate is lower, there are *more* individual organisms exploring the terrain via mutations. They could be closer to the optimum alluded to earlier, and they are ever-so-much *further* from extinction because their effective population size is higher. If, with their low mutation rate, they stumble across another error-correction adaptation that tweaks the system, they may move even closer to the optimum. So this seeming conundrum of using mutational "mistakes" to generate and optimize a system that ultimately stifles those very mistakes is not, in my opinion, conceptually unworkable at all. The dynamics involved--and I have merely scratched the surface here--must be carefully considered before making a judgement.

Cells appear to be making some educated guesses as when and what is a good thing to change and what isn't. How beneficial would it be to differential reproduction to be able pass acquired characters on to the next generation? Real important would be my guess and from what I've seen so far that cells are doing it doesn't seem out of line to suppose they have some "expert software" running in them able to make informed decisions about adaptation.-ds great_ape
great_ape, I think you are missing the argument. We are not saying that something has to be good all the time. Things like cholesterol, salt, etc. Point taken. The problem is that error-control systems exist which are, in theory, created by a mechanism that relies on errors. Advantages are produced by error. In such a system, an error control mechanism (besides possibly being irreducibly complex) is an advantage that runs counter to the error-reliant mechanism which creates advantages. A better analogy, if I can find one, is a system that requires salt producing something that requires no salt. geoffrobinson
No I think you were right I made a mistake, the process also takes care of mutations that cause exon skipping. Although I still don't see how this is a problem affecting evolution. I think most of the people on this thread are referring to general mutation correcting mechanisms though. Chris Hyland
Chris - you are right. I didn't read the paper carefully enough. This is simply a system that improves the efficiency of expressing genes. My post above was mistaken. Mark Frank
Remember that this paper is talking about nonsense mediated decay, which is not to do with genetic mutation. I think it is a result of eukaryotic genes containing introns. A system that means proteins are produced correctly doesn't have much of an effect on the variation available to evolution. It does however result in a decrease in the amount of energy required to produce the right amount of proteins. Chris Hyland
"But to sum up: if error control gives a natural advantage, that must mean the Darwinian mechanism doesn’t give organisms a natural advantage. If the Darwinian mechanism is true, the error control system should not exist. But it does" --geoffrobinson Many comments in this thread have followed a similar line of reasoning. We should extend it to other aspects of our life. My doctor, for instance, recommends that I lower my salt intake because too much sodium is detrimental to my health. Based on my biochemical and physiological training, however, I'm certain this is pure rubbish. Sodium is an essential component in many biochemical and physiological processes. If I need sodium, how can sodium also detrimental? It's simply not possible that something can be both good and bad at once. His sodium-disease theory (SDT) is logically flawed and inconsistent with abundant evidence that sodium is necessary for biological functions. I therefore have opted to dismiss his advice altogether. great_ape


You say:

"So why would systems evolve which stultify the very basis for evolution?"

Most mutations are harmful, thus natural selection will favor a low mutation rate. However, there is presumably an energetic limit to a low mutation rate, so one explanation is that the lowest mutation rate that is not prohibitive in cost evolves. Favorable mutations may still occur on this background.

There is some theory that also suggests that the rate may evolve to be a smidgen above minimal, due to selection on favorable mutations.

At least this is the basic thinking. However, as usual, things are more complicated, and equilibrium mutation rates may not be reached.

Here is a very recent paper that explores the balance between different selective forces: cost of fidelity, cost of deleterious mutation and beneficial mutation.


Given the complexity of systems already unraveled why should "nature" be constrained to accepting dumb luck in descent with modification and direct some "mutations" to benefit itself in its immediate environment? In fact this is exactly what is being discovered about life - Lamarck was right it's just that the mechanism isn't Mendelian which was why he was discounted, it's epigenetic. The key isn't in the coding genes. Those are just a library of subroutines. The key is in the regulatory regions which control the when, where, why, and how of protein production and this is controlled in significant measure by non-Mendelian inheritance i.e. transposable elements. -ds bdelloid
I think there is confusion over the word "error" here. As I understand the paper and from the little genetics I studied many years ago (I am not a biochemist) "error correction" in this context means: 1) Repairing some (but not all) transcription failures. There has to be an optimum rate of mutation for the germ line and there is no benefit at all in mutation in the somatic line. So it would provide a substantial fitness advantage to evolve some level of error correction of this type. 2) Repairing mutations that for one reason or another cannot transcribe into proteins. There is no benefit in a mutation that cannot transcibe into a protein and such a repair mechanism would be a positive advantage. I don't see any suggestion in the paper that an error correction mechanism is required for life to get started. Mark Frank
"So why would systems evolve which stultify the very basis for evolution?" I think an even better question is, "how could they?", or also, "how could they survive beforehand?" This idea presupposes the possibility of life being able to survive in cellular conditions that would today lead to error catastrophe. johnnyb
I have remarked before that it certainly does seem odd that the very mechanism responsible for all the beauty of the life forms is a deadly and destructive force (random errors) which almost always tears them apart and which therefore has required the development of some highly complex and elegant systems to constantly thwart its occurrence. avocationist
I've maintained the view for some time now that evolutionary theory is incoherent and this is just another example of why that it is. Without variation there can be no evolution. So why would systems evolve which stultify the very basis for evolution? Now, take note, evolutionary theory is compatible with both the presence of error detection and correction and the absence of error detection and correction. It's easy to see how this is so by observing that error detection and correction is an aspect of living organsisms which itself evolved. So evolutionary theory can't really tell us anything about either other than the vacuous, the survivors survived, and they must have been the ones which evolved error detection and correction systems. And after that, those who evolved even better ones. Why did error detection and correction systems stop evolving? Why not evolve to the point of no errors at all? Why hasn't evolution come to a halt, or has it? Is it mere coincidence that errror detection and correction would evolve, but only to the point that evolution itself could still continue to occur? Of course, all that evolutionary theory can tell us is that the survivors survived, and the survivors must have been the ones which stopped evolving better detection and correction systems. The mechanism of evolution requires change. Yet for changes to spread, the mechanism requires that there not be change. Weird. Mung
I think the general ND viewpoint is that primitive life (viruses, archaea etc.) took advantage of high error rates to offer rapid adaptation. As primitive life became more complex, error correction ramped up and offered stability required by complex organisms with longer gestation periods. The problem I see with ND's assertion is that (1) error correction algorithms are essentially goal oriented,(2) error correction would have to improve simultaneously with an organisms survivability, and (3) the ND explanation is oversimplified and relies on circular reasoning - (it exists, so it must have been advantageous or it wouldn't exist). As in the flagellum example, ND interpolates for all the parts of the mechanism that cannot be explained by saying it must have just happened in numerous gradual successive steps. I think ND will have an even greater problem using that logic to explain the emergence of a 6 bit parity code using all left handed weak bond polypeptides that is goal directed depending on the organisms longevity and gestation. chunkdz
Geoffrobinson, To be more clear, if there is a cost to error correction, and most errors are harmful, one would expect a minimal, but non-zero error rate. bdelloid
testing apollo230
Chris, may I ask how do random changes to any known system based on complex specific information (literature, DNA, ordered metabolic reaction chains, etc) improve that system? Or lead to constructive changes in general? Random-ness invariably degrades any legibility, function or co-herence. Scrambling something leads to what I will call (qualitatively-speaking) "informational entropy". The presence of rigorous error-prevention protocols in metabolic systems is, I believe, nature's acknowledgement of this fact, and her verdict that errors are not desirable features to be encouraged. I honestly believe that Geoff is right when he states (more or less) that natural selection has ruled against the fitness of un-coordinated mechanisms of genetic change by throwing its support instead behind the various error-prevention protocols in cells. Granted, change has happened, and descent with modification has taken place. I have no complaint about that pillar of evolutionary theory. But very basic information theory really does strike down the supposed Michaelangelo ability of random variation/mutation to create structural utility and functional biochemistry. Rather, advantageous change must be co-ordinated by sources of complex specified (ordered) information provided by perhaps subtle 3-D energy fields akin to holograms, biochemical mathematics embedded in the DNA, and/or some other undiscovered mechanism. Evolution is a wondrous concept that is largely verified. It is un-necessarily burdened, however by this grossly unfounded notion that randomness can produce anything along the lines of what we see everyday and everywhere in the biosphere. Best regards, apollo230 apollo230
Geoffrobinson, This error control mechanism is not a mechanism that corrects genetic errors. It corrects mistakes in transcription (ie, not at the level of germline mutation). Secondly, speaking strictly of germline mutation now, even if most mutations are deleterious, this doesn't preclude some mutations being advantageous. bdelloid
I'm not saying that something advantageous can't be improved upon. But if the underlying mechanism for improvement is error which natural selection then acts upon, why would we expect an improvement to be error control? Error is how you get improvement. If error control exists in a Darwinian framework, that must, by definition, be a natural advantage. Preventing errors must be a natural advantage. But when the mechanism that gives you advantages is based on errors that shouldn't be an advantage. The mechanism chose (if I may use an anthropomorphism) something that runs counter to the mechanism. geoffrobinson
Reminds me of another paper from a long time ago: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=6234436&dopt=Citation "Current knowledge of enzymic editing mechanisms in DNA replication, transcription and translation can be used to predict error rates in the absence of editing. Primitive enzymes which possessed synthetic activity but not yet editing mechanisms would have had extremely high error rates resulting in heterogeneous proteins. Based on present knowledge of molecular biology and biochemistry, it is concluded that the evolution of contemporary information transfer systems from primitive systems lacking such editing mechanisms remains an unsolved problem in theoretical biology." johnnyb
I don't follow. Are you saying that if something can be improved upon it could never be or have been advantageous? Chris Hyland
A system based on random errors develops error-control. Now, a Darwinist has to assume that anything that exists gives a natural advantage. That would mean error control systems give an advantage over just allowing errors. But that would mean that the neo-Darwinian mechanism of random errors (mutations) aren't advantageous. Someone may say it more eloquently. But to sum up: if error control gives a natural advantage, that must mean the Darwinian mechanism doesn't give organisms a natural advantage. If the Darwinian mechanism is true, the error control system should not exist. But it does. geoffrobinson

Leave a Reply