Uncommon Descent Serving The Intelligent Design Community

Dover all over

arroba Email

From Evolution News & Views:

Following Kitzmiller v. Dover, an Excellent Decade for Intelligent Design

Tomorrow marks the tenth anniversary of opening of arguments in the Kitzmiller v. Dover case that resulted in the most absurdly hyped court decision in memory. In 2005, did an obscure Federal judge in Dover, Pennsylvania, at last settle the ultimate scientific question that has fascinated mankind for millennia?

Of course not. The decision by Judge John Jones established nothing about intelligent design — far from being the “death knell” sometimes claimed by Darwin defenders.

A number of post-Dover achievements are listed, including

– Lots of pro-ID peer-reviewed scientific papers published.

– Experimental peer-reviewed research showing the unevolvability of new proteins.

– Theoretical peer-reviewed papers taking down alleged computer simulations of evolution, showing that intelligent design is needed to produce new information. Much more.

With the ten-year anniversary of Dover upcoming, expect Darwin’s followers to be too busy with hype to notice that the ground is subtly shifting.

Ironically, Dover was a major help in making it all possible.

Darwin’s followers are more apt to believe their own storytelling than reality. The reality was that people who wanted design taught in schools were a major hassle and distraction in the years leading up to Dover.

Much theoretical and research work needed to be done. But theorists and researchers were overshadowed by well-meaning people with ideas about what the school system needed—resulting in some amazing Darwinblog rants and opinionating by concerned bimbettes from Talk TV.

It would be useless to ask if the latter had read any book by an ID theorist. Most likely, Bimbette had not read any book since graduating from the journalism program. A characteristic of the type is that they “believe in evolution,” but know almost nothing about it and see no need.

Dover, thankfully, got the crowd out of people’s laptop cases and lab coat pockets, and that was —in my opinion—one of the reasons the decade was fruitful.

Darwin followers continued to claim that the Discovery Institute wanted ID taught in schools. As someone with a ringside seat, I knew that wasn’t true; its involvement in Dover was more or less forced by events.

The “teach the controversy” approach the institute did advocate was taken to be a plot to advance ID in the schools. It was actually an attempt to teach evidence-based thinking, as opposed to the Darwin lobby’s metaphysical claims.

But fortunately, the pants in knot street theatre Darwin’s faithful created over the issue was an unexpected help. It tended to focus much of the hysteria on something other than the main work of the ID community.

Here’s to another decade of fruitful work for the ID community and creative profanity from the Darwinblogs! Oh yes, and pontificating about what God would or wouldn’t do from the Christian Darwinists. At least we will all have our priorities straight.

Follow UD News at Twitter!

JoeCoder, you made the claim that 'A lot of them probably don’t, even though we don’t know for sure'. That is a claim for which you have no empirical evidence and for which I provided empirical evidence against. Your 'Darwinian' hunch does not count as empirical evidence. It is up to you to cite actual empirical evidence to counter the evidence I provided. Let me save you time, there is no actual empirical evidence for your 'Darwinian' hunch that 'random', as opposed to 'directed', mutations are 'completely neutral'! bornagain77
@ba77 Right, I already mentioned alternate reading frames and controlling transcription speed above. I think the alt-frames are problematic for Darwinian evolution in particular. But what evidence is there that ALL synonymous codons require a specific nucleotide for optimal function? A lot of them probably don't, even though we don't know for sure. I don't think that's evidence against design any more than it is for all the characters of this comment only using 7 out of 8 bits. JoeCoder
The evidence for the detrimental nature of mutations in humans is overwhelming for scientists have already cited over 100,000 mutational disorders.
Inside the Human Genome: A Case for Non-Intelligent Design - Pg. 57 By John C. Avise Excerpt: "Another compilation of gene lesions responsible for inherited diseases is the web-based Human Gene Mutation Database (HGMD). Recent versions of HGMD describe more than 75,000 different disease causing mutations identified to date in Homo-sapiens."
I went to the mutation database website cited by John Avise and found:
Mutation total (as of June 27, 2015) - 166,768 http://www.hgmd.cf.ac.uk/ac/
Despite what Dr. Avise may believe, that is certainly not good from the evolutionary standpoint!
Critic ignores reality of Genetic Entropy - Dr John Sanford - 7 March 2013 Excerpt: Where are the beneficial mutations in man? It is very well documented that there are thousands of deleterious Mendelian mutations accumulating in the human gene pool, even though there is strong selection against such mutations. Yet such easily recognized deleterious mutations are just the tip of the iceberg. The vast majority of deleterious mutations will not display any clear phenotype at all. There is a very high rate of visible birth defects, all of which appear deleterious. Again, this is just the tip of the iceberg. Why are no beneficial birth anomalies being seen? This is not just a matter of identifying positive changes. If there are so many beneficial mutations happening in the human population, selection should very effectively amplify them. They should be popping up virtually everywhere. They should be much more common than genetic pathologies. Where are they? European adult lactose tolerance appears to be due to a broken lactase promoter [see Can’t drink milk? You’re ‘normal’! Ed.]. African resistance to malaria is due to a broken hemoglobin protein [see Sickle-cell disease. Also, immunity of an estimated 20% of western Europeans to HIV infection is due to a broken chemokine receptor—see CCR5-delta32: a very beneficial mutation. Ed.] Beneficials happen, but generally they are loss-of-function mutations, and even then they are very rare! http://creation.com/genetic-entropy Human Genome in Meltdown - January 11, 2013 Excerpt: According to a study published Jan. 10 in Nature by geneticists from 4 universities including Harvard, “Analysis of 6,515 exomes reveals the recent origin of most human protein-coding variants.”,,,: "We estimate that approximately 73% of all protein-coding SNVs [single-nucleotide variants] and approximately 86% of SNVs predicted to be deleterious arose in the past 5,000 -10,000 years. The average age of deleterious SNVs varied significantly across molecular pathways, and disease genes contained a significantly higher proportion of recently arisen deleterious SNVs than other genes.",,, As for advantageous mutations, they provided NO examples,,, http://crev.info/2013/01/human-genome-in-meltdown/ Multiple Overlapping Genetic Codes Profoundly Reduce the Probability of Beneficial Mutation George Montañez 1, Robert J. Marks II 2, Jorge Fernandez 3 and John C. Sanford 4 - May 2013 Excerpt: It is almost universally acknowledged that beneficial mutations are rare compared to deleterious mutations [1–10].,, It appears that beneficial mutations may be too rare to actually allow the accurate measurement of how rare they are [11]. 1. Kibota T, Lynch M (1996) Estimate of the genomic mutation rate deleterious to overall fitness in E. coli . Nature 381:694–696. 2. Charlesworth B, Charlesworth D (1998) Some evolutionary consequences of deleterious mutations. Genetica 103: 3–19. 3. Elena S, et al (1998) Distribution of fitness effects caused by random insertion mutations in Escherichia coli. Genetica 102/103: 349–358. 4. Gerrish P, Lenski R N (1998) The fate of competing beneficial mutations in an asexual population. Genetica 102/103:127–144. 5. Crow J (2000) The origins, patterns, and implications of human spontaneous mutation. Nature Reviews 1:40–47. 6. Bataillon T (2000) Estimation of spontaneous genome-wide mutation rate parameters: whither beneficial mutations? Heredity 84:497–501. 7. Imhof M, Schlotterer C (2001) Fitness effects of advantageous mutations in evolving Escherichia coli populations. Proc Natl Acad Sci USA 98:1113–1117. 8. Orr H (2003) The distribution of fitness effects among beneficial mutations. Genetics 163: 1519–1526. 9. Keightley P, Lynch M (2003) Toward a realistic model of mutations affecting fitness. Evolution 57:683–685. 10. Barrett R, et al (2006) The distribution of beneficial mutation effects under strong selection. Genetics 174:2071–2079. 11. Bataillon T (2000) Estimation of spontaneous genome-wide mutation rate parameters: whither beneficial mutations? Heredity 84:497–501. http://www.worldscientific.com/doi/pdf/10.1142/9789814508728_0006
As to my 'deletion of functionless genes' comment, I wrongly extrapolated from studies on bacteria and agree with you that selection cannot see that well in multicellular creatures. But that only adds to Dr. Sanford's argument for Genetic Entropy in humans. bornagain77
as to: "What about four-fold degeneracy sites? Some of them may have other purposes (in an alternate reading frame, or affecting transcription speed) but others should be able to mutate free of consequence." And your empirical evidence for 'should be able to mutate free of consequence' is exactly what other than the hidden Darwinian presupposition in your argument that it must be able to mutate free of consequence?
Synonymous (“Silent”) Mutations in Health, Disease, and Personalized Medicine: Review - 2012 Excerpt: The CBER authors compiled a list of synonymous mutations that are linked to almost fifty diseases, including diabetes, a blood clotting disorder called hemophilia B, cervical cancer, and cystic fibrosis. http://www.fda.gov/BiologicsBloodVaccines/ScienceResearch/ucm271385.htm Synonymous Codons: Another Gene Expression Regulation Mechanism - September 2010 Excerpt: There are 64 possible triplet codons in the DNA code, but only 20 amino acids they produce. As one can see, some amino acids can be coded by up to six “synonyms” of triplet codons: e.g., the codes AGA, AGG, CGA, CGC, CGG, and CGU will all yield arginine when translated by the ribosome. If the same amino acid results, what difference could the synonymous codons make? The researchers found that alternate spellings might affect the timing of translation in the ribosome tunnel, and slight delays could influence how the polypeptide begins its folding. This, in turn, might affect what chemical tags get put onto the polypeptide in the post-translational process. In the case of actin, the protein that forms transport highways for muscle and other things, the researchers found that synonymous codons produced very different functional roles for the “isoform” proteins that resulted in non-muscle cells,,, In their conclusion, they repeated, “Whatever the exact mechanism, the discovery of Zhang et al. that synonymous codon changes can so profoundly change the role of a protein adds a new level of complexity to how we interpret the genetic code.”,,, http://www.creationsafaris.com/crev201009.htm#20100919a 'Snooze Button' On Biological Clocks Improves Cell Adaptability - Feb. 17, 2013 Excerpt: Like many written languages, the genetic code is filled with synonyms: differently spelled "words" that have the same or very similar meanings. For a long time, biologists thought that these synonyms, called synonymous codons, were in fact interchangeable. Recently, they have realized that this is not the case and that differences in synonymous codon usage have a significant impact on cellular processes,, http://www.sciencedaily.com/releases/2013/02/130217134246.htm A hidden genetic code: Researchers identify key differences in seemingly synonymous parts of the structure - January 21, 2013 Excerpt: (In the Genetic Code) there are 64 possible ways to combine four bases into groups of three, called codons, the translation process uses only 20 amino acids. To account for the difference, multiple codons translate to the same amino acid. Leucine, for example, can be encoded in six ways. Scientists, however, have long speculated whether those seemingly synonymous codons truly produced the same amino acids, or whether they represented a second, hidden genetic code. Harvard researchers have deciphered that second code,,, Under some stressful conditions, the researchers found, certain sequences manufacture proteins efficiently, while others—which are ostensibly identical—produce almost none. "It's really quite remarkable, because it's a very simple mechanism," Subramaniam said. "Many researchers have tried to determine whether using different codons affects protein levels, but no one had thought that maybe you need to look at it under the right conditions to see this.",,, While the system helps cells to make certain proteins efficiently under stressful conditions, it also acts as a biological failsafe, allowing the near-complete shutdown in the production of other proteins as a way to preserve limited resources. http://phys.org/news/2013-01-hidden-genetic-code-key-differences.html Design In DNA – Alternative Splicing, Duons, and Dual coding genes – video (5:05 minute mark) http://www.youtube.com/watch?v=Bm67oXKtH3s#t=305 Codes Within Codes: How Dual-Use Codons Challenge Statistical Methods for Inferring Natural Selection - Casey Luskin - December 20, 2013 Excerpt: In fact, one commentator observed that on the same analysis, codons may have more than two uses: "By this logic one could coin the term "trion" by pointing out that histone binding is also independently affected by A-C-T-G letter frequencies within protein-coding stretches of DNA." But this isn't the first time that scientists have discovered multiple codes in biology. Earlier this year I discussed research that found an analog code in the DNA that helps regulate gene expression, in addition to the digital code that encodes primary protein sequence. In other cases, multiple proteins are encoded by the same gene! And then of course there's the splicing code, which helps control how RNAs transcribed from genes are spliced together in different ways to construct different proteins (see here and here). It boggles the mind to think about how such "codes within codes" could evolve by random mutation and natural selection. But now it turns out that evidence of different functions for synonymous codons could threaten many standard methods used to infer selection in the first place,,, http://www.evolutionnews.org/2013/12/codes_within_co080381.html Sounds of silence: synonymous nucleotides as a key to biological regulation and complexity. - Jan 2013 Excerpt: Synonymous positions of the coding regions have a higher level of hybridization potential relative to non-synonymous positions, and are multifunctional in their regulatory and structural roles. http://www.ncbi.nlm.nih.gov/pubmed/23293005 Multiple Overlapping Genetic Codes Profoundly Reduce the Probability of Beneficial Mutation George Montañez 1, Robert J. Marks II 2, Jorge Fernandez 3 and John C. Sanford 4 - published online May 2013 Excerpt: In the last decade, we have discovered still another aspect of the multi- dimensional genome. We now know that DNA sequences are typically “ poly-functional” [38]. Trifanov previously had described at least 12 genetic codes that any given nucleotide can contribute to [39,40], and showed that a given base-pair can contribute to multiple overlapping codes simultaneously. The first evidence of overlapping protein-coding sequences in viruses caused quite a stir, but since then it has become recognized as typical. According to Kapronov et al., “it is not unusual that a single base-pair can be part of an intricate network of multiple isoforms of overlapping sense and antisense transcripts, the majority of which are unannotated” [41]. The ENCODE project [42] has confirmed that this phenomenon is ubiquitous in higher genomes, wherein a given DNA sequence routinely encodes multiple overlapping messages, meaning that a single nucleotide can contribute to two or more genetic codes. Most recently, Itzkovitz et al. analyzed protein coding regions of 700 species, and showed that virtually all forms of life have extensive overlapping information in their genomes [43]. 38. Sanford J (2008) Genetic Entropy and the Mystery of the Genome. FMS Publications, NY. Pages 131–142. 39. Trifonov EN (1989) Multiple codes of nucleotide sequences. Bull of Mathematical Biology 51:417–432. 40. Trifanov EN (1997) Genetic sequences as products of compression by inclusive superposition of many codes. Mol Biol 31:647–654. 41. Kapranov P, et al (2005) Examples of complex architecture of the human transcriptome revealed by RACE and high density tiling arrays. Genome Res 15:987–997. 42. Birney E, et al (2007) Encode Project Consortium: Identification and analysis of functional elements in 1% of the human genome by the ENCODE pilot project. Nature 447:799–816. 43. Itzkovitz S, Hodis E, Sega E (2010) Overlapping codes within protein-coding sequences. Genome Res. 20:1582–1589. http://www.worldscientific.com/doi/pdf/10.1142/9789814508728_0006
@BA77 Sanford writes:
a nucleotide position takes up space, affects spacing between other sites, and affects such things as regional nucleotide composition, DNA folding, and nucleosome building. If a nucleotide carries absolutely no (useful) information, it is, by definition, slightly deleterious, as it slows cell replication and wastes energy.
What about four-fold degeneracy sites? Some of them may have other purposes (in an alternate reading frame, or affecting transcription speed) but others should be able to mutate free of consequence. But deleting them causes a nasty frameshift. I respect John Sanford, he knows all this already, and problems only arise when I pedantically take a hyper-literal reading of his statement. But no more than what you are doing with me : ) Above I wrote:
The answer is because most mutations are either neutral or very slightly deleterious. Which depends on how much of the genome you think is functional.
I actually think the genome is mostly functional and therefore most are slightly deleterous. Don't think I'm arguing for mostly junk DNA.
in that not yet deleted gene
Why do you think disabled genes are usually deleted? Selection isn't strong enough to care about whether a 3 billion base genome has an extra 1000 bases here or there. And only very rarely would a deletion target the exact start and end of a gene. JoeCoder
Andre- Those with sickle-cell trait, only one copy of the mutated gene, survive just fine and have some immunity to malaria. It is only when the individual has both copies of the gene with the mutation is it the disease sickle-cell anemia. That's how Darwinian evolution "works"- break something, hope it isn't fatal and hope it helps the organism survive. If it helps the organism survive it has a chance to be passed on. Darwin's "theory" of de-evolution is born. :cool: Virgil Cain
Andre -- can I suggest you learn something about sickle cell before you rant about it? wd400
How exactly is dying beneficial to the organism WD400? How? Lastly ever consider the reason the mosses stay away from sickle cell sufferers is because they have a mechanism that detects there are issues with the food source? Andre
JoeCoder, "Any mutation in a gene that has already been knocked out should be completely neutral" So you are depending on the previous detrimental effect of the loss of an entire gene to argue that a mutation in that not yet deleted gene may be 'completely' neutral? And me and Andre are suppose to take this argument for completely neutral mutations seriously why exactly? I would suggest that you perhaps soften your stance and say that mutations can be 'nearly neutral' instead of completely neutral.
"Moreover, there is strong theoretical reasons for believing there is no truly neutral nucleotide positions. By its very existence, a nucleotide position takes up space, affects spacing between other sites, and affects such things as regional nucleotide composition, DNA folding, and nucleosome building. If a nucleotide carries absolutely no (useful) information, it is, by definition, slightly deleterious, as it slows cell replication and wastes energy.,, Therefore, there is no way to change any given site without some biological effect, no matter how subtle." - John Sanford - Genetic Entropy and The Mystery of The Genome - pg. 21 - Inventor of the 'Gene Gun' Mutations: Enemies of Evolution with Geneticist Dr John Sanford - Genesis Unleashed (4:10 minute mark) https://youtu.be/MfCETJ_PI1s?t=250
@BA77 Any mutation in a gene that has already been knocked out should be completely neutral, and we all have many broken genes. I said that MOST mutations are neutral or slightly deleterious, but most mutations are not in developmental gene regulatory networks. And even mutations that are, are none at four-fold degeneracy sites? JoeCoder
As for do mutations happen? Yes once the Integrity checks systems, repair systems and PCD mechanisms fail mutations do happen and its pretty much lethal to the organism every time, Sickle Cell being a good example of that, cancer also.
We each have many mutations, so this is just not true. The great majority of sickle cell disease is not caused by new mutations, but form standing variation in our population. The sickle allele is common because it's beneficial to have one copy of it in malaria-endemic areas. wd400
JoeCoder, actually mutations that happen in developmental gene regulatory networks are 'always catastrophically bad' (Stephen Meyer). Moreover, the idea that mutations can be completely neutral is false (John Sanford). bornagain77
@Andre wrote:
mutations do happen and its pretty much lethal to the organism every time
Do you agree that humans get something around 60 to 160 mutations per generation? There is no biologist, creation, ID, or otherwise who would dispute that number. If ever mutation is "pretty much lethal", why aren't we all dead 100 times over? (to reappropriate Kondrashov's famous question) The answer is because most mutations are either neutral or very slightly deleterious. Which depends on how much of the genome you think is functional. JoeCoder
Andre, this article may interest you:
Nobel Prize 2015: What the chemistry winners taught us about the fragility of human life - Julia Belluz - October 7, 2015, Excerpt: Early this morning we learned that the 2015 Nobel Prize in Chemistry went to Tomas Lindahl of the Francis Crick Institute, Paul Modrich of Duke University, and Aziz Sancar of University of North Carolina Chapel Hill. They won for a simple reason: Their scientific discoveries revealed the surprising ways in which our DNA is at once extremely fragile and super resilient.,,, As late as the 1960s and '70s, these building blocks of life were believed to be exceptionally stable. How else could DNA be passed down from generation to generation? Scientists surmised that human evolution must have selected for sturdy molecules. After all, if our gene molecules were fragile, no complex organism could possibly survive, right? Around that time, however, Lindahl began to question the conventional wisdom, asking: "How stable is DNA, really?" As a postdoc student at Princeton and later at the Karolinska Institutet in Stockholm, he carried out a series of experiments showing that DNA molecules, when isolated outside of the cell, actually degraded pretty quickly. Lindahl's research suggested that DNA can actually sustain quite a bit of damage — but somehow manage to thrive and repair itself. "[DNA] turned out to be photosensitive, temperature sensitive, and all-sorts-of-other-stuff sensitive, and that meant that living cells (1) must have mechanisms to repair DNA damage and (2) must spend a substantial amount of time and energy on them," explained chemist Derek Lowe in a fantastic blog post on the awards.,,, the Nobel Prize Committee said. "It is constantly subjected to assaults from the environment, yet it remains surprisingly intact." The big question, then, was how DNA gets repaired. Lindahl arrived at part of the answer here: He identified a bacterial enzyme that removes damaged cells from DNA. Later on, he also discovered a cellular process — called "base excision repair" — that essentially continuously repairs damaged DNA using a similar enzyme. Lindahl's co-winner, Aziz Sancar, later built on this work, mapping the mechanism that cells use to repair the most common type of assault — UV damage — a technique called "nucleotide excision repair." Basically, our cells can cut out sections of DNA that are damaged by UV light and replace them with new DNA. Meanwhile, Paul Modrich discovered yet another repair mechanism: Cells can correct replication errors through a process called "mismatch repair." The upshot of these discoveries is that cells are constantly working to repair DNA damage. "Every day, [these processes] fix thousands of occurrences of DNA damage caused by the sun, cigarette smoke or other genotoxic substances; they continuously counteract spontaneous alterations to DNA and, for each cell division, mismatch repair corrects some thousand mismatches," the Nobel Committee described. "Our genome would collapse without these repair mechanisms." These discoveries were important in themselves: They completely changed how the scientific community understood the fundamentals of cell biology and DNA. http://www.vox.com/2015/10/7/9470913/nobel-prize-2015-what-the-chemistry-winners-taught-us-about-the
WD400 You know all those useless junk you Darwinian's love selling? Would you believe that these non-coding regions are quite possibly the builtin responses to changes in an environment. As for do mutations happen? Yes once the Integrity checks systems, repair systems and PCD mechanisms fail mutations do happen and its pretty much lethal to the organism every time, Sickle Cell being a good example of that, cancer also. Mutations are bad! They kill you! Andre
Mung -- I know how to distinguish drift form seleciton. I've never diagnosed the source of your confusion on that topic, and don't suppose I will wd400
wd400, have you learned the difference between genetic drift and neutral evolution yet? Or how to distinguish drift from selection? Mung
I understand that's the way D&S are doing it. 1/u for the first mutation A and 1/sqrt(u) for the second mutation B. Because they reason any time A appears, it might linger in the population for a while, and during that time B may appear among one of those with A. But I think if the 1/sqrt(u) component were correct we could restructure all of our brute force password cracking algorithms as an evolutionary search and make them exponentially faster. Nobody does this and we all still use passwords so I'm inclined to thing the 1/sqrt(u) term is incorrect. Or maybe there's some other difference I'm missing? JoeCoder
You seem(?) to be calculating the probability of finding the double mutation in a single chain. The point D&S make is that you have to calculate the probability that a cell getting the "B" mutation descends from a one that carried the "A" mutation. wd400
@wd400 I don't think scaling is the only thing they got wrong. I'm also skeptical of the sqrt(u2) in their theorem 1 which itself is the root of our difference in calculations. Let me explain why I find it problematic: 1. The malarial genome is 23 megabases. 2. It therefore has 23 million squared possible 2-nucleotide permutations. That's 5e14, so far so good. 3. But the mutation rate you used in the D&S equation above is 10^-10, or about 1 mutation every 435 replications. 4. 5e14 * 435 is 2e17 So it should take 2e17 malaria to search every possible 2-nucleotide permutation. Or 1e17 to search half of them. In computer science we know it's impossible to find a value faster than you can look for it. That assumption is critical in all our security systems. Otherwise a password of 8 digits could be found in far less than an average of 10^8 / 2 searches. So I think D&S's sqrt(u2) component must be incorrect. But I have not dug through the Iwasa et al. paper cited by D&S to try to figure out why. Thank you for your help in evaluating this so far. P.S. I think 1e17 is still reconcilable with Behe and Tim White's suggestion it takes 1e20 malaria to f JoeCoder
Andre, In 38 you really sound like you are denying mutations can happen? Are you serious? wd400
Hi JoeC, I think the bit the D&S got wrong with regard Dembski was how to specify u -- the nucleotide rate against the particular nucloetide mutation required for the amino acid change. But for a given rate D&S show how to calculate the waiting time for 2 mutations. So, I don't think scaling and re-scaling is going to get you back to the inverse square? wd400
But before we even discuss all the sophisticated systems, the reasonable observer should take pause for a minute and consider the following... How can any unguided process without any help build it's own guided process to prevent any unguided processes from happening in the first place? Luck? Chance? Andre
JoeCoder An integrity checker will have a signature file of what the data should be like. Any changes to the data and the integrity check fails. DNA integrity checks are not simple checksums. They are highly sophisticated, worse there are multiple integrity checks which means the system even have builtin redundancy. If this is the case spanning back 550 million years Darwinian evolution is impotent, powerless and unable to do anything because anything it attempts, random or non-random first have to pass these multiple checks. If they do for whatever reason it means the integrity check system failed. If the integrity checks fail the system attempts repairs... How does it know what to repair? Well consider that there are additional information or signatures of the system and when repair fails what happens next? Yes you got it the system goes into self destruct when all checks and all repairs failed. Darwinian evolution can't do jack because not only.does the system not tolerate it but when the system faults and is not repairable it shuts down indefinitely. Andre
@Andre Because any lineage that trashed them didn't live to tell about it? Thus they are conserved. JoeCoder
JoeCoder Here is the problem. All the integrity check systems, all the repair mechanisms, and all the PCD systems are evolutionary conserved. This presents a major hurdle for Darwinian evolution.... Want to guess why? Andre
@BA77 I’m not sure why error correction is a paradox for evolution? You mean besides the obvious paradox of presuming random errors built an extremely sophisticated, multi-overlapping, system of random error correction? Yeah no paradox in that Darwinian presumption at all! :) A bit off topic JoeC, but the following paper may interest you in regards to 'directed' mutations:
Duality in the human genome - Nov. 28, 2014 Excerpt: The results show that most genes can occur in many different forms within a population: On average, about 250 different forms of each gene exist. The researchers found around four million different gene forms just in the 400 or so genomes they analysed. This figure is certain to increase as more human genomes are examined. More than 85 percent of all genes have no predominant form which occurs in more than half of all individuals. This enormous diversity means that over half of all genes in an individual, around 9,000 of 17,500, occur uniquely in that one person - and are therefore individual in the truest sense of the word. The gene, as we imagined it, exists only in exceptional cases. "We need to fundamentally rethink the view of genes that every schoolchild has learned since Gregor Mendel's time.,,, According to the researchers, mutations of genes are not randomly distributed between the parental chromosomes. They found that 60 percent of mutations affect the same chromosome set and 40 percent both sets. Scientists refer to these as cis and trans mutations, respectively. Evidently, an organism must have more cis mutations, where the second gene form remains intact. "It's amazing how precisely the 60:40 ratio is maintained. It occurs in the genome of every individual – almost like a magic formula," says Hoehe. http://medicalxpress.com/news/2014-11-duality-human-genome.html
@wd400 Above I said:
So if we take your 5e14 above and multiply it by 30
But I think you did a calculation for a mutation rate of 1e-10 and not 1e-8, so scratch that part. JoeCoder
@BA77 I'm not sure why error correction is a paradox for evolution? Shouldn't there be a sweet spot between a mutation rate high enough to allow for evolution, but not so high it drives species extinct? However I agree with you and Sanford's genetic entropy thesis that we and likely most other higher animals are way above that safe point. I also agree that the distribution of mutations can be very non-random. I'm assuming randomness because 1) these things would be too hard to calculate otherwise, and 2) I want to show how difficult evolution is even under generous assumptions. JoeCoder
@wd400 You're correct that Durret+Schmidt differed u1 and u2 by a factor of 30 (10 nucleotide binding site times 3 ways to mutate each letter). I did not notice that before. However, I did some googling and found an interesting debate between Behe and Durret+Schmidt. In the paper you linked above, Durret+Schmidt said Behe's estimated waiting time for two mutations off by "5 million times". Behe wrote this response saying their estimate was 30 times too generous because they assumed any mutation would change the protein instead of only codon-altering mutations. In a response to Behe, Durret+Schmidt agreed "Behe is right on this point." So if we take your 5e14 above and multiply it by 30, we get 1.5e16. Durret+Schmidt use a mutation rate of 1e-8 per nucleotide per generation for humans, and the inverse square of that is 1e16. So I think their calculations do indeed show that the odds are the inverse of the mutation rate. Behe was wrong in Edge to not consider the differences in the mutation rates between humans and Malaria, which I think accounts for the remainder of the difference between Durret+Schmidt's and Behe's calculations for waiting time for two mutations in humans. JoeCoder
Actually Andre has a relevant point:
The Evolutionary Dynamics of Digital and Nucleotide Codes: A Mutation Protection Perspective - February 2011 Excerpt: "Unbounded random change of nucleotide codes through the accumulation of irreparable, advantageous, code-expanding, inheritable mutations at the level of individual nucleotides, as proposed by evolutionary theory, requires the mutation protection at the level of the individual nucleotides and at the higher levels of the code to be switched off or at least to dysfunction. Dysfunctioning mutation protection, however, is the origin of cancer and hereditary diseases, which reduce the capacity to live and to reproduce. Our mutation protection perspective of the evolutionary dynamics of digital and nucleotide codes thus reveals the presence of a paradox in evolutionary theory between the necessity and the disadvantage of dysfunctioning mutation protection. This mutation protection paradox, which is closely related with the paradox between evolvability and mutational robustness, needs further investigation." http://www.benthamscience.com/open/toevolj/articles/V005/1TOEVOLJ.pdf Contradiction in evolutionary theory - video - (The contradiction between extensive DNA repair mechanisms and the necessity of 'random mutations/errors' for Darwinian evolution) http://www.youtube.com/watch?v=dzh6Ct5cg1o The Darwinism contradiction of repair systems Excerpt: The bottom line is that repair mechanisms are incompatible with Darwinism in principle. Since sophisticated repair mechanisms do exist in the cell after all, then the thing to discard in the dilemma to avoid the contradiction necessarily is the Darwinist dogma. https://uncommondesc.wpengine.com/intelligent-design/the-darwinism-contradiction-of-repair-systems/
As well, even though Darwinian evolution is dependent of random mutations to be feasible as a Theory, ironically, it is found that too many random mutations per generation will lead to genetic deterioration.
"it would in the end be far easier and more sensible to manufacture a complete man de novo, out of appropriately chosen raw materials, than to try to fashion into human form those pitiful relics which remained… it is evident that the natural rate of mutation of man is so high, and his natural rate of reproduction so low, that not a great deal of margin is left for selection… it becomes perfectly evident that the present number of children per couple cannot be great enough to allow selection to keep pace with a mutation rate of 0.1..if, to make matters worse, u should be anything like as high as 0.5…, our present reproductive practices would be utterly out of line with human requirements." Hermann Muller quoted by John Sanford; Appendix 1, Genetic Entropy No Matter What Type Of Selection, Mutations Deteriorate Genetic Information - article and animation Excerpt: The animation asserts that if harmful mutation rates are high enough, then there exists no form or mechanism of selection which can arrest genetic deterioration. Even if the harmful mutations do not reach population fixation, they can still damage the collective genome.,,, Nobel Prize winner HJ Muller (of Muller’s ratchet fame) suggested that the human race can’t even cope with a harmful rate of 0.1 (mutations) per new born. The actual rate has been speculated to be (much higher). The animation uses a conservative harmful rate of 1 and argues (with some attempts at humor) that deterioration would thus be inevitable even with a harmful rate of 1 per new born. https://uncommondesc.wpengine.com/evolution/nachmans-paradox-defeats-darwinism-and-dawkins-weasel/ Human evolution or extinction - discussion on acceptable mutation rate per generation (with clips from Dr. John Sanford) - video http://www.youtube.com/watch?v=aC_NyFZG7pM
As well, even though Darwinian evolution is dependent of random mutations to be feasible as a Theory, it is found that the vast majority of mutations are not truly random but are directed:
Revisiting the Central Dogma in the 21st Century - James A. Shapiro - 2009 Excerpt (Page 12): Underlying the central dogma and conventional views of genome evolution was the idea that the genome is a stable structure that changes rarely and accidentally by chemical fluctuations (106) or replication errors. This view has had to change with the realization that maintenance of genome stability is an active cellular function and the discovery of numerous dedicated biochemical systems for restructuring DNA molecules.(107–110) Genetic change is almost always the result of cellular action on the genome. These natural processes are analogous to human genetic engineering,,, (Page 14) Genome change arises as a consequence of natural genetic engineering, not from accidents. Replication errors and DNA damage are subject to cell surveillance and correction. When DNA damage correction does produce novel genetic structures, natural genetic engineering functions, such as mutator polymerases and nonhomologous end-joining complexes, are involved. Realizing that DNA change is a biochemical process means that it is subject to regulation like other cellular activities. Thus, we expect to see genome change occurring in response to different stimuli (Table 1) and operating nonrandomly throughout the genome, guided by various types of intermolecular contacts (Table 1 of Ref. 112). http://shapiro.bsd.uchicago.edu/Shapiro2009.AnnNYAcadSciMS.RevisitingCentral%20Dogma.pdf Also of interest from the preceding paper, on page 22, is a simplified list of the ‘epigenetic’ information flow in the cell that directly contradicts what was expected from the central dogma (Genetic Reductionism/modern synthesis model) of neo-Darwinism. "It is difficult (if not impossible) to find a genome change operator that is truly random in its action within the DNA of the cell where it works' James Shapiro - Evolution: A View From The 21st Century - (Page 82) New Research Elucidates Directed Mutation Mechanisms - Cornelius Hunter - January 7, 2013 Excerpt: mutations don’t occur randomly in the genome, but rather in the genes where they can help to address the challenge. But there is more. The gene’s single stranded DNA has certain coils and loops which expose only some of the gene’s nucleotides to mutation. So not only are certain genes targeted for mutation, but certain nucleotides within those genes are targeted in what is referred to as directed mutations.,,, These findings contradict evolution’s prediction that mutations are random with respect to need and sometimes just happen to occur in the right place at the right time.,,, http://darwins-god.blogspot.com/2013/01/news-research-elucidates-directed.html Failed Darwinian Prediction – Mutations are not adaptive – Cornelius Hunter – 2015 Excerpt: In the twentieth century, the theory of evolution predicted that mutations are not adaptive or directed. In other words, mutations were believed to be random with respect to the needs of the individual.,,, But that assumption is now known to be false.,,, (References on site) https://sites.google.com/site/darwinspredictions/mutations-are-not-adaptive
@Andre: All the papers we're discussing use the per generation mutation rate. Because those factors prevent mutations between parent and child, they don't need to factor into these calculations. JoeCoder
JoeC, Durret and Schmidt are talking about specific mutations, with rates u_1 and u_2. It's true that the rate of de-activated mutations will be higher than the nucleotide mutaiton rate, but that's not very relevant. You can put any value for the mutation rates into their equations. Andre, What? Are you claiming mutations don't (!) or shouldn't (?) exist? wd400
Wow suppose so many mutation get past the multiple integrity checks, the multiple repair mechanisms, Apoptosis, necrosis and they still mutate.. Just imagine how that supposedly works.. Right there that is Darwinian evolution Andre
But Durret and Schmidt didn't calculate two specific mutations. Any number of mutations can deactivate a binding site. It's not specific. They write:
this article considers the possibility that in a short amount of time, two changes will occur, the first of which inactivates an existing binding site, and the second of which creates a new one
Or maybe you're talking about something else? JoeCoder
Suppose a mutation rate of 10^-10 per nucleotide per generation. Are you saying that if there is a varying population size, a specific two-mutation combination will arise in less than an average of 2 / 10^20 cumulative reproductions? Not sure if that’s what you’re saying.
Variance in off-spring number but otherwise yes. That's the point of the Durret and Schmidt paper. In your case, and assuming an effective population size of 1e6 you'd have a mean waiting time of 5e8 generations = 5e14 replications. wd400
Just released video from DI Information Enigma - 21 minute video https://www.youtube.com/watch?v=aA-FcnLsF1g Information drives the development of life. But what is the source of that information? Could it have been produced by an unguided Darwinian process? Or did it require intelligent design? The Information Enigma is a fascinating 21-minute documentary that probes the mystery of biological information, the challenge it poses to orthodox Darwinian theory, and the reason it points to intelligent design. The video features molecular biologist Douglas Axe and Stephen Meyer, author of the books Signature in the Cell and Darwin’s Doubt. bornagain77
Nature was designed to "go forth and multiply". Ok, Commanded - but Designed too. And the "you just don't understand Evolution" needs to be replaced by "nobody understands Evolution". It's true. "Evolution is true" reveals an ignoramus. Sorry but it's true. And Truth is important you know. ppolish
But what if the malaria are actively trying to develop resistance?
Indeed. It was designed that way, of course. Some see as evidence for evolution but others see it as designed adaptive behavior. The importance of this is that nothing was designed to evolve from single cell organisms to donkeys and whales. Every species obviously has limits to its ability to adapt. Mapou
JoeCoder, where the insurmountable problem comes in for Darwinists is that, despite the fact that the ‘fittest’ mutations will never fix in a population over evolutionary timescales, embryonic development and metabolic pathways, (to name just two examples), are as ‘fit’ as can possibly be,,,
Seeing the Natural World With a Physicist’s Lens – November 2010 Excerpt: Scientists have identified and mathematically anatomized an array of cases where optimization has left its fastidious mark, among them;,, the precision response in a fruit fly embryo to contouring molecules that help distinguish tail from head;,,, In each instance, biophysicists have calculated, the system couldn’t get faster, more sensitive or more efficient without first relocating to an alternate universe with alternate physical constants. http://www.nytimes.com/2010/11/02/science/02angier.html?_r=2&scp=1&sq=seeing%20the%20natural%20world%20with%20a%20physicist%27s%20lens&st=cse Optimal Design of Metabolism – Dr. Fazale Rana – July 2012 Excerpt: A new study further highlights the optimality of the cell’s metabolic systems. Using the multi-dimension optimization theory, researchers evaluated the performance of the metabolic systems of several different bacteria. The data generated by monitoring the flux (movement) of compounds through metabolic pathways (like the movement of cars along the roadways) allowed researchers to assess the behavior of cellular metabolism. They determined that metabolism functions optimally for a system that seeks to accomplish multiple objectives. It looks as if the cell’s metabolism is optimized to operate under a single set of conditions. At the same time, it can perform optimally with relatively small adjustments to the metabolic operations when the cell experiences a change in condition. http://www.reasons.org/articles/the-optimal-design-of-metabolism
Considering the extreme integrated complexity being dealt with in embryonic development and in metabolic pathways, this 'optimal as can possibly be' is certainly NOT a minor discrepancy between what the empirical evidence tells us and what we can reasonably expect from unguided, (i.e. Darwinian), material processes. In fact, I hold the study to be yet another strong empirical falsification of Darwinian claims. And by the way JoeC, thanks again for citing, and defending, the Sanford paper. I have enjoyed your input very much today! bornagain77
@ba77 I hope I'm not being too critical here. In another thread I found the part about the Bubonic plague and gene loss interesting. Saved it to my notes. So thank you for that. JoeCoder
@BA77 I took a look at the "The researchers found that the ‘fittest’ simply did not have time to be found, or to fix in the population" paper. The authors write:
Overall, our simulations show how the more frequent phenotype p1 can fix at the expense of the more fit phenotype p1. Given the many orders of magnitude difference possible between the Tp [waiting time for p2 compared to p1], such an “arrival of the frequent” effect may prevent the arrival of the fittest: If a highly beneficial phenotype is never discovered, a much less adaptive but easily accessible phenotype may go to fixation instead.
In other words, if there are lots of mutational paths that lead to phenotype p1, but few paths to p2, then p1 is more likely to fix than p2, even if p2 is more fit. I don't see how that negates wd400's statement:
I have said many times fitness is central to evolution biology
Granted it's technically both fitness and frequency of an allele, but that seems like a silly thing to call error on. However it does fit well with Behe's "break or blunt" thesis. Since disabled genes can sometimes be beneficial, and there are many ways to disable a gene but very few to improve it. JoeCoder
But what if the malaria are actively trying to develop resistance? Virgil Cain
wd400, you would not know real science if it bit you on the rear end. You have no empirical evidence nor mathematical basis for your grandiose claims for Darwinian evolution period. (as the paper I cited illustrates). Your stupid 'you just don't understand evolution' crap is very old hat. In fact, I hold that you are completely insane and delusional for believing, with no empirical evidence whatsoever, that unguided material processes can produce functional coding and integrated complexity that far, far, far, outclasses anything ever produced and designed by man. In fact, it is probably an insult to insane and delusional people to compare them to you since at least they are being honest in their delusions whereas I hold, since you continue to refuse to deal honestly with the evidence, that you are being purposely, willfully, deceptive in your insane belief. Other than that, I hope you have a nice evening. :) bornagain77
Thank you for responding again wd400. In the Malaria paper Summers, et al write:
the minimal requirement for (low) CQ transport activity was N75E [Asn->Glu] and K76T [Lys->Thr] in PfCRTDd2 and K76T [Lys->Thr] and N326D [Asn->Asp] in PfCRTEcu1110. Given that all known PfCRT haplotypes contain either N75E/D or N326D (13), these results indicate that PfCRT acquires the ability to transport CQ via one of two main mutational routes, both of which entail the introduction of K76T plus the replacement of an asparagine (N75 or N326) with an acidic residue.
So to get any chloro-quine resistance you need at least two mutations. And there are two different ways to get those two mutations. I would think getting the first two requires something around 10^20 reproductions, then the remaining 4 to 10 stepwise-increasing mutations (on any of the possible paths) are very easy and probably take a trillion or so. Sanford's 84 million years is also testing for two mutations that must both be present, so that part is the same. Or 42m years if you want either path. But Sanford's 84 million years in a population of 10,000, which comes to a cumulative total far less than 10^20. I think the difference is Sanford's model accepts a matching sequence anywhere on a chromosome. If I'm reading his paper correctly, they start with a chromosome of all A's and wait for a sequence of two matching letters. While Malaria requires mutations at very specific locations--hence a much longer waiting time.
The ‘line em up and cut off the bottom” method is not much easier/faster to run, so I’m not sure why they’d do it.
That approach is far more generous than reality, since you're removing much of the randomness from selection. I'd think a more accurate approach would add to the waiting time. I'm also not sure why Sanford did it that way.
When the number of offspring is variable we can have genetic drift which means a neutral allele can multiply present more “targets” for mutation.
Suppose a mutation rate of 10^-10 per nucleotide per generation. Are you saying that if there is a varying population size, a specific two-mutation combination will arise in less than an average of 2 / 10^20 cumulative reproductions? Not sure if that's what you're saying. JoeCoder
BA, I suspect you are going to ignore this or think I'm not being genuine, but here goes: The reason I don't reply to you is that I think your obsession with evolutionary biology and the idea it's some "materialist" conspiracy (and your huge library of copy-paste articles) is at the very least a huge waste of time that could be spend on something profitable. For that reason I don't wish to encourage you to spend more time on it. It seems you just spent time googling up my past posts in order to repeat your mistaken interpretation of a paper that really doesn't relate to this thread at all. Knowing that it's very unlikely that I'd reply. As I say, you can dismiss me if you like, but I think you can probably spend you time on better things that this, and I hope that you do. (needless to say, I'm very unlikely to reply to any more comments you make) wd400
JoeC, Very briefly, With the malaria protein, some of the intermediate steps are apparently-adaptive, so we aren't in Sandford's 84 million year scenario there. (Otherwise require a step down in fitness, so quite how those playout is less clear). With your own calculation -- I think that's conditioned on a constant number of offspring being produced by all invividuals (a doubling in your case). When the number of offspring is variable we can have genetic drift which means a neutral allele can multiply present more "targets" for mutation. Unfortunately to calculate that probability you need more than a little calculus. (The calculation would go under the heading "Diffusion approximation of the Fisher Wright model" if you want to check them out). The normal way to model selection is to assign genotypes with a fitness that is the probability that they produce successful offspring. The 'line em up and cutt of the bottom" method is not much easier/faster to run, so I'm not sure why they'd do it. None of these differences are going to change the fact rare mutations are rare, of course. wd400
“I have said many times fitness is central to evolution biology” – wd400 https://uncommondesc.wpengine.com/ddd/darwinian-debating-devices-5-moving-goalposts/#comment-519198 read slowly if it helps wd400 Study demonstrates evolutionary ‘fitness’ not the most important determinant of success – February 7, 2014 – with illustration The researchers found that the ‘fittest’ simply did not have time to be found, or to fix in the population over evolutionary timescales. http://phys.org/news/2014-02-evolutionary-important-success.html bornagain77
Reality is only a crutch for people who can't cope with evolution/the multiworld/retro-engineering from unintelligent design, etc. I like that last little epigrammatic evocation of their lunacy, if I say so myself. Axel
Definitely, Monty Python material, BA! Axel
there are plenty of approximately neutral paths through protein space
I remember all the discussion about that malaria paper but hadn't read it beyond the abstract. I like the way they illustrated figure 3. But if I'm reading it correctly it looks like each step only has 1-3 possible paths? Dividing Sanford's 84 million years by 3 doesn't do much to help. I come from a software engineering background and that makes me very skeptical that lineages can easily just neutral their way from one adaptation to the next. I'm sure you've seen that Tim White says it takes 10^20 malaria to evolve chloro-quine resistance: "If two drugs are used with different modes of action, and therefore different resistance mechanisms, then the per-parasite probability of developing resistance to both drugs is the product of their individual per-parasite probabilities. This is particularly powerful in malaria, because there are only about 10^17 malaria parasites in the entire world. For example, if the per-parasite probabilities of developing resistance to drug A and drug B are both 1 in 10^12, then a simultaneously resistant mutant will arise spontaneously every 1 in 10^24 parasites. As there is a cumulative total of less than 10^20 malaria parasites in existence in one year, such a simultaneously resistant parasite would arise spontaneously roughly once every 10,000 years" And I don't remember all the details from the Moran-Behe debate, but Larry Moran did agree that: "The probability of any single mutation occurring is equal to the mutation rare, which is about 10^-10. The probability of an additional specific mutation occurring is also 10^-10. The combined probability of any two specific mutations occurring is 10^-20... Let's say that three specific mutations are required to change from a cluster of two needles to a cluster of five needles. One hundred million years ago you could calculate that the probability of three specific mutations is about 10^-30. It's highly improbable, just like the specific bridge hand. When such a triple mutation arises we recognize that it was only one of millions and millions of possible evolutionary outcomes." About a year ago, you and I also worked out that on average you need a population that's the inverse square of the mutation rate before you get a two-step mutation with a neutral intermediate. 10^20 is roughly equal to the total number of all mammals that have ever lived.
we need only look at the diversity of own genomes to know this
That takes the premise that all functional variations in our genomes arose through unguided mutation+selection. If I read you correctly, the conclusion of your argument is that all functional variations in our genome arose through unguided mutations. That's obviously circular, but it's quote possible I'm reading you wrong and you meant something else here?
sets up very strange model of selection without even discussing why they didn’t use a more standard one what difference a soft selection regime would make.
Can you explain what you mean here? I don't know the specifics of various selection models. They simulate: 1. 4 offspring per generation, 2. two selected away to maintain constant population size 3. a selective benefit of 10% to any member having both nucleotides That all seems reasonable, and I would guess most beneficial mutations have a selection coefficient much smaller than 10%.
how often are adaptive traits limited to these ultra-specified paths with no pay off until the end?
I agree that's the key question--and why I said that the sparsity of protein space (few functional proteins) makes me think that paths are rare. But again I don't know a way to quantify this. JoeCoder
wd400, 'that paper demonstrates the ways in which ID is disconnected from the science is criticizes" neo-Darwinism is certainly NOT a 'science'. To be correct, it would have been more proper for you to say 'the pseudo-science it criticizes'. There simply is no solid empirical, nor rigid mathematical, basis available that you can cite, despite your bluffs to the contrary, that can establish neo-Darwinism as a proper science instead of the story telling pseudo-science that it is. References will be provided upon your denial of that cold hard fact! bornagain77
Interesting JoeC, but (on first reading) that paper demonstrates the ways in which ID is disconnected from the science is criticizes The paper sets up this odd special case to test (the rate at which ultra-specific nucleotide strings arrive by mutation and are fixed), throws in some misunderstanding about human and chimp genomes (no on claims most of the differences between our genomes are the result of selection, so the share numbers of differences and very relevant, the 5% figure includes indels which are not modeled here...) then sets up very strange model of selection without even discussing why they didn’t use a more standard one what difference a soft selection regime would make.
Of course the big question they don’t answer is–how many two-mutation paths exist? Based on the sparsity of protein space I’d wager very few. But I don’t think we yet have a quantifiable answer. One step at a time :)
I’m not sure exactly what you mean here, but there are plenty of approximately neutral paths through protein space (we need only look at the diversity of own genomes to know this). This paper does a really neat job of enumerating adaptive paths through an important drug resistance protein in malaria. Adaptations that need specific multiple mutations will obviously be harder to "find", and moreso in organisms with small census population sizes. The more interesting question is whether that matters very much -- how often are adaptive traits limited to these ultra-specified paths with no pay off until the end? wd400
Thanks JoeCoder:
The waiting time problem in a model hominin population - 2015 Sep 17 John Sanford, Wesley Brewer, Franzine Smith, and John Baumgardner Excerpt: The program Mendel’s Accountant realistically simulates the mutation/selection process,,, Given optimal settings, what is the longest nucleotide string that can arise within a reasonable waiting time within a hominin population of 10,000? Arguably, the waiting time for the fixation of a “string-of-one” is by itself problematic (Table 2). Waiting a minimum of 1.5 million years (realistically, much longer), for a single point mutation is not timely adaptation in the face of any type of pressing evolutionary challenge. This is especially problematic when we consider that it is estimated that it only took six million years for the chimp and human genomes to diverge by over 5 % [1]. This represents at least 75 million nucleotide changes in the human lineage, many of which must encode new information. While fixing one point mutation is problematic, our simulations show that the fixation of two co-dependent mutations is extremely problematic – requiring at least 84 million years (Table 2). This is ten-fold longer than the estimated time required for ape-to-man evolution. In this light, we suggest that a string of two specific mutations is a reasonable upper limit, in terms of the longest string length that is likely to evolve within a hominin population (at least in a way that is either timely or meaningful). Certainly the creation and fixation of a string of three (requiring at least 380 million years) would be extremely untimely (and trivial in effect), in terms of the evolution of modern man. It is widely thought that a larger population size can eliminate the waiting time problem. If that were true, then the waiting time problem would only be meaningful within small populations. While our simulations show that larger populations do help reduce waiting time, we see that the benefit of larger population size produces rapidly diminishing returns (Table 4 and Fig. 4). When we increase the hominin population from 10,000 to 1 million (our current upper limit for these types of experiments), the waiting time for creating a string of five is only reduced from two billion to 482 million years. http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4573302/
Quite the finding. :) As to codependent mutations, Behe addressed that, somewhat, here:
Kenneth Miller Steps on Darwin's Achilles Heel - Michael Behe - January 17, 2015 Excerpt: Enter Achilles and his heel. It turns out that the odds are much better for atovaquone resistance because only one particular malaria mutation is required for resistance. The odds are astronomical for chloroquine because a minimum of two particular malaria mutations are required for resistance. Just one mutation won't do it. For Darwinism, that is the troublesome significance of Summers et al.: "The findings presented here reveal that the minimum requirement for (low) CQ transport activity ... is two mutations." Darwinism is hounded relentlessly by an unshakeable limitation: if it has to skip even a single tiny step -- that is, if an evolutionary pathway includes a deleterious or even neutral mutation -- then the probability of finding the pathway by random mutation decreases exponentially. If even a few more unselected mutations are needed, the likelihood rapidly fades away.,,, So what should we conclude from all this? Miller grants for purposes of discussion that the likelihood of developing a new protein binding site is 1 in 10^20. Now, suppose that, in order to acquire some new, useful property, not just one but two new protein-binding sites had to develop. In that case the odds would be the multiple of the two separate events -- about 1 in 10^40, which is somewhat more than the number of cells that have existed on earth in the history of life. That seems like a reasonable place to set the likely limit to Darwinism, to draw the edge of evolution. http://www.evolutionnews.org/2015/01/kenneth_miller_1092771.html
Quotes of note:
"The immediate, most important implication is that complexes with more than two different binding sites-ones that require three or more proteins-are beyond the edge of evolution, past what is biologically reasonable to expect Darwinian evolution to have accomplished in all of life in all of the billion-year history of the world. The reasoning is straightforward. The odds of getting two independent things right are the multiple of the odds of getting each right by itself. So, other things being equal, the likelihood of developing two binding sites in a protein complex would be the square of the probability for getting one: a double CCC, 10^20 times 10^20, which is 10^40. There have likely been fewer than 10^40 cells in the world in the last 4 billion years, so the odds are against a single event of this variety in the history of life. It is biologically unreasonable." - Michael Behe - The Edge of Evolution - page 146 Swine Flu, Viruses, and the Edge of Evolution - Casey Luskin - 2009 Excerpt: “Indeed, the work on malaria and AIDS demonstrates that after all possible unintelligent processes in the cell–both ones we’ve discovered so far and ones we haven’t–at best extremely limited benefit, since no such process was able to do much of anything. It’s critical to notice that no artificial limitations were placed on the kinds of mutations or processes the microorganisms could undergo in nature. Nothing–neither point mutation, deletion, insertion, gene duplication, transposition, genome duplication, self-organization nor any other process yet undiscovered–was of much use.” Michael Behe, The Edge of Evolution, pg. 162 http://www.evolutionnews.org/2009/05/swine_flu_viruses_and_the_edge020071.html
@wd400 - Nice to see you again. It's a small number, but I'm starting to see more ID papers in "regular" journals than in BioComplexity. Like this one from John Sanford and crew last month that used a simulation in Mendel's Accountant to add more support for having a very long waiting time to get two specific mutations. Of course the big question they don't answer is--how many two-mutation paths exist? Based on the sparsity of protein space I'd wager very few. But I don't think we yet have a quantifiable answer. One step at a time :) JoeCoder
Axel, and now, as Monty Python would say, for something completely different:
Fecal mimicry found in seeds that fool dung beetles - October 6, 2015 Excerpt: The researchers found that over the course of a single day, dung beetles moving through the area had grabbed approximately half of the seeds and rolled them to nearby locations, where they subsequently buried them. Dung beetles, as their name implies, normally grab animal droppings and bury them for eating later and for using as a place to lay their eggs. After making the recordings, the researchers dug up all the seeds that had been buried by the beetles and found no trace of dung beetles around, nor any sign of eggs being laid, suggesting the beetles only discovered the ruse after attempting to eat them or when the time came to lay eggs. Thus, the group surmised that the dung beetles had been fooled into carrying the seeds to a distant locale and planting them and had received no reward whatsoever for their efforts. Upon inspection, the researchers noted that the seeds looked a lot like bontebok (a type of antelope) dung—a closer look also revealed that the chemical composition of the seeds closely resembled dung as well—which the team suggests means, the seeds smelled enough like dung samples to fool the beetles. http://phys.org/news/2015-10-fecal-mimicry-seeds-dung-beetles.html
:) bornagain77
'Darwin’s followers are more apt to believe their own storytelling than reality.' But if pushed, they'll admit reality could be an interesting concept...in principle.... but, but.. in the unreal world.... Axel
wd400- How many papers are published tat support evolutionism? And just where is unguided evolutionary biology? Evolutionary biologists can't even answer any questions pertaining to how biological systems and subsystems evolved. Natural selection can't even be modeled. Unguided evolution is a useless heuristic. So here we are, with a useless heuristic and no answers. Nicely done. Virgil Cain
Its impossible for there to be a decision using the constitution to censor truth in education. IMPOSSIBLE. The Judges are incompetent and forget them. History will. Take them back to court again and in front of america demand all censorship is illegal when dealing with intellectual conclusions about something in the universe. If they say ID/YEC is religion and censor it in subjects on truth in origins then SAY the gov is saying same ID/YEC religious conclusions are wrong. The constitution does not give them this order or allow the gov to say religious conclusions are wrong. Drag them into court until the bad guys in history join the other bad guys in history in infamy who silence truth. Then come to Canada which makes historic pretence to freedom. Robert Byers
wd400, how many new proteins have been created by unguided material processes over the last 4 decades? The answer is Zero! see Behe: The First Rule of Adaptive Evolution How many proteins can we ever expect to be created by unguided material processes over the entire history of the universe. The answer again is Zero! see Axe, Gauger It’s a bit hard to credit the claim that Darwinism is a proper science when is has no real time experimental evidence, nor even a rigid mathematical basis, to back up its claim. bornagain77
How many papers has Bio-Complexity published this year (is seems like none)? How many has the "Biologic institute" published? It's a bit hard to credit the claim ID is an actual research program... EDIT: It's also useful to consider movements own predictions. We are fast coming up to end of the decade in which Dembski claimed evolutionary biology would be "dead". Yet here we are... wd400

Leave a Reply