Uncommon Descent Serving The Intelligent Design Community

What are the speed limits of naturalistic evolution?

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

What are the speed limits of naturalistic evolution? We know from experience it takes time to evolve a species. Would naturalistic evolution be fast enough in geological time to turn a cow into a whale, an ape-like creature into a human? What are the speed limits of evolution?

To give an illustration of just how hard it might be to evolve a population, consider that there are about 6.5 billion people on the planet geographically dispersed. Suppose a single advantageous mutation (say a single point mutation or indel) occurred in a single individual in one of those 6.5 billion people. How long would it take for that mutation to propagate such that every human on the planet had this mutation? It would be a pretty long time. Sufficient numbers of people would have to have their descendants exchange genes for centuries. And this measly change is but one nucleotide in 3,500,000,000 base pairs!

The Darwinists will argue, “but that wasn’t the way it was in the past, it was easier to evolve things millions of years ago.” Perhaps. Evolving a large geographically dispersed population is a colossal problem for Darwinian evolution as you can see. Thus (using DarLogic) since Darwinian evolution is true (cough), we must assume this implies populations in the past were much smaller and “well-stirred” (meaning geographic barriers are dismissed and every individual has the same chance of mating with anyone else in the population). Bear in mind also, the population can’t be too small either, since evolution needs a certain number of individuals to be generating a sufficient number of beneficial mutations.

Haldane

So given optimal conditions, how fast could we evolve a population? Haldane (pictured above), suggested that on average, 1 “trait” per 300 generations could be fixed into a population of mammals. In the modern sense, we can take this “trait” to even be a single nucleotide [in the traditional sense we look for phenotypic traits, but the problem of evolving single nucleotide in the genome still remains, thus for the sake of analysis a single nucleotide can be considered something of a “trait”].

But such change is obviously too slow to account for 180,000,000 differences in base pairs between humans and chimps. [chimps have about 180,000,000 base pairs more DNA than humans, if anyone has better figures, please post]. This poses something of a dilemma for the evolutionary community, and this dilemma has been dubbed, “Haldane’s dilemma”. If Haldane’s dilemma seems overly pessimistic, ponder the example I gave above even for a smaller population (say 20,000 individuals within a 200 mile radius ). In light of this, 1 nucleotide per 300 generations might not seem like a stretch. If anything, Haldane’s dilemma (even by his own admission) seems a bit optimistic!

Various solutions have been explored to Haldane’s dilemma, such as multiple simultaneous nucleotide substitutions. But such “solutions” have their own set of fatal problems. One could make a good case, Haldane’s dilemma has never been solved, nor will it ever be….

Motoo Kimura
And if Haldane’s dilemma were not enough of a blow to Darwinian evolution, in the 1960’s several population geneticists like Motoo Kimura demonstrated mathematically that the overwhelming majority of molecular evolution was non-Darwinian and invisible to natural selection. Lest he be found guilty for blasphemy, Kimura made an obligatory salute to Darwin by saying his non-Darwinian neutral theory “does not deny the role of natural selection in determining the course of adaptive evolution”. That’s right, according to Kimura, adaptive evolution is visible to natural selection while simultaneously molecular evolution is invisible to natural selection. Is such a position logical? No. Is it politically and intellectually expedient? Absolutely….

The selectionist viewpoint is faced with Haldane’s dilemma. But does the neutralist viewpoint (Kimura) have an alternative mechanism that will get around Haldane’s selectionist dilemma? It seems not, as neutral theory has other sets of problems.

What has since resulted has been a never ending war between the selectionists and neutralists. The selectionists argue that natural selection shaped the majority of molecular evolution and the neutralists argue natural selection did not. Each warring camp finds fatal flaws in the ideas of their opponent. The neutralists rightly argue from first principles of population genetics that selection did not have enough resources to evolve billions of nucleotides, and the selectionists rightly point out that large amounts of conserved sequences fly in the face of neutralist theories. The net result is that both camps demonstrate that they are both dead wrong.

To make matters worse, there are even more dilemmas to deal with such as Nachman’s U-Paradox. Looming on the horizon, and even more problematic would be the fact DNA might only be a FRACTION of the actual information that is used to create living organisms. This idea of the organism (or at least a single cell) as being the totatlity of information versus only the DNA was suggested by one of the finest evolutionary biologist on the planet, Richard Sternberg. He argues his case in the peer-reviewed article Teleomorphic Recursivity. And by the way, these discussions of selectionist speed limits assumes the multitude of Irreducibly Complex structures in biology are somehow visible to natural selection….

What then will we conclude if we find functionality in those large regions of molecules which evolved independent of natural selection? How do we account for designs that cannot possibly be the result of natural selection? Can we attribute them to the random chance mechanisms of neutral theory? Unlikely. Evo-devo might offer some relief, but the proponents of Evo-Devo do not yet seem to realize that even if they are right, the ancestral life forms might have to be in a highly improbable, specified state, exactly the kind of state that suggests front-loaded intelligent design.

I’m opening this thread to continue a discussion of these and other topics which I also raised at PandasThumb in this thread. I found a commenter named Caligula who gave very substantive criticisms to my ideas in a precise and technical manner, and which I found worthy of giving a fair and civil hearing here at UD. I also invited the authors at PT to air their objections here (with the exception of PvM who has been banned). If Caligula and I must take the discussion outside of UD, we will be glad to, but I thought the topics would be of interest and educational to readers of both weblogs.

With that, I’ll just let the conversation continue in the comment section as we try to answer the question, “what are the speed limits of naturalistic evolution?”

Salvador
PS Two books relevant to this discussion by ID proponents are Genetic Entropy by respected Cornell geneticist John Sanford.

Genetic Entropy

and The Biotic Message by Electrical Engineer and pioneer of Discontinuity Systematics, Walter ReMine.

Biotic Message

Comments
[...] In What are the speed limits of naturalistic evolution?, I pointed out: And if Haldane’s dilemma were not enough of a blow to Darwinian evolution, in the 1960’s several population geneticists like Motoo Kimura demonstrated mathematically that the overwhelming majority of molecular evolution was non-Darwinian and invisible to natural selection. Lest he be found guilty for blasphemy, Kimura made an obligatory salute to Darwin by saying his non-Darwinian neutral theory “does not deny the role of natural selection in determining the course of adaptive evolution”. That’s right, according to Kimura, adaptive evolution is visible to natural selection while simultaneously molecular evolution is invisible to natural selection. Is such a position logical? No. Is it politically and intellectually expedient? Absolutely…. [...] Prominent NAS member trashes neo-Darwinism | Uncommon Descent
Here's a paper by Grant and Flake from 1974 that addresses "soft selection". http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=434284 Go to the "Full Text" section and click on "complete text". Remine, in his Appendix, completely dismantles the argument Grant and Flake make. PaV
Also, for the general case that ReMine formulates, the limit is based on reproductive excess rates and the assumption of serial substitution (i.e. no two substitutions overlap during the process) Atom
PaV: In his clarification paper ReMine provides all the formulas necessary to derive Haldane's formula for the specific case and calculate the numbers. You seem pretty adept at understanding this topic, would you take a swing at calculating the numbers? Atom
Sal: "I’m tempted to say “soft selection” is thus superflous (no added insight) to the idea of changes in gene frequency. At issue is whether it leads to faster substitution rates, and whether Haldane’s model includes population behavior that can be characterized as soft selection." Here's a reference to "soft selection": http://www.blackwellpublishing.com/ridley/a-z/Hard_&_soft_selection.asp In a nutshell, as you'll find on Ridley's citation above, if you have N adults in a population, with each female giving 2F eggs on average, then FN-N progeny must be eliminated to keep the population at a constant size. So, then, the question is this: does "hard selection" (the kind that is due completely to lesser fitness) elimnate all the FN-N, or does "soft selection" (the kind that is due to 'density-dependent' factors such as disease and competition for food and space which whittle down 'reproductive excess')? Obviously, it's a combination of both. Prescinding from this proposed division between hard and soft, why don't we just look at the big picture here. Haldane finds out about Kettlewell's experiment. At last, he thinks, evolution that can be seen right before our eyes. He then reasons that since this is so, we can rightly assume this is as 'fast' as evolution can go. He then develops an equation to address Biston betularia and suggests that this represents a fitness of 0.31, meaning that only 31 out of 100 organisms will survive to the juvenile state (which is a quite high loss of reproductive capacity) corresponding to 43 generations. I think the starting point here is that nature itself is setting an upper limit on the speed of evolution, which is 43 generations, since no other examples of this type occur. (As I pointed out to Caligula above, the Galapagos finches happens so fast that it cannot be evolution at work). The problem then becomes this: how does Haldane go from 43 generations (fitness= 31 out of 100), to 300 generations (fitness =90 out of 100). Here's all he says in the paper: "I doubt if such high intensities of selection (speaking of Kettlewell's experiment) have been common in the course of evolution. I think n=300, which would give I=0.1, is a more probable figure." This is not a lot to go on. But that's where Walter Remine's latest paper comes in. In it he re-interprets Haldane using his own concept of "reproductive excess"; but he, I don't believe, has not given us numbers yet. I think he's working on that as we speak. PaV
Reading through ReMine's paper, I notice he sets out to characterize the Cost of Substitution of a new trait by focusing on the new trait reproductive excess, without having to take into account the reproduction of the old trait. This simplification simply asks "to get a trait from one individual to one million, how much reproductive excess is needed per generation, given N generations?". From there, he shows that this can usefully set a speed limit on evolution (or actually on any form of replication.) Dealing with soft-selection he writes: "These optimally low costs require the trait to have a constant growth rate throughout all generations. Since nature does not provide this constancy, real cases will always have higher costs. Also, if the new trait decreases even momentarily, then the total cost increases, because some costs will be incurred more than once. Some theorists believe cost problems can be solved by a non-constant growth rate—such as frequency- or density dependent fitness, as employed in soft selection. But that does not reduce the problem, at least not for single substitutions, as shown above. Rather, constancy is required to minimize the problem, as it allows the lowest possible total cost for a substitution of any given duration." (From http://saintpaulscience.com/CostTheory1.pdf page 5) Atom
The posts seemed mixed up right now. I assume the following quote--not seen before--is from Caligula: "It is this density effect that soft selection is based on. When e.g. 6 offspring are produced per couple, and the ecological niche is filled by the population, unavoidably 4 of 6 the offspring will die. Now, Haldane would apparently assume that these 4 offspring somehow always die before selection is applied. But this hardly holds universally. More often selection (e.g. predation) and background mortality (e.g. sheer starvation) apply in parallel. And when they do, selection reduces background mortality, by reducing density." I think you're looking at it from one point of view only, and not from the point of view that Haldane was looking at it. What I mean is that you're looking at how selection is effecting background mortality and not the other way around; i.e., how background mortality effects selection, which is exactly what Haldane did. In passing, he stated the obvious (given that most predation, diseases, etc., affect individual organisms randomly, and thus NOT selectively): that density-dependent factors "slow down" evolution, i.e., the randomness of death that DDF involve slow down the selection of a particular loci. Haldane, impressed with this example from nature of "evolution in action", wants an answer to the question of 'how fast can evolution act'. His answer: 300 generations per loci. (And, by extension, more generations than this if DDF are involved.) PaV
Soft selection is an important topic. I don't even have it explicitly stated as "theory" in my population genetics book by Hartl and Clark. Any one have a a recommendation? If I may offer a quick speculation as to why. Say a population of 30 has the following:
10 individuals with trait A at loci #1 10 individuals with trait B a tloci #2 10 individuals with trait C at loci #3
Each set of 10 individuals is disjoint from the other set. Let's suppose all the individuals with Trait B at Loci 2 fail to reproduce to the next generation. Whatever happens to individuals with trait A and C might be charactersized as soft selection or change in gene frequencies, which is well described by existing PopGen literature. I'm tempted to say "soft selection" is thus superflous (no added insight) to the idea of changes in gene frequency. At issue is whether it leads to faster substitution rates, and whether Haldane's model includes population behavior that can be characterized as soft selection. Another thing: how can soft selection be modelled? Trait B not reproducing can alternatively be modelled as Trait B being deleterious or causing a reduction in fitness with respect to a genotype! Some of these discussions remind of how in math and physics we regularly picked the most convenient coordinate system to solve problems. There was the problem of which perspective was the most practical, not that any perspective (as long as faithful to the problem at hand) was wrong. One thing I sense is that we are confronted with a similar problem here. Is soft-selection another "coordinate system" for something we have already dealt with or has been implicitly explored in existing literature???? scordova
Sal, you're welcome. What those who don't have the .pdf paper don't realize is that it is a digitalized copy of his original paper, which is a bit grainy to start with. So a copy of the pdf, well that must have been some kind of bad. Hope your eyes get better! ;) PaV
Caligula: "A locus (pl. loci) is a location in coding DNA where diploid organisms have two gene alleles, one in each strand." It's clear from Haldane's paper that a 'loci', to him, was an "allel" (sic. his spelling); I just wanted to side-step the whole SNP's versus an "allel" type discussion. "There is no biologically sound reason to always assume that, when population size remains constant, (most of) reproductive excess is randomly removed before selection is applied. Rather, density-dependent deaths like starvation may well take place in parallel with selection." Indeed, Haldane acknowledges that selection and density-dependent factors can act in parallel. He points out that even density-independent factors and selection can act in tandem, and giving an example of this for Biston betularia. However, your saying, "There is no biologically sound reason to always assume that, when population size remains constant, (most of) reproductive excess is randomly removed before selection is applied," is completely puzzling to me since, as I read the paper, the very reason that Haldane uses juveniles is--as I pointed out in my last post--so that his equations will highlight evolution working at its fastest. IOW, if he included the diminishment due to density-dependent factors into his equations, then the "intensity of selection" would have to be less, which in turn means "n" would have to be even higher than 300, which here you've already argued is too high. Hence, my puzzlement. So I think you should be happy that he keeps it out (of course, I'm also sure this made his equations a lot easier to handle!). "However, I must defend Walter’s cause here. I can’t remember if 20 years is the correct time interval. But since it is the average beak size that oscillates, my guess is that the frequencies of competing alleles oscillate at one or more loci, without any complete substitution ever taking place." I think I'm in relative agreement with you on this point. I think there is, in fact, an "oscillation" taking place. However, we're dealing--from what has been reported--with "fixation", since I don't remember the Grants saying that, e.g., 55% of those formerly with small beaks how have larger size beaks, but that a complete transition from smaller to larger had taken place; i.e., 100% of the smaller were now larger. This means that "evolution" is not taking place, but something else entirely, since, for 10 generations, as I calculated it in the prior post, means 95% of one generation of such finches would die off--a fact that would have been splashed all over the headlines of newspapers the world over. Even if we take n=20 (generations=years), fitness=22%. IOW, 78% of the finches would have to die off each year. Again, this is stuff for headlines, wouldn't you agree? I thank you for your questions PaV, but it really is time for me to say goodbye to this forum." Sorry to see you go; I was ready to propose that we leave Haldane behind and head right to Remine's paper since he deals with the kinds of "density-dependent" factors that Haldane leaves off the plate. I, thus, thought we might, using Remine's paper, circumvent some of the uncertainty that Haldane's paper doesn't explicityl resolve. As to PT: well, you can't have a civil discussion over there, so you won't find me going there. PaV
I would very much appreciate a simple example of "soft-selection" worked out with numbers, since you feel it solves Haldane's Dilemma. In my mind, I seem to think that differential reproduction and replacement is what the dilemma deals with, so soft-selection doesn't help, but maybe I'm understanding it wrong. Perhaps you can clarify what I misunderstand. Say you start with 10 replicators. That is your population. One of them gets a variant trait and now will begin to spread that trait throughout the population, if the trait helps it to do so by upping its replication rate, by allowing it to produce a net of more copies than its 9 other competitors. Let's say the replicator with the good trait has 4 offspring, and they all have the trait, and all reach the reproduction stage. Without the trait you are only likely to have 2 offspring reach that stage. In my mind, this would allow the replicators with the trait to eventually replace those without it. Now, when you bring in soft-selection, from my undertsanding what you're saying is that instead of the replicator in question "breeding true" the trait, perhaps half his copies have the trait and half don't. Those with it survive better and reach replication age, and the process repeats. This, to me, would actually hinder the spread of the trait, not speed it up. Perhaps you can claify what you mean. If you instead mean that the replicator has 6 copies, 2 of which would be eliminated by environment or other causes, and only those with the trait would remain (they are "soft-selected".) I still end up with a net of 4 copies with the trait, ready to spread in the population. So unless soft-selection can raise the relative number of copies per replicator with a trait (i.e. improve fitness) contra competitor replicators, I don't see how it can help the problem, let alone solve it. Which is why I request your clarification. Atom
PaV, Thanks for the link to Haldane's paper!!!! My copy is a 2nd Hand PDF of scanned xerox copy. I could hardly read it! Sal scordova
Caligula, many lurkers here (like myself) are learning from your posts and enjoying the open calm discussion of the problem at hand. Your prescence here is greatly appreciated, I would love it if you continued. Atom
PaV: "for the fixation of what, in Haldane’s day, they called a loci" A locus (pl. loci) is a location in coding DNA where diploid organisms have two gene alleles, one in each strand. Humans are currently estimated to possess 20,000-25,000 loci. The term locus is still in wide use (at least in Finland, it is introduced in high school biology). A substitution is the fixation of a new beneficial allele at a locus. "We have to remember that Haldane was focusing ONLY on juveniles" And he wouldn't have allowed us to go any further. Haldane uses a common oversimplification where viability is tested entirely during the juvenile phase. After that, survivors get the whole "cake", i.e. they get to reproduce in peace and refill the population size with their offspring. However, Haldane's discussion on density does involve two successive phases: 1. "background mortality" applied randomly, and 2. viability test or selection. This may apply well to moths, but it is hardly considered to apply universally. There is no biologically sound reason to always assume that, when population size remains constant, (most of) reproductive excess is randomly removed before selection is applied. Rather, density-dependent deaths like starvation may well take place in parallel with selection. As for your example on finches. I think such attempts, if successful, would mainly cause headache for e.g. Walter ReMine. If we truly did know from direct observation that substitutions easily take place in mere 10 generations, ReMine's "magic number" would be devastated; it is Walter who insists that the 300 generation limit is universal and unavoidable, not modern population genetics. However, I must defend Walter's cause here. I can't remember if 20 years is the correct time interval. But since it is the average beak size that oscillates, my guess is that the frequencies of competing alleles oscillate at one or more loci, without any complete substitution ever taking place. You should verify it from research, of course. I thank you for your questions PaV, but it really is time for me to say goodbye to this forum. You can reach me at PT if you wish to. caligula
PaV: Kettlewell....finches
I would argue that even these examples are suspect. In these cases there was not overtake, the "favored" trait did not have a monopoly. How long would it have taken, if ever, for overtake to happen???? Darwinists have used this to try to illustrate rapid change, change in freuqency yes, but not a fixation event. These are inappropriate counter-example to Haldane's dilemma. Haldane's dilemma involves fixation, these cases were not fixation events. scordova
Sal, I do have Haldane's paper. It can be downloaded free in .pdf format. http://www.blackwellpublishing.com/ridley/classictexts/haldane2.pdf I do have a degree in biology, but I've never used it. Population genetics is something I've looked at trying to figure out how evolution might solve the macroevolution process, but no more. But I must add, Haldane did presume an "intensity of selection" of 0.1, as Caligula points out; which, in turn, means he selected the figure of 300 generations for the fixation of what, in Haldane's day, they called a loci. Caligula says that the 0.1 intensity is way too low. Yet, I think we have to value Haldane's perspective somewhat. We have to remember that Haldane was focusing ONLY on juveniles, so that whatever juveniles survived this first 'fitness test', would then be subject to predation, and any other density-dependent conditions that would exist. Also, Haldane seems quite influenced in his re-calculation by the, then, just published Kettlewell experiment--which, of course, has questions of its own. However, the point to be taken here is that Haldane took note of this "evolution in action" experiment of Kettlewell: IOW, evolutionists tell us that "evolution acts too slowly for us to take note of it", but now here's evidence of "evolution in action", and so Haldane calculates how "slow" evolution normally works using, I believe, the Biston betularia experiment as an upper limit. So 43 generations--the number from his paper--represents an "upper limit" for how fast evolution works. Obviously, then, anywhere else that evolution might be at work, it must be working more slowly than the Kettlewell experiment, or else we would "see" it. If we go one step further, we can analyze the Galapagos Finches example of "evolution at work". There, if I remember correctly, in a twenty year period, the finch beak size went from large to small, and then back to large (relatively speaking). So that represents 20 generations. But we had--using Haldane's language--TWO loci changed; i.e., first the population moved in one direction, and then it moved back in the other direction. So, that means 10 generations pery loci fixation. Using Haldane's formula, that means their "fitness" is e^-3, or 5%, meaning that all but 95% of the population would have to 'die off' each generation for the 'loci' to be fixed. I think the zoologists there on the islands would have noticed that 95% of a species of finch had died. So, evolution, per Haldane, cannot realistically explain this other example of "evolution in action", I'm afraid. In the meantime, we can negotiate, perhaps, with Caligula on some general figure for fitness. Finally, as a note in the history of science, I was reading a paper the other day of a prominent microbiologist who referenced the great excitement occurring in the early 50's when Monod and others began studying bacteria and discovered the "logarithmic" phase of bacteria. For them, too, this was "evolution in action". Then, comes Kettlewell's experiment, and by the end of the 50's Darwinism/Modern Synthesis was doctrine, since there was so much 'evidence' of it at work. Nevertheless, "Haldane's Dilemna" remained unanswered. But perhaps Caligula can ride to the rescue. We await. PaV
PaV: Apparently I somehow deleted a large portion in the middle of my last writing before posting. You seem to be confusing D=30 (the number of population sizes lost to selective deaths) with Haldane's eventual "magic number" n=300 (substitution time in generations). So the process takes 300 generations, during which 30 times the populations size is lost to selective deaths. Were Haldane's "magic number" really n=30, ReMine's respective "magic number" would be 16,667 and not 1,667. With n=300, I=0.1. Haldane's treatment of density consists of a single example, where larvae in an overcrowded population are harvested by parasites, against which they apparently can't evolve. During this phase, the larvae are assumed not to be subject to any kind of selection. Then, early adult (or late juvenile) moths are subject to predation (selection) but not effects of density. Haldane omits a qualitatively different density-dependent factor in such a treatment: ecological niches are finite. A growing population will eventually meet a limit where their environment can't provide viable room for more individuals (especially with territorial animals). I.e. excess ("background") mortality will follow. It is this density effect that soft selection is based on. When e.g. 6 offspring are produced per couple, and the ecological niche is filled by the population, unavoidably 4 of 6 the offspring will die. Now, Haldane would apparently assume that these 4 offspring somehow always die before selection is applied. But this hardly holds universally. More often selection (e.g. predation) and background mortality (e.g. sheer starvation) apply in parallel. And when they do, selection reduces background mortality, by reducing density. In summary: Haldane seems to assume that background mortality is somehow fixed, and whatever selection takes place, comes at the risk of extinction. Modern population genetics argues that selection reduces density, thus reducing background mortality, and effectively making "room" for itself. caligula
It is rare but there is a truly decent comment at PT regarding the issues being raised. Passing reference has been made to Nachman's paradox. The following post by David Wilson was so impressive, I thought I would reference it here: Comment 156785 on Nachmans' U paradox. I could not post portions of it here since it was math symbol intensive..... Sal scordova
PaV, You seem quite knowledgeable about this topic. I presume you have Haldane's paper. I would like to keep this discussion going. Do you have a background in Population Genetics (PopGen). Sal scordova
The intensity of selection calculation I gave above is wrong. Haldane defines the "intensity of selection" in two ways: as ln s-optimal/S-beginning, and, after his calculations, as 30/n. Thus, if n=30, I=30/30=1.0. So, the "intensity of selection" is not 0.43, but 1.0. We would all agree that's quite "intense". In fact, I don't see how it could get any higher since 1.0 means every carrier of the allele is killed. PaV
Caligula, thank you for your response. "If you look at “Discussion” in Haldane(1957), you see that he estimates intensity of selection with I=30/n. He first calculates that if n=43 => I=0.67, and then suggests conservative values n=300 => I=0.1. Intensity of selection, when the fitness of the optimal genotype is assumed 1.0, as it is here, can be directly subtracted from 1.0 to get an estimate of the average fitness of the population at the beginning of his scenario." Fitness is defined as e^-30/n. Intensity of selection is I=ln(s-optimal)/S(population at beginning). Only if "s" is small can you substitute s-optimal minus S-beginning for intensity. Here optimal is assumed, as you say, as 1.0. You divide that by e^-30/n and take the natural log, and you get I=30/n. In his example, he is using S=1/2, while s-optimal is 1.0. This is then ln(1/0.5)= ln 2=.69 If I=.69=30/n, then "n"=43 generations. BUT, the "intensity of selection" is NOT 0.1, but 0.31, and, as Haldane states, of the order seen in Biston betularia. As I calculated above, for 30 generations, I=1/2.33=0.43. This is NOT a small "intensity of selection." (. . . Rather, I’m saying Haldane is implicitly adding together the cost of selection and background mortality, which is likely why his intensity of selection is so low. Modern population genetics argues that background mortality may well decrease when selection increases, because selective deaths decrease density. (Most density factors don’t apply to larvae, after all!) Indeed, I don't believe Haldane is lumping background mortality with the "cost of selection". In fact, he is explicitly using "juveniles" so as to 'back-out' "density-dependent" factors. Why does he do this? He says: "Negative density=dependent factors must, however, slightly lower the overall efficiency of natural selection in a heterogeneous environment. If as the result of larval disease due to overcrowding the density is not appreciably higher in a wood (type of tree) containing mainly carbonaria than in a wood (type of tree) containing the original type, the spread of the gene C by migration is somewhat diminished." So, basically, he's saying that when density-dependent factors are involved--which they would be in the case of adults--this makes NS work more slowly. Likewise, if selection is quite high, it would in turn slow down background mortality. In the example he gives, he has 90% of the Biston betularia larvae being killed by a parasite. Likely "background mortality" would be diminished by at least 50%, or so, in such a situation. But here Haldane is concerned with how "fast" evolution can work, so he uses juveniles so as to not have the "background mortality" slow it down. PaV
Caligula, it seems that you remain too busy to read or address my arguments,
I have reviewed this thread which I began Friday. A discussion of this complexity will not go quickly and I mentioned that numerous times. The other issues presented here are relevant to your points. I had addressed to issue of the 6% figure at least three times now, with the last one making a citation of a PNAS paper saying the 1.5% figure is probably in error, not to mention the genbank numbers suggest 6% as well. I have invited a correction of this figure... You criticized my interpretation of the Nachman paradox earlier. It appears you have reversed one of your posisitions on the the paradox. Was my original interpretation correct after all? I spent a few hours reviewing the equations, to begin a defense of my interpretation. Thus silence on my part does not mean I was too busy to address your objections. I spent a few hours looking into junkDNA and deeply conserved regions to argue that the non-coding regions may have significant functional regions in addition to the fact deep conservation exists. I say this because your remark about me being too busy is not completely fair. These issues will not be resolved easily and require thought and study. And in fact I have begun responding to your points. I appreciate your willingness to acknowledge when there is an open question. I'm not so sure that soft selection cure Haldane's dilemma. At PT, you said ReMine did not address soft selection. Elsewhere here I was about to copy ReMine's response to the Soft Seleciton "fix". I am also awaiting Mike Dunford's input on the issue. scordova
PaV: Before I leave, I think your post requires a response. If you look at "Discussion" in Haldane(1957), you see that he estimates intensity of selection with I=30/n. He first calculates that if n=43 => I=0.67, and then suggests conservative values n=300 => I=0.1. Intensity of selection, when the fitness of the optimal genotype is assumed 1.0, as it is here, can be directly subtracted from 1.0 to get an estimate of the average fitness of the population at the beginning of his scenario. (With multiple loci, you can just sum up the coefficients, because multiplicative fitness and additive fitness are approximately the same when Iopposite to what Haldane says. Rather, I'm saying Haldane is implicitly adding together the cost of selection and background mortality, which is likely why his intensity of selection is so low. Modern population genetics argues that background mortality may well decrease when selection increases, because selective deaths decrease density. (Most density factors don't apply to larvae, after all!) caligula
It seems that you remain too busy to read or address my arguments
I can understand how you may perceive it that way, but I had mentioned in advance I thought the thread would take a while (perhaps weeks) to address the issues. This has only been the first week. The issue of soft selection related to this thread is being discussed, and as that discussion reaches resolution, your arguments about soft-selection might get addressed.
I thank you for the time you had to spare for this discussion.
Thank you as well for your time as well, Caligula. scordova
LOL @ DS Atom
Whoa! This article somehow jumped back up near the top of the page. I'm just guessing here but that might've happened as a result of me changing the timestamp on the article from 1/18/06 to 1/23/06. Weird, huh? ;-) DaveScot
I should have pointed out as well that Haldane was discounting "density-dependent" forces(which slows things down further), and his model is based on juvenile viability. PaV
Caligula: "Haldane’s limit 0.1 is quite small, isn’t it? Surely, a population can compensate a much higher rate of mortality than 10% with reproductive excess?" I believe you're misrepresenting what Haldane asserts. His 10% was not added mortality, but was a fitness level. His mathematics point out that for diploids, depending not so much on the 'intensity of selection', but more on the initial frequency of the 'loci', you get a number from 10 to 100 added deaths; he took 30 as some reasonable number of deaths, or equivalently, generations. Now, in the discussion section he defines "fitness" (population fitness) as I=e^-30/n, where n=the number of generations. If you reduce "n", then the fitness of the population is lowered,and, hence, its susceptibility to extinction thereby increases. Using n=30, Haldane's number, I=e^-1, which is equal to 1/2.33=.43, or 43% fitness. That means that in one generation 43 out of 100 of the species will die. Haldane used 30 generations--probably a reasonable number (the number varies depending on whether the 'loci' is dominant or recessive). Even using the lowest figure of 10 generations, then fitness = I=e^-10/n, which then implies that even at 10 generations, for the 'loci' to be fixed, each generation has to be reduced by 43%. As to "soft selection", Haldane says that "density-dependent" selective forces reduce the speed at which evolution by NS can take place. I think you're implying the opposite. PaV
Sal, It's time to say farewell for the time. It seems that you remain too busy to read or address my arguments, and I feel I have said what I have to say about yours. The thread seems to have been recreated on the front page, and I was told that I contribute nothing there and I'd better be banned. I complained that GilDodgen tries to present a mathematical argument without any math in it, which was declared uncivil behavior. I thank you for the time you had to spare for this discussion. caligula
At PT, I have asked you to give me a pointer to whatever other sources you may have for this belief,
Caligula, I know there has been a flurry of posts, so you probably missed that I gave my answer several times. I just want to put to rest before the readers, that I have responed to your querry at least 2 times. I recognize there probably has been a misunderstanding somewhere, on the other hand I wish to reassure the readers that I have not evaded the issue. I gave a link to Genbank, chimps have 180,000,000 more base pairs than humans. Homo Sapein Genome size = 3,400,000,000 Pan Troglodytes= 3,577,500,000 See: Genbank The difference between pan an homo is about 180,000,000 just based on base pair count, and that's about 6%. This indicates comparisons of nucleotides that are non-coding. I argue the non-coding 6% need to be accounted for. Also, see this article from PNAS: Divergence between samples of chimpanze and human DNA is 5% counting indels
Five chimpanzee bacterial artificial chromosome (BAC) sequences (described in GenBank) have been compared with the best matching regions of the human genome sequence to assay the amount and kind of DNA divergence. The conclusion is the old saw that we share 98.5% of our DNA sequence with chimpanzee is probably in error. For this sample, a better estimate would be that 95% of the base pairs are exactly shared between chimpanzee and human DNA. In this sample of 779 kb, the divergence due to base substitution is 1.4%, and there is an additional 3.4% difference due to the presence of indels. The gaps in alignment are present in about equal amounts in the chimp and human sequences. They occur equally in repeated and nonrepeated sequences, as detected by REPEATMASKER (http:ftp.genome.washington.eduRM RepeatMasker.html).
If I have misinterpreted something here and 6% is wildly off, I welcome a correction. For the sake of argument I can accept a 1.5% difference, but let's not throw out all the other data points just yet that point to a 6% difference. Sal PS I thank you for your meticulous examination of everything and your willingness to keep this backpage thread alive on this important topic. scordova
Sal: "But for starters if we have 6% difference between humans an chimps" At PT, I have asked you to give me a pointer to whatever other sources you may have for this belief, as I can't find any. As for the SciAm article, even Patrick points out, using a quote, that your reasoning fails. See #8 and #51. "200,000 generations? .... supposedly we ought to have lots of neutral-substitution awaiting to be fixed??? I’ve not even seen this question raised." This is because you insist on viewing our entire civilization as one wild population. Few scientists probably do. See #43 again. caligula
Sal, "I do not think soft selection gives a figure for the number of nucleotide substitutions per generation. I do not think it addresses fixation rates at all [can anyone clarify]." Soft selection specifically addresses fixation rates, although it is dependent on how much density-dependent mortality there is. An adaptation is typically a point mutation, but it can also involve longer DNA sequences, such as gene duplicates subject to selection. "I would like to point out Darwin argued for slight successive modifications. His comment was not very quantitative at all." Darwin knew not about Mendelian genetics nor DNA. "The nucleotide is an objective measure of “slight successive”,--" It is hardly objective, as an adaptation can well constitute of larger chunks of DNA without violating gradualism. "--and when put under quantitative scrutiny, the idea seems to fail." I fail to see why. "Consider John mutates a beneficial trait and James mutates a benefical trait. They cannot simultaneously FIX the trait into the next generation even tried to do so by mating with the same female. Haldane pointed out therefore that the dispersed traits throughout a population must be fixed one at a time." Haldane neither implicitly assumed nor "pointed out" any such thing. Just ask ReMine, who is quite frustrated with the claim that Haldane failed to address concurrency. Haldane did account for it, but he demonstrated that concurrency per se doesn't bring much into the equation, unless intensity of selection can grow. caligula
Remine-3. Concerning "adding up". Yes, cost of mutations and cost of selection are separate things and these costs add up when calculating the cost of evolution. "Add up" was a poor choice of words from me. What I meant and mean is that Haldane's dilemma and Nachman's paradox are not separate problems in the sense that they required different means to solve. (E.g. density-dependent selection applies to both.) Also, if we assume that our mutations were at an equilibrium and that U=3, Nachman's paper would suggest that Haldane/Kimura's models have serious defects, because they contradict our direct observations. However, since it seems Nachman(2000)'s value for U should be adjusted to U=1, this latter claim of mine does not hold water anymore. Here's a short example. Assume the average hominid couple produced 10 offspring. Thus, the reproduction rate is 5.0. The costs for continuity and mutations total 3.0, as formulas from Nachman(2000) suggest with updated data when harmful mutations are at an equilibrium. Then a reproductive excess of 2.0 would result, unless other costs can be applied. With 2.0 reproductive excess, or tripled population size each generation, the population suffers severe excess mortality due to density. "Background mortality" would be 67% (6 offspring per couple, but viable room for only 2). This would create a buffer for selection pressures with a combined fitness effect 0.67, without further increase in mortality. As opposed to Haldane's estimate 300, there would be one substitution per about 45 generations. In 300,000 generations, this would produce more than 6600 adaptations. Additionally, intraspecific competition could produce many more adaptations on top of that number, depending on how influential we considered that scenario to be. A "dilemma" would follow, if we found the final number to be unable to account for the morphological differences between modern human and its ancestor some 6 million years ago. Also, a "paradox" could follow if it turned out that Nachman's estimate about the fraction of the genome under constraint turned out to be much too small. caligula
ReMine-2. I must start by saying a few things. I first read Nachman's paper a few days ago, as Sal remembers, and I have originally misunderstood one thing and omitted another thing. Nachman's multiplicative fitness formula e^-U is for calculating the impact of delerious mutations after the mutation rate and purifying selection have reached an equilibrium concerning the frequencies of harmful mutations. In other words, the fitness impact is not immediate, as opposed to what I have claimed both here and at PT. I was already corrected on this by David Wilson at PT, yet I failed to appreciate the whole impact of his very informative correction, especially in my response to JGuy concerning how fast Nachman's paradox realizes in all severity. However, I hope my mistake has not caused a great harm. Although I indeed have previously assumed that Nachman's paradox always applies in all severity on population size, this assumption only maximizes the paradox to be solved by population genetics. That is, if hominids were at equilibrium The thing that I have omitted is that Nachman's paper, being released in 2000, not yet incorporates the results of the Human Genome Project. Instead of e.g. 23,000 genes, Nachman(2000) explicitly assumes we have 70,000 genes (or 140,000 copies of a gene as we are diploid). As can be verified from Nachman(2000), the crucial estimate for value U, or the average rate of deletrious mutations per offspring, linearly depends on the number of genes. So, with updated data, U should be divided by 3, roughly: U~3 should be replaced with U~1. As a consequence, the reproductive excess requirement 19.0 would then change to 2.0 (Mean fitness e^-1 ~ 0.37 requires a reproduction rate 3.0, roughly, to maintain constant population size. I.e. the excess is 2.0.) In short. Walter is quite correct here: the frequencies of deletrious mutations must first find their maximum values, or equilibrium with purifying selection, before their full impact on mean fitness applies. But note that the paradox I have considered is what happens after that, when the cost of mutations is the highest. caligula
Thank you for an interesting letter (from ReMine)! I respond to each of three comments by Walter in a separate post, labeled Remine-#. ReMine-1. Nearly neutral mutations are just that: effectively neutral mutations with a coefficient much less than 0.01. Yes, selection against negative coefficients of very low magnitude is so weak that mutation rate and the sampling effect may often beat it, especially in a small population. But this is not what the discussion here has revolved around. The claim I responded to was that harmful mutations (in general) become fixed and thus accumulate in the gene pool with high (but still somewhat reasonable) mutation rates -- which is simply a different thing. For slightly deletrious mutations with coefficient, say, s = -0.0001 to make any noticeable impact on mean fitness, they would have to be simultaneously substituting at hundreds of loci. For the negative fitness impact to be cumulative, no subsequent beneficial mutation is allowed to fix at those loci. Also, one should think what kind of visible morphological effects such slightly harmful mutations might bear, if any, in case the "genetic degeneration" claim includes the idea of an organism's phenotype visibly degenerating. So, in terms of "cost of mutations", slightly harmful mutations might make a modest difference, but they hardly warrant the cumulatively degenerating genome scenario that was advanced here originally. Personally I think the weakness of "nearly neutral theory" is as follows. Using a uniform selection coefficient for the entire population is an unavoidable oversimplification of population genetics, of course. We just assume the simplification roughly holds. But with extremely small coefficients, the errors of such simplifications are amplified I believe. *TO ADMINS* (delete after reading): I pasted text containing two successive smaller-than symbols for "much less than", which kept crashing my posts before I figured out the reason. I hope the application handling these posts doesn't have vulnerabilities to malicious use, such as redirection symbols which are known to puncture holes in weak CGI handlers from time to time. caligula
There is a confusion factor that needs to be cleared up. There are two senses of the word fitness and both can be modelled with the same equations. 1. (from Hartl and Clark) traditional population genetic definition of fitness:
The simples way to interpret the fitnesses is in terms of survivorship, usually termed viability, which is the probability that a genotype survives from fertilization to reproductive age.
2. Functional fitness. Non-traditional, but more accurate, but which can still use traditional formulas. A harmful mutations effect is not measuarble in terms of immediate survivorship, but in terms of a functional context. i.e., the immediate probability of survivorship is not changed at all and not for many generations, but at some point if genetic degeneration continues, the mutation will become significant. For example, the space shuttle can tolerate a break down of one of its quintuply redundant navigation systems. It is still a break down and should be counted as "deleterious". Unfortunately, the anti-Design, anti-Engineering perspective of Darwinism, does not account for these sort of "deleterious" mutations. Seen in this light, Nachman's paradox can be properly interpreted to say, our genomes have been degrading, and natural selection has been unable to purge the mutations since there were inadequate population resources. I think it is important the reader recognize which concept of "fitness" is being used. They are not equal, though related. Nachman's paradox can be used with some slight reformulation to deal with "functional fitness" versus "Darwinian fitness". scordova
A letter from Walter ReMine to JGuy which JGuy forwarded to me regarding Caligula's remarks. I formatted it help the flow.
Caligula wrote: Selection and segregation prevent harmful mutations from fixing, even in a situation where every individual in the population is born with a harmful mutation or a few. If someone claims to have a simulation demonstrating otherwise, I'm very interested in seeing the source code.
Walter responds: He is thoroughly mistaken there. Harmful mutations, when they are slightly harmful (or near neutral), will often reach fixation in the population. They reach fixation at very nearly the same rate as neutral mutations. It can lead to continual fitness degradation, generation after generation. This is true theoretically, and is backed-up by simulations. For example, have him try the simulation by evolutionist Jody Hey, available on the Rutgers website. http://lifesci.rutgers.edu/~heylab/ProgramsandData/Programs/FPG/FPG_Documentation.htm (Remember to use the -Q switch.)
Caligula wrote: Nachman(2000) estimates that the average human offspring suffers from (at least) 3 deleterious mutations. This results, using Haldane/Kimura's old models, in a remarkable mortality where a reproduction rate 20.0 is required to prevent extinction. Let us assume that we can do 10.0, and ignore all other costs except cost due to mutations. This would mean that our population size should be halved in each generation, instead of growing.
Walter responds: His argument (above) conflates and confuses two issues: fitness versus population size. Need I say those are different things? His argument implicitly deals with fitness. Like this: At 3 deleterious mutations per progeny, a reproduction rate of 20.0 is required to prevent continual degradation in fitness -- eventually leading to extinction. But he does not calculate the rate of degradation of fitness, which could be very slow (depending on the fitness effect of the mutations, and population size, etc.). So far he has not made any argument that any creationist should feel uncomfortable with. He then switches horses mid-stream, and talks about population size -- which is different from fitness. He implicitly claims a reproduction rate of 20.0 is required to prevent continual reduction in population size . (He thereafter claims a reproduction rate of 10.0 would halve the population each generation.) But that does not follow from his previous argument. Fitness and population size are different things, and he has not argued for a connection between the two -- so his argument fails.
Caligula wrote: So, "Haldane's dilemma" and "Nachman's paradox" don't "add up". Rather, Nachman's paradox provides another strong argument for criticizing Haldane's overly simplified model resulting in his "dilemma".
Walter responds: He has it backwards. Haldane's Dilemma is concerned with beneficial mutations. While "Nachman's paradox" is concerned with harmful mutations. Both of those are serious unsolved evolutionary problems. And they add up. That is, their solutions both require extra reproduction rate, and those requirements both draw from the limited reproduction rate of the species. The more reproduction rate required for solving one problem, the less reproduction rate is leftover for solving the other problem -- thereby aggravating the other problem. Yet both problems must be solved simultaneously. The two problems add-up, and aggravate each other. The current model of evolutionary genetics/selection -- the model advertised widely in evolutionary genetics textbooks -- cannot solve these anti-evolutionary problems. -- Walter ReMine
scordova
the empirical facts will over-ride ANY world view, and a proper world view is not a pre-requisite to understanding certain truths because the design of nature. That is the biotic and cosmic message.
If I may add, that is the central theme of the biotic message. What ever naturalistic mindless scenario is proposed, when one adopts the scenario as a working hypothesis it will lead to theoretical and empirical contradictions. The design will not permit a naturalistic interpretation. For example, assume your "worldview" is Darwinian evolution for the major features of life. Haldane's dilemma demolishes that. Or for example, assume Neutral evolution (as in random unplanned diffusion) is responsible for the majority of molecular evolution. The recent findings in junkDNA and deeply conserved regions, the problem of harmful mutations (recall Cobalt Bomb experiments), refute the neutral view. scordova
Caligula wrote: If you don’t mind, I’d like to scope YEC concepts such as Fall, flood, and 4004BC outside my interests as far as this thread is concerned. I think Sal, too, has tried to limit the scope somewhat.
I agree with Caligula here. And I will point out even to the YECs on the thread why.
JGuy wrote: I thought it interesting to point out that there does not exist a Nachman’s paradox within the world view of YEC (ie. literal Genesis account).
I appreciate your zealousness to defend the faith, but such a line of argumentation does not even honor the YEC cause. There does not exist Nachman paradoxes in the world of the Flying Spaghetti monster either. Adopting a world view where there is no paradox hardly implies the world view is correct. The notion of "adopting the right world view" before one can discern certain truths about origins is a Ken "AiG" Hamism which I find distasteful and dishonors the work of God. Romans 1:20 implies the empirical facts will over-ride ANY world view, and a proper world view is not a pre-requisite to understanding certain truths because the design of nature. That is the biotic and cosmic message. The effects on a person falling off a skyscraper are independent of his world view. Facts have a way of eventually smacking people irrespective of what they believe. There ought not be a need to be preaching Genesis here or anywhere. Argue the facts, not world views. If one believes Romans 1:20, the facts will take care of themselves... You may mention Genesis in passing, but until we get some slam dunk evidence, statements like the following do not even help the YEC cause:
there does not exist a Nachman’s paradox within the world view of YEC (ie. literal Genesis account).
I say that as someone who is very sympathetic to your views. Argue the facts. End of Sermon. scordova
JGuy: "It was simply noting the lack of a paradox across all views." If you want to get rid of Nachman's paradox by adjusting your cosmology, you would have to switch to a worldview where the human population is constantly decreasing and never increasing. (Read #54 one more time.) It doesn't help to say: before the Fall (or flood), the paradox did not apply. You would have to say: before our generation, the paradox did not apply (because the human population has constantly increased during historic time). And just to be safe, you should perhaps say: before an unknown time in the future, the paradox won't apply. I wager you don't find it a very acceptable solution. caligula
caligula: "If you don’t mind, I’d like to scope YEC concepts such as Fall, flood, and 4004BC outside my interests as far as this thread is concerned. I think Sal, too, has tried to limit the scope somewhat." Thanks for the response. I wouldn't have wanted this to tangent into that area of discussion either. It was simply noting the lack of a paradox across all views. I'm very interested, as is, in both sides of the discussion. It will be useful to see how the paradox and dilemma etc.. all fair criticism, with the disadvantage of evolutionist assumptions. JGuy
due to the uncertanties you indicate. Please ignore the deleterious random mutation in my previous post. tribune7
First, let’s not fixate on the number 10 billion, as it is an artefact of my argument to DS rather than my assumption of the exact number. Just to make it clear — it doesn’t really change the essence of your question, It may not change the essence but it does reflect a new facet. Ten billion is tad bit higher than "several hundred million" and that makes the point that if you can't trust the fossil record to provide a reasonably consistant account of the species that have lived, you can't trust to provide a record of evolution, and the fossil record is invariably cited as the prime piece of evidence for evolution. No, really. Until fairly recently, not much fossil hunting had been done in e.g. Asia — and indeed, everyone has heard of the remarkable Chinese findings during the past decade. Extrapolate 300,000 to 300 million. Has just a 10th of a percent of possible fossile sites been investigated? I guess we can debate it but if it can be debated it certainly indicates we aren't talking hard science. Now extrapolate it to 10 billion. Also, I’m not sure how well paleobiologists are able to tell the difference between two closely related fossil species in many cases. Well they often can't. New fossil types are being recorded but old ones are merged (brontosaurus to apatosauras ) do the uncertanties you indicate. The point is that the fossil record is not proof of evolution. Actually, many of us think it is evidence against it. So a “morphological species” may not accurately reflect the biological species concept, as applied to living organisms. Exactly!! tribune7
For the readers benefit, let me add one further observation. Much has been made of "Darwin's" finches in the Galopagos island. The evolution we see there is not fixation, only a reversible change in population frequencies. When there is dry weather we have a change in the mean versus cold weather. This is a different statement than saying a trait is "permanently" or "irreversibly" fixed into a population. One cannot argue the "fast" changes in finch frequencies or any other population frequencies (like industrial melanism in Kettlewell's moths) as circumventing Haldane's dilemma. They don't even frame the problem correctly. scordova
From Biotic Message page 241:
The process [of neutral substitution] is extremely infefficent, slow... Let N_e be the effective population size. According to Kimura (1983, p 35), on average, a neutral substitution requires 4N_e generation to reach fixation, if we exclude the cases in which it is lost. A population of 50,000 would require 200,000 generation to complet a given substitution
200,000 generations? 4 million years for human-like generation times. This model also assumes the population is well-stirred. If not, this leads to some interesting questions of intra-species sequence divergence or the enigma of the deeply "conserved" regions that were discovered between mice an men. This issue is worth exploring. But for starters if we have 6% difference between humans an chimps, what should we expect to be the differences between human an human since supposedly we ought to have lots of neutral-substitution awaiting to be fixed??? I've not even seen this question raised. Neutral theory is an admirable theory, and I'm sorry to be so negative on it because it almost succeeds, and I have a natural liking of "quants" (researchers who like to quantify things mathematically) like Kimura. But deeply conserved regions between mice and men are inconsistent with neutral theory, not to mention if we find functionality and very deep multi-layers of organization within DNA. As pointed out in this article in nature regarding deeply conserved regions: Deleting non-coding regions from the genome has no apparent effect
Why are there so many similiarities between this DNA across various unrelated species? There doesn't appear to be any significant life-affecting value within it. Mutations should have built up in this DNA over time.
These deeply conserved regions fly in the face of neutral theory. As I said, the selectionist use these regions to argue against neturalists. The neutralists will respond by pounding Haldane's dilemma. I don't think they realize they have mutally destroyed each other's ideas, leaving the door open for another explanation that appeals neither to selection or neutrality. scordova
I do not think soft selection gives a figure for the number of nucleotide substitutions per generation. I do not think it addresses fixation rates at all [can anyone clarify]. I would like to point out Darwin argued for slight successive modifications. His comment was not very quantitative at all. The nucleotide is an objective measure of "slight successive", and when put under quantitative scrutiny, the idea seems to fail. The issue of multi-nucleotide substitutions is there, but Haldane's 1957 paper pointed out why it is problematic if the multi-nucleotides are dispersed throughout the population. Consider John mutates a beneficial trait and James mutates a benefical trait. They cannot simultaneously FIX the trait into the next generation even tried to do so by mating with the same female. Haldane pointed out therefore that the dispersed traits throughout a population must be fixed one at a time. The alternative for mutli-nucleotide fixation is all the traits suddenly appear in one individual (like a punctuated Evo-Devo event, or something) and he's such a stud he forces a rather substantial overtake of the population. GeoMor alluded to this. But I think this is problematic, and worth exploring further in subsequent posts to settle it. I pondered it while I was away this weekend and it is worth discussing... With this in mind, at the time of Kimura and Ohta, the Chimp/Human divergence rate was 20 nucleotides per generation, not the 720 (or to be a bit more fair 720/2 = 360) nucleotides we have today. Yet here is their comment on Haldane's work, even when the figure was a mere 20 nucleotides per generation:
This gives a rate of nucleotide substitution per generation of at least 20 [note: 720 is the last figure, not 20!], making the contrast still greater with Haldane's (1957) estimate of 1/300 per generation as the standard rate of gene substitution in evolution. Considering the amount of selective elimination that accompanies the process of gene substitution (substitutional load, see Chapter 5), the most natural interpretation is, we believe, that the majority of molecular mutations that participate in evolution are almost neutral in natural selection. Kimuara and Ohta Theoretical Aspects of Population Genetics
scordova
DaveScot: "Just as I thought. There is no experimental evidence of soft selection. It’s just more crap made up out of thin air." I appreciate your opinion, and I leave it to the readers of this thread to decide whether it is a well-substantiated one, and whether you have demonstrated that you understand what soft selection is. caligula
"Caligula -Since only about 200-300 thousand morphological species have been identified, we are missing 10 billion fossil species. You really don't see that as a problem?" First, let's not fixate on the number 10 billion, as it is an artefact of my argument to DS rather than my assumption of the exact number. Just to make it clear -- it doesn't really change the essence of your question, as I do agree that the number is probably at least several hundred million, i.e. more than a thousand times what has been identified so far. I've read an optimistic estimate by paleontologists that as much as 10% of species may have left fossils (although the portion varies between groups due to e.g. habitat, behavior, size and structure). Not nearly all fossils have survived to our day, but it still remains the case that the currect fossil catalogs should then contain only a fairly small fraction of available fossil species. What are the reasons for why the majority of available fossils may still remain unidentified? As much as a "Darwinistic cliché" it may sound, we may just not have dug enough. No, really. Until fairly recently, not much fossil hunting had been done in e.g. Asia -- and indeed, everyone has heard of the remarkable Chinese findings during the past decade. But worse, each digging site is just a snapshot. Is it valid to expect that the ecological niches of most species are geographically wide enough to be covered by local snapshots? And is it valid to expect that one digging site can capture even the local fauna? I think everyone can understand the difference between somehow systematically sweeping the 540 sq.kms and digging a hole here and there. (This is not to say that all spots on Earth are likely to provide fossils, though.) So, I would find it a problem if new fossil species ceased to show up. But whether such has been the case lately, I don't know. Do you? Also, I'm not sure how well paleobiologists are able to tell the difference between two closely related fossil species in many cases. So a "morphological species" may not accurately reflect the biological species concept, as applied to living organisms. caligula
This is a fun game, making things up to solve Haldane's Dilemma. I'm going to put forward that beneficial genes are spread through the population by kissing. You heard it here first. Evolution is accelerated by kissing. Haldane's Dilemma is solved. DaveScot
caligula Just as I thought. There is no experimental evidence of soft selection. It's just more crap made up out of thin air. Next! :razz: DaveScot
I would like to advise the readers this thread is about to slip off the front page. I expect the discussion will continue, and I would encourage continued participation even after it drops off the front page. The topics here are deep enough that I envision the dialogue continuing for quite sometime. I would like to thank all the participant here as I sense this discussion is of keen interest to many readers. scordova
Ludwig wrote: If you focus on the DNA sequences, then at that level the vast majority of differences are insignificant.
It is understandable how the scientific community may have adopted that view, given that it appears a biological system's "fitness" is immune to various changes. But I would caution against that position. Part of this view has been driven by the under-appreciation for being able to recognize deeply redundant designs in biological systems. Redundant, robust functions are functional systems which will elude definitions fitness and selection, and thus it is easy for an evolutionary biologist to dismiss since damage to these systems do not immediately change fitness. However, an engineer will recognize this is a horrible way to characterize a system. It's like saying a spare tire has no function since the car runs fine without it! See: Airplane Magenetos, Contingency Designs and Reasons ID will Prevail for both some theoretical and empirical confirmation of this. Contingent architectures strongly resist evolution via natural selection because they are generally invisible (or at least substantially less visble) to selective pressures. If junkDNA is replete with these architectures, this would overturn the idea that the junkDNA is insignificant. Furthermore, I would not be too quick to dismiss specific nucleotide arrangements just because of the synonymous codon issue or the fact that proteins on the surface can apparently tolerate various amino acid substitutions. In a deeply redundant, robust system, I would expect tolerance to some degree of damage, but what if we find functional information in these regions which only certain circumstances will trigger. We have in engineering the concept of safety margins, and there is some reason to think the nucleotide sequences and the genetic codes were engineered with measurable saftey margins which have been steadily eroding through genetic entropy. The following discussion here has bearing on the importance of every nucleotide: Poetry in the Genetic Code These developments are offered in criticism of neutral theory. Neutral theory is a good theory, but in light of the large amount of functionality being discovered in DNA, the major claims of neutral theory in regard to 98.5% of the DNA in humans is seriously in doubt. Further information on these regions is offered here: Post Genetic Diseases and DNA researcher, Andras Pellionisz gives favorable review to a shredding of Dawkins and TalkOrigins scordova
JGuy: I suggest you read Crow(1997) directly, as I just did. The part you're interested in is the section called "The Current Human Population" at the end of the text. Crow clearly states that the (potential) degeneration of the human genome is due to natural selection not working in the human civilization as it used to do in the wild. On the other hand, he states that this same factor (civilization and its ability to overcome and, indeed, remove selection pressures imposed by the environment) more than compensates for the loss of fitness as we would experience it in the wild. Crow does give a powerful warning ("how long can we keep this up?") but he also assumes that "perhaps for a long time" (as long as our social stucture permits compensating our biological degeneration with technology). I don't think one can draw strong conclusions from the final section of Crow(1997), nor base any reliable calculations on it. Crow himself does not encourage such practice in the paper, as he is merely giving a speculative ("if we are like Drosophila") but sincere warning to us. If you don't mind, I'd like to scope YEC concepts such as Fall, flood, and 4004BC outside my interests as far as this thread is concerned. I think Sal, too, has tried to limit the scope somewhat. caligula
"Before I waste more time on soft selection could you give experimental evidence that it even exists?" The general existence of soft selection is, I believe, easy to demonstrate. Every animal which fails to conquer a territory can be called a victim of soft selection. There was no free territory for it in the ecological niche, so it was a density-dependent death. I believe we agree that examples of such fates are abundant in nature? I would also argue that the territorial behavior in animals is likely a direct result of the constant existence of conditions resulting in soft selection. It makes no sense for everyone to starve because of juveniles exceeding the carrying capacity of the environment. It is usually better to risk your life in occasionally fighting over a sufficient amount of vital resources (territory) than face the constant risk of e.g. starving when your food is halved -- or reduced to 1/3, 1/4... But even though I think the general existence of soft selection is hard to refute, the relative importance of soft selection for the rate of substitutions via natural selections is hardly known. (The same applies to intraspecific competition.) caligula
caligula wrote #54 - Nachman’s paradox: ... "So, “Haldane’s dilemma” and “Nachman’s paradox” don’t “add up”. Rather, Nachman’s paradox provides another strong argument for criticizing Haldane’s overly simplified model resulting in his “dilemma”." ----------------------- Hi caligula. :) Nice to meet you. It's good to see a civil dialogue - I think so far. Your posting on the paradox is interesting - I love puzzles. Sanford (2005) referenced Crowe (1997) and indicated he (Crowe) believed the fitness was degenerating about 1-2% per generation. I think this wold have put human extinction close to 300 generations. The past is not scientifically testable. However, theories about the past may be scientifically evaluated. There seems to be a large difference here - I'm sure Karl Popper might agree - since he said as much about the theory of evolution in his autobigraphy Unended Quest(1976). So, Nachman's paradox, as you presented it, contrasts two concepts: 1) A presently observed & repeatably measured genetic mutations & basic maths TO 2) The unobservable & unrepeatable human past (based on presuppositions). It's similar to the problem that YEC have been criticised for when arguing against radiometric dating (an obseved rate measured today) versus their presuppositions about the past. For now - as a young earth creationist - I thought it interesting to point out that there does not exist a Nachman's paradox within the world view of YEC (ie. literal Genesis account). There would be only about 230 generations since the creation of man - if my math is correct. There were only 10 generation prior to the flood 4400 years ago. Additionally, prior to the flood there is direct evidence that there was very little fitness decay. This fits well within a 300 generation extinction. JGuy
Caligula --Since only about 200-300 thousand morphological species have been identified, we are missing 10 billion fossil species. You really don't see that as a problem? tribune7
Before I waste more time on soft selection could you give experimental evidence that it even exists? DaveScot
DaveScot: "When times are good the population can grow and soft selection can speed up evolution" 1. Could you give an example of a wild population having a "good time"? 2. Soft selection is not based on the premise of a growing population. Rather, it is based on the premise of a population that can not grow in spite of its reproductive capability to do so, i.e. on notable "background mortality". See #53 again. "because times are good, mutations that would be deleterious in not so good times can also be soft selected" Could you give a concrete example of this? caligula
caligula So what you're saying is basically that all of "you" are in denial of what's revealed in the fossil record; species' abrupt emergence, stasis, and extinction. "We" already knew that. DaveScot
Selectionists vs. neutralists (etc.) Disagreements about the mechanisms of evolution are well known. The best known "opposing" schools of thought are "selectionists" vs. "neutralists", "gradualists" vs. "punctualists", and "gene selectionists" vs. "group selectionists". I do not dispute that there have been severe disagreements between such schools, and there probably still are, even if they are less severe. What I do dispute is that they are or have been such polar opposites as one might believe from Sal's treatment concerning selectionists vs. neutralists. No selectionist probably disagrees that neutral evolution is the dominant mechanism for molecular evolution. No neutralist probably disagrees that adaptation is the dominant mechanism for morphological change. The open question between these schools is how much of morphological change can be attributed to neutral polymorphisms. I don't see it as a very severe disagreement -- such differences are what we should expect from researchers trying to interpret insufficient data. Most importantly, traditional Darwinism and neutralism are hardly incompatible. Basically, everyone is a gradualist. Even punctualists such as Gould and Eldredge keep a distance to saltations, i.e. macromutations. Also, no gradualist, not even Dawkins, believes that morphological change has a constant rate. The dispute, after it was established that punctualism is not saltationism, is merely how influental punctuated equilibriums have been in the history of life. As for the unit of selection, I think gene selectionists clearly "won" the so-called Darwin Wars. This is not to say that gene selection was shown to be omnipotent in explaining everything. But there seems to be a near-consensus that gene selection best explains most observations. There is little dispute between gene selection and selection of individuals, because their predictions largely agree except for cases like altruism and some cases of sexual selection. And group selection is not dead either. It is just not considered nearly as influental by the majority of biologists as many paleobiologists once suggested. In short, the various mechanisms of evolution are generally not incompatible with each other. There are legitimate differences in opinion over their relative contribution in the historic evolution of species, however. caligula
geomor The wages of selection is death. Caligula points out that soft selection allows evolution to proceed faster. It does. But it doesn't eliminate or lower the cost of selection. It defers it. When times are good the population can grow and soft selection can speed up evolution. However, because times are good, mutations that would be deleterious in not so good times can also be soft selected. The deleterious mutations accumulate in the genome and then as soon as the environment becomes not so good the species quickly becomes extinct. Extinction without descendent species is the rule in the fossil record. Any theory of evolution must explain the facts surrounding extinctions. DaveScot
GeMor: DaveScot, I didn’t really follow the significance of this factoid, that 0.1% of species that ever lived are alive today. Could your clarify what this implies in this discussion? I inferred that he was talking about the fact that populations show up, stay for a while, and leave virtually unchanged. You know the basis for punctuated equilibrium- stasis, ie natural selection keeping the norm just as predicted and observed. But that is just me. Joseph
DaveScot, I didn't really follow the significance of this factoid, that 0.1% of species that ever lived are alive today. Could your clarify what this implies in this discussion? GeoMor
DaveScot: As I said, I fail to see how your discussion of fossils over geological timescale relates to my presentation of Nachman(2000). However, since you feel that I'm ignoring your argument, I will respond to it separately. You say that we know from fossils that 99,9% of all species have gone extinct and that (almost?) no speciations have occurred. If my memory serves me, the usual estimate is closer to 99%, but let us assume you have the correct number. I assume that the evidence for lack of speciation events is based on gaps in the fossil record below the family level. An obvious question follows. Assume 10 million species are living now; then 10 billion species would have existed in the history of Earth. Since only about 200-300 thousand morphological species have been identified, we are missing 10 billion fossil species. How can you know that none of these forms fits the gaps in the current fossil record, to better demonstrate speciation events? caligula
DaveScot: As I said, I have not made an argument under geological timescale. Nachman's paradox is that we should observe the impact of his results in the here and now and in historic timescale, yet we do not and have not. Thank you for clarifying your point about soft selection. So you acknowledge that "background mortality" does serve as a buffer for natural selection. But you present a counter-argument that harmful mutations start fixing under soft selection. Can you present us a short mathematical example on such fixation? caligula
Joseph Soft selection is a double edged sword. Yes, it allows beneficial mutations to fix faster in a growing population. It also allows neutral and deleterious mutations to fix faster in a growing population. The end result of the accumulation of neutral and deleterious mutations in a growing population is disaster when the environment turns ugly (and it WILL eventually turn ugly). This is why 999 of 1000 species with obligatory sexual reproduction go extinct without spawning any new species. Sermonti is absolutely correct. DaveScot
Caligula: Many have suggested that harmful mutations “prevent evolution” or cause genetic degradation. Geneticist Giuseppe Sermonti tells us that sexual reproduction prevents evolution:
Sexuality has brought joy to the world, to the world of the wild beasts, and to the world of flowers, but it has brought an end to evolution. In the lineages of living beings, whenever absent-minded Venus has taken the upper hand, forms have forgotten to make progress. It is only the husbandman that has improved strains, and he has done so by bullying, enslaving, and segregating. All these methods, of course, have made for sad, alienated animals, but they have not resulted in new species. Left to themselves, domesticated breeds would either die out or revert to the wild state—scarcely a commendable model for nature’s progress.
Joseph
caligula I see you completely ignored the inconvenient fact that the indisputable testimony of the fossil record shows that 999 of 1000 species go extinct without spawning any new species with an average lifespan of 10 million years. Like I said, you don't let facts get in the way of your arguments... Soft selection solves nothing. Its effects are cumulative and the result IS indeed a reduction to a population size of zero (extinction). Your refusal to accept the facts surrounding extinctions makes it difficult to carry on a discussion. DaveScot
DaveScot: 1. Let's all tune down rhetorics for a moment, please. 2. If you look at the context around your quote, you see how badly your argument fits here. According to Nachman(2000), our population should be dropping at an extremely nasty pace instead of increasing; which also means that we should have gone extinct during historic time, even if we had started with an absurdly huge population. So, clearly, I wasn't speaking of geological timescale here. 3. Do not worry. I have read a lot of ReMine's argumentation, including your link. But the fact that ReMine writes about a topic does not mean he has refuted it. When writing about Haldane's dilemma, I specifically keep ReMine's arguments in mind. And, for example, when commenting soft selection, ReMine is so busy discussing what it is not that he forgets to tell his reader what it is. Thus, he also fails to address the real issue and merely addresses strawmen of his choice. But if you have a link where ReMine actually does address the real impact of soft selection, please provide it as I am interested in it. I have asked him directly for such a treatment, however. 4. They are not my assertions. They are additions to Haldane(1957) made by current authorities in mathematical population genetics. I'm helpless against the fact that ReMine does not include them among his trusted authorities, although he does largely base his arguments on selected quotes from authorities. caligula
caligula and we are not extinct Not yet at any rate. 999 of 1000 species that ever lived are extinct. The average lifetime of a species is 10 million years. We are the only surviving line of hominids. I see you, like most NeoDarwinists, are not letting facts get in the way of your arguments. All your assertions have been refuted by Remine. See the link on the sidebar. DaveScot
Nachman's paradox Many have suggested that harmful mutations "prevent evolution" or cause genetic degradation. Such a claim is a serious misunderstanding. Selection and segregation prevent harmful mutations from fixing, even in a situation where every individual in the population is born with a harmful mutation or a few. If someone claims to have a simulation demonstrating otherwise, I'm very interested in seeing the source code. What a high rate of harmful mutations could cause is a lot of excess mortality, which could lead to extinction. That is what "Nachman's paradox" is about -- it is not about purifying selection somehow failing to work. (Just read Nachman(2000)!) And yes, the former could be a serious problem. But it is easy to see that the problem does not exist in reality. Consistent changes in population sizes are exponential. Apply, for example 10% decrement per generation and see how quickly the population would go extinct. Nachman(2000) estimates that the average human offspring suffers from (at least) 3 deletrious mutations. This results, using Haldane/Kimura's old models, in a remarkable mortality where a reproduction rate 20.0 is required to prevent extinction. Let us assume that we can do 10.0, and ignore all other costs except cost due to mutations. This would mean that our population size should be halved in each generation, instead of growing. It also means that even if our population numbered 300 billion some 35 generations (700 years) ago, we should be an extinct species now. Clearly, our population size was not 300 billion in the 1300s, and we are not extinct. I.e. a paradox. Now, the solution to this paradox is not that natural selection does not work, as many seem to suggest. If NS failed to work, and harmful mutations accumulated in our genome, the paradox would only become deeper. Nachman himself pretty much suggests that Haldane/Kimura's models require revision. He specifically suggests taking into account density-dependent factors. But I'm additionally interested whether Nachman's estimate for the average number of deletrious mutations can be considered accurate. It seems to me that nothing can compensate for his extreme fitness reduction -- except for the possibility that most deletrious mutations result in an early miscarriage and thus do not result in the kind of cost a "juvenile death" would. So, "Haldane's dilemma" and "Nachman's paradox" don't "add up". Rather, Nachman's paradox provides another strong argument for criticizing Haldane's overly simplified model resulting in his "dilemma". caligula
Haldane's model Haldane's model deals solely with a scenario where a population is adapting to external environmental pressures. Haldane reasons, quite correctly, that before a population can adapt, it must initially be less-than-optimally adapted. And the quicker it adapts, the less well-adapted it is initially. Being badly adapted means a lower average fitness and, thus, a higher mortality. This, in turn, brings to table the risk of going extinct. Haldane estimated that in order to avoid extinction, the population can't tolerate more than combined selection coefficients worth 0.1, on average. Soft selection Haldane's limit 0.1 is quite small, isn't it? Surely, a population can compensate a much higher rate of mortality than 10% with reproductive excess? It is likely that Haldane (implicitly) assumed that many other reasons for mortality besides cost of selection are taking place most of the time, and that various costs simply add up. This is where soft selection comes in. It is assumed that the highest individual cause of mortality is population density. When a population has (a) filled the carrying capacity of its ecological niche and (b) has reproductive excess (i.e. produces more than 2 offspring per mating couple per generation), this reproductive excess simply dies: there just isn't viable room for it. When even the slowest breeders, such as elephants, are known to produce much more that 2 offspring per couple, there likely is a lot of mortality due to density. But note that if there are other reasons for mortality, such as selection, costs imposed by these sources reduce the cost due to density. If, say, predation (selection) harvests an individual before it has time to, say, starve to death (density), then costs due to density are reduced. It is said that unavoidable, density-dependent mortality acts as a "buffer" for natural selection: selection doesn't necessarily increase the total mortality, it merely makes survival non-random. Intraspecific competition Traditionally, adaptation is an "us against them" scenario. But not all beneficial alleles necessarily help a population to overcome external challenges in its environment. There is also competition between individuals within the population. Those who manage to hog more vital resources in their environment, such as territory, food and shelter, produce more offspring. For such alleles, we don't need to assume a corresponding external selection pressure causing excess mortality. It is simply about redistribution of vital resources within the population. In such a case, there is no cost in terms of excess mortality and, thus, no need for reproductive excess in terms of individuals. Yet allele frequencies do change, due to non-random survival and segregation. caligula
Neutral mutations take even longer to spread through the population than beneficial mutations. The carriers of a neutral mutation have no gain or loss in differential reproduction as a result of that mutation. A beneficial mutation OTOH by definition increases the number of offspring of the carriers and thus spreads it around faster. DaveScot
LudwigK:
Deleterious mutations do not spread, they die with the individual they originate with.
Eh? Unless you're defining "deleterious mutations" in a manner I'm unfamiliar with the first thing that came to mind was blind cave fish. Obviously lethal deleterious mutations are another matter but fixation of deleterious mutations can occur and can accumulate. All: http://www.nature.com/nature/journal/v445/n7123/abs/nature05388.html
The overall rate of occurrence of deleterious mutations in the genome each generation (U) appears in theories of nucleotide divergence and polymorphism, the evolution of sex and recombination, and the evolutionary consequences of inbreeding. However, estimates of U based on changes in allozymes4 or DNA sequences5 and fitness traits are discordant. Here we directly estimate u in Drosophila melanogaster by scanning 20 million bases of DNA from three sets of mutation accumulation lines by using denaturing high-performance liquid chromatography. From 37 mutation events that we detected, we obtained a mean estimate for u of 8.4 times 10-9 per generation. Moreover, we detected significant heterogeneity in u among the three mutation-accumulation-line genotypes. By multiplying u by an estimate of the fraction of mutations that are deleterious in natural populations of Drosophila10, we estimate that U is 1.2 per diploid genome.
http://www.pnas.org/cgi/content/abstract/0404125101v1
The tendency for genetic architectures to exhibit epistasis among mutations plays a central role in the modern synthesis of evolutionary biology and in theoretical descriptions of many evolutionary processes. Nevertheless, few studies unquestionably show whether, and how, mutations typically interact. Beneficial mutations are especially difficult to identify because of their scarcity. Consequently, epistasis among pairs of this important class of mutations has, to our knowledge, never before been explored. Interactions among genome components should be of special relevance in compacted genomes such as those of RNA viruses. To tackle these issues, we first generated 47 genotypes of vesicular stomatitis virus carrying pairs of nucleotide substitution mutations whose separated and combined deleterious effects on fitness were determined. Several pairs exhibited significant interactions for fitness, including antagonistic and synergistic epistasis. Synthetic lethals represented 50% of the latter. In a second set of experiments, 15 genotypes carrying pairs of beneficial mutations were also created. In this case, all significant interactions were antagonistic. Our results show that the architecture of the fitness depends on complex interactions among genome components.
On the chimp/human differences: http://www.eurekalert.org/pub_releases/2006-12/uob-wim122006.php
The researchers paid special attention to gene number changes between humans and chimps. Using a new statistical method developed by Tijl De Bie, University of Bristol, and Cristianini, the international team inferred humans have gained 689 genes (through the duplication of existing genes) and lost 86 genes since diverging from their most recent common ancestor with chimps. Including the 729 genes chimps appear to have lost since their divergence, the total gene differences between humans and chimps was estimated to be about 6 percent. The team included computational biologists from the University of Indiana and University of California, Berkeley. The results do not negate the commonly reported 1.5 percent nucleotide-by-nucleotide difference between humans and chimps. But they do illustrate there isn't a single, standard estimate of variation that incorporates all the ways humans, chimps and other animals can be genetically different from each other. Any measure of genetic difference between humans and chimps must therefore incorporate both variation at the nucleotide level among coding genes and large-scale differences in the structure of human and chimp genomes. Cristianini commented, "So the question biologists now face is not which measure is correct but rather which sets of differences have been more important in human evolution."
Patrick
LudwigK: "And this is what is the greatest destroyer of peace today - abortion which brings people to such blindness." Technically speaking, neutral mutations don't "fixate", they just happen. "If we require that all the differences between chimps and humans arose by fixation through selection pressures, then they do seem too much. However, the vast majority of those were neutral mutations." This sort of tips your hand. It seems you're anxious to discuss the 100 to 300 neutral mutations as a way of getting from chimps to humans. I have no problems with that, as long as the discussion doesn't get bogged down with too many complications. But this does take the fixation of beneficial mutations off the table as a possible mechanism, which only seems to affirm Sal's distinction between "selectionist" and "neutralist" approach. I don't care what the rules are, as long as everyone agrees. PaV
PaV, Those 100-300 mutations per generation that fixate are neutral mutations. Deleterious mutations do not spread, they die with the individual they originate with. Beneficial mutations fixate at much less than one per generation, probably. Note that in the early stages of embrionary development lots of genes are activated. If something is to go wrong, that’s a likely place for it to happen. And it’s at that stage that one half to two thirds of all humans die, often even before implantation. That’s one powerful sieve to filter out deleterious mutations, but one which does nothing against neutral mutations (by definition). The maturation process of sperm cells and ova also can lead to filtering out harmful mutations while letting neutral mutations pass through. If we require that all the differences between chimps and humans arose by fixation through selection pressures, then they do seem too much. However, the vast majority of those were neutral mutations. Also Salvador’s presentation of selection as being opposed to neutral evolution is not quite correct. Selection is what affects mutations that impact on reproductive success, neutral theory describes what happens to those mutations that have no impact on reproductive success. Just like ballistics and fluid dynamics describe different things in physics, but are not contradictory theories. LudwigK
#44
Natural selection explains the formation of features like wings, eyes, legs, and such, and how species diverge in significant traits.
A little correction: NDE supporters claims, without any significant proofs, that Natural selection could explain the formation of features like wings, eyes, legs, and such, and how species diverge in significant traits. kairos
I think there are tremendous cross-puposes going on. 1.) Is is 100 snp's or 300 per generation? But what about "error correction mechanisms"? As bFast points out, high deleterious mutation rates likely can't be tolerated. 2.) Are these 100?, or 300? SNP's spread out, consolidated, or connected to a highly-conserved loci ("hitchkiking")? 3.) Which proportion of the 100 (or 300) mutations are "neutral", and which proportion "selected" for? 4.) Do we, or do we not, consider "recombination" and lateral gene transfer, etc? Before this all starts, perhaps it would be good to whittle some of this stuff down to a precise scenario which could then be dealt with in some depth. PaV
One of Sal's main questions is the maximum substitution rate allowed by Darwinistic mechanisms. I will do my best to at least shed some light on the question. In another post, I will also shortly describe Haldane's basic model, some of the later additions to it as well as the relationship between so-called "Haldane's dilemma" and "Nachman's paradox". 1. Haldane's model allows for one substitution per 300 generations on average, even when concurrency is involved. 2. Soft selection allows for much more. However, as I understand it, even soft selection can't increase the rate of substitution by an order of magnitude compared to Haldane's limit. 3. Intraspecific competition allows for a rate of substitution that is, at least in principle, orders of magnitude higher than Haldane's. The risk of extinction is not involved, so the key question becomes: what is the rate of beneficial mutations? 4. Further factors. For example, non-uniform selection pressures in the environment inhabited by the population can create interesting allele flows between subpopulations, and in principle lower the cost of selection. Of these, 3 is clearly the most interesting. How influental it has been in the historic evolution, I am not qualified to estimate. AFAIK no one currently knows the answer. In summary, I don't think anyone can give a definite limit for the speed of substitutions through natural selection. It should be measured, which is of course quite difficult. If a very high rate was measured, however, I think population genetics does readily have mathematical models to explain such a rate. caligula
bFast However, in my simulation, when the mutation rate reaches an average of 1 deleterious mutation per generation, evolution comes to a grinding halt. If we are hit by 3 deleterious mutations per generation, we should be degrading. Interesting! Will you post this at some point? One question to indicate my interest: is a "deleterious mutation" one that changes a selected value into non-selected one? And then is a "beneficial mutation" one that changes a non-selected value to a selected one? And this is done at a random spot and with random result, correct? It sounds like you might be setting separate rates on deleterious and beneficial mutations, which does not seem properly random on the face of it. Tom Moore
Salvador (scordova), I agree that modern evolution theory is not Darwinism. That’s an understatement, like saying that modern physics is not Newtonism. Both Darwin and Newton did remarkable jobs, but their models are now just a small part of much bigger theories. But I think your distinction between selectionism and neutralism is not very useful nowadays. Scientists like to argue a lot, and it’s in by splitting hairs that science progresses, but both models are part of modern evolutionary theory, addressing different levels of detail. Natural selection explains the formation of features like wings, eyes, legs, and such, and how species diverge in significant traits. So if you consider traits like brain size, body hair, teeth, jaws, height, built, and such you’ll probably get some 1000 or so phenotipically significant differences between humans and chimps, which is what Haldane was talking about. If you focus on the DNA sequences, then at that level the vast majority of differences are insignificant. Different ways of coding the same protein, different proteins that do the same thing, slight changes in enzyme activity for some metabolite that affects eyelash colour and such. That’s where Kimura’s model is most useful, because at that level most changes are either eliminated outright or have no effect. And it is at that level that you get millions of differences between humans and chimps. Another issue is that eliminating rare harmful mutations is very different from increasing the frequency of a rare beneficial mutation until it removes all alternatives from the gene pool. Haldane’s estimate of 300 generations was for the latter, not the former. Rare harmful mutations will remain rare even with moderate selection pressures. LudwigK
Apologies about the delay. (Real life and GMT+2 timezone.) "Humans have 99.5% uniformity in their genomes ... Recall that Darwinism relies on the introduction of noise into the genome, the fact we have the antithesis of noise in the genome in these uniform regions is problematic for Darwinism in a big way." 1. How big a percentage of our genome should contain signs of ongoing fixation due to drift, in your opinion? 2. Is the human population a good test bench for this? On one hand, it has likely gone through a recent bottle-neck (near-extinction) -- on the other hand, the population has quickly become enormous and geographically complex (including political and cultural barriers) which is hardly compatible with random mating. 3. I believe only SNPs with higher than 1% frequency are included in that near-100% uniformity estimate. Most national phenomenoms do not fit in. If we considered, e.g. my home country Finland as a "population", perhaps we would find better examples of drift? And on the level of "noise" (e.g. individual variation), I think the uniformity estimate tells us hardly anything. caligula
I will be gone for the weekend and back Monday or Tuesday. I'd like to thank Caligula, GeoMor, LuwigK and everyone else for their comments so far. This thread will probably fall of the front page, but I invite participants to keep posting their ideas and insights to this thread. I expect the discussion to be open for quite some time as the issues are technical and will take time to resolve. Let me take the opportunity to introduce Caligula (from his post at PandasThumb)
Oh, about myself. My real name is Esko Heimonen, and I use it in Finnish evo/crea debates openly. Although I use various callsigns at international servers, I have at least succeeded in creating the same nick at PT and UD, so that e.g. the mods of UD can associate my comments at PT to the person at UD. At ARN, I have written some stuff using the callsign Emuu. My professional background is modest concerning these topics: IT. As it happens, there are precious few professional scientists in evo/crea web debates! So “research” does not apply to e.g. my population genetics simulations. I have been in modest email exchange with e.g. Dr. Warren Ewens, and I intend to also contact Leonard Nunney concerning his simulation. But I certainly am not planning to publish anything, not even on a self-administered web site. I have followed the evo/crea debate for over a decade, so I know the various arguments quite well, and I have read quite a bit concerning the topic. But I’m perfectly aware that I’m no authority nor qualified to present strong arguments about scientific data. I do dare to present arguments concerning math, however. (GAs, neural networks, mathematical population genetics, “explanatory filter”, etc.)
Whatever the ID proponents may feel about Caligula, I hope we can keep the discussion focused on the central technical issue of Evolutionary Speed limits. If Caligula has some other issues (aside from speed limits) which he feels I neglected at PT thumb, he is invited to raise them in this thread as that was part of a courtesy I wished to extend to him in gratitude for his participation. So I'm giving him an opportunity to put issues on the table not directly related to the question of evolutionary speed limits. That said, everyone have a nice weekend, and thank you for an interesting discussion. I'll see you all next week. Salvador Cordova scordova
I developed a software simulation a while back -- kinda along the lines of "me thinks..." I was planning to polish it up and do something with it, but haven't got around to it. However, in my simulation, when the mutation rate reaches an average of 1 deleterious mutation per generation, evolution comes to a grinding halt. If we are hit by 3 deleterious mutations per generation, we should be degrading. bFast
Salvador: I’m not tracking this discussion closely yet. However, are you using Walters latest paper? I’d think you were. If not, I’m sure it may add to this discussion.
Thank you!!! It is interesting to note in the acknowledgments:
This research was supported in part by a grant from Discovery Institute.
ReMine gives an account of how his paper was treated: Haldane's Dilemma and Peer-Review scordova
On the surface, Kimura offers what appears to be a convincing fix to the speed limit problem. [Kimura was brilliant, and his work is obviously admired by many ID proponents. If I weren't an ID proponent, I'd probably be a neutralist.] Kimura's fix seemed to do the trick. Simply have a fast enough mutation rate, and the problem is solved. If one needs to fix 300 nucleotides per generation, all one needs is a mutation rate of 300 nucleotides per individual per generation. With a mammal having about 3 giga base pairs, that seems easy enough. But we have the rather troubling example of what happens when we increase mutation rates. Let me recount something by respected geneticist Maciej Giertych on attempts to up mutation rates. This would have been a triumph for the claims of neutral evolution, but it wasn't:
Mutations figure prominently in the Evolution story. When in the early ’60s I was starting breeding work on forest trees, everyone was very excited about the potential of artificial mutations. In many places around the world, special “cobalt bomb” centers were established to stimulate rates of mutations. What wonderful things were expected from increased variability by induced mutations. All of this work has long since been abandoned. It led nowhere. All that was obtained were deformed freaks, absolutely useless
One attempt to circumvent this problem was to suggest that 97% of the human genome was junk, thus neutral evoltuion could do it's thing and hopefully Natural Selection could take care of the remaining 3%. An uneasy truce was made between the selectionists and the neutralists. But what if our UD friend Dr. Pellionisz is right and the 97% of our genome isn't junk? This truce will be history. The neutralists were hoping the 97% would not be important to function, otherwise, given the example of the cobalt bomb labs, the results would be disastarous. But even if 97% is junk, there is still the problem of the remaining 3%. This leads to the 300 mutation rate LudwigK cited from page 34 of Sanford's book.
page 34, Genetic Entropy One of the most astounding recent findings in the world of genetics is that the human mutation rate (just within our reproductive cells) is at least 100 nucleotide substitutions (misspellings) per person per generation (Kondrashov, 2002). Other geneticists would place this number at 175 (Nachman and Crowell, 2000). These high numbers are now widely accepted within the genetics community. Furthermore, Dr. Kondrashov, the author of the most definitive publication, has indicated to me that 100 was only his lower estimate -- he believes the actual rate of point mutations (misspellings) per person may be as high as 300 (personal communication). Even the lower estimate, 100, is an amazing number, with profound implications. When an earlier study revealed that the human mutation rate might be as high as 30, the highly distinguished author of that study, concluded that such a number would have profound implications for evolutionary theory (Neel et al. 1986). But the acutal number is now known to be 100-300! Even if we were to accept the lowest estimate (100 mutations), and further assumed that 97% of the genome is perfectly neutral junk, this would still mean that at least 3 additional deleterious mutations are occrring per person per generation. So every one of us is a mutant, many times over! What type of selection scheme could possibly stop this type of loss of information? As we will see-- given these numbers, there is no realistic method to halt genomic degeneration. Since the portion of the genome that is recognized as being truly functional is rapidly increasing, the number of mutations recognized as being actually deleterious is also rapidly increasing. If all the genome proves functional, then everyone of these 100 mutations per person is actually deleterious. Yet even this number is too small, firstly because it is only the lowest estimate, and secondly because it only considers point muations (misspellings). Not included within this number aer the many other types of common mutations -- such as deletions, insertions, duplications, translocations, inversions, and all mitochondrial mutations.
scordova
LudwigK wrote: As for my background, if you find it relevant here goes. I’m Portuguese, I graduated in chemistry, did my master’s degree in computer science, and my PhD in structural biochemistry. I currently do research in bioinformatics and I’m an assistant professor in the computer science department at the New University of Lisbon. Here’s my homepage: http://centria.di.fct.unl.pt/~ludi/
Thank you, I appreciate your participation. I sensed you were educated in a relevant field. For the readers benefit, let me point out what I think you meant by your earlier comment:
Population size is not an issue. A random mutation shows up with a given probability per organism, so the same fraction of the population will bear the mutation regardless of how large the population is.
I believe you were referring to Neutral Theory (Kimura), not Selectionist theory (Haldane). From my population genetics textbook by Hartl and Clark, page 316:
The steady-state rate at which neutral mutations are fixed in a population equals mu, where mu is the neutral mutation rate. It is noteworthy that the equilibrium rate of fixation does not involve the population size N. the reason is that N cancels out: The overalll rate is determined by the product of the probability of fixation of new neutral mutations (1/2N) and the average number of new neutral mutation in each generation (2Nmu), hence (1/2N)x (2Nmu) = mu
For the readers this seems extraordinarily counter intuitive. A relevant mathematical theorem which Michael Lynch's colleague, Allan Force, directed me to is the theory of gambler's ruin. See: Gambler's ruin. It is from that formula that the probability of fixation (1/2N) is computed. For the reader's benefit, this method of transforming a population via neutral evolution is defnitely NON-Darwinian. James Crow and his brilliant students (like Kimura) were so effective at pounding Haldane's dilemma and arguing equations like the above that non-Darwinian Neutral theory was able overcome the obvious resistance to its implications. But this "fix" by the neutralists to circumvent certain speed limits imposed by Haldane's dilemma comes at price, and that is the subject of the next post. scordova
Salvador: I'm not tracking this discussion closely yet. However, are you using Walters latest paper? I'd think you were. If not, I'm sure it may add to this discussion. http://www.creationresearch.org/crsq/articles/43/43_2/cost_substitution.htm JGuy
I do appreciate the informative and civil discussion here. For the reader's benefit let me try to frame the issues as far as the speed limits of evolution and possible avenues (highways) which evolution can take and the speed limits associated with each. 1. 1 nucleotide "trait" per every 300 generations (Haldane's dilemma for a single nucleotide). 2. 1 gene "trait", thus multiple nucleotides (say 720), per every 300 generations (Haldane's dilemma for a single gene), a bit more promising than #1 at first glance but some problems associated with it such as probability of gene activation and utility. Since this probability is remote (as in making a functional, activated, useful protein), it could effectively be slower than #1. Further discussion is invited on this issue of multi-nucleotide selection. This will not account however for multiple numerous single-nucleotide changes that get fixed in a species genome, or nucleotides in apparently selectively neutral regions that are non-coding. 3. neutral evolution (non-Darwinian) where the fixation rate is equal to the neutral mutation rate mu (see Hartl and Clark, Principles of Population Genetics page 316). This can be for single nucleotides or multiple nucleotides. This is far more promising than #1 or #2, but the problem of deep conservation and various statistical test may contradict the claim of neutral evolution. Much of the criticisms of neutral evoltion come from the selectionists. The neutralists in turn will confront the selectionists with Haldane's dilemma. More discussion is invited on the contradictions to neutral theory, and the speed limits it faces. I at least am sympathetic to neutral evolution as it demonstrate theoretically Darwin was wrong. However, its proposed alternate mechanism (neutrality instead of selectivity) has its own problems. 4. Evo Devo rapid evolution scenario. This scenario avoids some serious evolutionary speed limits, and accounts for the deep conservation. But the fast car of Evo-Devo may come at a cost. Some ID proponents (like the guys at Telic Thoughts) are sympathetic to Evo-Devo. But I point out, for Evo-Devo to have chance of working, front-loaded design is strongly indicated. Not any old architecture can evolve into the diversity of life we see today. The the burden then is how did all that complexity get packed in the ancestral organism? When GeoMor mentioned hitchike, it was on the back of my mind some Evo-Devo scenario where a superior organism overtakes a population in a few generations. This would be a good discussion,but because Evo-Devo is so deep, I would ask we postpone it till later in the thread or simply another thread altogether. 5. Special Creation. There would obviously be no speed limit here. Within the ID community are the advocates of front-loading vs. special creation. #4 is substantive enough, that for the sake of argument, I would support front loading as a working hypothesis (despite my personal views about special creation). But in brief, Paul Nelson poses good arguments from morphology that are serious problems for Evo-Devo. Supportive of special creation would be any empirical confirmation of ideas in Sanford's Genetic Entropy. It is also possible there is some combination of special creation with front loading. There is much sympathy in the ID community for this synergy between special creation of ancestral forms followed by front-loaded evolution, that is the modern view of the Baraminology Study group of which Richard Sternberg was a part. scordova
[...] What are the speed limits of naturalistic evolution?. [...] Darwiniana » Random evolution?
Just couldn't help but noticing how much can be learned when toejam, raging bee, and Rev. Flank are not in the room. Terrific discussion. Sorry for interrupting. chunkdz
Along w/ Haldane's and Nachman's which i think are good enough refutes for the micro level changes (genetic and protein alterations) there needs to also be some talk of macro/systems level changes and the ability/time for evo to do those things. If you listed all the macrolvl differences btwn chimps and humans its still rather astonishing. Were eye color variation and hair distribution selected for individually or at the sametime? Most these models of evolution focus on the plausibility of one trait evolving within one time span, like you see the posters of horses changing in just size, jawbone migration, visual acuity, etc. Given the 7 million year difference and say there are only 7000 macroscopic differences that would mean approximately 7,000 years alloted for selection of each trait. Of course this brings to mind multiple traits that can be selected for at one time say "survival in the ice age," but what are the limits of selectability here? The genome doesn't mutate that much per generation to give say 20 advantages does it? What's the limit? Can you select for pH regulation, temperature regulation, hair distribution, metabolic rate, intelligence, all at the same time? Can you do it without significant draw backs like sickle cell and malaria? There just seems to be no limit on how many things can be selected for at one time. You need a lot of "gimmes" looking at it this way ie Wow, my thenar muscles, lung cartilage, language acquisition, and immunoglobulins were selected for at the same time - thank you evolution! jpark320
Regarding the conserved regions, I think there’s some misunderstanding. Haldande’s model is for a mutation that is present in only a small fraction of the population increasing in frequency until it fixates by eliminating all other (previously more common) variants. This is what Haldane estimated to take a few hundred generations if done by natural selection. The conserved regions is what you get when this does not happen, i.e. when new mutations fail to eliminate the common, established, variant and so this more common variant lasts for many generations. Even a modest selection pressure against new (and thus very rare) mutations is enough to keep a common gene from disappearing. LudwigK
Let me suggest to the reader, a very imporant area of research and one that might be amenable to computer simulation would be the probabily of deep conservation in the absense of selection. A falsifiable claim of Sanford's Genetic Entropy is that the overall number of SNPs will increase in neutral or nearly neutral regions. scordova
Even if you assume away such events, nucleotide sites do not get fixed independently of each other. If you have a very strong selective sweep on a particular site, nearby (possibly neutral, possibly even somewhat detrimental) characters are likely to come along for the ride — so-called “genetic hitchhiking”. The stronger the sweep, the better the chance that farther away sites will also become fixed.
That's an outstanding observation, however a consideration is in the idea of "fixation" is also the idea that the fix is maintained or "conserved". You suggestion is obviously far more plausible than the one-nucleotide per generation being fixed by Natural Selection. However if we assume hitchhike, how much of the hitchhike it can be lost during meiotic recombination and shuffling. Furthermore, without the policing of natural selection, it his hard to justify the existence of deeply conserved sequences. Random mutation noise will begin to erode these regions of conservation (between species) or uniformity (within the same species). I point you to the enigma mentioned here: Life goes on without vital DNA
Haussler's team recently described "ultra-conserved regions" in mammals. The level of conservation was even higher than that for many genes. "What's most mysterious is that we don't know any molecular mechanism that would demand conservation like this," Haussler says.
I think your comment is one of the better criticism of my argument and something the readers should take note of. However, the hitchike mechanism you suggest is vulnerable to the fatal flaw which deep conservation poses. scordova
Scordova, It's in page 34, where Sanford mentions various estimates for the number of nucleotide substitutions per person per generation. His estimates range from 100 to 300. I know that these values are disputed, but it’s in the ballpark to explain the differences between human and chimp genome. I’m not sure what you mean by selectionists (selection is only one mechanism, not the whole theory). I do understand that only a tiny fraction of these mutations can be fixated by selection pressure, but neutral evolution mechanisms easily account for the majority of them. And note that most of our genes are not fixated (there are several different alleles present in the population). Otherwise we would have all the same eye colour, skin tone, height, built, hair, etc. As for my background, if you find it relevant here goes. I’m Portuguese, I graduated in chemistry, did my master’s degree in computer science, and my PhD in structural biochemistry. I currently do research in bioinformatics and I’m an assistant professor in the computer science department at the New University of Lisbon. Here’s my homepage: http://centria.di.fct.unl.pt/~ludi/ LudwigK
Sal, I appreciate the idea of a "maximum rate of fixation of nucleotide changes", but we'll just never get there in the forseeable future. As some at PT already pointed out, nucleotides do not get fixed independently of each other; when you have a gene duplication or a rearrangement or something like that, it affects thousands/millions of nucleotides in one shot. Even if you assume away such events, nucleotide sites do not get fixed independently of each other. If you have a very strong selective sweep on a particular site, nearby (possibly neutral, possibly even somewhat detrimental) characters are likely to come along for the ride -- so-called "genetic hitchhiking". The stronger the sweep, the better the chance that farther away sites will also become fixed. Stepping back, I think the situation is so complicated that even if you built some kind of mathematical model that incorporated these various effects, it would be so complex and assumption-ridden that a negative result from it would make you question the model long before you'd question the power of evolution. Sort of like the Drake equation -- you can come up with half-plausible values for all the free parameters, but can you really trust the result? GeoMor
Patrick, Thank you. Many of my critics have claimed I run fom the issues. This open dialogue will give me a chance to defend myself and have my day in court. I look forward to the opportunity. Sal scordova
The answer is quite simple, for an approximate estimate. Most mutations that fixate (without being eliminated) are neutral. For these the fixation rate turns out to be equal to the mutation rate (and independent of population size, one of the points I made above). Using the numbers Sanford mentions in Genetic Entropy, that’s about 100 nucleotides fixed per generation.
Thank you for the response, can you give the page number in Sanford's book? Also, feel free to say something of your background and research interests for the benefit of our readers. Before I look up that page in Sanford's book, let me say, the selectionists probably have an issue with tha 100 nucleotide figure and it is tied exactly to the problem of conserved sequences and uniform regions. The selecitonists in other words have found a fatal contradiction in neutralist theory, the neutralists likewise have found fatal contradictions in selectionist theory. But before I post more on this, I'd like to get the page numbers in Sanford's book. scordova
“And you know what? No one knows whether or not any mutation/ selection process can account for all the physiological and anatomical differences observed between chimps and humans.”
Caligula: Indeed. This is one of the points I have made to Sal. We currently do not have the foggiest as to how many adaptations are needed to account for these differences. My point is we don't know if any number of adaptaions can do it. As far as we know wobbling stability is the norm. That is that allele frequencies oscillate and no real "evolution" occurs. Caligula: Since you start your post like a news-bearer, could you, perhaps, tell us this number of exterme interest? I'm not sure I follow but the following site gives us the basics pertaining to the physiological and anatomical (as well as the genetic) differences between chimps and humans: Chimps and humans, explore the differences Joseph
Sal, I just released several of caligula's comments from the spam filter; check them out above. Patrick
Dave, I'm afraid I do not see how my statement was so hopelessly wrong, as I was merely referring to the mutation rate, a standard parameter in population genetics models. Scordova, The answer is quite simple, for an approximate estimate. Most mutations that fixate (without being eliminated) are neutral. For these the fixation rate turns out to be equal to the mutation rate (and independent of population size, one of the points I made above). Using the numbers Sanford mentions in Genetic Entropy, that’s about 100 nucleotides fixed per generation. Note that Haldane was only referring to those mutations that are fixated exclusively by selection pressures. LudwigK
One scientifcally falsifiable claim that I'm offering here is the average number of single-nucleotide-polymorphisms (SNP) in the human genome will increase over time. That is a fundamental thesis of Genetic Entropy and is scientifically falsifiable. If one google's SNP's one will see it is a hot topic in medical research. This topic has medical significance in addition to issues about origins. Sal scordova
DaveScot wrote: That’s so hopelessy wrong I hardly know where to begin. I think you need to find a different blog.
I don't necessarily want to scare away participants, but I don't want this good discussion spammed either. One way to more objectively guage the worthiness of a comment is whether it addresses the fundamental issue of evolutionary speedlimits. Speed limits defined in terms of nucleotides per generation in the highly conserved or intraspecific regions of each species genome. Humans have 99.5% uniformity in their genomes, that's about 180,000,000 nucleotides identical that are different from chimps. Whether they can be established as morphologically significant or not at this time, this level of uniformity poses a problem for evolutionary speed limits. Recall that Darwinism relies on the introduction of noise into the genome, the fact we have the antithesis of noise in the genome in these uniform regions is problematic for Darwinism in a big way. Few have appreciated the significance of this enigma. If LudwigK can offer an estimate of the number of nucleotides per generation being fixed, then I would welcome his response. He can post it briefly. Is the answer 1, 2, 3... or .01, .001, .0001? Haldane gave an optimistic number of .0033 Sal scordova
shaner74, It would be nice to have Walter here, that's for sure. In respect for his time, I'll call upon him when I think we're in need of a technical clarification. The ARN experience for him was very negative as all sorts of sock puppets were just spamming the thread. Many of his responses then are relevant here, and when I'm in need of something new that he hasn't written already, I'll write him and ask him to pay us a visit. Before I do, I'll actually have to study his most recent paper in depth. It would be embarrassing if he takes the time to visit, and I haven't even shown enough respect for his work to have studied his paper in depth. So in that regard I'll have to at least wait a little. Sal scordova
Caligula wrote: Most importantly, you are here repeating claims that I believe were given in-depth responses, which you clearly did not even attempt to refute. I hope you will attempt to do so here.
I did not attempt to address for lack of time, and if my claims are wrong, it does not serve the ID movement to let erroneous ideas persist. If you find something wrong, we should discuss it and it would benefit the pro-ID proponents reading this weblog. This thread may slip off the front page, but I intend to keep the discussion going. The material is important to some of our readers who are in active debate on the campuses and churches. If you find areas that are substantially incorrect in my argument you will have done me and our readers a great service. One thing that would be helpful is if we focus on one bite-size issue at a time. Otherwise our comments will be hopelessly long. Speak freely, and now that you have joined the weblog, if you wish to state a laundry list of your concerns in brief form you may. For example. 1. sal's interpretation of Nachman 2. multi-nucleotide vs. single nucleotide hypothesis 3. well-stirred population etc. You don't need to make the items immediately understandable, but just so we can keep an accounting of important topics and keep the issues focused. At some point, when you and I feel a topic has reached an impasse and we've said about all we can say given the data, we can declare that we must simply disagree on a point and move on. I want to emphasize, even if we're on oppossing sides, I welcome hearing substantive criticisms as our readers will want to hear those criticisms addressed in a substantive way. Ok, before we move on, a fundamental measure of speed limit is the maximum number nucleotides per generation. Can we agree that is a fundamental question? If you have another fundamental question, please state that. In sum, you can take two posting to state: 1. what you believe the fundamental questions are 2. a laundry list of topics you feel I did not address. You don't need to explain each point, just a few words. We can elaborate them in the subsequent post. This is mostly for accounting purposes. It is my hope that our discussion will serve as a model for future technical exchanges. I hope the mods will permit a little leeway. I don't want to give the impression that I'm running away from arguments by banning people. If I do find someone worth banning I'll say so. Salvador scordova
Walter ReMine does not debate me, because I openly admit that I (a) do not have a relevant scientific background, (b) am quite new to the topic and (c ) have not published nor ever will publish anything concerning the topic. Also, there are quite a few relevant concepts that ReMine refuses to debate, calling them "confusion factors". I strongly oppose to his attitude. "Dr" ReMine in fact has a very similar background to me. He is an electrical engineer, not a Ph.D in the first place, and certainly doesn't have any more relevant education than I do; and he has not published anything in a peer-reviewed journal either. As for "confusion factors", terms such as "avoiding extinction", "mortality" and (various types of) "density-dependent selection" are highly relevant in spite of ReMine's protests. Indeed, "avoiding extinction" is the one and only concept where ReMine can attempt to base his "magic number" of 1667. Haldane(1957) is all about avoiding extinction under intense external selection pressures. Whether Walter wants to discuss "cost" in terms of "reproductive excess" rather than "genetic deaths" is of no consequence to me, however. I have and I can show that under "density dependent selection" the total cost of evolution needs not increase under even extremely intense selection -- using the cost concept advanced by Walter, the cost of substitution is paid by ReMine's "effective producers" instead of reproductive excess by the population or any subpopulation within. caligula
Sal, Did you invite Dr. ReMine to this discussion? I know he posts at ARN from time to time, and I recall a huge thread where he answered dozens of questions regarding Haldane’s dilemma. shaner74
"And you know what? No one knows whether or not any mutation/ selection process can account for all the physiological and anatomical differences observed between chimps and humans." Indeed. This is one of the points I have made to Sal. We currently do not have the foggiest as to how many adaptations are needed to account for these differences. Since you start your post like a news-bearer, could you, perhaps, tell us this number of exterme interest? caligula
Caligula: Differences between human and chimp can be and have been calculated using many different criteria. And you know what? No one knows whether or not any mutation/ selection process can account for all the physiological and anatomical differences observed between chimps and humans. And again there isn't any way to objectively test the premise that chimps and humans shared a common ancestor. Joseph
EJ Klone: Why do some IDists insist in highlighting the naturalistic aspect of evolution? Isn’t ID also compatible with naturalism? A human genetic engineer would be designing characteristics into organisms, yet what they are doing is totally natural. (As opposed to supernatural) Natural/ supernatural is irrelevant because even the anti-ID position requires something beyond nature to account for its origins. The debate concerns intentional design vs. sheer dumb luck. EJ Klone: Talking about the limits of evolution is negative argumentation. If an existing paradigm is false it should be exposed. EJ Klone: Even if evolutionary scientists somehow demonstrate that unguided evolution was possible, it doesn’t mean that that is actually what happened. We need to focus on a positive argument for design. If what you say does occur then the only positive argument for ID would be to present the designer. EJ Klone: Salvador, which do you consider ID to be, an investigation about how possible evolution is, or an investigation into whether or not certain aspects of life were designed? The two are different things. That depends on what you mean by "evolution" as it has several meanings. Joseph
Is "Wobbling Stability" a speed limit or an obstacle? Chapter IV of prominent geneticist Giuseppe Sermonti's book Why is a Fly Not a Horse? is titled "Wobbling Stability". In that chapter he discusses what I have been talking about in other threads- that populations oscillate. The following is what he has to say which is based on thorough scientific investigation:
Sexuality has brought joy to the world, to the world of the wild beasts, and to the world of flowers, but it has brought an end to evolution. In the lineages of living beings, whenever absent-minded Venus has taken the upper hand, forms have forgotten to make progress. It is only the husbandman that has improved strains, and he has done so by bullying, enslaving, and segregating. All these methods, of course, have made for sad, alienated animals, but they have not resulted in new species. Left to themselves, domesticated breeds would either die out or revert to the wild state—scarcely a commendable model for nature’s progress.
he goes on to say:
Natural Selection, which indeed occurs in nature (as Bishop Wilberforce, too, was perfectly aware), mainly has the effect of maintaining equilibrium and stability. It eliminates all those that dare depart from the type—the eccentrics and the adventurers and the marginal sort. It is ever adjusting populations, but it does so in each case by bringing them back to the norm. We read in the textbooks that, when environmental conditions change, the selection process may produce a shift in a population’s mean values, by a process known as adaptation. If the climate turns very cold, the cold-adapted beings are favored relative to others.; if it becomes windy, the wind blows away those that are most exposed; if an illness breaks out, those in questionable health will be lost. But all these artful guiles serve their purpose only until the clouds blow away. The species, in fact, is an organic entity, a typical form, which may deviate only to return to the furrow of its destiny; it may wander from the band only to find its proper place by returning to the gang. Everything that disassembles, upsets proportions or becomes distorted in any way is sooner or later brought back to the type. There has been a tendency to confuse fleeting adjustments with grand destinies, minor shrewdness with signs of the times. It is true that species may lose something on the way—the mole its eyes, say, and the succulent plant its leaves, never to recover them again. But here we are dealing with unhappy, mutilated species, at the margins of their area of distribution—the extreme and the specialized. These are species with no future; they are not pioneers, but prisoners in nature’s penitentiary.
The point being, that IF it were left to direct scientific observations, evolutionism fails miserably and all that is left is wishful thinking supported by speculation. And the fact remains that we don't know whether or not any mutation/ selection process can account for the changes required by Coomon Descent from some single-celled LUCA. Which means we don't have any way to objectively test the premise- meaning evolutionism is beyond the realm of science. Joseph
ludwig A random mutation shows up with a given probability per organism, so the same fraction of the population will bear the mutation regardless of how large the population is. That's so hopelessy wrong I hardly know where to begin. I think you need to find a different blog. DaveScot
I think there are a few problems with this analysis. Population size is not an issue. A random mutation shows up with a given probability per organism, so the same fraction of the population will bear the mutation regardless of how large the population is. Haldane’s scenario was a catastrophic reduction in population size due to some event that killed off most organisms (climate change, disease, etc) and made this mutation an advantage. Other scenarios can result in faster fixation rates. Fixation occurs when one allele replaces all alleles for that locus in the population. This is not a requirement for evolution (any change in gene frequencies is evolution). Neutral evolution is well supported by observation, and not just a political move. We know that many different DNA sequences code for the same protein sequence, that many different protein sequences produce the same physiological effects, and that many different traits have no impact on reproductive success. The exact number of facial hairs, exact hair or eye colour, exact height or built does not significantly affect the number of children we have. And the reason there is so much diversity in these traits is precisely because fixing them (i.e. wiping off all variants except one for each gene) is a slow process. But, again, it is not necessary that this occurs for evolution to carry on. LudwigK
Sal, I promised to address your question about the fixation of gene duplications. I have now read an article by Fyodor Kondrashov(2002), who seems to be a current authority on the topic. He has written a more recent one (2006), but I could only find an abstract. I suggest you look them up. The basic ideas themselves are not new -- they are exactly what a layman could come up with. Either gene duplications become fixed by drift alone, or at least some of them are selected because they affect the dosage of the coded protein in our cells. Rather than presenting new radical ideas, Kondrashov presents data. This data challenges the pure "neutralist" view advanced by Susumu Ohno and accepted as the "null hypothesis" or default assumption. In Kondrashov(2002), both original research and references are provided, amounting to an impressive collection of data, to support the view that a duplicate of a gene in many, many cases is not redundant, and instead selection does apply to both copies of the gene. caligula
The one thing these guys never deal with is the fact that temporal benefits may be the building blocks of a future bomb. Without design or guidance you have failure. vpr
Sal, First, thank you for your invitation to continue the discussion, which started at PT, here at UD. The above article of yours is not exactly compatible with the discussion that inspired it at PT: http://www.pandasthumb.org/archives/2007/01/dissent_out_of.html which I tried to summarize in (#155924). Some of the 4 subtopics that I identify in the summary have been omitted by you here, some new ones have been added. Most importantly, you are here repeating claims that I believe were given in-depth responses, which you clearly did not even attempt to refute. I hope you will attempt to do so here. For starters, some comments about your article. 1. As you said yourself, a geographically complex population of 6,5 billion is not relevant when considering the evolution of Homo sapiens. Population sizes of the order 10,000 to 1,000,000 seem more realistic. Also, according to fossil evidence, it seems that this population has been geographically rather limited and well connected. In general, I suggest we acknowledge from the get-go that the current human civilization is not a typical wild population. In fact, I have pointed this out several times during our past exchanges. 2. I assume the figure 180,000,000 SNPs is solely based on this SciAm article that you have quoted in an earlier writing: http://www.sciam.com/article.cfm?chanID=sa003&articleID=9D0DAC2B-E7F2-99DF-3AA795436FEF8039 Alas, you have misinterpreted the article. It does not refer to differences between human and chimp in terms of nucleotide differences over the entire genome. Note this key sentence: "In humans and chimps, which have about 22,000 genes each, the group found 1,418 duplicates that one or the other does not possess." (1418 / 22000 = 6%, roughly.) Differences between human and chimp can be and have been calculated using many different criteria. Differences calculated with two entirely different criteria can't be compared, of course. Talking in terms of SNPs, especially over the entire genome, is plain and simply nonsensical here. Using the typical 1000-sites-per-gene estimate, the above article would mean that humans and chimps differ by about 1,5 million SNPs (as opposed to your erroneous 180 million!). Now, does this really mean that we differ from chimps by only 0,05 %? Of course not. The difference in number of coding genes that we do not share with each other tells us neither how many nucleotide differences there are over the entire genome nor even how many SNP differences there are over the coding genes that we do share with each other. You asked for a better figure? Alas, I'm not qualified to give any accurate figure. But I belive that typical figures would be 30-60 million sites (1-2% difference). Of these differences, the vast majority are considered morphologically insignificant. 3. Haldane's dilemma: I have said quite a bit about this to you at PT already, but I will repeat my thoughts here. I will do it in a separate post, in order to prevent individual posts from becoming unwieldy, as you have adviced me. 4. Neuralists vs. selectionists: See 3. I will pay special attention to this sentence of yours: "How do we account for designs that cannot possibly be the result of natural selection?" Indeed, you will have to show us what such design might be. 5. Nachman's paradox: See 3. 6. Teleomorphic Recursivity: This one I have to read more about. Do you have a link to the full text? caligula
Sal, your comments about genetics of population are very informative and clear. If you could collect them in one or some pdf files this could be useful for give others a fast reference for these ideas. kairos
I applaud you Sal, I just went over there and saw the merciless (and logically weak) attacks they made on you. Keep it up. -John jpark320
Why do some IDists insist in highlighting the naturalistic aspect of evolution? Isn't ID also compatible with naturalism? A human genetic engineer would be designing characteristics into organisms, yet what they are doing is totally natural. (As opposed to supernatural) Talking about the limits of evolution is negative argumentation. Even if evolutionary scientists somehow demonstrate that unguided evolution was possible, it doesn't mean that that is actually what happened. We need to focus on a positive argument for design. Salvador, which do you consider ID to be, an investigation about how possible evolution is, or an investigation into whether or not certain aspects of life were designed? The two are different things. EJ Klone
jpark320, The reason part of the discussion moved here was that PandasThumb is incredibly slow in displaying long discussions (as you saw yourself). When ever I visit, I tend to generate a swarm of nasty comments, and then their system just gets bogged down trying to display all the vitriol. Sal scordova
Nevermind - forgot to scroll down oops... :X jpark320
Hey Sal, Is it possible to put forward some of those criticisms or perhaps a link to the actual thread where the criticisms are? I clicked the link and only got the PT front page (do i need a PT registration to see the thread)? jpark320
Advocates of Darwinian mechanisms usually invoke "deep time" as a magic wand to explain away huge improbabilities. The only problem is that deep time is not deep enough. Actually, deep time is quite shallow -- only on the order of 10^18 seconds since the origin of time itself. Sorry, deep time won't do the trick. Try again. GilDodgen

Leave a Reply