Uncommon Descent Serving The Intelligent Design Community

The Altenberg Sixteen

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

HT to Larry Moran’s Sandwalk for the link to this fascinating long piece by journalist Suzan Mazur about an upcoming (July 2008) evolution meeting at the Konrad Lorenz Institute in Altenberg, Austria.

“The Altenberg 16” is Mazur’s playful term for the sixteeen biologists and theoreticians invited by organizer Massimo Pigliucci. Most are on record as being, to greater and lesser degrees, dissatisfied with the current textbook theory of evolution. Surveying the group, I note that I’ve interacted with several of the people over the years, as have other ID theorists and assorted Bad Guys. This should be an exciting meeting, with the papers to be published in 2009 by MIT Press.

Mazur’s article is worth your attention. Evolutionary theory is in — and has been, for a long time — a period of great upheaval. Much of this upheaval is masked by the noise and smoke of the ID debate, and by the steady public rhetoric of major science organizations, concerned to tamp down ID-connected dissent. You know the lines: “Darwinian evolutionary theory is the foundation of biology,” et cetera.

But the upheaval is there, and increasing in amplitude and frequency.

[Note to Kevin Padian: journalists don’t like it when you do this to them. Mazur writes:

Curiously, when I called Kevin Padian, president of NCSE’s board of directors and a witness at the 2005 Kitzmiller v. Dover trial on Intelligent Design, to ask him about the evolution debate among scientists –- he said, “On some things there is not a debate.” He then hung up.

That hanging-up part…not so wise. If you’re going to say there’s no debate, explain why.]

Comments
bFast wrote: "I think that the general theme of the discussion here is that there must be some unknown preservative(s). We are not suggesting that these preservatives necessarily be direct acts of God, nor are necessarily “unnatural” in any way." Nor did I suggest that they were. I think the short list of potential error correction mechanisms that I provided above is a first approximation to an answer to the apparent paradox that Alexi Kondoshov expressed when he asked "Why are we not dead 100 times over?"Allen_MacNeill
March 7, 2008
March
03
Mar
7
07
2008
12:05 PM
12
12
05
PM
PDT
Right now I am reading a history of the genome by Henry Gee and it starts with Aristotle and others trying to explain how new life is formed. Right now I am up to Darwin and the chapter after next is on Mendel which is appropriate for my comments below. Gee said the problem with Darwin's ideas was always the source of variation. Natural selection works fine on the current gene pool given enough time. But how does variation arise? This is relevant here since we have occasionally evoked Dr. MacNeill's 47 engines of variation. So a rightful area of enquiry is just how well do these engines generate variation or is what they really generate is random dysfunctional genomes. However, a second issue was raised by Gee and that is that natural selection needs time and lots of it to do its work and essentially what you get are variations of the original that look a little different and have some other modest changes to the phenotype. But here is another real problem, where Dr. MacNeill is leading us to. Are there ways to jump start the genetic changes or other organic changes that are necessary to explain the more dramatic changes we have seen in the world and which would happen too slowly according to modern genetic theory. Dr. MacNeill's specialty is evolutionary psychology which I personally always looked upon as related to alchemy, astrology etc. because suppose there was a psychological outlook that favored things such as religion, altruism, etc, or some other desirable trait. If it was based on some genomic DNA combination, it could not be transmitted to future generations except through typical population genetics and this takes ages. So it sounds like these other three dimensions being discussed by Jablonka and recommended by Dr. MacNeill are meant to pave the way for faster changes in species of various traits and have nothing to do with variation generation which has been our perception of the real achilles heel of the modern synthesis. So now the genetic half is under assault and it seems at first glance because it is necessary to implement faster changes in the genomes of species so that things like evolutionary psychology or other pet theories can be viable.jerry
March 7, 2008
March
03
Mar
7
07
2008
12:02 PM
12
12
02
PM
PDT
I have a question. Is it only the female gamete cells that are separated off early in development. Aren't sperm cells constantly being produced from a germ cell and as such these cells subject to mutations as much as any other cell.jerry
March 7, 2008
March
03
Mar
7
07
2008
10:08 AM
10
10
08
AM
PDT
Allan_MacNeill:
This entire discussion has circled around an elephant in the room: that fact that we are here provides prima facie evidence that there is clearly something wrong with the “genetic entropy” hypothesis (unless one agrees with Dr. Sanford that the universe and everything in it is less than 10,000 years old). IOW, as DaveScot has pointed out, there must be at least one mechanism that compensates for the surprisingly rapid decay of the genome over time.
I think you will find that we all are of the mind that Sanford has taken a very exaggerated view of the genetic entropy problem. However, I still see a fundimental problem once we experience 1 mutation in active DNA (DNA that does something, and, for that matter, epigenic material that does something) per generation (birth to birth of offspring (b to b)). As far as how many mutations a human has, I think there's a lot to be said for identical twin studies. If, as has been suggested, the germ cells are separated off early in the development cycle of mammals, then let the seed of two identicals be analysed for differences. We will then have an imperically determined count of mutation rate per (b to b) generation. I think that the general theme of the discussion here is that there must be some unknown preservative(s). We are not suggesting that these preservatives necessisarily be direct acts of God, nor are necessarily "unnatural" in any way.bFast
March 7, 2008
March
03
Mar
7
07
2008
08:39 AM
8
08
39
AM
PDT
I made the comment on another thread a couple days ago that the species of the world seem quite healthy, especially humans as we live longer and are more hearty when we are fed correctly. Species extinctions seem more due to human interference than the viability of the line deteriorating. So those biologists predicting doom and gloom for the species of the world due to genetic mutations seem to lack any empirical evidence. We have descriptions of humans and other animals going back over 4000 years or 2/3 of the supposed history of the earth and all is fine. While we may not be better than our Greek or Persian ancestors, we certainly are not worse. Of course they were the times of heros such as Achilles, Odysseus, Roland, Gilgamesh and King Arthur and maybe we are on a down hill slide. No more super men, we only read about them in stories like the Iliad, Beowulf etc. Oh for the good old days when humans were giants.jerry
March 7, 2008
March
03
Mar
7
07
2008
08:07 AM
8
08
07
AM
PDT
DaveScot wrote: "Proofreading can only be done as long as you have an original copy to compare to the new copy." Not necessarily; the copy that is used for the proofreading almost certainly comes from the set that was provided from the other parent following fertilization. During the first division of meiosis (Meiosis I), the two chromosomes that make up each homologous pair (i.e. one from each parent) line up in register, a process called synapsis. They remain in this condition for a surprisingly long period of time (indeed, in female mammals, it lasts from before birth until the eggs are fertilized, which in humans can be longer than 40 years). This combination of two double-stranded chromosomes is called a "tetrad". While the homologous chromosomes are lined up "in register" (meaning the genes on the two copies are lined up next to each other) a large protein complex, called the recombination complex, works its way along the tetrad. Parts of the complex, called recomination enzymes, check for differences between the two copies. When a difference is detected, it can be corrected using the undamaged code in one of the other strands. How can undamaged code be recognized? There are several mechanisms, all having to do with specific sequences (especially in promoters). And, of course, sometimes the "proofreading" recombination complex makes a mistake and uses a "bad" copy as the template for a repair. In many cases, this is caught, as it eventually causes the release of a chemical signal that stops the completion of meiotic division. If it isn't caught, a "bad" gamete gets made, but this will presumably be eliminated by phenotypic selection. IOW, there is indeed a whole set of "proofreading" mechanisms in eukaryotes that tends to reduce the frequency of deleterious mutations actually making their way into a population via sex and reproduction. None of these proofreading mechanisms are accounted for in Dr. Sanford's mathematical models of "genetic entropy", which apply only to point mutations in DNA sequences. This is yet another reason to suspect that the explanation for why such models do not match observed reality (i.e. the fact that eukaryotes, including us, are still around) is that they do not model reality precisely enough to be meaningful. Interesting yes, but irrelevant to the analysis of actual biological reality.Allen_MacNeill
March 7, 2008
March
03
Mar
7
07
2008
08:04 AM
8
08
04
AM
PDT
DaveScot wrote: "Primordial germ cells, on the other hand, differentiate very early in embryonic development. Presumably that’s to limit the number of downstream replications and thus limit the number of DNA replication errors." Exactly right, as is your analysis of the etiology of cancer as well. This entire discussion has circled around an elephant in the room: that fact that we are here provides prima facie evidence that there is clearly something wrong with the "genetic entropy" hypothesis (unless one agrees with Dr. Sanford that the universe and everything in it is less than 10,000 years old). IOW, as DaveScot has pointed out, there must be at least one mechanism that compensates for the surprisingly rapid decay of the genome over time. My guess is that there are multiple mechanisms, probably added in a stepwise fashion as genomes increased in size as the result of gene duplication, genome fusion, virus and transposon insertions, accumulation of tandem repeats, etc.Allen_MacNeill
March 7, 2008
March
03
Mar
7
07
2008
07:35 AM
7
07
35
AM
PDT
Allen Huge amounts of junk DNA can't act as a mutation sponge if mutation rate is constant in junk and non-junk. However, if the non-junk DNA in a human is as small as say the malaria parasite then that works out as the total errors in the functional DNA would be small enough that most copies would be perfect. This however doesn't make much sense from an engineering viewpoint as the total amount of human DNA already makes it highly questionable whether much of it can be junk. There's too much additional complexity in a human compared to a malaria parasite and that complexity has to be encoded somewhere. The total amount of DNA in a human, even if every scrap is functional, still beggars belief that it's enough information to build a human. Recombination can't be the savior either as only natural selection is capable of telling a good allele from a bad one and culling it. Error checking requires a test of some sort to discriminate between errors and non-errors. How is the discrimination test made during recombination? Natural selection is such a test but obviously that requires growing the organism out long enough for differential reproduction to manifest enough to cull the less successful mutants. It's the practical inability of natural selection to select one allele at a time that's the root of the entropy problem. Selection only selects whole genomes so it must consider the good, the bad, and the nearly neutral mutations altogether. How would that discrimination be operative in any other way? Where's the test? Proofreading can only be done as long as you have an original copy to compare to the new copy. In recombination neither copy is the proof copy so there's no way to test for errors (or improvements) except through differential reproduction.DaveScot
March 7, 2008
March
03
Mar
7
07
2008
06:42 AM
6
06
42
AM
PDT
bFast asked: "In this context, what exactly is a replication — each time a human cell replicates, or each time a human replicates?" Both; this is why the probability of a cell becoming cancerous increases with each replication. This is why we tend to get cancer as we age, and why we tend to get cancer in tissues composed of continuously dividing cells — skin, lining of the digestive system and lungs, lining of the ducts in mammary glands and the prostate, testicles, and bone marrow.Allen_MacNeill
March 7, 2008
March
03
Mar
7
07
2008
06:01 AM
6
06
01
AM
PDT
DaveScot asked: "Why have a few large genomes managed to survive the ravages of genetic entropy over hundreds of millions of years?" A plausible (and testable) hypothesis is that the huge amount of non-coding DNA in such organisms acts as a "mutation sponge". That is, by providing a huge target for random mutations, almost all of which have no effect on phenotype, the non-coding DNA has the effect of lowering the rate of deleterious mutations that occur in coding regions. Sex also plays a hugely important role in this process. The larger a genome is, the less likely it is that there will be exactly the same mutation in exactly the same location in the two genomes that get combined during fertilization and sexual recombination. Lynn Margulis (in her book The Origin of Sex, coauthored with her son, Dorion Sagan) has proposed that this was the original reason why sex involved: not as a means of increasing genetic diversity among offspring, but rather as a means of providing a spare copy of genetic material for error checking with every new generation. This hypothesis is strongly supported by multiple lines of evidence, including the fact that such error correction does in fact take place during meiosis I in diploid eukaryotes. This process is further enhanced by crossing over, which has the effect of recombining good copies of genes from what were originally separate genomes in one copy. Obviously, this also creates a "mirror" set that has both of the "bad" copies, but this one gets used in only half of the gametes, which are made in such large quantities in males that the probability of a positive outcome from a recombined "good" set outweighs the probability of a negative outcome from a "bad" set (especially if the "bad" set lowers the viability of the sperm cells prior to fertilization) Once again, Dr. Sanford's assumptions do not reflect biological reality any better than the overly simplified assumptions upon which the "modern evolutionary synthesis" was based. Fortunately, evolutionary biologists have begun to recognize such deficiencies and move on. I hope John eventually does so as well, although the fact that he massaged the numbers to reify an hypothesis he had chosen for reasons not related to science (i.e. his absolute committment to the "young Earth" hypothesis) does not auger well in this regard.Allen_MacNeill
March 7, 2008
March
03
Mar
7
07
2008
05:57 AM
5
05
57
AM
PDT
DLH: All good points. As to the "energy processing module", all cells have at least one, consisting of the biochemical pathways that comprise glycolysis and fermentation. These do not need a cell to function, as they consist entirely of enzyme catalyzed reactions, and hence can be carried out in vitro. However, almost all cells rely on a membrane-bound system for most of their energy. In bacteria, the various proteins (cytochromes, etc.) and coenzymes (quinones, dinucleotides, etc.) are inextricably part of the plasma membrane. In eukaryotes, these same assemblies are embedded in the inner membranes of chloroplasts and mitochondria. The similarities between these two systems is not accidental. There is very strong evidence for the hypothesis that chloroplasts and mitochondria were once free-living bacteria that formed endosymbiotic partnerships with their Archaean host cells about a billion years ago. Since the energy processing modules for most cells involve molecular assemblies that are embedded in membranes, these are once again not reducible to genetic information alone. Rather, they absolutely require the presence of membranes for their function, and so until such membranes are constructed (either spontaneously in the OOL, or artificially in the laboratory), the "creation" of life that relies on such assemblies for their energy is quite literally impossible.Allen_MacNeill
March 7, 2008
March
03
Mar
7
07
2008
05:39 AM
5
05
39
AM
PDT
bFast That's 3 mutations each time a cell replicates but the rate varies quite a bit by species and loci. One per billion I understand as a rule of thumb for eukaryotes in general. Prokaryotes don't have the DNA proofreading that eukaryotes do and the rule of thumb for them is one per ten million. Presumably if somatic cells of identical twins were compared there would be a cumulative deviation of 3 mutations per replication downstream from the egg cell that split so yes, you would expect many more discrepancies. Primordial germ cells, on the other hand, differentiate very early in embryonic development. Presumably that's to limit the number of downstream replications and thus limit the number of DNA replication errors. That explains why few babies are born with cancer but acquire it later in life as mutations accumulate in somatic cell lines. Cells where the background replication error rate is accelerated due to environmental insults (carcinogenic chemicals and ionizing radiation) are more apt to become cancerous. So I guess one can say that most cancers are caused by genetic entropy.DaveScot
March 6, 2008
March
03
Mar
6
06
2008
10:23 PM
10
10
23
PM
PDT
jerry: The term "epigenetic", which can certainly have other historical contexts, is used in modern biology and medicine to indicate heritable factors which are not in the genome (usually cytoplasmic factors). Here is the Wiki definition: "Epigenetics is a term in biology used today to refer to features such as chromatin and DNA modifications that are stable over rounds of cell division but do not involve changes in the underlying DNA sequence of the organism.[1] These epigenetic changes play a role in the process of cellular differentiation, allowing cells to stably maintain different characteristics despite containing the same genomic material. Epigenetic features are inherited when cells divide despite a lack of change in the DNA sequence itself and, although most of these features are considered dynamic over the course of development in multicellular organisms, some epigenetic features show transgenerational inheritance and are inherited from one generation to the next". One example of epigenetic factors is the methylation of specific genes which is the basis for genetic imprinting: DNA does not change, vut a gene may express itself or not in a child according to a specific signal (methylation) given by one of the parents.gpuccio
March 6, 2008
March
03
Mar
6
06
2008
08:23 PM
8
08
23
PM
PDT
bfast, "epigensis - the approximately stepwise process by which genetic information, as modified by environmental influences, is translated into the substance and behavior of an organism." This sounds like natural selection to me. Though it is kind of vague, just as Gee said, it could mean a lot of things. For example, does it mean somatic cells or gametes? Does it mean that the environment will directly change a cell's DNA information as opposed to affecting how cells can develop? Is the environment inside the cell as you suggested or outside and where outside. Neighboring cells, the organism or outside environment. It is sufficiently vague that I could probably come up with 3-4 more interpretations. I ordered Jablonka's book from Amazon so maybe in a couple weeks I will have a better idea.jerry
March 6, 2008
March
03
Mar
6
06
2008
06:16 PM
6
06
16
PM
PDT
DaveScott Thanks for the clues. Intriguing point on purifying selection and the possibility of "rebooting". Somewhere I saw an inverse log-log plot between mutation rate and genome size. That has a very major impact on mutations, evolution and Haldane's Dilemma in comparing microbes to macrobes. Some followup data from the Kondrashov reference: “Direct estimates of human per nucleotide mutation rates at 20 loci causing mendelian diseases” Alexey S. Kondrashov, Human Mutation, Vol. 21, Issue 1 , Pages 12 - 27
The average direct estimate of the combined rate of all mutations is 1.8×10-8 per nucleotide per generation, and the coefficient of variation of this rate across the 20 loci is 0.53. Single nucleotide substitutions are 25 times more common than all other mutations, deletions are three times more common than insertions, complex mutations are very rare, and CpG context increases substitution rates by an order of magnitude.
Context of deletions and insertions in human coding sequences Alexey S. Kondrashov *, Igor B. Rogozin Hum Mutat 23:177-185, 2004.
Two-thirds of deletions remove a repeat, and over 80% of insertions create a repeat, i.e., they are duplications.
Most Rare Missense Alleles Are Deleterious in Humans: Implications for Complex Disease and Association Studies, GV Kryukov, LA Pennacchio, SR Sunyaev - Am J Hum Genet, 2007 Vol. 80, # 4, 727-739, 1 April 2007, - UChicago Press.
We combined analysis of mutations causing human Mendelian diseases, of human-chimpanzee divergence, and of systematic data on human genetic variation and found that ?20% of new missense mutations in humans result in a loss of function, whereas ?27% are effectively neutral. Thus, the remaining 53% of new missense mutations have mildly deleterious effects. . . . Surprisingly, up to 70% of low-frequency missense alleles are mildly deleterious and are associated with a heterozygous fitness loss in the range 0.001–0.003. . . .Several recent studies have reported a significant excess of rare missense variants in candidate genes or pathways in individuals with extreme values of quantitative phenotypes. These studies would be unlikely to yield results if most rare variants were neutral or if rare variants were not a significant contributor to the genetic component of phenotypic inheritance.
I think this last reference particularly supports the contention that mutations degrade system functionality (or "design") much faster than "beneficial mutations" with NS could provide new "function."DLH
March 6, 2008
March
03
Mar
6
06
2008
05:32 PM
5
05
32
PM
PDT
So let me throw a wench into the system. I have been looking for a way to illustrate what I call the signal to nose ratio problem. Consider five organisms that each contain 200 genes for which there are two alleles. Let us order the alleles, calling the slightly less fit for the current environment (possibly more fit for a slightly different environment) allele 1, and the more fit, allele 2. We can now determine the relative fitness of the organism by adding up the allele numbers. Now we throw in a new, slightly beneficial mutation, and give it allele #3. One of the organisms below has an allele 3. Do you really think that if we added reproduction, allele mixing (which doesn’t even happen in non-sexual organisms) and natural selection that natural selection is sensitive enough to cause allele 3 to spread throughout the population? I kinda doubt it. Now what if every cycle we also through in two or three allele 0s at random? Would that not make natural selection’s challenge even greater? What if we assumed an average of 2 alleles for each human gene. With this assumption, the one beneficial allele is lost in a jungle of 25,000 alleles (if you limit your count to coding genes.) Organism 1: 1111111222 2212112211 1221212112 1112222121 1212221112 1112212112 1112122121 1122212212 1221122222 1121121121 2221122222 2221112122 1121121112 2221222212 1222211212 1221212121 1221111221 1122121111 1122212112 1212112222 SUM = 301 Organism 2: 2112121121 1111111222 2211121122 2211112111 2222221122 2212122222 2221111221 2212212122 2211211122 2121112222 1221222222 2112112112 1122222212 1121112121 1122221222 1112121111 2111222111 2221211121 2221121122 2222212111 SUM = 306 Organism 3: 2112111222 1111222221 2212222111 1211222211 1221121112 2211221121 1221212111 2111111112 1221221122 1212122211 1112111222 2111211111 2111221112 1122212212 1122211222 2211221112 2221211121 1211112221 1112122122 1222222212 SUM = 296 Organism 4: 1111211111 1122111121 2211222122 1212222121 2221111212 1222111221 2111121221 1122212112 1212111222 1121211122 2211121212 2312111211 1111212121 1221121222 1112211212 2222221112 1211122122 1211111112 2211212121 2212222121 SUM = 295 Organism 5: 1112221211 1111222221 1112112112 1211112222 2121211112 1112111121 2121112212 2222211121 2111221111 1111211222 1212122121 2221222221 1111222112 1222211211 1211212221 2222111212 1111211111 1222221122 2221221111 1121111212 SUM = 290 Where is Waldo, anyway?bFast
March 6, 2008
March
03
Mar
6
06
2008
04:55 PM
4
04
55
PM
PDT
Jerry (130) you asked for a working definition of epigenesis in this context. I would site dictionary.com's biology definition 2:
b. the approximately stepwise process by which genetic information, as modified by environmental influences, is translated into the substance and behavior of an organism.
I think in this context we are specificly interested in the "environmental influences", most specifically the environment within the cell. My understanding is that the cell contains a variety of structures which are involved in the process of cell replication. At some point this structural stuff is also replicated. In all likelihood, the development of the copy structural stuff uses the original structure as the guide to make the new structure. If so, then the original structure is also replicated, it is part of the "data" that makes up the organism. If a structural component has particular essential properties, such as shape, these properties must be replicated exactly or the new copy will perform worse (usually) or better (once in a blue moon). It becomest another layer of data that defines the cell -- another opportunity for duplication error.bFast
March 6, 2008
March
03
Mar
6
06
2008
04:35 PM
4
04
35
PM
PDT
DaveScot, "Humans end up with an average of 3 mistakes in every replication." In this context, what exactly is a replication -- each time a human cell replicates, or each time a human replicates? Recently there was an article on PhysOrg.com I believe that discussed the genetic differences between identical twins. It would appear that identical twins are a lot more than six mutations different from each other -- a lot more! Using this measured data, Sanford may not be terribly wrong.bFast
March 6, 2008
March
03
Mar
6
06
2008
04:23 PM
4
04
23
PM
PDT
DLH (con't) The next question that came to me regarding genetic entropy was why didn't P.falciparum go extinct in the last 50 years from genetic entropy. So I did a little math using the standard number for eukaryotic mutation rate and found that genome size makes a huge difference in genetic entropy. Humans end up with an average of 3 mistakes in every replication. However, P.falciparum's genome is so much smaller that 19 out of 20 replications are PERFECT copies. That number of perfect copies allows natural selection to select one mutation at a time which is impossible with humans. P.falciparum undergoes purifying selection that is impossible for humans. That's why it didn't go extinct due to genetic entropy even though it replicated more times in the last 50 years than all the replications mammals have undergone from the time they were still reptiles. Big genomes aren't purified by selection nearly as well as small ones. That said, if Sanford's sources are right about orders of magnitude higher mutation rates than usually given then P.falciparum should have gone extinct from genetic entropy in the last 50 years but it didn't. Sanford's hypothesis works out really well against real world observations if you use the commonly given mutation rate. If you use those Sanford proposes his hypothesis doesn't agree with anything except the 6000 years of creation in the bible. This raised one further question for me. Why have a few large genomes managed to survive the ravages of genetic entropy over hundreds of millions of years? If Sanford's hypothesis is correct (and I believe it is with the caveat of DNA replication error rate of 1 per billion nucleotides) then reptiles and all their descendants should have gone extinct long ago. I then speculated about a recovery mechanism similar to what human engineers use in computers to ward off the effects of entropy in software programs and data and since "evolution is cleverer than we are" it shouldn't be unreasonable to presume that nature utilizes the same techniques to thwart genetic entropy that we use to thwart software entropy.DaveScot
March 6, 2008
March
03
Mar
6
06
2008
03:44 PM
3
03
44
PM
PDT
DLH Sanford cites 3 studies suggesting higher rates: Kondroshov 2002 (30 per billion), Nachman and Crowell 2000 (50 per billion), Neel et al 1986 (10 per billion). He then goes on to say that in personal communication with Kondroshov, Kondroshov admitted that 30 per billion was his lower estimate and his higher estimate was 100 per billion. Sanford then builds his hypothesis around the number 100 to 300 substitutions per human per generation. This was such an extraordinarily higher number than is commonly given in molecular biology texts it raised a big red flag in my mind the moment I read it. It appears on page 34, very early in the book. At that point I immediately knew Sanford was going to insinuate that the human gemone couldn't be older than 6000 years and I became quite disappointed. However if you use 3 mistakes per human per generation and apply that in Sanford's hypothesis we get lifespans for species that agree very nicely with the fossil record.DaveScot
March 6, 2008
March
03
Mar
6
06
2008
03:29 PM
3
03
29
PM
PDT
DaveScott at 129 Thanks for that link. Interesting how that link describes multiple repair mechanisms to achieve that low an error rate. As I recall, Sanford listed many different kinds of error rates with citations.DLH
March 6, 2008
March
03
Mar
6
06
2008
03:17 PM
3
03
17
PM
PDT
Allen
. . . we need something that could be called the “phenome”. That is, the sum total of all of the structural and functional components by means of which organisms construct and operate themselves.
I highly agree. From a design point of view, every factory needs jigs, assembly equipment, conveyor belts, and power systems to operated. Design information is both in the "blueprints" and equivalently embedded in the design of the assembly system, not just in the structure of the finished component. May I further propose breaking that "phenome" down into: an energy processing system and a material processing system in addition to the "information processing system" which you effectively described above. As I understand it, life cannot operate without an energy processing system. This is often overlooked or assumed to be operating. Yet each process needs external energy converted to controlled biotic energy (e.g., ATPsynthase & ATP etc.) Similarly cells and nuclei would not function without material processing system to form membranes and equally to transfer material through the membranes. etc.DLH
March 6, 2008
March
03
Mar
6
06
2008
03:10 PM
3
03
10
PM
PDT
The term "epigenesis" has been used here and it is not clear what it means. Here is a comment about it by Henry Gee. "Harvey speculated that the egg or primordium is truly formless and that the embryo develops gradually from homogeneous matter by a process called epegenesis. However, this says no more than form arises out of nothing by some unspecified mechanism. As a name epigenesis is a wild west storefront with nothing behind it. At best it is an observation of what happens - that is form emerging from nothing not an explanation of why it does so." Harvey coined the term in the mid 1600's. But Gee is extending its vagueness to today. It is like the term "emergent" which is used also to say something happened/appeared without specifying why or how it happened.jerry
March 6, 2008
March
03
Mar
6
06
2008
03:06 PM
3
03
06
PM
PDT
DLH One base pair substitution per billion base pairs is the commonly given number for DNA replication errors in humans. It's such a basic number in molecular biology it goes uncited in the textbooks. Here's one of many examples: http://www.sparknotes.com/biology/molecular/dnareplicationandrepair/section3.rhtml DaveScot
March 6, 2008
March
03
Mar
6
06
2008
03:05 PM
3
03
05
PM
PDT
Allen_MacNeill: Thank you for your comments, which are very much appreciated. I completely agree with you that the sum total of genome, transcriptome and proteome is still unable to really describe life. I had limited the analysis at those levels just to keep it simpler and more realistic, because those are the levels we know more about. But you are perfectly right, even given the right proteins at the right moment in the right quantities, that would not ensure the existence of a functional cell. There are a lot of other factors which need to be managed and controlled, one of which is certainly the spacial and reciprocal distribution and configuration of all the components, which is as essential to function in the individual cell as it is in the body plan of a multicellular organism. I remember a scientific paper I read many years ago (I think in Nature), which described a complex network of ions flow in the cytoplasm of cells, apparently not supported by definite "anatomic" structures, which very much resembled a basic nervous system. That is only one example of how much internal structure, whose origin is at present poorly understood, can be the carrier of infinite levels of complexity. Another stimulating example could be the early localization of homeotic factors in the zygote of drosophila. I completely agree with your comments about Venters's reductionist approach (which, anyway, is welcome in the measure that it can give us new facts; I have always thought that we can well appreciate the facts given by researchers, without having to necessarily share their ideas about them). One of the fundamental weaknesses of all reductionist OOL scenarios is that they seem to assume that, somehow, given the necessary gross components (which, as we know, already is a very big problem) they very easily joined together to give life; while we know that even now, in intelligent and sophisticated laboratories, nobody has even tried, least of all succeded, in mechanically building up simple bacteria from their inert components (which are, by the way, easy enough to get from existing living organisms).gpuccio
March 6, 2008
March
03
Mar
6
06
2008
02:13 PM
2
02
13
PM
PDT
Along these same lines, it is instructive to note that Craig Venter's much touted "creation" of a complete genome from off-the-shelf reagents is about as far from creating life as writing a constitution is from creating and governing a nation. As Venter himself admits, the artificial genome that his team has created doesn't do anything, not even when inserted into a specially prepared host cell. That is, there is something about a genome that a cell constructs for itself (and that in turn guides the construction of the cell) that makes the whole system work. This is s little like John Wheeler's question about the theories of general relativity: the equations aren't the things they describe. The equations sit there on the paper, but the things they describe make the universe. Kind of like the difference between genomes and organisms...Allen_MacNeill
March 6, 2008
March
03
Mar
6
06
2008
01:38 PM
1
01
38
PM
PDT
gpuccio: I congratulate you on a masterful condensation of the fundamental problems facing developmental biologists during this new century. Let me suggest yet another; that we should seriously consider that, in addition to the genome, transcriptosome, and proteosome, we need something that could be called the "phenome". That is, the sum total of all of the structural and functional components by means of which organisms construct and operate themselves. As I have suggested above, genomes, transcriptosomes, and even proteosomes do not constitute organisms, nor can they "do" anything without constant feedback from the environment. Therefore, the whole organism, considered as a "focus of activity" in its environment, must be factored into a comprehensive theory of the origin and evolution of life on Earth.Allen_MacNeill
March 6, 2008
March
03
Mar
6
06
2008
01:33 PM
1
01
33
PM
PDT
bFast: You say: "I think that all of us software types find the idea that man is described by 25,000 genes, or 4 * 25,000(genes) * 300(nucleotides/gene) bits of information to be a bit inconcievable. I do understand that epigenics is that which is not in the DNA, and that the above only calculates the “coding” DNA. However, even the entire human DNA, if none is junk, is some darn impressive tight code if it produces a human with all of his inhereted characteristics." I agree with you. The information content of the 25,000 or so protein coding genes is really trivial, if considered self-sufficient. Those genes are only the final effectors, a database of useful protein sequences with functional potentialities. Even if we consider the one gene -> many proteins paradigm, which is favoured by darwinists at present, things do not improve. It is certainly true that the proteome is much bigger than the genome (nobody really knows how much), but the problem remains that all those proteins have to be coded starting with only 25,000 genes, and that means that we need a lot of procedural information so that the right choices may be made at the right times. But that's not enough. As I have tried to discuss here, with not great success, is the fundamental problem of transcriptome selection and control. I have already outlined (post#5), starting from Kaufmann's observations, how big the search space for transcriptomes is, and how important transcriptomes are. They are the true key to all regulations in the cell. They are the true mystery, because no code is known which allows specific cells to know what transcriptome to operate at any given moment. Please, bear in mind that if a blood white cell is completely different from, say, a hepatocyte, the main reason is that they are implementing completely different transcriptomes from the identical genome they share. Each transcriptome means a different state of regulation of nucleus activity and of protein synthesis, different structural components, different metabolic pathways, different activation or repression of functions, and so on. Moreover, each specific cell has a dynamic transcriptome, which changes at any given moment to address different states and challenges. A white blood cell (WBC), for instance, can be quiescent, or reproduce, or differrentiate, or undifferentiate, or be active in defense procedures, or enter apoptosis. Each of those different states will require very different transcriptomes and proteomes in the cell, which add a new layer of selections and variety to the basic transcriptome choices which make a WBC a WBC. Now, if we were software programmers, after having written the basic code of our 25,000 functional structures (the protein coding genes), we still would have to face the major work, that is to write down the real working code, the procedures to implement all our billions of different activities, the logical operations, the measurements, in other words the "noblest" part of the software. And that would be huge. A lot of bits, even in a very compact form. And, obviously, it would not be written in the same form of the 25,000 functional sequences, because while those are "effectors", the other part would be "procedures". That's the problem, I think, with the genome. We don't know where the procedures are coded. And, even more important, we don't know "how" they are coded. In my opinion, our best guess at present is of two kinds: 1) The procedures are in the 98% non coding DNA. That is probably at least partly true. There are already many evidences of that. Intrones, promoters, non protein coding nuclear mRNAs, effectors of alternative splicing, all have been demonstrated. But it's only a drop in the ocean. And, to the present moment, non coding DNA is still mostly an enigma. The reason that for so many years it has been considered junk is not to be searched "only" in the fact that darwinists are not so bright (you see, I can defend darwinists at times! :-) ), bur first of all in the fact that its "appearance" is really enigmatic, and defies any easy interpretation. Obviously, I am referring here especially to those parts, a really big percentage of the whole, which really look meaningless, repetitive, or just destructive, to pseudogenes, to transposons, to ERVs, even to introns. No surprise that darwinists, for such a long time, have easily "coopted" those data in defense of their weak theory. Take introns, for example. If you were a programmer, for which reason would you take a specific, single sequence of information (a protein coding gene) and cut it in, say, 20 different parts, interspersing among them long segments of apparently random code, which has to be carefully removed after transcription? And how would you utilize such a tool to implement "procedures"? The answer, I suppose, is not easy. 2) The procedures, the real code, or at least a big part of them, could be somewhere else. A few possibilities are: a) In cytoplasmic structures (epigenetic information) b) In the nucleus, separate from DNA c) In any other molecule, including the same DNA, but encoded in a way that we can't at present imagine or understand, for example at a level which is not grossly biochemical, like the traditional genetic code, but rather biophysical (structures, conformations), even, or maybe especially, at the quantum level. All these hypotheses are interesting and promising, but we have to admit that at present we have almost no real tangible clue to support them.gpuccio
March 6, 2008
March
03
Mar
6
06
2008
12:47 PM
12
12
47
PM
PDT
Turner Coates @ 66, I noticed that in your response you failed to address my question concerning Prigogine’s role in all of this. You wrote: ”Mazur is simply listing common examples of self-organization, with no apparent understanding that they are inappropriate in this context. No biologist would suggest a close analogy of anything going on in epigenesis to snowflake formation. ” I think it is a grave mistake to boldly and arrogantly state that biology alone is capable of explaining the phenomenon of life, considering that OOL studies, as a system of inquiry attempting to explain the emergence of life from inanimate matter, very much requires the assistance of the physical sciences. And it is at the juncture where biology and chemistry intersect that such noted figures like Prigogine prominently come into play when he tried to use the self-organizing properties intrinsic to the material constituents of living systems to explain how life began. Now it is quite another matter if you were to maintain the belief that OOL has nothing to do with biology, or is this your position? Unfortunately, you did not clarify in your opening statement or follow-up response. My original intimations are still valid. Is OOL relevant to the physical sciences or not? And if so, are the analogies used by physical scientists involved in OOL research (as illustrated by Mazur’s examples) appropriate to these discussions? Like I said, you didn’t address Prigogine or his use of the vortex example as a way of explaining the OOL via self-organizing or self-ordering matter; you instead offered to stick with developmental biology, which then, of course, will have no relevance to the examples Mazur cited. You also wrote: ”In fact, Mazur quotes Kauffman emphasizing that snowflakes are not alive. Ah, but what seems to have escaped your notice, Mr. Coates, is that this conversation between Kauffman and Mazur confirms what I’ve been trying to intimate to you all along. Indeed, at the heart of these discussions is the origin of biological information, the real crucible of any theory dealing with OOL, and in turn, development. In fact, the OOL quandary seemed to have been a major reason behind the formation of the Altenberg meeting, and a focal point for reformulating evolutionary theory. Picking up where you had left off with the “snowflakes are not alive” quote, Mazur informs us that
[Kauffman ] reminded me in our phone conversation that Darwin doesn't explain how life begins, "Darwin starts with life. He doesn't get you to life." Thus the scramble at Altenberg for a new theory of evolution.
If self-organization could be defined as “a process where the organization (constraint, redundancy) of a system spontaneously increases, i.e. without this increase being controlled by the environment or an encompassing or otherwise external system,” then Mazur did not veer way off by including some examples of self-organization or self-ordering that have already been used by noted theorists in their futile attempts at explaining the OOL.JPCollado
March 6, 2008
March
03
Mar
6
06
2008
12:24 PM
12
12
24
PM
PDT
Hello Allen, I wrote: “…life is a network of artificial intelligence which artificially discovers the solution to problems.” You wrote: "How would one empirically distinguish between that and the following: “…life is a network of natural intelligence which naturally discovers the solution to problems.” or, for that matter: “…life is a network of intelligence which discovers the solution to problems.” Let me be clear about this: I am not necessarily advocating any of these. Rather, I’m asking how one would empirically distinguish between them (i.e. not speculate about their metaphysics)?" My response is that there is no difference between artificial and natural when it comes to intelligence as artificial intelligence is still perfectly natural as I believe capital "I" intelligence is also. So, that's not my point. The difference is between the foresight used by capital "I" Intelligence and the lack of foresight of artificial intelligence. However, that's not even the point I was making. I was asking, what causes artificial or even capital "I" Intelligence for that matter without violating conservation of information and thermodynamics. And yes, AI and EAs operate on the same basic principles -- inputted information of problem or target characteristics so that there can be better than chance performance. IOW, they are not realizable without previous problem specific information. I will repost the main point of my comment: "So, how does artificial intelligence operate? It only learns what it is programmed to learn. Outputted information can be no greater than inputted information. Thus, information is conserved. Can your AI robot servant do anything other than what he is programmed to do as he searches through a solution space that you’ve provided for him with the problem specific programming that you’ve inputted into him? [it is true that there may be some minor surprising random effects, however that does not describe the operation of the process as a whole or the ability to get any better than chance performance in the first place]. A “learner… that achieves at least mildly than better-than-chance performance, on average, … is like a perpetual motion machine - conservation of generalization performance precludes it.” –Cullen Schaffer on the Law of Conservation of Generalization Performance. Cullen Schaffer, “A conservation law for generalization performance,” in Proc. Eleventh International Conference on Machine Learning, H. Willian and W. Cohen. San Francisco: Morgan Kaufmann, 1994, pp.295-265. “The [computing] machine does not create any new information, but it performs a very valuable transformation of known information.” –Leon Brillouin, Science and Information Theory (Academic Press, New York, 1956). Therefore, the actual engine of variation is the programming that goes into setting up an information rich system of artificial intelligence. The pseudo-random natural selection filter merely searches through a non-random search space, guided by previously inputted problem specific information to pre-set targets. Of course, from those targets (solutions) there can be some minor effects and cyclical variations in accordance with a truly random search of the space immediately surrounding a pre-programmed potential solution (target). Whatever you want to call the latest version of evolution, if it is the hypothesis of the creation of information at consistently better than chance performance via the environment selecting from a random palate [of form, function, and law], it is the largest scientific hoax in history with not a shred of evidence in its favor and all observation, experimentation with EAs, and information theorems against it. In information terms, the above type of evolution is literally an attempt to sell a perpetual motion free energy machine.CJYman
March 6, 2008
March
03
Mar
6
06
2008
11:22 AM
11
11
22
AM
PDT
1 2 3 4 5 8

Leave a Reply