Uncommon Descent Serving The Intelligent Design Community

Lactose digestion in E. coli


Remember the big stir about Lenski’s 20 year experiment with E. coli where the bugs “evolved” the ability to digest lactose citrate and this was touted as overwhelming evidence of evolution? And remember our response that until the mechanism behind it was discovered that it might not be much “evolution” at all?

As usual, we are vindicated. In a similar case where it’s lactose instead of citrate the bug was all set up, in fact one might say front loaded, with the capacity to switch over from glucose to lactose digestion. Essentially the bug constantly samples the level of lactose in its environment and when the level reaches a tipping point a single “throw of the dice” switches it over from glucose to lactose digestion. This is contrary to Lenski’s hypothesis that a series of dice throws, each making a small change towards ability to digest lactose citrate, accumulate until lactose citrate digestion is fully switched on. Darwinian gradualism is denied once again and we see a front loaded genome switch to a new mode of operation through a saltational event.

Throw of a dice dictates a bug’s life
17:53 17 October 2008
NewScientist.com news service
Ewen Callaway

For an E. coli bacterium, 300 is its lucky number. That’s about how many protein molecules it takes to make a life-changing shift in its diet preference, according to new research.

But this shift happens entirely by chance, says Sunney Xie, a biochemist at Harvard University. “You don’t know when it’s going to occur. It’s a random event.”

While this may sound more like chaos theory and quantum mechanics than biology, it is how all living cells operate at the molecular level, from drug-resistant tuberculosis to stem cells, he says.

In Escherichia coli’s case, Xie studied a simple trait: the ability to digest lactose sugar.

E. coli normally prefer to dine on a sugar called glucose. To conserve energy, bacteria shut down the genes that control lactose consumption when glucose is around. This is achieved with the help of a “repressor protein” that sits on the lactose genes.

However, when glucose runs out and lactose is available, evolution has come up with an ingenious solution to bring the lactose-digesting genes out of slumber.
Tipping point

A protein called permease sits in the cell’s membrane and imports stray lactose molecules into the cell. These sugars latch onto the repressor protein, stopping its repressive activity, and allowing the lactose genes to switch back on.

This ensemble – called the lac operon – then produces more permease proteins that let in even more lactose, sending E. coli down a one-way street to lactose digestion.

Outlining this behaviour earned two scientists a Nobel prize in 1965. “The lac operon is like the hydrogen atom of molecular biology, it’s the first system that describes gene regulation,” Xie says.

His team sought to understand what happens at the tipping point between repressing and activating these genes under low levels of lactose in genetically identical bacteria.

“A single cell has to make a decision whether it wants to be induced or not,” he says. “How is this life-changing decision made?”

To answer that question, Xie and colleagues Paul Choi and Long Cai, used a technology pioneered in their lab to count permease molecules tethered to a fluorescent marker protein.

They found that when a cell hit a critical threshold of about 300 permease proteins, the lac operon switched on in a burst of activity and the cell gained the ability to break down the sugar. With fewer molecules, a cell remains stuck in neutral.
Double grip

However, this flurry of activity is all controlled by the repressor protein, which grabs onto the lac operon at two different places.

Losing grip of one of these points allows for little bursts of lactose gene expression – enough to get a taste of the outside world – but cell division prevents cells from reaching 300 in this way.

The repressor protein must completely let go for a cell to reach the magic number, Xie says. And this happens by chance.

The random event allows the expression of more permease molecules, which means more lactose gets into the cell, and so the lactose genes are active for longer. Eventually a point is reached where the cell is switched to lactose digestion.

“It’s a beautiful paper,” says Michael Elowitz, a molecular biologist at Caltech in Pasadena. “Trying to understand the behaviour of cells in terms of the behaviour of the individual molecules within them is one of the most fundamental goals of biology.”

The bizarre behaviour of single proteins could explain why one tuberculosis-causing bacterium is antibiotic resistant, while another bacterium with an identical genome falls prey to drugs, Xie says.

Even our own cells depend on life-changing fluke events involving single molecules. While vastly more complicated than an E. coli bacterium, embryonic stem cells capable of turning into any kind of tissue probably make this decision with a small cast of molecules.

Journal reference: Science (DOI: 10.1126/science.1161427)

off topic: IEEE spectrum Oct 2008 p 18: article: Unsticking MEMS says they get stuck because of the casimir effect; when they get too close together, reduces number of photons that can form between parts to the length of the wavelength of the distance between those parts; so, many more photons pressing on the outsides force the parts together, making a form of friction called: stiction; so, are there no photons in e-coli? if there are, why don't they get stuck when they are 1 to 2 orders of magnitude smaller? they plan to fix the problem by coating surfaces with metamaterials, which are "specifically designed to have properties that *do not occur naturally*, such as bending light the wrong way. Did e-coli also solve the problem by evolving "naturally" non-natural properties? Thoughts? es58
A simple explanation, like a sensory system, coupled with a frontloaded adaptive system, will work for me better than "chance, random mutations" and other hazardous darwinian explanations... Sladjo
Dave Scot said: "In engineered systems various possible contingencies are anticipated and processes are put in place to deal with them if and when any particular contingency actually arises in the future. These forward looking predetermined responses are “front loaded” - put in place before they are actually needed." ------------ I think this is reasonable...but what is also reasonable is that creatures were created with intelligence...the were created with minds...they were created with the ability to perceive environments, learn from them, and make the appropriate biological changes in response. Consciousness is basically just being aware of one's surroundings. So it only makes since that consciousness (since all living organisms surely have it in one degree or another) play a role in biological change. The genome, then, isn't the main player in biological change -- the mind is. The genome is just a follower, an effect of a deeper cause. That's my 2 cents anyway. van
Hi Patrick, Yes, I am talking about post #13. Thank you. There are still a few missing, but I think it is because of my bad editing. All the links are in this post (2x:)). Techne
My links disappeared
Are you talking about comment #13, which should now be viewable? First time posters are held in moderation as well as comments with 10+ links. Patrick
These forward looking predetermined responses are “front loaded” - put in place before they are actually needed. Chance & necessity is a reactive process that cannot plan ahead. Intelligent design is a proactive process that can plan ahead.
I suspect a good example of this is Desulforudis Audaxviator, a bacteria recently discovered living without oxygen, in temperatures around 60°C but able to survive as long as it has a small amount of water flowing through radioactive rocks. The discovers believe it could survive on mars! How does an organism develop the ability to utilise radioactivity without dying out first? At 60°C? With no oxygen? As many DNA sequences are available to the public I think that, when software and computing power are up to it, many front loaded sequences will come to light from home users, not scientists, because they are looking for them. Seek and you shall find. Stelios
My links disappeared :( Techne
Talking about front-loading, anybody seen the the recent paper about the Trichoplax genome? Loads of Hox genes in an organism with only four cell types (excluding nerves, sensory, muscles and bone). For example the Mnx gene. What does it do: It is involved in the development of the pancreas and motor neurons. 1) Zebrafish mnx genes in endocrine and exocrine pancreas formation. 2) The Mnx homeobox gene class defined by HB9, MNR2 and amphioxus AmphiMnx.
The HB9 homeobox gene has been cloned from several vertebrates and is implicated in motor neuron differentiation. In the chick, a related gene, MNR2, acts upstream of HB9 in this process. Here we report an amphioxus homologue of these genes and show that it diverged before the gene duplication yielding HB9 and MNR2. AmphiMnx RNA is detected in two irregular punctate stripes along the developing neural tube, comparable to the distribution of 'dorsal compartment' motor neurons, and also in dorsal endoderm and posterior mesoderm. We propose a new homeobox class, Mnx, to include AmphiMnx, HB9, MNR2 and their Drosophila and echinoderm orthologues; we suggest that vertebrate HB9 is renamed Mnx1 and MNR2 be renamed Mnx2.
And here is some interesting research: <a Directed Evolution of Motor Neurons from Genetically Engineered Neural Precursors.
Stem cell-based therapies hold therapeutic promise for degenerative motor neuron diseases such as amyotrophic lateral sclerosis and for spinal cord injury. Fetal neural progenitors present less risk of tumor formation than embryonic stem (ES) cells but inefficiently differentiate into motor neurons, in line with their low expression of motor neuron-specific transcription factors and poor response to soluble external factors. To overcome this limitation, we genetically engineered fetal rat spinal cord neurospheres to express the transcription factors HB9, Nkx6.1 and Ngn2. Enforced expression of the three factors rendered neural precursors responsive to sonic hedgehog and retinoic acid and directed their differentiation into cholinergic motor neurons that projected axons and formed contacts with co-cultured myotubes. When transplanted in the injured adult rat spinal cord, a model of acute motor neuron degeneration, the engineered precursors transiently proliferated, colonized the ventral horn, expressed motor neuron-specific differentiation markers and projected cholinergic axons in the ventral root. We conclude that genetic engineering can drive the differentiation of fetal neural precursors into motor neurons which efficiently engraft in the spinal cord. The strategy thus holds promise for cell replacement in motor neuron and related diseases.
What did these guys do? They enforced the expression of 3 genes associated with neuronal development in order to direct the development of motor neurons. Sonic hedgehog also played a role. So four genes played a role: 1. HB9 2. Nkx6.1 3. Ngn2 4. Sonic hedgehog Are similar genes present in the Trichoplax genome? 1. HB9 (mnx) Yes (see above). 2. Nkx6.1 Here is the <a human Nk6 gene. And here is <athe Trichoplax version. 3. Ngn2 Here is the <ahuman neurogenin 2 (ngn2) gene. And here <ais the Trichoplax version. A quick <aBLAST (blastp) the human genome shows this sequence to be closely related to ngn2 (E-value = 3^-8). 4. Sonic hedgehog (shh) Here is the human shh gene. This gene seems to absent in from the Trichoplax genome, however, the <apresence of shh in Monosiga brevicollis (unicellular eukaryote that diverged before Trichoplax) suggest the possibility of gene loss in this lineage. Wonder what will happen if shh is co-expressed and together with mnx, Nk6 and ngn2 in Trichoplax, or whether these genes will function like their counterparts in higher animals. A complex array of neurologically associated developmental pathways present in this eumetazoan that has no nerves, sensory cells and muscle cells, and there is more... Techne
Is there any simple example you could give of front-loading in a biological organism or something?
Search for "nylonase". Make sure you understand what a frameshift is - that'll enable you to understand how unlikely the non-frontloaded story is. Stelios
Davescot, Re: Frontloading. I've always suspected that a measure of the duration that any specific instance of frontloading was intended to cover can be found in the size of the genome in question. Logicaly, there must be a finite amount of information that can be placed in advance. Of course, an objection to this is that some organisms have genomes that appear to bear no relationship to the complexity of the organism in question (witness the humble onion for example) and so knowing the size of the genome does not relate directly to frontloading one way or the other. However, as we are seeing more and more with discovery's like the importance of epigenetics in the development of organisms there are many layers yet to be peeled back. For example, if a organisms DNA sequence contains X bytes it could be said that only X-Y bytes could be frontloads, where Y is the number of bytes needed to build the organism in question. So X-Y is the maximal amount of potential front loaded information and so if we can measure the speed at which the frontloaded information is enabled we can estimate that duration. From what I've read (assuming Lenski's results were trustworthy at any level) the rate appears to be very slow. Of course, it won't be for a long time that such a determination could be made, if ever. And when it is my response will simply be "there's another layer you've not looked at". It might be the gluons and quarks making up the atoms that contain the information required, we simply don't know sufficent information at the momement to rule anything out which is why it's so fustrating to see science on the path it currently is. OK, I doubt it'll be a the level of individual quarks, but you get my meaning. Sure, progress is being made but how much faster could that progress be made if we take the prior commitment to "no intelligence required" away? Stelios
Davescot, Thanks for the reply. I think that is more or less what I was thinking, but I'm still having a bit of trouble grasping it. Is there any simple example you could give of front-loading in a biological organism or something? That might help me wrap my mind around it. Thanks again! Domoman
Interesting stuff, thanks for the link. It's interesting how the concept of front loading does not even enter consideration when events like this happen. I was rather hoping that the forthcoming election would put in place a leader prepared to instruct bodies such as PNAS to follow the evidence where it leads rather then exclude particular conclusions (front loading etc) before they even write the grant cheque! It does not look like that will be the case I'm afraid to say! The fat lady has not sung as yet so there's still hope! When a potential leader of the worlds foremost superpower can say
Substantially more people in America believe in angels than they do in evolution."
and expect to maintain the respect of the majority of people who believe and still be on track for election, then I'm afraid there's something rotten in the state of Denmark! Stelios
Domoman In engineered systems various possible contingencies are anticipated and processes are put in place to deal with them if and when any particular contingency actually arises in the future. These forward looking predetermined responses are "front loaded" - put in place before they are actually needed. Chance & necessity is a reactive process that cannot plan ahead. Intelligent design is a proactive process that can plan ahead. DaveScot
Davescot, What exactly does the term "front-loaded" mean? I've heard you talk about it a lot, but I'm still not quite sure what it entails. If you could explain, that'd be great! Interesting post too. Funny how evolutionists will tout this as evolution-in-action when in reality the E. Coli had the machinery capable of digesting citrate already. lol Off topic: Jonathan Wells talks about the impossibility of life arising by chance on this video. I saw it again the other day and thought it was very interesting. He talks about it around 3 mins into the video: http://www.youtube.com/watch?v=IeDMOeuNwsQ&NR=1 Domoman
stelios It's been picked apart here too. http://www.google.com/search?sourceid=navclient&ie=UTF-8&rlz=1T4GPTB_enUS290US290&q=lenski+citrate+site%3auncommondescent%2ecom DaveScot
Lenski's paper has been picked apart over at Conservapedia already http://www.conservapedia.com/Lenski I doubt that any reasonable person, after reading the data and refutations at that site, would support Lenski's conclusions. For a detailed list of flaws in his paper go here http://www.conservapedia.com/Flaws_in_Richard_Lenski_Study Stelios
Milk and feces-living bacteria, just what I need to be reading right before lunch. Guess I'll pass on the Ovaltine this time. beancan5000
techne Quite right. The lactose changeover was so similar to the citrate changeover I confused the one with the other. I corrected the mistake by crossing out lactose and inserting citrate where noted in the article. Thanks for catching it! Excellent followup, by the way, describing Lenski's experiment in detail. Good stuff. I agree with your 600,000 year equivalency and this in fact highlights the major problem for evolution by mutation & selection. Selection works on the whole organism not individual traits. So while you can get a single small change such as lactose or citrate digestion over the course of 600,000 years, if that's all that's being selected for, it isn't all that's being selected for in the real world. The whole organism is being selected against a plethora of environmental challenges. The article quoted states that "evolution has come up with an ingenious solution". Really. I'm waiting on pins and needles to see exactly how mutation and selection, where selection is constrained by whole genome selection in the real world, manages to come up with uber-complex systems of many interdependent proteins with all sorts of contingency options within easy reach of one or just a few "throws of the dice". Non sequitur. And note how they unconsciously use a term like "ingenious". Is "ingenious" more suggestive of a reactive process or a proactive process? More suggestive of chance or design? Mike Gene in his book "The Design Matrix" surveyed biology literature for the use of terms that commonly appear in engineering literature and found that the use of those terms in biology has increased at an exponential rate in recent years while in other sciences the frequency of these kind of terms has remained constant. DaveScot
Hi Dave, It should say:
Remember the big stir about Lenski’s 20 year experiment with E. coli where the bugs “evolved” the ability to digest citrate and this was touted as overwhelming evidence of evolution?
Citrate, not lactose. Anyway, as far as I can gather, don't expect too much of an evolutionary leap in Lenki's experiment. My thoughts: E. coli are facultative anaerobic organisms, meaning that in the presence of oxygen, pyruvate (from glycolysis) will enter the Krebs (citric acid) cycle to produce energy and in the absence of oxygen, pyruvate will ferment to form lactate and/or ethanol (depending on the organism). Thus, E. coli can metabolize intracellular citrate just fine. All the machinery necessary to metabolize citrate is present in E. coli and the biochemical processes/reactions are highly coordinated and complex. Therefore, the evolutionary "leap/jump" cited is not an example of a suddenly acquired ability to metabolize citrate as E. coli posses the necessary intracellular machinery to metabolize citrate. What did evolve then? First look at the experimental conditions. The E. coli (subtype B) bacteria where grown in DM25, a minimal salts medium that has 139microM glucose and 1,700microM citrate for about 20 years. Meaning a lot of extracellular citrate and little extracellular glucose. This specific strain does not have a citrate symporter to import the extracellular citrate. Other strains of E. coli posses such membrane proteins (e.g. citT), however some of them are situated on plasmids and the researchers made sure that horizontal transfer of plasmids was not possible in this experiment. Basically, the bacteria were swimming in an ocean of food but could only use a fraction of it because they could not get their "hands" on the goods. At around the 31000-31500th generation the first citrate "importers" arose. It will be fascinating to see the actual mutations that gave rise to this function and it is probably more than one (article speculates on possibly three genetic events over many generations). E. coli has many other symporter membrane proteins for dicarboxylic acids and other molecules with similar properties to citrate (e.g. Tartrate or alpha-ketoglutarate). Citrate is a tricarboxylic acid and it is possible that a few key mutations from an existing symporter protein allowed citrate to be imported by an existing protein. If so, it would be interesting how the mutations affected the properties of the original symporter system. The researchers however have not tracked down the exact mutations. 30 000 generations is the equivalent of +-600 000 years of evolution in a species with an average generation of 20 years (like primates). Is this an example of a "leap/jump"? If you are lactose intolerant, it might very well take 600 000 years before a mutation makes your descendants lactose tolerant (lactase gene beneficial mutation if all your descendants are also lactose intolerant). So in the end it is more like an evolutionary stutter (without us knowing the full extent of the other mutations during the 30 000 generations), much like nylonase "evolution", which was a a pre-existing esterase with B-lactam folds that had minimal nylon hydrolysis activity from the start. Techne
So they found the mutation that breaks the repressor gene? Or is it two mutations? Either way, it's a loss of information and a loss of two protein-DNA binding sites. I may actually go look up this paper. I'll bet you could calculate the odds of these mutations happening and compare them to the number of organisms it took to generate the mutations. Since the odds are probably worse than the probability resources provided by the experiment, there are most likely many other mutations that could cause the same loss of information. So take the difference between the odds required to generate those two specific mutations and the probability resources provided by the experiment and estimate how many other mutations could do the same thing. There you have an ID hypothesis that is testable. tragicmishap

Leave a Reply