Uncommon Descent Serving The Intelligent Design Community

Controlling the waves of dynamic, far from equilibrium states: the NF-kB system of transcription regulation.

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

I have recently commented on another thread:

about a paper that (very correctly) describes cells as dynamic, far from equilibrium systems, rather than as “traditional” machines.

That is true. But, of course, the cell implements the same functions as complex machines do, and much more. My simple point is that, to do that, you need much greater functional complexity than you need to realize a conventional machine.

IOWs, dynamic, far from equilibrium systems that can be as successful as a conventional machine, or more, must certainly be incredibly complex and amazing systems, systems that defy everything else that we already know and that we can conceive. They must not only implement their functional purposes, but they must do that by “harnessing” the constantly changing waves of change, of random noise, of improbability. I have commented on those ideas in the mentioned thread, at posts #5 and #8, and I have quoted at posts #11 and #12 a couple of interesting and pertinent papers, introducing the important concept of robustness: the ability to achieve reliable functional results in spite of random noise and disturbing variation.

In this OP, I would like to present in some detail a very interesting system that shows very well what we can understand, at present, of that kind of amazing systems.

The system I will discuss here is an old friend: it is the NF-kB system of transcription factors (nuclear factor kappa-light-chain-enhancer of activated B cells). We are speaking, therefore, of transcription regulation, a very complex topic that I have already discussed in some depth here:

I will remind here briefly that transcription regulation is the very complex process that allows cells to be completely different using the same genomic information: IOWs, each type of cell “reads” differently the genes in the common genome, and that allows the different types of cell differentiation and the different cell responses in the same cell type.

Transcription regulation relies on many different levels of control, that are summarized in the above quoted OP, but a key role is certainly played by Transcription Factors (TFs), proteins that bind DNA and act as activators or inhibitors of transcription at specific sites.

TFs are a fascinating class of proteins. There are a lot of them (1600 – 2000 in humans, almost 10% of all proteins), and they are usually medium sized proteins, about 500 AA long, containing at least one highly conserved domain, the DNA binding domain (DBD), and other, often less understood, functional components.

I quote again here a recent review about human TFs:

The Human Transcription Factors

The NK-kB system is a system of TFs. I have discussed it in some detail in the discussion following the Ubiquitin thread, but I will describe it in a more systematic way here.

In general, I will refer a lot to this very recent paper about it:

Considering Abundance, Affinity, and Binding Site Availability in the NF-kB Target Selection Puzzle

The NF-kB system relies essentially on 5 different TFs (see Fig. 1 A in the paper):

  1. RelA  (551 AAs)
  2. RelB  (579 AAs)
  3. c-Rel  (619 AAs)
  4. p105/p50 (968 AAs)
  5. p100/p52  (900 AAs)

Those 5 TFs work forming dimers, homodimers or heterodimers, for a total of 15 possible compbinations, all of which have been found to work in the cell, even if some of them are much more common.

Then there are at least 4 inhibitor proteins, collectively called IkBs.

The mechanism is apparently simple enough. The dimers are inhibited by IkBs and therefore they remain in the cytoplasm in inactive form.

When an appropriate signal arrives to the cell and is received by a membrane receptor, the inhibitor (the IkB molecule) is phosphorylated and then ubiquinated and detached from the complex. This is done by a protein complex called IKK. The free dimer can then migrate to the nucleus and localize there, where it can act as a TF, binding DNA.

This is the canonical activation pathway, summarized in Fig. 1. There is also a non canonical activation pathway, that we will not discuss for the moment.


Mechanism of NF-κB action. In this figure, the NF-κB heterodimer consisting of Rel and p50 proteins is used as an example. While in an inactivated state, NF-κB is located in the cytosol complexed with the inhibitory protein IκBα. Through the intermediacy of integral membrane receptors, a variety of extracellular signals can activate the enzyme IκB kinase (IKK). IKK, in turn, phosphorylates the IκBα protein, which results in ubiquitination, dissociation of IκBα from NF-κB, and eventual degradation of IκBα by the proteasome. The activated NF-κB is then translocated into the nucleus where it binds to specific sequences of DNA called response elements (RE). The DNA/NF-κB complex then recruits other proteins such as coactivators and RNA polymerase, which transcribe downstream DNA into mRNA. In turn, mRNA is translated into protein, resulting in a change of cell function.

Attribution: Boghog2 at English Wikipedia [Public domain]

Now, the purpose of this OP is to show, in greater detail, how this mechanism, apparently moderately simple, is indeed extremely complex and dynamic. Let’s see.

The stimuli.

First of all, we must understand what are the stimuli that, arriving to the cell membrane, are capable to activate the NF-kB system. IOWs, what are the signals that work as inputs.

The main concept is: the NF-kB system is a central pathway activated by many stimuli:

  1. Inflammation
  2. Stress
  3. Free radicals
  4. Infections
  5. Radiation
  6. Immune stimulation

IOWs, a wide variety of aggressive stimuli can activate the system

The extracellular signal arrives to the cell usually through specific cytokines, for example TNF, IL1, or through pathogen associated molecules, like bacterial lipopolysaccharides (LPS). Of course there are different and specific membrane receptors, in particular IL-1R (for IL1) , TNF-R (for TNF), and many TLRs (Toll like receptors, for pathogen associated structures). A special kind of activation is implemented, in B and T lymphocytes, by the immune activation of the specific receptors for antigen epitopes (B cell receptor, BCR, and T cell receptor, TCR).

The process through which the activated receptor can activate the NF-kB dimer is rather complex: it involves, in the canonical pathway, a macromolecular complex called IKK (IkB kinase) complex, comprising two catalytic kinase subunits (IKKa and IKKb) and a regulatory protein (IKKg/NEMO), and involving in multiple and complex ways the ubiquitin system. The non canonical pathway is a variation of that. Finally, a specific protein complex (CBM complex or CBM signalosome) mediates the transmission from the immune BCR or TCR to the canonical pathway. See Fig. 2:

From: NF-κB Activation in Lymphoid Malignancies: Genetics, Signaling, and Targeted Therapy – Scientific Figure on ResearchGate. Available from: https://www.researchgate.net/figure/Increased-activity-of-the-CARMA1-BCL10-MALT1-signalosome-drives-constitutive-NF-kB_fig2_324089636 [accessed 10 Jul, 2019]
Figure 3 – NF-κB Activation in Lymphoid Malignancies: Genetics, Signaling, and Targeted Therapy
available via license: Creative Commons Attribution 4.0 International

I will not go into further details about this part, but those interested can have a look at this very good paper:

TLR-4, IL-1R and TNF-R signaling to NF-kB: variations on a common theme

In particular, Figg. 1, 2, 3.

In the end, as a result of the activation process, the IkB inhibitor is degraded by the ubiquitin system, and the NK-kB dimer is free to migrate to the nucleus.

An important concept is that this is a “rapid-acting” response system, because the dimers are already present, in inactive form, in the cytoplasm, and must not be synthesized de novo: so the system is ready to respond to the activating signal.

The response.

But what is the cellular response?

Again, there are multiple and complex possible responses.

Essentially, this system is a major regulator of innate and adaptive immune responses. As such, it has a central role in the regulation of inflammation, in immunity, in autoimmune processes, and in cancer.

Moreover, the NF-kB system is rather ubiquitous, and is present and active in many different cell types. And, as we have seen, it can be activated by different stimuli, in different ways.

So, the important point is that the response to activation must be (at least):

  1. Lineage-specific
  2. Stimulus-specific

IOWs, different cells must be able to respond differently, and each cell type must respond differently to different stimuli. That gives a wide range of possible gene expression patterns at the transcription level.

The following paper is a good review of the topic:

Selectivity of the NF-κB Response

For example, IL2 is induced by NF-kB activayion in T cells, but not in B cells (lineage specific response). Moreover, specific cell types can undergo specific, and often different, cell destinies after NF-kB activation: for example, NK-kB is strongly involved in the control and regulation of T and B cell development.

From:

30 years of NF-κB: a blossoming of relevance to human pathobiology

“B and T lymphocytes induce NF-κB in adaptive immune responses through the CARD11:Bcl10:MALT1 (CBM) complex (Hayden and Ghosh, 2008). Newly expressed genes promote lymphocyte proliferation and specific immune functions including antibody production by B cells and the generation of cytokines and other anti-pathogen responses by T cells.”

And, in the same cell type, certain promoters regulated by NF-kB require additional signaling (for example,  in human dendritic cells promoters for Il6Il12b, and MCP-1 require additional p38 histone phosphorylation to be activated), while others can be activated directly (stimulus-specific response).

So, to sum up:

  1. A variety of stimuli can activate the system in different ways
  2. The system itself has its complexity (different dimers)
  3. The response can be widely different, according to the cell type where it happens, and to the type of stimuli that have activated the system, and probably according to other complex variables.
  4. The possible responses include a wide range of regulations of inflammation, of the immune system, of cell specifications or modifications, and so on.

How does it work?

So, what do we know about the working of such a system?

I will ignore, for the moment, the many complexities of the activation pathways, both canonical and non canonical, the role of cyotkines and receptors and IKK complexes, the many facets of NEMO and of the involvement of the ubiquitin system.

For simplicity, we will start with the activated system: the IkB inhibitor has been released from the inactive complex in the cytoplasm, and some form of NF-kB dimer is ready to migrate to the nucleus.

Let’s remember that the purpose of this OP is to show that the system works as a dynamic, far from equilibrium system, rather than as a “traditional” machine. And that such a way to work is an even more amazing example of design and functional complexity.

To do that; I will rely mainly on the recent paper quoted at the beginning:

Considering Abundance, Affinity, and Binding Site Availability in the NF-kB Target Selection Puzzle

The paper is essentially about the NF-kB Target Selection Puzzle. IOWs, it tries to analyze what we know about the specificity of the response. How are specific patterns of transcription achieved after the activation of the system? What mechanisms allow the selection of the right genes to be transcribed (the targets) to implement the specific patterns according to cell type, context, and type of stimuli?

A “traditional” view of the system as a machine would try to establish rather fixed connections. For example, some type of dimer is connected to specific stimuli, and evokes specific gene patterns. Or some other components modulate the effect of NK-kB, generate diversification and specificity of the response.

Well, those ideas are not completely wrong. In a sense, the system does work also that way. Dimer specificity has a role. Other components have a role. In a sense, but only in a sense, the system works as though it were a traditional machine, and uses some of the mechanisms that we find in the concept of a traditional biological machine.

But that is only a tiny part of the real thing.

The real thing is that the system really works as a dynamic, far from equilibrium system, harnessing huge random/stochastic components to achieve robustness and complexity and flexibility of behavior in spite of all those non finalistic parts.

Let’s see how that happens, at least for the limited understanding we have of it. It is important to consider that this is a system that has been studied a lot, for decades, because of its central role in so many physiological and pathological contexts, and so we know many things. But still, our understanding is very limited, as you will see.

So, let’s go back to the paper. I will try to summarize as simply as possible the main concepts. Anyone who is really interested can refer to the paper itself.

Essentially, the paper analyzes three important and different aspects that contribute to the selection of targets at the genomic level by our TFs (IOWs, our NF-kB dimers, ready to migrate to the nucleus. As the title itself summarizes, they are:

  1. Abundance
  2. Affinity
  3. Binding site availability

1. Abundance

Abundance is referred here to two different variables: abundance of NF-kB Binding Sites in the genome and abundance of Nucleus-Localized NF-kB Dimers. Let’s consider them separately.

1a) Abundance of NF-kB Binding Sites in the genome:

It is well known that TFs bind specific sites in the genome. For NF-kB TFs, the following consensus kB site pattern has been found:

 5′-GGGRNWYYCC-3′

where R, W, Y, and N, respectively denote purine, adenine or thymine, pyrimidine, and any nucleotide.

That simply means that any sequence corresponding to that pattern in the genome can, in principle, bind NF-kB dimers.

So the problem is: how many such sequences do exist in the human genome?

Well, a study based on RelA has evaluated about 10^4 consensus sequences in the whole genome, but as NF-kB dimers seem to bind even incomplete consensus sites, the total number of potential binding sites could be nearer to 10^6

1b) Abundance of Nucleus-Localized NF-kB Dimers:

An estimate of the abundance of dimers in the nucleus after activation of the system is that about 1.5 × 10^5 molecules can be found, but again that is derived from studies about RelA only. Moreover, the number of molecules and type of dimer can probably vary much according to cell type.

So, the crucial variable, that is the ratio between binding sites and available dimers, and which could help undertsand the rate of sites saturation in the nucleus, remains rather undecided, and it seems very likely that it can vary a lot in different circumstances.

But there is another very interesting aspect about the concentration of dimers in the nucleus. According to some studies, NF-kB seems to generate oscillations of its nuclear content in some cell types, and those oscillation can be a way to generate specific transcription patterns:

NF-kB oscillations translate into functionally related patterns of gene expression

For example, this very recent paper :

NF-κB Signaling in Macrophages: Dynamics, Crosstalk, and Signal Integration

shows at Fig. 3 the occupancy curve of binding sites at nuclear level after NF-kB activation in two different cell types.

In fibroblasts, the curve is a periodic oscillation, with a frequency that varies according to various factors, and translates into different transcription scenarios accordingly:

Gene expression dynamics scale with the period (g1) and amplitude (g2) of these oscillations, which are influenced by variables such as signal strength, duration, and receptor identity.


In macrophages, instead, the curve is rather:

a single, strong nuclear translocation event which persists for as long as the stimulus remains and tends to remain above baseline for an extended period of time.

In this case, the type of transcription will be probably regulated by the are under the curve, ratehr than by the period and amplitude of the oscialltions, as happened in fibroblasts.

Interestingly, while in previous studies it seemed that the concentration of nuclear dimers could be sufficient to saturate most or all binding sites, that has been found not to be the case in more recent studies. Again from the paper about abundance:

in fact, this lack of saturation of the system is necessary to generate stimulus- and cell-type specific gene expression profiles

Moreover, the binding itself seems to be rather short-lived:

Interestingly, it is now thought that most functional NF-kB interactions with chromatin—interactions that lead to a change in transcription—are fleeting… a subsequent study using FRAP in live cells expressing RelA-GFP showed that most RelA-DNA interactions are actually quite dynamic, with half-lives of a few seconds… Indeed, a recent study used single-molecule tracking of individual Halo-tagged RelA molecules in live cells to show that the majority (∼96%) of RelA undergoes short-lived interactions lasting on average ∼0.5 s, while just ∼4% of RelA molecules form more stable complexes with a lifetime of ∼4 s.

2. Affinity

Affinity of dimers for DNA sequences is not a clear cut matter. From the paper:

Biochemical DNA binding studies of a wide variety of 9–12 base-pair sequences have revealed that different NF-kB dimers bind far more sequences than previously thought, with different dimer species exhibiting specific but overlapping affinities for consensus and non-consensus kB site sequences.

IOWs, we have different dimers (15 different types) binding with varying affinity different DNA sequences (starting from the classical consensus sequence, but including also incomplete sequences). Remember that those sequences are rather short (the consensus sequence is 10 nucleotides long), and that there are thousands of such sequences in the genome.

Moreover, different bindings can affect transcription differently. Again, from the paper:

How might different consensus kB sites modulate the activity of the NF-kB dimers? Structure-function studies have shown that binding to different consensus kB sites can alter the conformation of the bound NF-kB dimers, thus dictating dimer function When an NF-kB dimer interacts with a DNA sequence, side chains of the amino  acids located in the DNA-binding domains of dimers contact the bases exposed in the groove of the DNA. For different consensus kB site sequences different bases are exposed in this groove, and NF-kB seems to alter its conformation to maximize interactions with the DNA and maintain high binding affinity. Changes in conformation may in turn impact NF-kB binding to co-regulators of transcription, whether these are activating or inhibitory, to specify the strength and dynamics of the transcriptional response. These findings again highlight how the huge array of kB binding site sequences must play a key role in modulating the transcription of target genes.

Quite a complex scenario, I would say!

But there is more:

Finally, as an additional layer of dimer and sequence-specific regulation, each of the subunits can be phosphorylated at multiple sites with, depending on the site, effects on nearly every step of NF-kB activation.

IOWs, the 15 dimers we have mentioned can be phosphorylated in many different ways, and that changes their binding affinities and their effects on transcription.

This section of the paper ends with a very interesting statement:

Overall, when considering the various ways in which NF-kB dimer abundances and their affinity for DNA can be modulated, it becomes clear that with these multiple cascading effects, small differences in consensus kB site sequences and small a priori differences in interaction affinities can ultimately have a large impact on the transcriptional response to NF-kB pathway activation.

Emphasis mine.

This is interesting, because in some way it seems to suggest that the whole system acts like a chaotic system, at least at some basic level. IOWs, small initial differences, maybe even random noise, can potentially affect deeply the general working of the whole systems.

Unless, of course, there is some higher, powerful level of control.

3. Availability of high affinity kB binding sequences

We have seen that there is a great abundance and variety of binding sequences for NF-kB dimers in the human genome. But, of course, those sequences are not necessarily available. Different cell types will have a different scenario of binding sites availability.

Why?

Because, as we know, the genome and chromatin are a very dynamic system, that can exist in many different states, continuosly changing in different cell types and, in the same cell type, in different conditions..

We know rather well the many levels of control that affect DNA and chromatin state. In brief, they are essentially:

  1. DNA methylation
  2. Histone modifications (methylation, acetylation, etc)
  3. Chromatin modifications
  4. Higher levels of organization, including nuclear localization and TADs (Topologically Associating Domains)

For example, from the paper:

The promoter regions of early response genes have abundant histone acetylation or trimethylation prior to stimulation [e.g., H3K27ac, (67) and H4K20me3, (66)], a chromatin state “poised” for immediate activation…  In contrast, promoters of late genes often have hypo-acetylated histones, requiring conformational changes to the chromatin to become accessible. They are therefore unable to recruit NF-kB for up to several hours after stimulation (68), due to the slow process of chromatin remodeling.

We must remember that each wave of NK-kB activation translates into the modified transcription of a lot of different genes at the genome level. It is therefore extremely important to consider what genes are available (IOWs, their promoters can be reached by the NF-kB signal) in each cell type and cell state.

The paper concludes:

Taken together, chromatin state and chromatin organization strongly influence the selection of DNA binding sites by NF-kB dimers and, most likely, the selection of the target genes that are regulated by these protein-DNA interaction events. Analyses that consider binding events in the context of three-dimensional nuclear organization and chromatin composition will be required to generate a more accurate view of the ways in which NF-kBDNA binding affects gene transcription.

This is the main scenario. But there are other components, that I have not considered in detail for the sake of brevity, for example competition between NF-kB dimers and the complex role and intervention of other co-regulators of transcription.

Does the system work?

But does the system work?

Of course it does. It is a central regulator, as we have said, of many extremely important biological processes, above all immunity. This is the system that decides how immune cells, T and B lymphocytes, have to behave, in terms of cell destiny and cell state. It is of huge relevance in all inflammatory responses, and in our defense against infections. It works, it works very well.

And what happens if it does not work properly?

Of course, like all very complex systems, errors can happen. Those interested can have a look at this recent paper:

30 years of NF-κB: a blossoming of relevance to human pathobiology

First of all, many serious genetic diseases have been linked to mutations in genes involved in the system. You can find a list in Table 1 of the above paper. Among them, for example, some forms of SCID, Severe combined immunodeficiency, one of the most severe genetic diseases of the immune system.

But, of course, a dysfunction of the NF-kB system has a very important role also in autoimmune diseases and in cancer.

Conclusions.

So, let’s try to sum up what we have seen here in the light of the original statement about biological systems that “are not machines”.

The NF-kB system is a perfect example. Even if we still understand very little of how it works, it is rather obvious that it is not a traditional machine.

A traditional machine would work differently. The signal would be transmitted from the membrane to the nucleus in the simplest possible way, without ambiguities and diversions. The Transcription Factor, once activated, would bind, at the level of the genome, very specific sites, each of them corresponding to a definite cascade of specific genes. The result would be clear cut, almost mechanical. Like a watch.

But that’s not the way things happen. There are myriads of variations, of ambiguities, of stochastic components.

The signal arrives to the membrane in multiple ways, very different one from the other: IL1, IL17, TNF, bacterial LPS, and immune activation of the B cell receptor (BCR) or the T cell receptor (TCR) are all possible signals.

The signal is translated to the NF-kB proteins in very different ways: canonical or non canonical activation, involving complex protein structures such as:

The CBM signalosome, intermediate between immune activation of BCR or TCR and canonical activation of the NF-kB. This complex is made of at least three proteins, CARD11, Bcl10 and MALT1.

The IKK complex in canonical activation: this is made of three proteins, IKK alpha, IKK beta, and NEMO. Its purpose is to phosphorylate the IkB, the inhibitor of the dimers, so that it can be ubiquinated and released from the dimer. Then the dimer can relocate to the nucleus.

Non canonical pathway: it involves the following phosphorylation cascade: NIK -> IKK alpha dimer -> Relb – p100 dimer -> Relb – p50 dimer (the final TF). It operates during the development of lymphoid organs and is responsible for the generation of B and T lymphocytes.

Different kinds of activated dimers relocate to the nucleus.

Different dimers, in varying abundance, interact with many different binding sites: complete or incomplete consensus sites, and probably others. The interaction is usually brief, and it can generate an oscillating pattern, or a more stable pattern

Completely different sets of genes are transcribed in different cell types and in different contexts, because of the interaction of NF-kB TFs with their promoters.

Many other factors and systems contribute to the final result.

The chromatin state of the cell at the moment of the NF-kB activation is essential to determine the accessibility of different binding sites, and therefore the final transcription pattern.

All these events and interactions are quick, unstable, far from equilibrium. A lot of possible random noise is involved.

In spite of that amazing complexity and potential stochastic nature of the system, reliable transcripion regulation and results are obtained in most cases. Those results are essential to immune cell differentiation, immune response, both innate and adaptive, inflammation, apoptosis, and many other crucial cellular processes.

So, let’s go back to our initial question.

Is this the working of a machine?

Of course it is! Because the results are purposeful, reasonably robust and reliable, and govern a lot of complex processes with remarkable elegance and efficiency.

But certainly, it is not a traditional machine. It is a lot more complex. It is a lot more beautiful and flexible.

It works with biological realities and not with transistors and switches. And biological realities are, by definition, far from equilibrium states, improbable forms of order that must continuously recreate themselves, fighting against the thermodynamic disorder and the intrinsic random noise that should apparently dominate any such scenario.

It is more similar to a set of extremely clever surfers who succeed in performing elegant and functional figures and motions in spite of the huge contrasting waves.

It is, from all points of view, amazing.

Now, Paley was absolutely right. No traditional machine, like a watch, could ever originate without design.

And if that is true of a watch, with its rather simple and fixed mechanisms, how much truer it must be for a system like NF-kB? Or, for that, like any cellular complex system?

Do you still have any doubts?

Added graphic: The evolutionary history, in terms of human conserved information, of the three proteins in the CBM signalosome.
On the y axis, homologies with the human protein as bits per aminoacid (bpa). On the x axis, approximate time of appearance in million of years.
The graphic shows the big information jump in vertebrates for all three protens , especially CARD11.


Added graphic: two very different proteins and their functional history


Added graphic (for Bill Cole). Functional history of Prp8, collagen, p53.
Comments
To all: Of course, I will make the clarifications about transposons as soon as possible.gpuccio
July 15, 2019
July
07
Jul
15
15
2019
08:03 AM
8
08
03
AM
PDT
Bornagain77: OK, I think I will leave it at that with you. Even if you don't.gpuccio
July 15, 2019
July
07
Jul
15
15
2019
08:02 AM
8
08
02
AM
PDT
Gp has, in a couple of instances now, tried to imply that I (and others) do not understand randomness. In regards to Shapiro Gp states,
Moreover, the reference to “statistically significant non-random patterns” could simply point to some necessity effect that modifies the probability distribution, like in the case of the loaded dice. As explained, that does not make the system “non-random”. And that has nothing to do with guidance, design or creation.
Might I suggest that it is Gp himself that does not understand randomness. As far as I can tell, Gp presupposes complete randomness within his model, (completely free from 'loaded dice'), and is one of the main reasons that he states that he can think of no "other possible explanation" to explain the sequence data.. Yet, if 'loaded dice' are producing “statistically significant non-random patterns” within genomes then that, of course, falsifies Gp's assumption of complete randomness in his model. Like I stated before 'directed' mutations, (and/or 'loaded dice' to use Gp's term), are 'another possible explanation' that I can think of.bornagain77
July 15, 2019
July
07
Jul
15
15
2019
07:23 AM
7
07
23
AM
PDT
Gp 77 and 85 disingenuously claims that he is the one being 'scientific' while trying, as best he can, to keep God out of his science. Hogwash! His model specifically makes claims as to what he believes the designer, i.e. God, is and is not doing. i.e. Johnny Cash's 'One Piece at a Time". Perhaps Gp falsely believes that if he compromises his theology enough that he is somehow being more scientific than I am? Again Hogwash. As I have pointed out many times, assuming Methodologcal Naturalism as a starting assumption, (as Gp seems bent on doing in his model as far as he can do it without invoking God), results in the catastrophic epistemological failure of science itself. (See bottom of post for refutation of methodological naturalism) Bottom line, Gp, instead of being more scientific than I, as he is falsely trying to imply (much like Darwinists constantly try to falsely imply), has instead produced a compromised, bizarre, and convoluted, model. A model that IMHO does not stand up to even minimal scrutiny. And a model that no self respecting Theist or even Darwinist would ever accept as being true. A model that, as far as I can tell, apparently only Gp himself accepts as being undeniably true..
As I have pointed out several times now, assuming Naturalism instead of Theism as the worldview on which all of science is based leads to the catastrophic epistemological failure of science itself. Basically, because of reductive materialism (and/or methodological naturalism), the atheistic materialist is forced to claim that he is merely a ‘neuronal illusion’ (Coyne, Dennett, etc..), who has the illusion of free will (Harris), who has unreliable beliefs about reality (Plantinga), who has illusory perceptions of reality (Hoffman), who, since he has no real time empirical evidence substantiating his grandiose claims, must make up illusory “just so stories” with the illusory, and impotent, ‘designer substitute’ of natural selection (Behe, Gould, Sternberg), so as to ‘explain away’ the appearance (i.e. illusion) of design (Crick, Dawkins), and who must make up illusory meanings and purposes for his life since the reality of the nihilism inherent in his atheistic worldview is too much for him to bear (Weikart), and who must also hold morality to be subjective and illusory since he has rejected God (Craig, Kreeft). Bottom line, nothing is real in the atheist’s worldview, least of all, morality, meaning and purposes for life.,,, – Darwin’s Theory vs Falsification – video – 39:45 minute mark https://youtu.be/8rzw0JkuKuQ?t=2387 Thus, although the Darwinist may firmly believes he is on the terra firma of science (in his appeal, even demand, for methodological naturalism), the fact of the matter is that, when examining the details of his materialistic/naturalistic worldview, it is found that Darwinists/Atheists are adrift in an ocean of fantasy and imagination with no discernible anchor for reality to grab on to. It would be hard to fathom a worldview more antagonistic to modern science than Atheistic materialism and/or methodological naturalism have turned out to be. 2 Corinthians 10:5 Casting down imaginations, and every high thing that exalteth itself against the knowledge of God, and bringing into captivity every thought to the obedience of Christ;
bornagain77
July 15, 2019
July
07
Jul
15
15
2019
06:38 AM
6
06
38
AM
PDT
ET at #83: "Yes, the algorithm would be more complex than the structure. " OK. "So what? Where is the algorithm? With the Intelligent Designer. " ??? What do you mean? I really don't understand. "A trace of it is in the structure itself." The structure aloows us to infer design. I don't see what in the structure points to some specific algorithm. Can you help? "The algorithm attempts to answer the question of how ATP synthase was intelligently designed. " OK, I am not saying that the designer did not use any algorithm. Maybe the designer is there in his lab, and has a lot of computers working fot him in the process. But: a) He probably designed the computers too b) His conscious cognition is absolutely necessary to reach the results. Computers do the computations, but it's consciousness that defines puproses, and finds strategies. And however, design happens when the functional information is inputted into the material object we observe. So, if the designer inputs information after having computed it in his lab. that is not really relevant. I though that your mention of an algorithm meant something different. I thought you meant that the designer designs an algorithm and put it in some existing organism (or place), and tha such algorithm them compute ATP synthase or what else. So, if that is your idea, again I ask: what facts support the existence of such an independent physical algorithm in physical reality? The answer is simple enough: none at all. " Of course an omnipotent intelligent designer wouldn’t require that and could just design one from its mind." I have no idea if the biological designer is omnipotent, or if he designs things from his mind alone, or if he uses computers or watches or anything else in the process. I only know that he designs biological things, and must be conscious, intelligent and purposeful.gpuccio
July 15, 2019
July
07
Jul
15
15
2019
06:29 AM
6
06
29
AM
PDT
Bornagain77 at #82:
Gp in 77 tried to imply he was completely theologically neutral. That is impossible.
Emphasis mine. That's unfair and not true. I quote myself at #77: "One of my strong choices is that my philosophy of science (and my theology, too) tell me that my scientific reasonings must not (as far as it is humanly possible) be influenced by my theology. In any way. So, I really strive to achieve that (and it’s not easy)." No comments. You see, the difference between your position and my position is that you are very happy to derive your scientific ideas from your theology. I try as much as possible not to do that. As said, both are strong choices. And I respect choices. But that's probably one of the reasons why we cannot really communicate constructively about scientific things.gpuccio
July 15, 2019
July
07
Jul
15
15
2019
06:16 AM
6
06
16
AM
PDT
Bornagain77 at #69 and #76 (and to all): OK, so some people apparently disagree with me. I will try to survive. But I would insist on the "apparently", because again, IMO, you make some confusion in your quotes and their intepretation. Let's see. At #69, you make 6 quote (excluding the internal reference to ET): 1. Shapiro. I don't think I can comment on this one. The quote is too short, and I have not the book to check the context. However, the reference to "genome change operator" is not very clear. Moreover, the reference to "statistically significant non-random patterns" could simply point to some necessity effect that modifies the probability distribution, like in the case of the loaded dice. As explained, that does not make the system "non-random". And that has nothing to do with guidance, design or creation. 2. Noble. That "genetic change is far from random and often not gradual" is obvious. It is not random because it is designed, and it is well known that it is not gradual. I perfectly agree. That has nothing to do with random mutations, because design is of course not implemented by random mutations. This is simply a criticism of model a. Another point is that some epigenetic modification can be inherited. Again, I have nothing against that. But of course I don't believe that such a mechanism can create complex functional information and body plans. Neither do you, I believe. You say you believe in the "creation of kinds". 3. and 4. Stermberg and the PLOS paper. These are about transposons. I will address this topic specifically at the end of this post. 5. The other PLOS paper. Here is the abstract:
Abstract Mutations drive evolution and were assumed to occur by chance: constantly, gradually, roughly uniformly in genomes, and without regard to environmental inputs, but this view is being revised by discoveries of molecular mechanisms of mutation in bacteria, now translated across the tree of life. These mechanisms reveal a picture of highly regulated mutagenesis, up-regulated temporally by stress responses and activated when cells/organisms are maladapted to their environments—when stressed—potentially accelerating adaptation. Mutation is also nonrandom in genomic space, with multiple simultaneous mutations falling in local clusters, which may allow concerted evolution—the multiple changes needed to adapt protein functions and protein machines encoded by linked genes. Molecular mechanisms of stress-inducible mutation change ideas about evolution and suggest different ways to model and address cancer development, infectious disease, and evolution generally.
This is simple. The paper, again, uses the term "random" and "not random" incorrectly. It is obvious in the first phrase. The authors complain that mutations do not occur "roughly uniformly" in the genome, and that would make them not random. But, as explained, the uniform distribution is only one of the many probability distributions that describe well natural phenomena. For example, many natural systems are well described, as well known, by a normal distribution, which has nothing to do with an uniform distribution. That does not mean that they are not random systems. The criticism to graduality I have already discussed: I obviously agree, but the only reason for non gradual variation is design. Indeed, neutral mutations are instead gradual, because they are not designed. And what's the problem with "environmental inputs"? We know very well that environmental inputs change the rate, and often the type, of mutation. Radiations, for example, do that. We have known that for decades. That is no reason to say that mutations are not random. They are random, and environmental inputs do modify the probability distribution. A lot. Are these authors really discovering, in 2019, that a lor of leukemias were caused by the bomb in Hiroshima? 6. Wells. He is discussing the interesting concept of somatic genomic variation. Here is the abstract of the paper to which he refers:
Genetic variation between individuals has been extensively investigated, but differences between tissues within individuals are far less understood. It is commonly assumed that all healthy cells that arise from the same zygote possess the same genomic content, with a few known exceptions in the immune system and germ line. However, a growing body of evidence shows that genomic variation exists between differentiated tissues. We investigated the scope of somatic genomic variation between tissues within humans. Analysis of copy number variation by high-resolution array-comparative genomic hybridization in diverse tissues from six unrelated subjects reveals a significant number of intra-individual genomic changes between tissues. Many (79%) of these events affect genes. Our results have important consequences for understanding normal genetic and phenotypic variation within individuals, and they have significant implications for both the etiology of genetic diseases such as cancer and for immortalized cell lines that might be used in research and therapeutics.
As you can see (if you can read that abstract impartially) the paper does not mention in any way anything that supports Wells'final (and rather gratuitous) statement: "From what I now know as an embryologist I would say that the truth is the opposite: Tissues and cells, as they differentiate, modify their DNA to suit their needs. It’s the organism controlling the DNA, not the DNA controlling the organism." Indeed, the paper says the opposit: that somatic genomic variations are important to better understand "the etiology of genetic diseases such as cancer". Why? The reason is simple: because they are random mutations, often deleterious. Ah, and by the way: of course somatic mutattions cannot be inherited, and therefore have no role in building the functional inforamtion in organisms. So, as you can see (but will not see) you are making a lot of confusion with your quotations. The only interesting topic is transposons. But it's late, so I will discuss that topic later, in next post.gpuccio
July 15, 2019
July
07
Jul
15
15
2019
06:06 AM
6
06
06
AM
PDT
gpuccio:
Because, of course, the algorithm would be by far more complex than the result. And where is that algorithm? there is absolutely no trace of it.
Yes, the algorithm would be more complex than the structure. So what? Where is the algorithm? With the Intelligent Designer. A trace of it is in the structure itself. The algorithm attempts to answer the question of how ATP synthase was intelligently designed. Of course an omnipotent intelligent designer wouldn't require that and could just design one from its mind.ET
July 15, 2019
July
07
Jul
15
15
2019
06:05 AM
6
06
05
AM
PDT
Gp in 77 tried to imply he was completely theologically neutral. That is impossible. Besides science itself being impossible without basic Theological presuppositions (about the rational intelligibility of the universe and of our minds to comprehend it), any discussion of origins necessarily entails Theological overtones. It simply can't be avoided. Gp is trying to play politics instead of being honest. Perhaps next GP will try to claim that he is completely neutral in regards to breathing air. :)bornagain77
July 15, 2019
July
07
Jul
15
15
2019
05:48 AM
5
05
48
AM
PDT
Basically I believe one of Gp's main flaws in his model is that he believes that the genome is basically static and most all the changes to the genome that do occur are the result of randomness (save for when God intervenes at the family level to introduce ''some' new information whilst saving parts of the genome that have accumulated changes due to randomness). Yet the genome is now known to be dynamic and not to be basically static.
Neurons constantly rewrite their DNA - Apr. 27, 2015 Excerpt: They (neurons) use minor "DNA surgeries" to toggle their activity levels all day, every day.,,, "We used to think that once a cell reaches full maturation, its DNA is totally stable, including the molecular tags attached to it to control its genes and maintain the cell's identity," says Hongjun Song, Ph.D.,, "This research shows that some cells actually alter their DNA all the time, just to perform everyday functions.",,, ,,, recent studies had turned up evidence that mammals' brains exhibit highly dynamic DNA modification activity—more than in any other area of the body,,, http://medicalxpress.com/news/2015-04-neurons-constantly-rewrite-dna.html A Key Evidence for Evolution Involving Mobile Genetic Elements Continues to Crumble - Cornelius Hunter - July 13, 2014 Excerpt: The biological roles of these place-jumping, repetitive elements are mysterious. They are largely viewed (by Darwinists) as “genomic parasites,” but in this study, researchers found the mobile DNA can provide genetic novelties recruited as certain population-unique, functional enrichments that are nonrandom and purposeful. “The first shocker was the sheer volume of genetic variation due to the dynamics of mobile elements, including coding and regulatory genomic regions, and the second was amount of population-specific insertions of transposable DNA elements,” Michalak said. “Roughly 50 percent of the insertions were population unique.” http://darwins-god.blogspot.com/2014/07/a-key-evidence-for-evolution-involving.html Contrary to expectations, genes are constantly rearranged by cells - July 7, 2017 Excerpt: Contrary to expectations, this latest study reveals that each gene doesn’t have an ideal location in the cell nucleus. Instead, genes are always on the move. Published in the journal Nature, researchers examined the organisation of genes in stem cells from mice. They revealed that these cells continually remix their genes, changing their positions as they progress though different stages. https://uncommondescent.com/intelligent-design/researchers-contrary-to-expectations-genes-are-constantly-rearranged-by-cells/
And again, DNA is now, contrary to what is termed to be 'the central dogma', far more passive than it was originally thought to be. As Denis Noble stated, “The genome is an ‘organ of the cell’, not its dictator”
“The genome is an ‘organ of the cell’, not its dictator” - Denis Noble – President of the International Union of Physiological Sciences
Another main flaw in Gp's 'Johnny Cash model', and as has been pointed out already, is that he assumes 'randomness' to be a defining notion for changes to the genome. This is the same assumption that Darwinists make. In fact, Darwinists. on top of that, also falsely assume 'random thermodynamic jostling' to be a defining attribute of the actions within a cell. Yet, advances in quantum biology have now overturned that foundational assumption of Darwinists, The first part of the following video recalls an incident where 'Harvard Biovisions' tried to invoke 'random thermodynamic jostling' within the cell to undermine the design inference. (i.e. the actions of the cell, due to advances in quantum biology, are now known to be far more resistant to 'random background noise' than Darwinists had originally presupposed:)
Darwinian Materialism vs. Quantum Biology – Part II – video https://www.youtube.com/watch?v=oSig2CsjKbg
Of supplemental note:
How Quantum Mechanics and Consciousness Correlate – video (how quantum information theory and molecular biology correlate – 27 minute mark) https://youtu.be/4f0hL3Nrdas?t=1634
bornagain77
July 15, 2019
July
07
Jul
15
15
2019
05:41 AM
5
05
41
AM
PDT
EugeneS: Hi, Eugene, Welcome anyway to the discussion, even for an off.topic! :)gpuccio
July 15, 2019
July
07
Jul
15
15
2019
05:15 AM
5
05
15
AM
PDT
Upright Biped, An off-topic. You have mail as of a long time ago :) I apologise for my long silence. I have changed jobs twice and have been quite under stress. Because of this I was not checking my non-business emails regularly. Hoping to get back to normal.EugeneS
July 15, 2019
July
07
Jul
15
15
2019
04:57 AM
4
04
57
AM
PDT
Bornagain77 at #76: For "God reusing stuff", see my previous post. For the rest, mutations and similar, see my next post (I need a little time to write it).gpuccio
July 15, 2019
July
07
Jul
15
15
2019
04:49 AM
4
04
49
AM
PDT
Bornagain77: "I note that my model is Theologically modest in that I hold to traditional concepts of the omniscience of God and God creating ‘kinds’ that reproduce after themselves, whereas, humorously, your model is all over the place Theologically speaking." "And as ET pointed out, Gp’s presupposition also makes no sense theologically speaking" I have ignored this kind of objection, but as you (and ET) insist, I will say just a few words. I believe that you are theologically committed in your discussions about science. This is not a big statement, I suppose, because it is rather obvious in all that you say. And it is not a criticism, believe me. It is your strong choice, and I appreciate people who make strong choices. But, of course, I don't feel obliged to share those choices. You see, I too make my strong choices, and I like to remain loyal to them. One of my strong choices is that my philosophy of science (and my theology, too) tell me that my scientific reasonings must not (as far as it is humanly possible) be influenced by my theology. In any way. So, I really strive to achieve that (and it's not easy). This is, for me, an important question of principle. So, I will not answer any argument that makes any reference to theology, or even simply to God, in a scientific discussion. Never. So, excuse me if I will go on ignoring that kind of remarks from you or others. It's not out of discourtesy. It's to remain loyal to my principles.gpuccio
July 15, 2019
July
07
Jul
15
15
2019
04:47 AM
4
04
47
AM
PDT
Gp claims:
neutral signatures accumulate as differences as time goes on, between there is physical continuity. Creation or design form scratch for each organism cannot explain that. This is the argument that BA seems not to understand.
To be clear, Gp is arguing for a very peculiar. even bizarre. form of UCD where God reuses stuff and does not create families de novo (which is where Behe now puts the edge of evolution). Hence my reference to Johnny Cask's song "One Piece at a Time" Earlier, Gp also claimed that he could think of no other possible explanation to explain the data. I pointed out that 'directed' mutations are another possible explanation. Gp then falsely claimed that there are no such thing as directed mutations. Specifically he claimed, "Most mutations that we observe, maybe all, are random." Gp, whether he accepts it or not, is wrong in his claim that "maybe all mutations are random". Thus, Gp's "Johnny Cash" model is far weaker than he imagines it to be.
JOHNNY CASH – ONE PIECE AT A TIME – CADILLAC VIDEO https://www.youtube.com/watch?v=Hb9F2DT8iEQ
bornagain77
July 15, 2019
July
07
Jul
15
15
2019
04:40 AM
4
04
40
AM
PDT
ET at #71: As far as I can understand, the divergence of polar bears is probably simple enough to be explained as adaptation under environmental constraints. This is not ATP synthase. Not at all. I don't know the topic well, so mine is just an opinion. However, bears are part of the family Ursidae, so brown bears and polar bears are part of the same family. So, is we stick to Behe's very reasonable idea that family is probably the level which still requres design, this is an inside family divergence.gpuccio
July 15, 2019
July
07
Jul
15
15
2019
02:19 AM
2
02
19
AM
PDT
ET at #70:
Evolution by means of intelligent design is active design.
Yes, it is.
Genetic changes don’t have to produce some perceived advantage in order to be directed.
Of course. That's exactly my point. See my post #43, this statement about my model (modeol b): "There is no need for functional intermediates in the fossil record or in the genomes. What happens in the lab does not leave traces. We do not need big intermediate populations to be expanded by positive NS, to gain new huge probabilistic resources (as in model a). We just need a few samples, a few intermediates, in a limited time and space. There is no reason to expect any relevant trace from that process." Emphasis added.
And if genetic entropy has interfered with the directed mutation function then that could also explain what you observe.
In my model, it does. You see, for anything to explain the differences created in time by neutral variation (my point 1 at post #43, what I call "signatures of neutral variation in the conserved sequences, grossly proportional to the evolutionary time split"), you definitely need physical continuity between different organisms. Otherwise, nothing can be explained. IOWs, neutral signatures accumulate as differences as time goes on, between there is physical continuity. Creation or design form scratch for each organism cannot explain that. This is the argument that BA seems not to understand.
And yes, ATP synthase was definitely intelligently designed.
Definitely.
Why can’t it be that it was intelligently designed via some sort of real genetic algorithm?
Because, of course, the algorithm would be by far more complex than the result. And where is that algorithm? there is absolutely no trace of it. It is no good to explain things with mere imagination, We need facts. Look, we are dealing with functional information here, not with some kind of pseudo-order that can be generated by some simple necessity laws coupled to random components. IOWs, this is not something that self-organization can even start to do. Of course, an algorithm could do it. If I had a super-computer already programmed with all possible knowledge about ciochemistry, and the computing ability to anticipate top down how protein sequences will fold and what biochemical activity they will have, and with a definite plan to look for some outcome that can transform a proton gradient into ATP, possibly with at least a strong starting plan that it should be something like a water mill, then yes, maybe that super-computer could be, in time, elaborate some relatively efiicient project on that basis. Of course, that whole apparatus would be much more complex than what we want to obtain. After all, ATP synthase has only a few thousand bits of functional information. Here we are discussing probably many gigabytes for the algorithm. That's the problem, in the end. Functional information can be generated only in two was: a) Direct design by a consious, intelligen, purposeful agent. Of course that agent may have to use previous data or knowledge, but the point is that its cognitive abilities and its ability to have purposes will create those shortcuts that no non design system can generate. b) Indirect design through some designed system complex enough to include a good programming of how to obtain some results. As said, that can work, but it has severe limitaitons. The designed system is already very complex, and the further functional information that can be obtained is usually very limited and simple. Why? Because the system, not being open to a further intervention of conaciousness and intelligence, can only do what it has been progarmmed to do. Nothing else. The purposes are only those purposes that have already been embedded at the beginning. Nothing else. The computations, all the apparently "intelligent" activities, are merely passive executions of intelligent programs already designed. They can do what they have been programmed to do, but nothing else. So, let's say that I want to program a system that can find a good solution for ATP-synthase. OK, I can do that (not me, of course, let's say some very intelligent designer). But I must already be conscious that I will need ATP.synthase, ir something like that. I must put that purpose in my system. And of course all the knowledge and power needed to do what I want it to do. Or, of course, I can just design ATP synthase and introduce that design in the system (that I have already designed myself soem time ago) if and when it is needed. Which is more probably true? Again, facts and only facts must guide us. ATP synthase, in a form very similar to what we observe today, was alredy present billion of years ago, when reasonably only prokaryotes were living on our planet. Was a complex algorithm capable of that kind of knowledge and computations present on our planet before the appearance of ATP synthase? In what form? What fatcs have we that support such an idea The truth is very simple. For all that we can know and reasonably infer, at some time, very early after our plane became compatible with any form of life, ATP synthase appeared, very much similar to what it is today, in some bacterial like form of life. There is nothing to suggest, or support, or even mak credible or reasonable, that any complex algorithm capable of computing the necessary information for it was present at that time. No such algorithm, or any trace of it, exists today. If we wanted to compute ATP synthase today, we would not have the palest idea of how to do it. These are the simple facts. Then, anyone is free to believe as he likes. As for me, I stick to my model, and am very happy with it.gpuccio
July 15, 2019
July
07
Jul
15
15
2019
02:11 AM
2
02
11
AM
PDT
Upright BiPed: Hi UB, nice to hear from you! :) "Once again, where are your anti-ID critics?" As usual, they seem to have other interests. :) Luckily, some friends are ready to be fiercely antagonistic! :) Which is good, I suppose...gpuccio
July 15, 2019
July
07
Jul
15
15
2019
01:37 AM
1
01
37
AM
PDT
. Another excellent post GP, thank you for writing it. Reading thru it now. Once again, where are your anti-ID critics?Upright BiPed
July 14, 2019
July
07
Jul
14
14
2019
09:49 PM
9
09
49
PM
PDT
And those polar bears. The change in the structure of the fur didn't happen by chance. So either the original population(s) of bears already had that variation or the information required to produce it. With that information being teased out due to the environmental changes and built-in responses to environmental cues.ET
July 14, 2019
July
07
Jul
14
14
2019
06:04 PM
6
06
04
PM
PDT
Evolution by means of intelligent design is active design. Genetic changes don't have to produce some perceived advantage in order to be directed. And if genetic entropy has interfered with the directed mutation function then that could also explain what you observe. And yes, ATP synthase was definitely intelligently designed. Why can't it be that it was intelligently designed via some sort of real genetic algorithm?ET
July 14, 2019
July
07
Jul
14
14
2019
05:32 PM
5
05
32
PM
PDT
Gp adamantly states,
I beg to differ. Most mutations that we observe, maybe all, are random.
And yet Shapiro adamantly begs to differ,,,
"It is difficult (if not impossible) to find a genome change operator that is truly random in its action within the DNA of the cell where it works. All careful studies of mutagenesis find statistically significant non-random patterns” James Shapiro - Evolution: A View From The 21st Century - (Page 82)
Noble also begs to differ
Physiology is rocking the foundations of evolutionary biology - Denis Noble - 17 MAY 2013 Excerpt: The ‘Modern Synthesis’ (Neo-Darwinism) is a mid-20th century gene-centric view of evolution, based on random mutations accumulating to produce gradual change through natural selection.,,, We now know that genetic change is far from random and often not gradual.,,, http://onlinelibrary.wiley.com/doi/10.1113/expphysiol.2012.071134/abstract - Denis Noble – President of the International Union of Physiological Sciences
Richard Sternberg also begs to differ
Discovering Signs in the Genome by Thinking Outside the BioLogos Box - Richard Sternberg - March 17, 2010 Excerpt: The scale on the x-axis is the same as that of the previous graph--it is the same 110,000,000 genetic letters of rat chromosome 10. The scale on the y-axis is different, with the red line in this figure corresponding to the distribution of rat-specific SINEs in the rat genome (i.e., ID sequences). The green line in this figure, however, corresponds to the pattern of B1s, B2s, and B4s in the mouse genome.... *The strongest correlation between mouse and rat genomes is SINE linear patterning. *Though these SINE families have no sequence similarities, their placements are conserved. *And they are concentrated in protein-coding genes.,,, ,,, instead of finding nothing but disorder along our chromosomes, we are finding instead a high degree of order. Is this an anomaly? No. As I'll discuss later, we see a similar pattern when we compare the linear positioning of human Alus with mouse SINEs. Is there an explanation? Yes. But to discover it, you have to think outside the BioLogos box. http://www.evolutionnews.org/2010/03/signs_in_the_genome_part_2032961.html Beginning to Decipher the SINE Signal - Richard Sternberg - March 18, 2010 Excerpt: So for a pure neutralist model to account for the graphs we have seen, ~300,000 random mutation events in the mouse have to match, somehow, the ~300,000 random mutation events in the rat. What are the odds of that? http://www.evolutionnews.org/2010/03/beginning_to_decipher_the_sine032981.html
Another paper along that line,
Recent comprehensive sequence analysis of the maize genome now permits detailed discovery and description of all transposable elements (TEs) in this complex nuclear environment. . . . The majority, perhaps all, of the investigated retroelement families exhibited non-random dispersal across the maize genome, with LINEs, SINEs, and many low-copy-number LTR retrotransposons exhibiting a bias for accumulation in gene-rich regions. http://journals.plos.org/plosgenetics/article?id=10.1371/journal.pgen.1000732
and another paper
PLOS Paper Admits To Nonrandom Mutation In Evolution - May 31, 2019 Abstract: “Mutations drive evolution and were assumed to occur by chance: constantly, gradually, roughly uniformly in genomes, and without regard to environmental inputs, but this view is being revised by discoveries of molecular mechanisms of mutation in bacteria, now translated across the tree of life. These mechanisms reveal a picture of highly regulated mutagenesis, up-regulated temporally by stress responses and activated when cells/organisms are maladapted to their environments—when stressed—potentially accelerating adaptation. Mutation is also nonrandom in genomic space, with multiple simultaneous mutations falling in local clusters, which may allow concerted evolution—the multiple changes needed to adapt protein functions and protein machines encoded by linked genes. Molecular mechanisms of stress-inducible mutation change ideas about evolution and suggest different ways to model and address cancer development, infectious disease, and evolution generally.” (open access) – Fitzgerald DM, Rosenberg SM (2019) What is mutation? A chapter in the series: How microbes “jeopardize”the modern synthesis. PloS Genet 15(4): e1007995. https://uncommondescent.com/evolution/plos-paper-admits-to-nonrandom-mutation-in-evolution/
And as Jonathan Wells noted, "I now know as an embryologist,,,Tissues and cells, as they differentiate, modify their DNA to suit their needs. It's the organism controlling the DNA, not the DNA controlling the organism."
Ask an Embryologist: Genomic Mosaicism - Jonathan Wells - February 23, 2015 Excerpt: humans have a "few thousand" different cell types. Here is my simple question: Does the DNA sequence in one cell type differ from the sequence in another cell type in the same person?,,, The simple answer is: We now know that there is considerable variation in DNA sequences among tissues, and even among cells in the same tissue. It's called genomic mosaicism. In the early days of developmental genetics, some people thought that parts of the embryo became different from each other because they acquired different pieces of the DNA from the fertilized egg. That theory was abandoned,,, ,,,(then) "genomic equivalence" -- the idea that all the cells of an organism (with a few exceptions, such as cells of the immune system) contain the same DNA -- became the accepted view. I taught genomic equivalence for many years. A few years ago, however, everything changed. With the development of more sophisticated techniques and the sampling of more tissues and cells, it became clear that genetic mosaicism is common. I now know as an embryologist,,,Tissues and cells, as they differentiate, modify their DNA to suit their needs. It's the organism controlling the DNA, not the DNA controlling the organism. http://www.evolutionnews.org/2015/02/ask_an_embryolo093851.html
And as ET pointed out, Gp's presupposition also makes no sense theologically speaking
Just think about it- a Designer went through all of the trouble to produce various living organisms and place them on a changing planet in a changing universe. But the Designer is then going to leave it mostly to chance how those organisms cope with the changes? It just makes more sense that organisms were intelligently designed with the ability to adapt and evolve, albeit with genetic entropy creeping in.
bornagain77
July 14, 2019
July
07
Jul
14
14
2019
04:47 PM
4
04
47
PM
PDT
OLV and all: Here is a database of known human lncRNAs: https://lncipedia.org/ It includes, at present, data for 127,802 transcripts and 56,946 genes. A joy for the fans of junk DNA! :) Let's look at one of these strange objects. MALAT-1 is one of the lncRNAs described in the paper at the previous post. Here is what the paper says:
MALAT1 Metastasis-associated lung adenocarcinoma transcript 1 (MALAT1) is a highly conserved lncRNA whose abnormal expression is considered to correlate with the development, progression and metastasis of multiple cancer types. Recently we reported the role of MALAT1 in regulating the production of cytokines in macrophages. Using PMA-differentiated macrophages derived from the human THP1 monocyte cell line, we showed that following stimulation with LPS, a ligand for the innate pattern recognition receptor TLR4, MALAT1 expression is increased in an NF-kB-dependent manner. In the nucleus, MALAT1 interacts with both p65 and p50 to suppress their DNA binding activity and consequently attenuates the expression of two NF-kB-responsive genes, TNF-a and IL-6. This finding is in agreement with a report based on in silico analysis predicting that MALAT1 could influence NF-kB/RelA activity in the context of epithelial–mesenchymal transition. Therefore, in LPS-activated macrophages MALAT1 is engaged in the tight control of the inflammatory response through interacting with NF-kB, demonstrating for the first time its role in regulating innate immunity-mediated inflammation. As MALAT1 is capable of binding hundreds of active chromatin sites throughout the human genome, the function and mechanism of action so far uncovered for this evolutionarily conserved lncRNA may be just the tip of an iceberg.
Emphasis mine, as usual. Now, if we look for MALAT-1 in the database above linked, we find 52 transcripts. The first one, MALAT1:1, has a size of 12819 nucleotides. Not bad! :) 342 papers quoted about this one transcript.gpuccio
July 14, 2019
July
07
Jul
14
14
2019
03:51 PM
3
03
51
PM
PDT
OLV and all: This is another paper about lncRNAs and NF-kB: Long non-coding RNA: a versatile regulator of the nuclear factor-kB signalling circuit https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5343356/ This is open access.
SUMMARY The nuclear factor-kB (NF-kB) family of transcription factors play an essential role for the regulation of inflammatory responses, immune function and malignant transformation. Aberrant activity of this signalling pathway may lead to inflammation, autoimmune diseases and oncogenesis. Over the last two decades great progress has been made in the understanding of NF-kB activation and how the response is counteracted for maintaining tissue homeostasis. Therapeutic targeting of this pathway has largely remained ineffective due to the widespread role of this vital pathway and the lack of specificity of the therapies currently available. Besides regulatory proteins and microRNAs, long non?coding RNA (lncRNA) is emerging as another critical layer of the intricate modulatory architecture for the control of the NF-kB signalling circuit. In this paper we focus on recent progress concerning lncRNA-mediated modulation of the NF-kB pathway, and evaluate the potential therapeutic uses and challenges of using lncRNAs that regulate NF-kB activity.
gpuccio
July 14, 2019
July
07
Jul
14
14
2019
03:35 PM
3
03
35
PM
PDT
OLV: "Are you surprised?" No. :) But, of course, self-organization can easily explain all that! :)gpuccio
July 14, 2019
July
07
Jul
14
14
2019
03:30 PM
3
03
30
PM
PDT
GP @52: " the levels of regulation and crosstalk of this NF-kB system grow each time I make a Pubmed search" Are you surprised? :) This crosstalk concept is very interesting indeed.OLV
July 14, 2019
July
07
Jul
14
14
2019
03:13 PM
3
03
13
PM
PDT
ET: "I doubt it. I would say most are directed and only some are happenstance occurrences". I beg to differ. Most mutations that we observe, maybe all, are random. Of course, if the functional information we observe in organisms was generated by mutations, those mutations were probably guided. But we cannot observe that process directly, or at least I am not aware that it has been observed. Instead, we observe a lot of more or less spontaneous mutations that are really random. Many of them generate diseases, often in real time. Radiation and toxic substances dramatically increase the rate of random mutations, and the frequency of certain diseases or malformations. We know that very well. And yet, no law can anticipate when and how those mutations will happen. We just know that they are more common. The system is still probabilistic, even if we can detect the effect of specific causes. I don't know Spetner in detail, but it seems that he believes that most functional information derives from some intelligent adaptation of existing organisms. Again, I beg to differ. It is certainly true that "all the evolution that has been actually observed and which is not accounted for by modern evolutionary theory" needs some explanation, but the explanation is active design, not adaptation. I am not saying that adaptation does not exist, or does not have some important role. We can see good examples, for example in bacteria (the plasmid system, just to mention one instance). Of course a complex algorithm can generate some new information by computing new data that come from the environment. but the ability to adapt depends on the specific functional information that is already in the system, and has therefore very strict limitations. Adaptation can never generate a lot of new original functional information. Let's make a simple example. ATP synthase, again. There is no adaptation system in bacteria that could have found the specific sequences of tha many complex components of the system. It is completely out of discussion. And yet, ATP synthase exists in bacteria from billion of years, and is still by far similar in humans. This is of course the result of design, not adaptation. The same can be said for body plans, all complex protein networks, and I agree with Behe that families of organisms are already levels of complexity that scream design. Adaptation, even for an already complex organism, cannot in any way explain those things. It is true that the mutations we observe are practically always random. It is true that they are often deleterious, or neutral. More often neutral or quasi neutral. We know that. We see those mutations happen all the time. Achondroplasia, for example, which is the most common cause of dwarfism, is a genetic disease that (I quote from Wikipedia for simplicity): "is due to a mutation in the fibroblast growth factor receptor 3 (FGFR3) gene.[3] In about 80% of cases this occurs as a new mutation during early development.[3] In the other cases it is inherited from one's parents in an autosomal dominant manner." IOWs, in 80% of cases the disease is due to a new mutation, one that was not present in the parents. If you look at the Exac site: http://exac.broadinstitute.org/ you will find the biggest database of variations in the human genome. Random mutations that generate neutral variation are facts. They can be observed, their rate can be measured with some precision. There is absolutely no scientific reason to deny that. So, to sum up: a) The mutations we observe every day are random, often neutral, sometimes deleterious. b) The few cases where those mutations generate some advantage, as well argued by Behe, are cases of loss of information in complex structures that, by chance, confers some advantage in specific environments. see antibiotic resistance. All those variations are simple. None of them generates any complex functional information. c) The few cases of adaptation by some active mechanism that are in some way documented are very simple too. Nylonase, for example, could be one of them. The ability of viruses to change at very high rates could be another one. d) None of those reasonings can help explain the appearance, throughout natural history, of new complex functional information, in the form of new functional proteins and protein networks, new body plans, new functions, new regulations. None of those reasonings can explain OOL, or eukaryogenesis, or the transition to vertebrates. None of them can even start to explain ATP synthase, ot the immune system, or the nervous system in mammals. And so on, and so on. e) All these things can only be explained by active design. This is my position. This is what I firmly believe. That said, if you want, we can leave it to that.gpuccio
July 14, 2019
July
07
Jul
14
14
2019
02:56 PM
2
02
56
PM
PDT
Hazel: In a strict sense, random is a system where the events cannot be anticipated by a definite law, but can be reasonably described by a probability distribution. Of course, it is absolutely true that in that case "there is no causal connection between the mutation and whatever eventual effects and possible benefits it might have for the organism". I would describe that aspect saying that the system, as a whole, is blind to those results. Randomness is a concept linked to our way of describing the system. Random systems, like the tossing of a coin, are in essence deterministic, but we have no way to describe them in a deterministic way. The only exception could be the intrinsic randomness of the wave function collapse in quantum mechanics. In the interpretations where it is really considered intrinsic.gpuccio
July 14, 2019
July
07
Jul
14
14
2019
02:29 PM
2
02
29
PM
PDT
Thank you, bornagain77. And yes- the non-random evolutionary hypothesis featuring built-in responses to environmental cues.ET
July 14, 2019
July
07
Jul
14
14
2019
01:23 PM
1
01
23
PM
PDT
Excellent point at 59 ET. Isn't Spetner's model called the "Non-Random' Evolutionary hypothesis?
Spetner goes through many examples of non-random evolutionary changes that cannot be explained in a Darwinian framework. https://evolutionnews.org/2014/10/the_evolution_r/ Gloves Off — Responding to David Levin on the Nonrandom Evolutionary Hypothesis Lee M. Spetner September 26, 2016 In the book, I present my nonrandom evolutionary hypothesis (NREH) that accounts for all the evolution that has been actually observed and which is not accounted for by modern evolutionary theory (the Modern Synthesis, or MS). Levin ridicules the NREH but does not refute it. There is too much evidence for it. A lot of evidence is cited in the book, and there is considerably more that I could add. He ridicules what he cannot refute. Levin calls the NREH Lamarckian. But it differs significantly from Lamarkism. Lamarck taught that an animal acquired a new capability — either an organ or a modification thereof — if it had a need for it. He offered, however, no mechanism for that capability. Because Lamarck’s theory lacked a mechanism, the scientific community did not accept it. The NREH, on the other hand, teaches that the organism has an endogenous mechanism that responds to environmental stress with the activation of a transposable genetic element and often leads to an adaptive response. How this mechanism arose is obscure at present, but its operation has been verified in many species.,,, https://evolutionnews.org/2016/09/gloves_off_-_r/
bornagain77
July 14, 2019
July
07
Jul
14
14
2019
01:15 PM
1
01
15
PM
PDT
1 21 22 23 24 25

Leave a Reply