Here’s the abstract of a new PNAS article:

There’s a new article in PNAS that illustrates the fact that thermodynamics determines the effects of future changes made to a protein molecule. Any one mutation changes the thermodynamic/statistical mechanics of the protein molecule. And these changes in the thermodynamic properties swims around in a giant ocean of statistical possibilities, and consequent improbabilities; so much so, that future mutations following upon any given mutation cannot be ascertained.

So, the next time you hear about how they were able to “reconstruct” an ancient protein, allowing the determination of possible pathways, don’t pay any attention to it.

Evolutionary prediction is of deep practical and philosophical importance.

Here we show, using a simple computational protein model, that protein evolution remains unpredictable, even if one knows the effects of all mutations in an ancestral protein background.We performed a virtual deep mutational scan—revealing the individual and pairwise epistatic effects of every mutation to our model protein—and then used this information to predict evolutionary trajectories.Our predictions were poor. This is a consequence of statistical thermodynamics.Proteins exist as ensembles of similar conformations. The effect of a mutation depends on the relative probabilities of conformations in the ensemble, which in turn, depend on the exact amino acid sequence of the protein. Accumulating substitutions alter the relative probabilities of conformations, thereby changing the effects of future mutations. This manifests itself as subtle but pervasive high-order epistasis.Uncertainty in the effect of each mutation accumulates and undermines prediction. Because conformational ensembles are an inevitable feature of proteins, this is likely universal.

From the article itself:

A key point from our work is that unpredictability can arise

even in this extraordinary simple system.The problem of predictingUsing a larger protein, for example,

evolution will only become harder as the complexity and realism

of the models increase.

would increase the number of possible options and degeneracy

of trajectories, making predictions more challenging.Likewise,(SI Appendix).

constructing a more realistic evolutionary model—incorporating

drift for example—increases the number of available trajectories

and makes evolutionary prediction more challenging than

the strong selection case

So, NS is passe according to “evolutionary biologists”. Everything nowadays involves principally ‘genetic drift’; but, here, they show that this only makes evolution “more” unpredictable. So, how, exactly, do we climb Mt. Improbable?

And, then, this:

Surprisingly, we found that our predictions were quite poor.

We even added all pairwise epistatic effects of mutations to our predictive model, requiring a massive virtual deep mutational scan of all possible pairs of mutations.Even this did not allow robust predictions of evolutionary trajectories. We find that theunpredictability arises directly from the thermodynamic ensemble of conformationspopulated by macromolecules, revealing a profound link between protein physics and the evolutionary process.

Let me paraphrase this for you:

To evaluate protein evolution, statistical mechanics must be employed. What does this look like? S= ln P, where P is the number of degrees of freedom. And how do we calculate that? By using the same formula that Wm. Dembski used to define information.

From Wikipedia:

In statistical mechanics, Boltzmann’s equation is a probability equation relating the entropy S of an ideal gas to the

quantity W, the number of real microstates corresponding to the gas’ macrostate:[N.B.: read “nucleotide sequence configuration” for “gas ‘microstate'”]S=k_(B)ln W} (1)

where k_(B) is the Boltzmann constant (also written with k), which is equal to 1.38065 × 10−23 J/K.

That is, if we want to know the **probability of any particular protein coming about** from some other protein, then **simply use a probability calcuation.** So, if two mutations are needed, then, using an average length of 3 x 10^9 nucleotides/eukaryotic genome, **this means that the probability of getting the right mutation at two locations is: (1/4)(1/3×10^9) x (1/4)(1/3×10^9)= 7 x 10^-21.** Assume a population size of 10^4 and a mutation rate of 180/animal/generation, and this means each generation can produce 1.8 x 10^6 possibilities/generation. **The likelihood of overcoming the calculated improbability would require 4 x 10^15 generations. That is, a million, billion generations. Impossible.**

This is what ID has been saying for twenty years.

Interesting article. Thanks.

Can this OP be somehow associated with or related to the following link?

https://uncommondescent.com/intelligent-design/what-are-the-limits-of-random-variation-a-simple-evaluation-of-the-probabilistic-resources-of-our-biological-world/

This article does touch upon the two questions gpuccio asks at the bottom of his opening post.

There’s another article recently published along these lines which suggests that the entire approach to modern evolutionary theory needs to be rethought based on the results of their investigation. It seems that that the entire stretch of nucleotides “reacts” to ‘any’ mutation occuring along its length, and this ‘reaction’ is based on thermodynamic properties: i.e., the ‘gene’ ‘adapts itself’ to any new mutation.

Fundamental thermodynamic properties of nucleotide sequences have a very complicated, complex life of their own. The simple formulas used by evolutionary biologists can’t capture this complexity. The more we learn, the more purely materialist answers fall apart.