I’m not aware of any policy restriction on providing links to TalkOrigins at UD. I’ve linked to TalkOrigins a lot since I’ve criticized it so much in the past.

In this case, I actually concur with one of Gordon Davisson’s essays there, and I think it rightly earned Post of the Month 14 years ago at TalkOrigins.

When I teach ID, I feel the ideas must defensible to science undergraduate and graduate students and science faculty (physics, chemistry, mathematics, engineering). The internet has been an opportunity to vet some of my ideas. Gordon has been of great assistance with his balanced and insightful critiques of my essays at UD.

I link to the Gordon’s essay because we independently arrived at the same conclusion regarding the relationship of Thermodynamic entropy and Shannon Entropy, namely:

1 Joule/Kelvin = 1 / (1.381 x 10^{-23}) / ln (2) Shannon Bits =

1.045 x 10^{23} Shannon Bits

provided the microstates under consideration for Shannon entropy are the energy microstates.

In the 500 fair coins example, I typically say the Shannon entropy is 500 bits because I consider the heads/tails configurations to be the symbols generally under consideration, but if we consider instead the energy microstates at the molecular level of 500 coins, one will get on the order of 8.636 x 10^{25} bits — a number which dwarfs even the supposed Universal Probability Bound of 500 bits! ðŸ˜¯

I made the derivation in two places. Mainly at:

Shannon Information, Entropy, Uncertainty in Thermodynamics and ID

and indirectly in:

Creation Evolution University: Clausius, Boltzmann, Dembski

Here is Gordon’s essay:

Information and Thermo-Entropy. I present his essay in in gratitude for much of his constructive criticisms of my work at UD.

Technical comments and corrections are welcome.

Sal, we have no internal policy restrictions on providing links to anything. We avoid sites that are *apparently* in violation of copyright (but are not Internet experts on that). Crude language/other unsuitability warnings are highly advisable as needed.

One problem is that, as the Internet is new to the judiciary, some court judgments have upheld the idea that linking to something illegal or objectionable constitutes an offense. That is hardly likely to apply to anything at the site you mention, or others similar.

Thanks for the link; I’m remembering a policy from maybe a decade ago, and might well be thinking of something from a completely different site. Since it’s no longer in force (if it ever was), I won’t worry about it. BTW, I’d also like to thank you for contributing some clarity to the debate about thermodynamics — my experience has been that most people (on both sides of the pro/anti-evolution debate) don’t really understand thermodynamics very well, but a disturbing number (again on both sides) nonetheless feel the need to argue about it. The result is lots of misunderstandings being hurled back and forth… So it’s distinctly gratifying when people on opposite sides can actually agree about something like this (and the agreement seems to stem from understanding, not just exhaustion).

There’s also no policy restriction against letting people discuss Gordon’s paper here.

If you don’t wish to allow comments in a thread you can turn them off.

Gordon Davisson:

But that’s a pretty important question to answer isn’t it?

Arieh Ben-Naim writes:

Yep; entropy’s status as a state function is pretty central to how it’s used in thermodynamics, so the questions the information connection raises are … interesting.

To illustrate the problem, suppose we constructed an RNA strand with a random sequence of bases. From what I said in the article, the randomness of the sequence will add to its entropy. Now, suppose we replicated the strand. The second strand is also random, but the process that made it could only produce one sequence. So they’re identical, but the second has less entropy? And it has less entropy because of how it was made? That’s a clear violation of the principle that entropy only depends on a system’t current state, not how it got into that state.

So what sense can we make of this? At least in this case, I think clear that they both have the same (higher) entropy, but that the entropy of the two together is less than sum of their individual entropies. In information theory, we’d say their joint entropy is less than the sum of their separate entropies, and the difference, called their joint information, is a measure of how much their messages (microstates) are correlated.

This solution works neatly in this case, but what about an RNA strand constructed by a process that produces a deterministic sequence? In that case, there might not be a clear other thing that you can say it’s correlated to. I

thinkthe approach still works, it’s just messy in practice.But I wouldn’t take that opinion too seriously. I haven’t thought this through all that thoroughly, and I’m far out of touch with the current research and literature on the subject. I also wouldn’t worry about it too much — the “problem” is too small to matter in practice. In fact, I wouldn’t be too unhappy saying that entropy is only approximately a state function.

GD: A replication process for a DNA of relevant length is a manufacturing process of enormous implied functionally specific complex organisation and associated information. That is why it is wise to start considerations in a Darwin’s warm pond or equivalent with “plausible” initial molecules, then move up from there. I am sure, you will agree that the config space for such a pond or ensemble, will be huge and there is a serious search space challenge to blindly — per chance and mechanical necessity — get to a gated, encapsulated, metabolising automaton with coded von Neumann self replicating facility. Thereafter, to get to novel body plans poses an even bigger FSCO/I challenge, given that functional specificity and complexity imply isolated clusters of configs in the space. The result is, so far we have a theory of micro-evo within islands of smoothly varying function [well behaved fitness], rather than what is advertised, an empirically credible and analytically plausible scientific account of origin of main body plans [macro-evo], starting with the origin of the first. And, I include the first, as it is normal to include such in presentations to secondary or tertiary students and the general public, with an air of supreme, lab coat-clad confidence. KF

Gordon Davisson:

I don’t have a clear idea yet to what extent thermodynamics is connected to information theory, but certainly the entropy concept in thermodynamics is connected to the entropy concept in information theory.

Shannon’s Measure of Information (SMI) is defined for

any probability distribution. The entropy concept in thermodynamics is defined only for some probability distributions. As such, entropy in thermodynamics is a special case of SMI (Ben-Naim 2013).Manfred Eigen (2013) takes it a step further:

Salvador:

What does it even meant to say that this burst of energy contributes entropy?