This essay is intended to give a short overview of textbook understanding of Shannon Information, Entropy, Uncertainty in thermodynamics and Intelligent Design. Technical corrections are welcome.

The phrases “Shannon Information”, “Shannon Uncertainty”, “Shannon Entropy” are all the same. The most familiar every day usage of the notion of Shannon Information is in the world of computing and digital communication where the fundamental unit is the bit.

When someone transmits 1 megabit of information, the sender is sending 1 megabit of Shannon information, and the receiver is getting a reduction of 1 megabit of Shannon Uncertainty, and the ensemble of all possible configurations of 1 megabits has 1 million degrees of freedom as described by 1 megabit of Shannon entropy.

The word “Information” is preferred over “Uncertainty” and “Entropy” even though as can be seen they will yield essentially the same number.

The probability of any given configuration (microstate) can be converted by the following formula:

2^{-I} = P

conversely

I = -log_{2}P

Where I is the information measure in bits and P is the probability of finding that particular configuration (microstate).

**Examples**:

The Shannon information of 500 fair coins heads is 500 bits, and the probability of a given configuration of the 500 coins is

2^{-I} = 1 out of 2^{500}

But there are subtleties with the 500 coin example. We usually refer to the Shannon entropy in terms of heads/tails configuration. But there are other symbols we could apply Shannon metrics to.

For example, we could look at the 4 digit year on a coin and consider the Shannon entropy associated with the year. If we use an *a priori* assumption of all possible years as equally probable, I calculate the Shannon Entropy as:

log_{2} 10^{4}= 13.28771 bits

But this assumption is clearly too crude since we know not all years are equally probable!

Further we could consider the orientation angle of a coin on a table based on the rounded whole number of degrees relative to some reference point. Thus each coin has 360 possible orientation microstates corresponding to 360 degrees. The number of bits is thus:

log_{2}360 = 8.49185 bits

Notice, the problem here is we could choose even finer resolution of the orientation angle to 0.1 degree and thus get 3600 possible microstates! In that case, the Shannon information of any given orientation is:

log_{2}360 = 11.81378 bits

If for example I focused on all the possible configurations of the orientations of each coin (rounded to a whole number of degrees) of a system of 500 coins, I’d get:

log_{2}360^{500} = 500 log_{2}360 = 4245 bits

The point of these is examples is to show there is no *one* Shannon Entropy to describe a system. It is dependent on the way we choose to recognize the possible microstates of the system! 😯

If we choose to recognize the energy microstates as the chosen microstates to calculate Shannon entropy, that is the thermodynamic entropy of the system expressed in Shannon bits versus the traditional thermodynamic entropy measure in Joules/Kelvin.

Consider a system of 500 pure copper pennies. The standard molar thermodynamic entropy of copper is 33.2 Joules/Kelvin/Mol. The number of mols in a pure copper penny is .0498 mols thus the thermodynamic entropy of 500 coins is roughly on the order of:

S = 500 * 33.2 Jolues/Kelvin/Mol * 0.0498 mols = 826.68 J/K

where S is the thermodynamic entropy.

We can relate thermodynamic entropy S_{Shannon} expressed in Shannon Bits to traditionally-expressed thermodynamic entropy number S_{Boltzmann} in Joules/Kelvin. Simply divide S_{Boltzmann} by Boltzmann’s constant (K_{B}) and further divide by ln(2) (to adjust from natural log to log base 2 information measures):

S_{Shannon} = S_{Boltzmann} / K_{B} / ln(2) =

(826.68 J/K ) / (1.381x 10^{-23} J/K) / .693147 = 8.636 x 10^{25} Shannon Bits

Conclusion:

Q. How much Shannon Entropy is in a system of 500 fair pure copper pennies?

A. Depends on the way you choose to recognize the microstates and associated probability of each microstate of the system! 😯

500 bits, 4245 bits, 8.636 x 10^{25} bits, etc. would be correct answers depending on how the observer chooses to recognize microstates and associated probability of each microstate!

How does this all relate to ID? I’ll let the readers decide….. 🙂

But my personal suggestion, tread information theory and thermodynamics with caution. If you can’t do calculations comfortably as shown above, you might want to reconsider using information and thermodynamic arguments. Expect your opponents in debate to demand you provide similar calculations to defend your claims of ID. That’s why I’ve stressed ID proponents use basic probability arguments, not information theory and thermodynamic arguments. For that matter, I recommend Jonathan Wells’ Humpty Dumpty argument. Keep It Simple Soldier (KISS).

**NOTES**

0. There has been debate about the validity of using of Boltzmann’s constant which is related to ideal monoatomic gases to determine the degrees of freedom in solids or non monoatomic gases. If temperature is the energy per degree of freedom, then it seems reasonable to use one constant for estimating the degrees of freedom based on temperature and energy alone. A diatomic gas for example will have more degrees of freedom, but as far as temperature and energy, you’ll get a given number of empirically measured microstates, so it seems one can generally use Boltzmann’s constant. At least that is my reading of the literature.

1. Standard Molar Entropies

2. Mole Calculations

3. Boltzmann’s Constant

4. Nat Measure of Information

5. Creation Evolution University: Clausius, Boltzmann, Dembski