I’m not aware of any policy restriction on providing links to TalkOrigins at UD. I’ve linked to TalkOrigins a lot since I’ve criticized it so much in the past.
In this case, I actually concur with one of Gordon Davisson’s essays there, and I think it rightly earned Post of the Month 14 years ago at TalkOrigins.
When I teach ID, I feel the ideas must defensible to science undergraduate and graduate students and science faculty (physics, chemistry, mathematics, engineering). The internet has been an opportunity to vet some of my ideas. Gordon has been of great assistance with his balanced and insightful critiques of my essays at UD.
I link to the Gordon’s essay because we independently arrived at the same conclusion regarding the relationship of Thermodynamic entropy and Shannon Entropy, namely:
1 Joule/Kelvin = 1 / (1.381 x 10-23) / ln (2) Shannon Bits =
1.045 x 1023 Shannon Bits
provided the microstates under consideration for Shannon entropy are the energy microstates.
In the 500 fair coins example, I typically say the Shannon entropy is 500 bits because I consider the heads/tails configurations to be the symbols generally under consideration, but if we consider instead the energy microstates at the molecular level of 500 coins, one will get on the order of 8.636 x 1025 bits — a number which dwarfs even the supposed Universal Probability Bound of 500 bits! 😯
I made the derivation in two places. Mainly at:
Shannon Information, Entropy, Uncertainty in Thermodynamics and ID
and indirectly in:
Here is Gordon’s essay:
Information and Thermo-Entropy. I present his essay in in gratitude for much of his constructive criticisms of my work at UD.
Technical comments and corrections are welcome.