In terms of textbook thermodynamics, a functioning Lamborghini has more thermal entropy than that same Lamborghini with its engine and other vital parts removed. But intuitively we view such a destruction of a functioning Lamborghini as an increase in entropy and not a decrease of entropy. Something about this example seems downright wrong…
To fix this enigma, and to make the notion of entropy line up to our intuitions, I’m suggesting that the notion of “specified entropy” be used to describe the increase in disorganization. I derive this coined phrase from Bill Dembski’s notions of specified information. In the case of the Lamborghini getting its vital parts removed, the specified entropy goes up by exactly the amount that the specified information goes down.
This isn’t a radical change in terms of ID literature, but may help to convey what is really meant by ID proponents when they say entropy is going up. What they really mean is not thermal entropy, but specified entropy.
As I mentioned in the comments of another thread, I tutor college and high school students in math, physics, and chemistry — many of them I met in church circles, and many are ID friendly. It would bother my conscience if I said things that grate against what they are learning. Here is a simple example.
QUESTION: According to textbook chemistry, physics, and engineering, which system has more entropy, a simple virus or an adult human?
ANSWER: Adult human if by entropy one means thermal entropy, and also the adult human if by entropy one means Shannon entropy.
I invite any UD reader to agree or disagree with my answer and state from principles of physics the reasons why they agree or disagree. State your case just as you would when explaining to a college chemistry, physics, or engineering student. Given all the discussion of entropy on the net, surely, it’s not unreasonable to pose a question that a college science student might consider. If the ID community wishes to help the next generation of ID friendly science students, these are the sort of basic science questions that are fair game.
So paradoxically, if an intelligent designer evolved a human from a simple virus, the designer had to add MORE entropy to the human design than what was in the virus. So if a science student asked me, “which has more entropy, a virus or a human?” I’d say the human.
Here is another question.
QUESTION: According to textbook college science, which state of the rat has more entropy, when it is a warm and living rat or when it is a dead and frozen rat near absolute zero?
ANSWER: The rat when it was warm and living had more entropy than when the rat was dead and frozen near absolute zero.
This seems down right wrong. What is the problem? The problem is the word “entropy” is being equivocated. Are we talking thermal entropy or specified entropy (related to ID concepts). Depending on what definition one is working from one will get a different answer.
Unless we adopt some convention for clarifying what type of entropy is being discussed, confusion will reign. One could mistakenly reason:
A human has more entropy than a virus, a living warm rat has more entropy than a frozen dead rat, and since the 2nd law says entropy is increasing, the 2nd law helps a virus with low entropy evolve into a human with high entropy, and dead things with low entropy evolve into living things with higher entropy
This would be the wrong conclusion, but the reason it is wrong is the notion of entropy is being equivocated, not to mention system boundaries are being redrawn on the fly — not exactly a wholesome way of analyzing systems.
By contrasting the notion of “specified entropy” (measured in bits) against the notion of “thermal entropy” (measured in Joules/Kelvin), at least some of the equivocations and confusions might be counteracted. This surely isn’t the end of the matter, but we have to start somewhere. What do you want me to tell my students? Here is your chance to serve the ID community by providing science-based answers to science students, otherwise these villains might get a hold of their minds.