# Confusion about 2LoT in regard to heat and information

A number of people conflate heat entropy with information entropy and then willy-nilly substitute one for the other. This is what the NeoDarwinists do when they point to the sun and say it makes the earth an open system to explain the evident way life violates information entropy. The sun is irrevelevant in this siutation as it is adding heat, not information. While heat and information entropy are closely related (they both behave according to 2LoT which was originally formulated for heat alone) they are not the same thing and cannot be exchanged.

Heat diffuses in a closed system until maximum entropy is reached where the heat is uniform (everything is the same temperature). Likewise a dye will diffuse through a glass of water until its distribution is uniform (everything is the same color). Dye and heat are not the same thing and you can heat or chill that uniformly colored glass of water all you want and it wonÃ¢â‚¬â„¢t undistribute the dye. ThatÃ¢â‚¬â„¢s because heat entropy and dye entropy are not the same thing.

The laymanÃ¢â‚¬â„¢s expression relating to this is you canÃ¢â‚¬â„¢t unbake a cake. The reason why you canÃ¢â‚¬â„¢t unbake it is it would violate 2LoT. However, thatÃ¢â‚¬â„¢s not quite right because a sufficiently advanced intelligence can unbake a cake. Intelligence can accomplish things that nature cannot and that includes violating 2LoT in relation to information entropy.

It’s intuitively obvious to me that 2LoT governed heat and information entropy aren’t the same thing. Sewell expresses this in a more rigorous manner in A Second Look at the Second Law.

DaveScotÃ¢â‚¬â„¢s Evo-Creo 2LoT Corollary: As Intelligent Design arguments become more organized, NeoDarwinian arguments must become less organized. The end result is that ID becomes a completely coherent explanation of the facts while the NeoDarwinian narrative decays into a vast state of disarray.

ItÃ¢â‚¬â„¢s all a simple matter of physics you see. 😎

## 45 Replies to “Confusion about 2LoT in regard to heat and information”

1. 1
physicist says:

Dear Davescot, I’ll just post the same kind of comment as in the other 2nd law thread…

IÃ¢â‚¬â„¢d be very interested to hear your precise statements of the second law, and how they distinguish information entropyÃ¢â‚¬â„¢ and heat entropyÃ¢â‚¬â„¢.
You might be using the words in a completely different way from me.

2. 2
MikeG says:

The sun does not add heat to the Earth, it adds energy (information) in the form of photons. Much of this energy is absorbed by matter and then immediately degrades into heat. However, photons that are absorbed by photosynthetic pigments are stored as excited electrons. This stored energy is then used in the construction of the molecules of life. Only when these molecules are broken and degraded is the photon energy released as heat.

I’m guilty of taking it for granted that people in a discussion such as this know that the energy in photons is measured by degrees Kelvin. And of course degrees Kelvin is a measure of temperature and temperature is synonymous with heat. Next time you decide to be argumentative I suggest you do a better job of it. -ds

3. 3
Gandalf says:

Information and energy are not synonyms.

It’s hard to believe some ostensibly well educated people honestly equate them. -ds

4. 4
physicist says:

Davescot

Let me just add that the motivation for statistical mechanics is to be able to understand macroscopic phenomena and quantities in terms of an underlying, microscopic theory.

The entropy in statistical mechanics can be defined as -k\Sum_i \rho_i log \rho_i (for states i and probabilities \rho_i) in say the canonical ensemble.

Is this the definition you’re calling information entropy? Because this really *is* the same quantity as the macroscopic entropy you’re calling heat (of course, really TdS is the heat transferred in some given process).

For a system that’s not closed, the entropy *can* decrease. I’m not sure what the controversy is. People in ID are aiming to argue that RM+NS couldn’t have produced the design we observe. Fair enough, but the second law of thermodynamics isn’t going to help the argument!

The classic definition of entropy is a heat gradient where a measurable amount of work can be performed across the gradient. 2LoT states that heat will diffuse throughout a closed system until there are no more gradients and no work can be performed – the definition of maximum entropy. It was later found that this applies to more than just heat. Matter can also be observed diffusing according to these principles – like solids sublimating into a vacuum. Even more recently we find information behaves according to these principles. However, you cannot exchange heat for information like they are the same thing. This appears to be what you are trying to do. Tell me how the information coded on a magnetic tape is in any way equivalent to or related to the heat gradients on the same tape – how making the tape a degree hotter or colder by exposing it to infrared radiation will change the information encoded on it. If you can do that then I’ll buy your story that photons from the sun changing heat gradients on the earth can increase information coded in the DNA molecule. Life doesn’t even require the sun. There are bacteria that get all their energy gradient from heat deep in the earth which is generated largely by radioactive decay and was never due to the sun. -ds

5. 5
valerie says:

DaveScot wrote:
“The reason why you canÃ¢â‚¬â„¢t unbake [a cake] is it would violate 2LoT. However, thatÃ¢â‚¬â„¢s not quite right because a sufficiently advanced intelligence can unbake a cake. Intelligence can accomplish things that nature cannot and that includes violating 2LoT in relation to information entropy.”

The 2nd law does not prohibit the unbaking of a cake. It says that if you unbake a cake, you have to pay the price in increased entropy in the surroundings. No intelligence, no matter how advanced, can unbake a cake without paying this price, unless it is a supernatural agent which is capable of violating natural law.

For more on this, see discussions of Maxwell’s Demon on Wikipedia and elsewhere.

Yes Valerie, it does prohibit the unbaking of a cake. I guarantee you can watch a cake almost forever and by itself it will never fall apart into neatly segregated measured amounts of flour, baking soda, water, sugar, and whatever else went into it. It *is* remotely possible it will unbake itself but the improbability is practically indistinguishable from impossible. This is 2LoT at work. You can pour all the blind energy (work) into it you want and it won’t unbake itself. However, an intelligent agent can conceivably unbake it (given some energy of course) because intelligence doesn’t use the energy in a blind manner. Intelligence can overcome 2LoT. This is what makes intelligence a unique property in the universe. Intelligence can change the natural outcome of events. It can overcome almost impossible odds easily and routinely. It boggles my mind that this concept seems beyond the grasp of many otherwise fairly bright people. Maybe Davison is right and there’s something genetic that blocks people from hearing Einstein’s “music of the spheres”. -ds

6. 6
valerie says:

“Life doesnÃ¢â‚¬â„¢t even require the sun. There are bacteria that get all their energy gradient from heat deep in the earth which is generated largely by radioactive decay and was never due to the sun.”

The fact that their energy doesn’t come from the sun doesn’t mean that they’re violating the 2nd law. They still get their energy from an external source. Without that energy, the 2nd law prevents them from maintaining their low entropy state, and they die. If they were able to live without an external source of energy, *then* they would be violating the 2nd law, and we could rightly call them living perpetual motion machines.

7. 7
physicist says:

Davescot,

I am still asking you for a clear definition of what you call information entropy. I just gave you the standard stat mech definition but perhaps you mean something completely different. Certainly, the definition I gave is equivalent to the thermodynamic definition of entropy, as any stat mech book will explain.

My other question is whether you think any laws of physics are violated by the hypothesis that solely RM+NS is responsible for evolution.

If so, which laws are they? If not, then there is no need to invoke physics in the ID argument at all.

If you make me supply a link to basic information to you one more time it’ll be the last time. Capisce? -ds

8. 8
physicist says:

Davescot

Although there are probably clearer examples than baking cakes, I think you, me and valerie agree on the following:

“However, an intelligent agent can conceivably unbake it (given some energy of course) because intelligence doesnÃ¢â‚¬â„¢t use the energy in a blind manner.”

What I don’t understand is why you think total entropy won’t have increased. I.e. what is your justification for:

“Intelligence can overcome 2LoT”

9. 9
secondclass says:

Dave, the 2nd Law makes no exceptions for intelligence. What you’re saying is that the 2nd Law is wrong.

You’re thinking of classic 2LoT which is heat entropy. Maxwell’s Demon is the classic thought experiment showing how heat entropy can’t be overcome with information as the information takes as much useful energy to acquire as can be obtained by sorting more and less energetic particles. It’s information entropy we’re talking about here. Information and energy are not interchangeable quantities. 2LoT can be applied information.

Shannon’s definition of entropy is closely related to thermodynamic entropy as defined by physicists and many chemists. Boltzmann and Gibbs did considerable work on statistical thermodynamics, which became the inspiration for adopting the word entropy in information theory. There are relationships between thermodynamic and informational entropy. In fact, in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon’s information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information (needed to define the detailed microscopic state of the system) that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. (See article: MaxEnt thermodynamics). Similarly, Maxwell’s demon reverses thermodynamic entropy with information; but if it is itself bound by the laws of thermodynamics, getting rid of that information exactly balances out the thermodynamic gain the demon would otherwise achieve.

10. 10
Phed says:

It is true that neither information nor entropy are transfered by heat -in fact, neither information nor entropy are “transferred” at all within a system, ecxept in a metaphorical way; there is no wave or particle responsible for the transfer of entropy. Entropy is a quantity measured within a given system and varies according to the systems changes of state- and it’s those changes that require transfer of energy.
Entropy, in both its thermodynamic aspect and its mathematical aspect of information entropy, follows the second law. Take the sun-earth system: It’s not only a thermally heterogenous system, but a highly ordered one, with most matter occupying a hot,huperdense spot enough to trigger nuclear fusion. This order is slowly diffused and lost as the sun radiates light (and radiowaves, and particles, neutrinos etc) and the entropy of the system increases as it heads towards a homogenous state. As the state of the system slowly changes, information entropy can decrease at points; no law is violated because it increases by a larger factor in the whole system and order is lost.

Dave:
No, intelligence cannot unbake a cake: not in the way the example is used to demonstrate the second law.
We can try to unbake it- hey, we even do that by eating it! However, no matter how sofisticated our method will be: a)We will always spend more energy than the one used to bake the cake, and b)we will never end up with the same quantity and quality of ingredients used to make it.
That is what the 2nd Law says- and it’s the same whether we put it inside the Unbaker Mk I, the dog eats it, or we leave it on top of some mountain. The difference is only in quantity. Intelligence does not violate the 2nd Law: Nothing does.

And after all, the main issue remains: If intelligence violated the 2nd Law, so would life itself (its entire existence, not just its evolution), so would the formation of stars and planets. Do we really want to argue whether the Universe is held in place by magic?

You are also guilty of equating informtion entropy with heat entropy. They are not the same nor are they interchangeable. The same principles apply to both just like gravity applies to apples and oranges. Apples and oranges are not interchangeable. Sewell goes through this very nicely. -ds

11. 11

May I just say what I said on the other thread?
“There is none so blind as he who will not see.”

Valerie, physicist, your arguments really amount to nothing more than hair-splitting.

The “force” of the 2nd Law (thank you, Valerie) has been universally observed to be successfully overcome *only* by systems of superiour forces designed and employed by intelligence.

Such observation is the nature of cold, hard science.

Hair-splitting simply won’t do.

12. 12
secondclass says:

Red Reader, if by the “force” of the 2nd Law you mean increasing entropy, surely you’re aware that localized decreases in entropy occur all the time without intelligent intervention.

13. 13
danb says:

“The Ã¢â‚¬Å“forceÃ¢â‚¬Â of the 2nd Law (thank you, Valerie) has been universally observed to be successfully overcome *only* by systems of superiour forces designed and employed by intelligence.”

But, the entropy is not overcome, it is just moved to a different form. For example, if you straighten your desk up, you are putting work in. This work creates heat (muscles get warm!) and warms up your environment. In the entire system (with you in it) the total entropy has increased. There is no way around this. And it’s not hair-splitting or nit-picking. It is a law of nature that can never be overcome.

Heat entropy, yes. Information entropy, no. Say you’re typing a letter. It takes the same amount of energy to type a sequence of 1000 keys regardless of the information content of the sequence. Information entropy and heat entropy are not interchangeable but they both obey the same 2LoT. The difference is that intelligent agency can violate the law that requires information entropy to increase in a closed system. This is the hallmark of intelligence – the ability to select between equally improbable events in the present in order to obtain a specified result in the future. -ds

14. 14
worldsoyster says:

I am not sure that this is good. Has a tree more information than a seed and dirt and water? I do not know the answer.

I think that maybe it is the distribution of information that is important. I look at life programs (avida and tierra) and they look to make information (or maybe not. I do not know). But they make it in smooth distribution. Life is not smooth. Kidneys are not mixed with liver, lungs are not mixed with heart. Some parts are complex, some parts are not complex. Can life programs do this? I do not think they can.

15. 15
hanseconomist says:

All you intelligent people who can overcome the 2nd law must have very low utility bills, since no doubt you have designed your very own perpetual motion machines that generate your energy for you.
Why give ID a bad name by employing such bogus physics arguments? Any physics undergraduate can defuse this nonsense. Name me one famous physicist who subscribes to these ideas and I shall hold my peace forever.

Der Hans

This is about information theory which is a field of mathematics. Physicists are not the go-to experts for this. Mathematicians are. Not surprisingly both Granville Sewell and William Dembski are mathematicians. You can hold your peace in moderation land for a while. -ds

16. 16
valerie says:

DaveScot wrote:
“It boggles my mind that this concept [the power of intelligence] seems beyond the grasp of many otherwise fairly bright people.”

Dave,

When you find yourself in disagreement with a large number of bright, scientifically-trained people, it pays to consider the possibility that you are misunderstanding their positions or that they are aware of something that has eluded you. Sure, its *possible* that the physics community, the biology community, and the mechanical engineering community all have the same blind spot that you and Red Reader have somehow managed to avoid, but there is another, more likely possibility as well.

I am well aware of the power of intelligence, and I’m sure physicist and hanseconomist are also. All we are saying is that intelligence does not operate without a price, and the price is exacted by the 2nd law. If an intelligence unbakes a cake, the 2nd law is *not* violated. The entropy increase in the cake’s surroundings more than compensates for the entropy decrease due to unbaking.

The fact is that even Maxwell’s Demon, a lightning-fast intelligence, cannot violate the 2nd law. Why do you think other forms of intelligence can?

You’re about ready for a timeout for boring the moderator with argumentum ad populum. I suggest you slow down the torrid pace of your commenting here if want comments approved in the near future. -ds

17. 17
worldsoyster says:

Valerie

Does thermodynamics work with gravity? I have read that the water at the bottom of a waterfall is warmer than the water at the top of a waterfall. The 2nd law can accept this?

18. 18
Deuce says:

I’ve generally tended to just write off the “2nd Law Of Thermodynamics disproves Darwinism” type arguments (usually, they’re just presented as “The 2LoT says order can’t come from disorder, so evolution is impossible, nyah, nyah, nyah”). However, I think Sewell has built a much better case than the norm here, with a much deeper understanding of the logic behind the 2nd Law. In particular, what he argues, based on the equations, on pages 3 and 4 deserves to be looked at. A couple parts quoted here:

Similarly, the increase in Ã¢â‚¬Âcarbon orderÃ¢â‚¬Â in an open system cannot be greater than the carbon order imported through the boundary, and the increase in Ã¢â‚¬Âchromium orderÃ¢â‚¬Â cannot be greater than the chromium order imported through the boundary, and so on.

The fact that order is disappearing in the next room does not make it any easier for computers to appear in our roomÃ¢â‚¬â€unless this order is disappearing into our room, and then only if it is a type of order that makes the appearance of computers not extremely improbable, for example, computers. Importing thermal order will make the temperature distribution less random, and importing carbon order will make the carbon distribution less random, but neither makes the formation of computers more probable.

Now, it’s true that we can’t create perpetual motion machines, which was also my first thought when I read this. But, I think Sewell’s analysis goes deeper than that. If we were not here, the sun would still burn just as brightly, the solar entropy occurring at essentially the same rate. However, we wouldn’t be here creating carbon order, or computer order, etc. Hence, even if we are limited by the import of solar order, and even if the things we make are temporary, we are creating order that would otherwise not exist when we use it to increase the order in another medium. We could just use the same amount of heat entropy and not do anything interesting with it. If, as Dr. Sewell argues, the logic of the 2ndLoT applies to physical mediums generally, and not just heat, then we have indeed beaten, or at least resisted, the 2ndLoT by creating more total order in the history of the sun-earth system than would have otherwise existed.

I had just about given up hope for this thread until I read your comment, Deuce. Information is not a physical medium although I can’t see how it can exist without a physical medium to store it. 1LoT in information theory (IT): information can neither be created nor destroyed, it can only change state. 2LoT in IT: information gradients tend to diffuse in a closed system until information is equally distributed. Intelligence (i.e. Maxwell’s Demon) cannot overcome 1LoT and 2LoT in regard to energy but it can in regard to information. Information and energy are not the same thing but they behave according to the same principles of conservation and entropy as matter and energy. Information is an arrangement of matter that has subjective meaning. Neither arrangment need have more or less intrinsic matter/energy entropy. -ds

19. 19
ctaser says:

“Does thermodynamics work with gravity? I have read that the water at the bottom of a waterfall is warmer than the water at the top of a waterfall. The 2nd law can accept this?

Comment by worldsoyster Ã¢â‚¬â€ March 6, 2006 @ 7:04 pm ”

what’s the problem? Why would the 2nd Law be challenged here?

20. 20
Deuce says:

We have created more order on earth than ever before (well, letÃ¢â‚¬â„¢s stipulate so), but at the same time weÃ¢â‚¬â„¢ve been burning lots of stuff to be able to do so, thus making sure thereÃ¢â‚¬â„¢s no net overall decrease in entropy. The same goes of course for other organisms that Ã¢â‚¬Å“create orderÃ¢â‚¬Â, say termites.

I didn’t say that we have increased the overall order over what it was. For instance, I’m not claiming that we’ve made it so that there is more overall order at time B than there was at time A. I’m pointing out that if one grants Sewell’s argument that entropy applies in physical mediums other than heat, then we’ve increased the total amount of order that there has been in history, since all that same stuff could have been burned without increasing order in any other mediums. Btw, I think there’s one possible way around this, though I think I’ll leave it to others to figure it out (I’m not sure if it holds up under scrutiny or not).

Why do you hate physics so much?

Umm… okay. Please, let’s not get all bent out of shape. I’m just putting out ideas that came to me and trying to discuss them. This wasn’t meant as an attack on you or anything you hold dear. Honestly, and I’m not trying to you down here, but you should maybe step back a bit and take a break if you’re going to react this way.

21. 21
Jack Krebs says:

Nice comments, Valerie. Would you drop me a line at jkrebs@sunflower.com?

22. 22
danb says:

“Heat entropy, yes. Information entropy, no. Say youÃ¢â‚¬â„¢re typing a letter. It takes the same amount of energy to type a sequence of 1000 keys regardless of the information content of the sequence. Information entropy and heat entropy are not interchangeable but they both obey the same 2LoT. The difference is that intelligent agency can violate the law that requires information entropy to increase in a closed system. This is the hallmark of intelligence – the ability to select between equally improbable events in the present in order to obtain a specified result in the future. -ds”

But, the 2LOT of “information entropy” was not violated in writing that letter. Directly, that “information order” came from your brain. Neurons fired, interacted, and energy was consumed by your brain. Thinking takes energy. (Ever been exhausted after a day sitting at the desk?) Why else would so much of our body’s blood supply be devoted to our brain?. Your brain had to do alot of work to sort the information. And that energy came from digesting that delicious cake earlier. Sure, some people can write better than others, but that just means that they are more efficient in turning cake into information. It is MUCH easier for me to write gibberish than an essay. Why? It takes less energy, that’s why. Can you give me a scenario of an intelligent agency creating information that doesn’t require any energy?

I get exhausted sitting in the sun on my boat all day drinking adult beverages. Trust me, there’s not much thought required. “Choicing zygotes albeit nuclear anatomy” is meaningless and it took more energy for me to think of that than it did anything meaningful in this sentence. What’s up with that? -ds

23. 23
physicist says:

Davescot,

I think lots of other people have already said sensible things here. Let me just point out that you has not given any evidence for the following assertion:

“Intelligence can accomplish things that nature cannot and that includes violating 2LoT in relation to information entropy”

I don’t think disputing this is hairsplitting. It’s simply not true that ID arguments can be directly connected with the violation of any physical laws. If I am missing which physical laws are necessarily violated by the RM+NS hypothesis please do tell me, I’d honestly be very interested.

Of course, if you want to invent a new physical law, which is routinely violated in everyday life, and then show it is violated in the RM+NS hypothesis, you can—but it won’t be very convincing to any physicists. Is this what you mean by the force of the second law’?

24. 24
Deuce says:

Hi Val,

On the other hand, if he really *does* believe that a growing plant requires constant supernatural intervention to help it violate the 2nd law, then we should be able to detect the violation scientifically.

I think you may have constructed a false dilemma here, which may rest on semantics rather than being a real conceptual problem. One could just as easily call the plant itself “supernatural” to solve the problem you presented. Clearly, some distinction or rationale would be needed for why the Law didn’t apply to us the same as everything else, but I don’t see that just because we call it a “natural law” therefore the rationale would have to be “supernatural” and therefore God. If indeed we are producing behavior that doesn’t follow the 2nd Law, I think the more apt distinction is between material and intentional causes, rather than “natural” and “supernatural” ones. Of course, further explanation would be needed for the plant’s case. Maybe I’ll elaborate on this later (It could easily get off-topic, and turn into a philosophy of mind issue).

ItÃ¢â‚¬â„¢s true that sunlight shining on Earth creates more order than it would shining on a barren planet, but in neither scenario is the 2nd law violated. If I have a bunch of steam, I can use it to run a factory machine, producing order, or I can simply vent it straight into the winter air. The first choice produces more order than the second, but in both cases, the amount of disorder produced exceeds the amount of order produced.

Yes, but the 2ndLoT equation requires more than just that the total amount of disorder produced exceeds the amount of order produced. Let’s look at a different example. Say I’m given a pile of 500 pennies. I could take those pennies and put them in a row, perhaps putting them all heads up, or encoding the list of starting prime numbers in binary, or whatever, to increase the “penny order”. Alternatively, I could use up the same amount of thermal entropy putting the pennies down randomly. While in the first case the thermal entropy is used to create the penny order, it’s not really correct to say that the thermal entropy was “converted into” penny order. The same amount of thermal entropy occurs either way, and it’s invisible, from the standpoint of the thermal order, whether it is used to create order or disorder in another medium. Meanwhile, the penny order is increased without being transported across the boundary as equation 5 requires. So, if one accepts that entropy applies to mediums other than heat, behavior that doesn’t follow the entropy equation has occurred, even if the total disorder at time B is greater than it was at time A.

25. 25
MikeG says:

Editor;
You said in response to my post (#2):
“IÃ¢â‚¬â„¢m guilty of taking it for granted that people in a discussion such as this know that the energy in photons is measured by degrees Kelvin”, and you give a reference to Wikipedia’s entry for black-body radiation.
You need to re-read the entry and get a better understanding of physics. The units of energy are kg*m^2/s, not degrees. The entry in Wikipedia describes the energetic emissions of a black body at a given temperature. It in no way says that the units of energy are degrees K.
Traditionally, a joule, the unit of energy, was defined as the amount of energy required to raise 1 gram of water 1 degree Celsius. Heat is a form of energy and so are photons, but heat and photons are not the same thing.

What’s the difference between 3K electromagnetic radiation and 3000K electromagnetic radiation and what is the significance of these numbers? A wrong answer means you’re out of here. A right answer concedes my point. Enjoy. -ds

26. 26
physicist says:

Steve2005

I think davescot is more or less correct on this point. You can just use the boltzmann constant k_b to convert between these units.

But I don’t think it’s the main issue!

Steve2005 was banished on the spot a minute ago for highly offensive language. And of course he doesn’t know his gluteus maximus from his elbow, as you politely point out here, but that’s not grounds for instant dismissal. -ds

27. 27
physicist says:

Mike G,

Davescot is certainly correct that one can trivially convert between these units—in stat mech sometimes it’s just conventional to consider temperature as having units of energy. Of course in your language this temperature’ is just (k_b T).

Davescot,

I really don’t see how you can justify your assertion that intelligence’ (whatever that nebulous concept is) can violate the second law. Or any other physical law, for that matter. Where is your argument or evidence for this extremely bold assertion?

28. 28
danb says:

Physicst says: “I really donÃ¢â‚¬â„¢t see how you can justify your assertion that intelligenceÃ¢â‚¬â„¢ (whatever that nebulous concept is) can violate the second law. Or any other physical law, for that matter. Where is your argument or evidence for this extremely bold assertion?”

All you need to do, Dave, is supply an example of intelligence that can act without expenditure of energy. (By expenditure, I mean that useful energy such as electricity, kinetic energy, potential energy, etc. gets turned into less useful energy, i.e. heat. Energy is always conserved afterall, its the fuel that we need to worry about.

We are talking about information, not energy. How much energy is in the pattern “DanB just doesn’t get it”. -ds

29. 29
DaveScot says:

Physicist

Do you somehow calculate any reasonable possibility that nature, absent intelligence, could have created the information in your computer’s memory subsytems? Do you think nature, absent intelligence, could have created the information represented by the particular arrangement of atoms in the chair you’re sitting in? 2LoT applied to information would have virtually denied the possibility of chance assemblage of that information. Intelligent agency routinely overcomes virtually impossible odds. That is the hallmark of intelligence.

30. 30
Gandalf says:

The individual plant does not violate the law of information entropy (if we can call it that) because the information to generate the plant in its entirety is already in place when the seed comes into existence.

The question is whether any information can come into existence — or increase in meaningful complexity — without a greater **information** input.

A key concept here is that duplicating information is not the same as creating it. I think that understanding would be central to evaluating information entropy compared to heat entropy.

31. 31
physicist says:

Davescot

If you can give me a clear and precisely worded example of an intelligent’ agency causing a violation of the second law, please do.

Me writing this sentence. -ds

32. 32
secondclass says:

Dave,

– All entropy is information entropy, as entropy indicates the information content of a given state. I can point you to articles that formalize this idea, if you’d like. In return, I’m hoping you can point me to articles that formalize Sewell’s argument.

– If virtually impossible odds are routinely overcome by intelligence, then in what sense are they virtually impossible?

They are virtually impossible without intelligence. -ds

33. 33
valerie says:

physicist wrote:
“Davescot is certainly correct that one can trivially convert between these unitsÃ¢â‚¬â€in stat mech sometimes itÃ¢â‚¬â„¢s just conventional to consider temperature as having units of energy. Of course in your language this temperatureÃ¢â‚¬â„¢ is just (k_b T).”

physicist,

Temperature is equivalent to the average energy per molecule of a substance. Boltzmann’s constant gives a way of converting between the two, going from joules to Kelvins and back. This is uncontroversial.

But this has nothing to do with blackbody radiation. You cannot multiply the blackbody temperature of radiation by Boltzmann’s constant to find out how much energy there is per photon. Doing so gives you the average energy per molecule in the black body, but not the average energy per photon in the radiation being emitted by the body.

Individual photons do not have a blackbody temperature.

I’ve had it with your crap, Valerie. This stuff is not hard to learn but you make no attempt, thinking you know it all already. Single photon calorimeter developed by NASA. Don’t bother commenting on UD for the next week. -ds

34. 34

However much the darwinites obfuscate this scientific problem with their ‘uphill’ theory…
the ‘downhill’ experience of everyday life contradicts mindless macroevolution.

35. 35
fnds says:

It seems clear even for a mere engineer like me that the point being made is that energy and information follow the laws of entropy but are independent.

Thermodinamic entropy can be increased or decreased by heating or cooling.

Information entropy can be increased or decreased by “removing” or “adding” information.

These effects are disconnected, even though they can be observed on a given system at the same time.

After reading Sewell’s article, the conclusion would be that for information entropy to decrease on earth, producing DNA for instance, it would be necessary that the corresponding “amount” of information was being injected into the system through its boundaries. The loss of energy in the form of heat does not explain the decrease in information entropy observed, the same way cooling the water in the glass won’t make the dye undistribute, as stated in the initial post.

The critics should bring examples where decreasing energy entropy leads to decrease in information entropy, without confusing the media where the information is stored with the information content. Energy is spent in the storage, transmission and manipulation of the information but that does not decrease the information entropy of a system.

36. 36
Edin Najetovic says:

“Do you think nature, absent intelligence, could have created the information represented by the particular arrangement of atoms in the chair youÃ¢â‚¬â„¢re sitting in? 2LoT applied to information would have virtually denied the possibility of chance assemblage of that information. Intelligent agency routinely overcomes virtually impossible odds. That is the hallmark of intelligence.”

I have been meaning to ask this for some time. What makes intelligence exalted from nature so much that we can reccon it not to be a part of it? Because that is what you seem to be implying with posts like these. You are making the creating intelligence supernatural and that is not what ID should be doing if it wishes to be scientific. Because, as you no doubt know, any appeal to the supernatural destroys the scientific value of a theory. A dichotomy with intelligence on one side and ‘nature’ on the other reeks of supernaturalism. What can you do to overcome that barrier?

37. 37
avocationist says:

Edin,

How does it sound supernatural when he gives himself typing a coherent sentence or a human manufacturing a chair as exampes of the difference intelligent input can make? Perhaps we can use the example of a beaver building a dam. Nature on its own would not form the dam, but the beaver is certainly part of nature.

It seems to me physicist and Valerie are focusing on the heat exchange and pressuring DS to agree that he does not get away without increasing entropy while inputting information. I don’t think that DS is really attempting to state that he can avoid the normal entropy of physical processes while engaging in intelligent input, rather he is stating that the very same minimal entropy increase to the environment that is involved in typing a sentence is about the same whether he types nonsense or very valuable and coherent words with meaning.

In other words, you get more value for your money.

In fact, I’m thinking this should probably be included as part of the definition of information, if it hasn’t already been.

38. 38
physicist says:

Dear Valerie

“But this has nothing to do with blackbody radiation. You cannot multiply the blackbody temperature of radiation by BoltzmannÃ¢â‚¬â„¢s constant to find out how much energy there is per photon.”

No—basically, you can. In blackbody radiation characterised by temperature, T, the total (averaged in the grand canonical ensemble) number of photons is:

N ~ V*T^3

while the total energy is:

E ~ V*T^4

Hence the energy per photon does indeed go like E/N ~ T. The dimensionful part of the constant of proportionality is k_B, as usual. I might be misunderstanding what your point was but you seemed to be saying the energy per unit photon in BB radiation *didn’t* grow linearly with the temperature of the radiation, which is wrong. Any stat mech book will have this calculation.

Maybe you just mean that there is a dimensionless number, too, which I agree with, but I think the question was more about units.

So Davescot was correct, unless I am misreading what you are both saying. However, I think it’s off the main topic of this debate!

39. 39
physicist says:

Davescot

“Physicist–If you can give me a clear and precisely worded example of an intelligentÃ¢â‚¬â„¢ agency causing a violation of the second law, please do.

Davescot–Me writing this sentence.”

No, as other people have pointed out, the overall entropy will have increased as you think about and write this sentence. One would need to take into account the very complicated system of your brain, which probably none of us have a desire to do!

If you want to convince any physicists, you should come up with a precise example where you can do the calculations and show that entropy has decreased in a closed system. I am skeptical.

I’m guessing, but it sounds like what you actually want to do is isolate the entropy in that sentence, call it information entropy and then say it has clearly decreased as the sentence is typed. Well, fair enough, but unfortunately this does not violate any statement of the second law!

Of course, if you want to invent a new law, which is routinely violated, well no one will be impressed when it is…routinely violated. I think you need to think more clearly about exactly what law’ you are violating, and exactly who thinks this law’ is a physical law.

40. 40
physicist says:

Dear Davescot (again)

“Physicist, Do you somehow calculate any reasonable possibility that nature, absent intelligence, could have created the information in your computerÃ¢â‚¬â„¢s memory subsytems? Do you think nature, absent intelligence, could have created the information represented by the particular arrangement of atoms in the chair youÃ¢â‚¬â„¢re sitting in? 2LoT applied to information would have virtually denied the possibility of chance assemblage of that information. Intelligent agency routinely overcomes virtually impossible odds. That is the hallmark of intelligence. ”

I find your statements here completely unchallenging. This is because the intelligence you are referring to in making computers and chairs is human intelligence, and humans are a part of nature.

So essentially your statement comes down to: if there were no humans, would you be surprised if things that are made by humans are no longer found?’

My answer is no, and moreover there is no challenge here to the second law.

41. 41
j says:

An attempt to clarify some understandings:

Thermodynamic entropy = Boltzmann’s constant x Shannon (information) entropy, or, S = kH

Energy = Boltzmann’s constant x temperature, or E = kT

Hence, thermo entropy is to info entropy as energy is to temperature
__________

The entropy rate balance for a control volume (CV) is as follows:

[rate of change of entropy within the CV] =
+ [rate of entropy transfer associated with heat transfer across boundary] (1)
+ [rate of entropy transfer associated with mass transfer into the CV] (2)
– [rate of entropy transfer associated with mass transfer out of the CV] (3)
+ [rate of entropy production within the CV due to irreversibilities] (4)

(1) is positive when heat is transferred in, negative when heat is transferred out.
(2) and (3) are zero for closed systems (mass is not transferred into or out of a closed system).
(4) is always positive.
__________

Now, some of my musings:

By analogy, since the First Law of Thermodynamics is basically the Law of Conservation of Energy, perhaps the Law of (Imperfect) Conservation of (Thermodynamic) Entropy (i.e., 2LoT) is better thought of as the First Law of Infodynamics? (Jaynes suggested something similar in 1957.)

For a closed system, if more entropy is transferred out than is transferred in, and if this rate is greater than the rate of entropy production due to irreversibilities, then entropy will be reduced within the system. With undirected natural processes, “self-organizational” phenomena (e.g., the formation of ice crystals, vortices in flowing water, ripples on sand dunes) can reduce entropy faster than irreversibilities generate it for limited periods of time and space. But such phenomena are necessary, rather than contingent; Also, they just don’t have the same “character” as the low entropy phenomena in living things. E.g., they can’t create novel cell types, tissue types, organs, or body plans (credit: ds). (Things to ponder: Why are there self-organizational tendencies of matter? From where do they originate?)

The following two considerations suggest how the Second Law of Thermo may be violated: Consciousness (which would appear necessary for true, original intelligence, rather than stored intelligence as in computer programs) seems to be associated with quantum mechanical phenomena; there are “two unwritten assumptions about Shannon’s definition of information that may make it inapplicable as such to quantum mechanics: (1) The supposition that there is such a thing as an observable state (for instance the upper face a die or a coin) before the observation begins (2) The fact that knowing this state does not depend on the order in which observations are made (commutativity).” ( en.wikipedia.org/wiki/Entropy in thermodynamics and information theory ) Or, perhaps the Second Law is not universally valid, as Stephen Wolfram suggests.

Whether or not intelligence violates the entropy accountancy required by 2LoT, what it certainly can do is repeatedly and unpredicably create regions of indefinitely large amounts of low entropy, of arbitrary character. Undirected nature can’t.

42. 42
woody says:

davescot: “1LoT in information theory (IT): information can neither be created nor destroyed, it can only change state. 2LoT in IT: information gradients tend to diffuse in a closed system until information is equally distributed.”

Dave,

Do you have any justification to offer for either of these “laws”?

Of course. The first law is more controversial. In a deterministic universe information is never created or destroyed. In principle sufficient knowledge of the state of any closed system enables perfect prediction of its past and future. Before you say the universe is not deterministic I suggest you at least know that it is perfectly deterministic at larger than quantum scales as far as physics informs us and at quantum scales the jury is out on whether quantum uncertainty is an artifact of not having complete knowledge such as a theory of quantum gravity. Dembski proposes a law of conservation of information but I didn’t know that until just now when I googled it. I figured it out on my own long ago as it’s an obvious logical consequence of a deterministic universe. The only question is whether the universe is truly deterministic. I suspect that it is deterministic with the sole exception that intelligent life is non-deterministic. In other words “free will” exists and that, and that alone, can break the law of conservation of information.

There is little controversy about the second law in information theory.

Now I have a question for you. Why did I have to give you this information when it’s freely available on the internet with a simple google? I’m not here to do your homework for you. Next time you question me I expect you to have done a little reseach youself first or you’ll be asking questions on a different blog.

Your “1LoT” is not true. Information can be destroyed (and is destroyed all the time). Logically irreversible processes destroy information. For example, a three-input OR gate destroys information — you cannot reproduce the values on the three inputs from the value on the single output.

John von Neumann suggested, and Rolf Landauer proved, that there is a minimum entropy increase (and hence energy cost) associated with the destruction of information by irreversible processes. The field of reversible computation theory is aimed at reducing the entropy increase by making the logical processes reversible, thus preventing the destruction of information. Our three-input OR gate becomes a three-input, three-output gate in a reversible computer.

Regarding your “2LoT”, what do you even mean by an “information gradient” or the “diffusion of information”?

Woody

43. 43
jjj says:

Interpreting the laws of thermodynamics in terms of “information” (in the sense of a Shannon entropy) is fine (and not at all controversial, as you know — a statistical mechanics approach to thermodynamics has an analogous form), but to do so, one must equate energy and information, though obviously through some conceptual mechanism. And that means the second law of thermodynamics still applies to any change in the nature of information (or, equivalently, energy), however it might occur.

44. 44
j says:

The “conceptual mechanism”?: info entropy change = [integral(heat energy / temperature)] / k.

But 2LoT may not be applicable to QM, or otherwise may not be universally valid. If it’s not, then it’s moot.

45. 45
DaveScot says:

j said

Whether or not intelligence violates the entropy accountancy required by 2LoT, what it certainly can do is repeatedly and unpredicably create regions of indefinitely large amounts of low entropy, of arbitrary character. Undirected nature canÃ¢â‚¬â„¢t

That’s what I wanted to say but I guess I didn’t find the right words. I consider repeated and unpredictable creation of large regions of low entropy to be a violation of 2LoT. If a glass of water becomes warmer by sucking heat out of a cooler surrounding environment that’s the creation of a region of low entropy – whether permanent or not it’s a violation of 2LoT. Similarly, if a DNA molecule becomes more complex by sucking information out of a less information-dense environment that also is a violation of 2LoT. The difference is that you will never see this kind of violation in heat transfer and aside from living things you won’t see this violation in information transfer either. We know intelligent agency routinely causes large regions of decreased information entropy. The big question then becomes whether any unintelligent process can create large regions of decreased information entropy. Granville Sewall says there are none and I agree with him.

There also appears to be a lot of confusion between order and information. An example of increased order is wind ripples on sand. But order is not necessarily information in information theoretic terms. There’s no specific information in those ripples. No message (or at least no message we know of). The information contained in them is objective, quantifiable, and not different in kind or quantity than the information contained in unrippled sand. What the ripples have more of is order, not information. Contrast this with the digitally encoded information on the spine of the DNA molecule. There is specified information quite different in kind and quantity from the order you see in crystals or the objective information seen in an unspecified arbitrary string of nucleic acids of similar length. Digitally encoded information that’s meaningful to a ribosome in building hideously complex self-replicating protein machines is a class of information apart from mere ordered arrangements of matter and energy. The class of information is complex specified information. Physicists don’t deal with this kind of order. This is information theory not physics but as Shannon has demonstrated information follows thermodynamic principles. But as Dembski and Sewall and others are attempting to show, when it comes to living things the thermodynamic principles that apply to information are evidently violated.