
Further to Origin of complex cells: Can energy create information? (Lane seems to think that energy can create or substitute for huge amounts of information. This seems wrong but it is apparently acceptable to The Scientist, Rob Sheldon, noting that reader’ thoughts were solicited, writes to say,
In thermodynamics, we have the fundamental thermodynamic relation or defining equation dU = dQ + dW = TdS – PdV, where U=internal energy, Q=heat, W=work, T=temperature,S=entropy, P=pressure, V=volume, and “d” means “the change of”. In a closed system that is “reversible” , (no eddies, turbulence etc) and the volume doesn’t change much (incompressible like water), then we can eliminate the work and get the equation dQ = TdS, which is to say, the change in heat energy of our system is equal to the temperature times the change in entropy of the system. Or we can rewrite this dQ/T = dS.
What does this mean?
Well, since S is the entropy, and using Shannon’s definition, -S = Information, then dQ/T = -d(Info)
So addressing Lane’s book, heat energy is not information. Increasing the heat energy leads to decreasing the information. The same amount of heat energy at higher temperature has more information than the same heat at lower temperature. For life to extract information from a heat source, it must be a heat engine, extracting high temperature energy and excreting low temperature energy. Heat engines do this feat by incorporating carefully constructed linkages and machinery to prevent the work from vanishing in the turbulent, diffusive, entropic working fluid. If the machinery has to be made out of the same material as the working fluid, then it is like saying “a high information dense state can process heat energy to extract information”.
Well, doesn’t this process produce more information than at the beginning? Wouldn’t this allow for an infinitely great amount of information inside the system, the “Borg” of cellular automata?
No, because entropy is a fluid property, and it wants to diffuse away. The more information that gets concentrated, the greater the gradients, and the more work expended keeping those gradients. Therefore for a fixed energy flow, there is a maximum entropy gradient, a maximum entropy density, where all the effort is expended just staying alive. For heat engines like your car, this is given by the maximum temperature the engine can run at before it melts, and Sadi Carnot used the formula (Ti-Tf)/Ti to describe this “Carnot efficiency”. For cells, we’ll call this maximum achievable entropy gradient “the life efficiency”, and I think it is fair to argue that the more efficient a heat engine, the more information it must contain. (Have you looked at the number of sensor wires under the hood of a late model car and compared it to, say, a VW bug engine?)
But one thing it cannot do, cannot allow, is “a low information state can process heat energy to spontaneously become a high information state that processes heat energy”. This is the proverbial “perpetual motion machine” now operating on entropy, it is the “perpetual information machine”. For just as Clausius and Kelvin showed that heat engines that produced too much work (too efficient) could be linked together to produce a perpetual motion machine, so also fluid “life” machines that produce too much information can be linked together to produce perpetual motion. This proves that no such machine is possible or else biology would long ago become a perpetual motion machine that never eats.
So why do these erroneous views keep propagating, having spontaneous information arising from energy gradients?
Because they fudge the bookkeeping. Entropy is notoriously hard to measure, and so for example, they might underestimate the information in the cellular machinery and think that a temperature gradient has more than enough entropy to create the cellular machinery. Or as Granville Sewell argues, they have an open system that allows entropy to arrive and disappear without proper accounting, so that information accumulates inside the cell, which they then misattribute to temperature gradients. But if any of these papers were even within hailing distance of being correct, then perpetual motion would be commonplace by now, and you and I would spend our days wondering what to do with all our free time.
Other reader’s thoughts?
See also: Granville Sewell’s important contribution to physics: Entropy-X
Follow UD News at Twitter!
facepalm
The relationship (if any) between energy and information depends a great deal on what definition of information one chooses; there are many definitions available, and which one you should pick depends on what you’re actually trying to do. (Compare this with picking a definition of “size”: depending on what you’re trying to figure out, you might use length, height, depth, circumference, volume, surface area, mass, etc… or maybe several of them at once.)
Unfortunately, the definition Dr. Sheldon chose has pretty serious problems for almost any application.
First, a minor quibble: correct me if I’m wrong, but I don’t think Shannon ever used this definition. I’m not an expert on the history here, but Norbert Weiner was the first I know of to identify information with a decrease in Shannon entropy, but not thermodynamic entropy. AIUI Leon Brillouin was the first to identify it with negative thermodynamic entropy.
But there are real problems here as well. For one thing, entropy (both Shannon and Boltzmann/Gibbs) is always positive (or at least nonnegative), and hence by this definition information is always negative (or at least never positive).
Now, normally this is dealt with by only looking at changes in entropy, and identifying information with decreases in entropy. But that still winds up being pretty seriously problematic. For instance, the entropy of a typical human is far far higher than that of any bacterium (mostly because of size: entropy is, other things being equal, proportional to size, and humans are far far bigger than bacteria); this would mean that turning a bacterium into a human would be a massive decrease of information. Does that make any sense to anyone at all?
(Side note: I used “size” above without clarifying which measure I meant. In this case, the number of atoms or molecules would probably be the most directly relevant measure.)
But it’s even worse than that, because according to this definition, simply cooling something off increases its information by a huge amount. Consider cooling one cc (about a thimblefull) of water off by one degree Centigrade (=1.8 degrees Fahrenheit) from, say, 27° C to 26° C (absolute temperatures of 300.15 K and 299.15 K respectively). The amount of heat removed is (almost by definition) about 1 calorie, so ΔQ = -1 cal, and ΔInformation = -ΔQ/T ~= – (-1 cal) / 300 K = +3.33e-3 cal/K. To convert that from thermodynamic units to information units (bits), we need to divide by Boltzmann’s constant times the natural log of 2; that’s k_B * ln(2) = 3.298e-24 cal/K * 0.6931 = 2.286e-24 cal/K. Dividing that into the entropy change we got above gives an information increase of …wait for it… 1.46e21 bits.
That’s over a thousand billion billion bits of information just because a little water cooled off slightly.
I’m going to go ahead and claim that this definition has almost nothing to do with what most people mean by “information”.
BTW, I’ve discussed this here before; see this earlier comment for a treatment of the relation between information and Shannon entropy.
Also, setting aside the question of “information” for the moment:
I’d agree that that analysis in Daniel Styer’s “Entropy and Evolution” is significantly wrong, but Emory Bunn’s “Evolution and the second law of thermodynamics” has, as far as I know, only one minor error: he assumed the entropy associated with thermal radiation was E/T (where T is the temperature of the emitting body). It’s actually more than that in general (and 4E/3T for the special case of blackbody radiation). I did a version of the entropy flux calculation (quoted here) that takes this into account and got a lower bound of 3.3e14 W/K (compare with Styer’s 4.2e14 W/K). Can Dr. Sheldon point out any other errors in Styer’s analysis, or any at all in my analysis?
BTW, if you plug that value into -S = Information (with appropriate conversions), you get 3.4e37 bits per second of information increase.
Finally, News (Denyse, I presume?) mentioned Sewell’s X-Entropy. As I’ve pointed out before, Sewell’s X-Entropy calculation is only valid in the specific case of diffusion through a solid (and even then, only in the absence of gravity). In many other situations, it falls apart completely (I give two examples here; I can give more).
The X-Entropy argument has been refuted. It’s done. Give it up.
Excellent choice of topic Rob Sheldon.
I found this section from A.Dalela’s Moral Materialism, in which his semantic/informational theory of matter is applied to the problems in statistical mechanics and thermodynamics, to be relevant to the discussion. Main point: he proposes that energy is information, and it exists on a continuum, from abstract (concepts) to contingent (objects). I highlighted the points I found most salient.
While I don’t know enough about the equations of thermodynamics to say one way or the other whether Dr. Sheldon’s claim, i.e. ‘Increasing the heat energy leads to decreasing the information’, is right from the mathematical, i.e. theoretical, perspective, I do know that his claim is right from the empirical perspective.
Simply put, just pouring raw energy into a system (as with the sun pouring energy onto the earth or as with boiling water) actually increases the disorder of a prebiotic system. i.e. less biological information.
Besides the implausibility of the ‘hot’ origin of life scenarios, some Darwinists have instead opted for promoting a ‘cold’ origin of life scenario with ice. Yet the cold origin of life scenario is also found to be implausible for several reasons.
Nick Lane himself notes a ‘significant conceptual flaw’ in some origin of life research regarding ‘equilibrium’.
In this regards Nick Lane is right. Biological life is dramatically characterized by its thermodynamic dis-equilbrium with the environment not by its equilibrium.
Professor Harold Morowitz has shown that the Origin of Life ‘problem’ escalates dramatically over the oft quoted 1 in 10^40,000 figure when working from a thermodynamic perspective:
Dr. Morowitz did another probability calculation working from the thermodynamic perspective with a already existing cell and came up with this number:
Also of related interest is the information content that is derived in a ‘simple’ cell when working from a thermodynamic perspective:
Thus, for someone to suggest that there is no problem between thermodynamics and the increase of non-trivial biological information is simply to be out of touch with the empirical realities of the situation.
So what is the definition of “information” in these cases? It’s being bandied about as if it’s some sort of widely-accepted unitary concept but it sounds nothing like what I would recognize as information.
Seversky, I believe that Dr. Sheldon and Gordon Davisson are talking about Shannon information whereas I’m looking for an increase in non-trivial functional biological information.
Huh? The same amount of heat energy at a higher temperature??? Did I miss something there?