Further to Origin of complex cells: Can energy create information? (Lane seems to think that energy can create or substitute for huge amounts of information. This seems wrong but it is apparently acceptable to The Scientist, Rob Sheldon, noting that reader’ thoughts were solicited, writes to say,
In thermodynamics, we have the fundamental thermodynamic relation or defining equation dU = dQ + dW = TdS – PdV, where U=internal energy, Q=heat, W=work, T=temperature,S=entropy, P=pressure, V=volume, and “d” means “the change of”. In a closed system that is “reversible” , (no eddies, turbulence etc) and the volume doesn’t change much (incompressible like water), then we can eliminate the work and get the equation dQ = TdS, which is to say, the change in heat energy of our system is equal to the temperature times the change in entropy of the system. Or we can rewrite this dQ/T = dS.
What does this mean?
Well, since S is the entropy, and using Shannon’s definition, -S = Information, then dQ/T = -d(Info)
So addressing Lane’s book, heat energy is not information. Increasing the heat energy leads to decreasing the information. The same amount of heat energy at higher temperature has more information than the same heat at lower temperature. For life to extract information from a heat source, it must be a heat engine, extracting high temperature energy and excreting low temperature energy. Heat engines do this feat by incorporating carefully constructed linkages and machinery to prevent the work from vanishing in the turbulent, diffusive, entropic working fluid. If the machinery has to be made out of the same material as the working fluid, then it is like saying “a high information dense state can process heat energy to extract information”.
Well, doesn’t this process produce more information than at the beginning? Wouldn’t this allow for an infinitely great amount of information inside the system, the “Borg” of cellular automata?
No, because entropy is a fluid property, and it wants to diffuse away. The more information that gets concentrated, the greater the gradients, and the more work expended keeping those gradients. Therefore for a fixed energy flow, there is a maximum entropy gradient, a maximum entropy density, where all the effort is expended just staying alive. For heat engines like your car, this is given by the maximum temperature the engine can run at before it melts, and Sadi Carnot used the formula (Ti-Tf)/Ti to describe this “Carnot efficiency”. For cells, we’ll call this maximum achievable entropy gradient “the life efficiency”, and I think it is fair to argue that the more efficient a heat engine, the more information it must contain. (Have you looked at the number of sensor wires under the hood of a late model car and compared it to, say, a VW bug engine?)
But one thing it cannot do, cannot allow, is “a low information state can process heat energy to spontaneously become a high information state that processes heat energy”. This is the proverbial “perpetual motion machine” now operating on entropy, it is the “perpetual information machine”. For just as Clausius and Kelvin showed that heat engines that produced too much work (too efficient) could be linked together to produce a perpetual motion machine, so also fluid “life” machines that produce too much information can be linked together to produce perpetual motion. This proves that no such machine is possible or else biology would long ago become a perpetual motion machine that never eats.
So why do these erroneous views keep propagating, having spontaneous information arising from energy gradients?
Because they fudge the bookkeeping. Entropy is notoriously hard to measure, and so for example, they might underestimate the information in the cellular machinery and think that a temperature gradient has more than enough entropy to create the cellular machinery. Or as Granville Sewell argues, they have an open system that allows entropy to arrive and disappear without proper accounting, so that information accumulates inside the cell, which they then misattribute to temperature gradients. But if any of these papers were even within hailing distance of being correct, then perpetual motion would be commonplace by now, and you and I would spend our days wondering what to do with all our free time.
Other reader’s thoughts?
See also: Granville Sewell’s important contribution to physics: Entropy-X
Follow UD News at Twitter!