Uncommon Descent Serving The Intelligent Design Community

And once more: Life can arise naturally from chemistry!

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Yet it isn’t happening, and we have no idea how it happened even once…

From science writer Michael Gross at Cell:

Rapid progress in several research fields relating to the origin of life bring us closer to the point where it may become feasible to recreate coherent and plausible models of early life in the laboratory. (paywall)

It’s a survey article, and it concludes:

on our own planet and on many others.
“One of the main new aspects of origins research is the growing effort to connect chemistry to geology,” Jack Szostak notes. “Finding reasonable geological settings for the origin of life
is a critical aspect of understanding the whole pathway. We’ve moved beyond thinking that life emerged from the oceans or at deep sea hydrothermal vents. New ideas for surface environments that could allow organic materials to accumulate over time, so that prebiotic chemistry could happen in very concentrated solutions, are a big advance.”

We can conclude from all of this that the emergence of life in a universe that provides a suitable set of conditions, like ours does, is an entirely natural process and does not require the
postulate of a miracle birth. (Current Biology 26, R1247–R1271, December 19, 2016 R1247–49)More.

Okay. “Moved beyond” is a way of saying that hydrothermal vents are not the answer after all.

Coherent and plausible models in the lab are not the same thing as knowing what happened. And the more of them there are, the more necessary it would become to explain why life isn’t coming into existence all over the place all the time.

And at times, we are not even sure what we mean. Do some viruses meet the criterion of being alive?

A friend writes to ask: “Imagine how it would sound if a study on any other topic had the words “does not require the postulate of a miracle” in the conclusion. Somehow they seem to think that it is perfectly appropriate and natural when discussing the origin of life.”

Aw, let’s be generous, it’s New Year’s Eve: When people really haven’t got very far in a discipline for the better part of two centuries, they tend to think in terms of zero or miracle. That’s just what they do.

Another friend writes to say that the thesis seems to be: Given enough time, anything can happen. If so, the proposition does not really depend on evidence. In 1954, mid-20th century Harvard biochemist George Wald wrote,

Time is in fact the hero of the plot. The time with which we have to deal is of the order of two billion years. What we regard as impossible on the basis of human experience is meaningless here. Given so much time, the “impossible” becomes possible, the possible probable, and the probable virtually certain. One has only to wait: time itself performs the miracles. (From TalkOrigins, Wald, Scientific American, p. 48).

Really? Physicist Rob Sheldon has doubts:

In physics, we discuss reversible and irreversible reactions. If entropy (or information) is unchanged, then the system is reversible. If entropy increases (loss of information), then the reaction cannot be reversed. Outside of Darwin’s theory of evolution, there are no irreversible reactions in which entropy decreases (information is gained), because that would enable a perpetual motion machine.

Thus time is of no benefit for evolution, since a perpetual motion machine is no more possible if it runs slowly than if it runs quickly. And while errors may persist in biology because it may be too complicated to be sure of the entropy, the same cannot be said of chemistry. So the biggest boondoggle of all is attributing to precise and exact chemistry the magical anti-entropy properties of inexact and imprecise biology simply because one is a materialist reductionist who thinks life is a substance. I’m not picking on chemists or biologists, because I’ve even heard physicists say that evolution causes the multiverse to spawn us. Evidently this anti-entropy magic is just too powerful to keep it bottled up in biology alone, the world needs more perpetual motion salesmen, they spontaneously generate.

Oh well, happy New Year.

See also: Researchers: Bacteria fossils predate the origin of oxygen

Rob Sheldon: Why the sulfur-based life forms never amounted to much

Welcome to “RNA world,” the five-star hotel of origin-of-life theories

and

What we know and don’t know about the origin of life

Follow UD News at Twitter!

Comments
Gordon Davisson Thanks for your comment. You selected one example there but it seems you're saying that "pretty much everything" BA77 posts "he’s significantly wrong about". That's a strong claim, I'd say. But in any case, he offered a number of arguments starting at post 14 - I'll select just a few: ... visible light is incredibly fine-tuned for life to exist on earth. “These specific frequencies of light (that enable plants to manufacture food and astronomers to observe the cosmos) represent less than 1 trillionth of a trillionth (10^-24) of the universe’s entire range of electromagnetic emissions.” Moreover, the light coming from the sun must be of the ‘right color’ To say that photosynthesis defies Darwinian explanation is to make a dramatic understatement: Evolutionary biology: Out of thin air John F. Allen & William Martin: The measure of the problem is here: “Oxygenetic photosynthesis involves about 100 proteins that are highly ordered within the photosynthetic membranes of the cell.” http://www.nature.com/nature/j.....5610a.html Of related note: ATP synthase is ‘unexpectedly’ found to be, thermodynamically, 100% efficient ... It goes on, so I'd be interested in learning how this information is significantly wrong.Silver Asiatic
January 3, 2017
January
01
Jan
3
03
2017
02:50 AM
2
02
50
AM
PDT
GD, Happy new year back to you. I must observe, nope, there is the issue of the underlying information content tied to functional organisation, and the issue of work from lucky noise. I argue that on this, the oh, there is nothing to see there argument fails. Water freezing has long since been addressed in say Thaxton et al, TMLO, and before that in Orgel, 1973: ORDER is not the same as ORGANISATION. Here, crystal order is imposed by and large through the peculiarities of the polar molecule involved in water -- and the roots of that in the core laws and framing of the cosmos are themselves of interest to design thinkers and theorists at another level -- but the point is, organisation requires assembly per what Wicken 1979 identified as a wiring plan; is is aperiodic and functionally constrained, also, consistently seen to be not driven by blind chance and/or mechanical forces. The issue on the latter tied to the 2nd law, is why. The point of the statistical form of the 2nd law [the relevant underlying root issue deeply embedded in our understanding of it since over 100 years ago], is that it highlights a statistical search challenge to the point of unobservability. Once we see FSCO/I beyond 500 - 1,000 bits in accord with a functionally specific wiring plan, it is maximally implausible -- notice I specifically do not argue "improbable" -- that a sol system or observed cosmos scale blind chance and mechanical necessity search of the relevant configuration space will find a relevant island of function. And, empirically such FSCO/I is routinely seen to originate, to the trillions of cases. Consistently, it is by design, that is intelligently directed configuration. Per Newton's vera causa principle, we are entitled to infer this is the reliable cause of said phenomenon. Yes, this has momentous consequences when we contemplate the phenomena of code [thus, TEXT and LANGUAGE], algorithms and associated molecular nanotech execution machinery in the living cell. Yes, it has further sobering consequences as we see the pattern of major body plans up to our own, involving 10 - 100 mn+ bits of further genetic information, per the genome scale patterns. Yes, it is likewise when we contemplate the pattern of deeply isolated small number fold domains in AA sequence space. And more. But that is the challenge, we face complex functionally specific organisation that is information rich and not at all the same as highly compressible order driven by mechanical forces as with crystallisation. This phenomenon cries out for its own proper explanation, not for conflation with a different phenomenon. The only actually observed adequate cause is design, and the statistical underpinnings of the 2nd law, indicate a very good reason for that observation. there is something big to see and to pause over here, and it is inextricably tied to the 2nd law and its underlying statistical frame of thought. Once, we see and accept the patent distinction between order and organisation. KF PS: And laser experiments etc come about by: ______ ?kairosfocus
January 3, 2017
January
01
Jan
3
03
2017
12:17 AM
12
12
17
AM
PDT
kairosfocus, happy new year, and thanks for your reply! I agree that heat flowing in at higher temperature and out at lower temperature is a necessary but not a sufficient condition for a heat engine (and a wide variety of other things). But it is sufficient for one important thing: it removes the second-law constraints against entropy decrease and/or free energy increase. Essentially, it doesn't mean anything interesting will happen, but it does mean that the second law doesn't forbid interesting things from happening. Or, to put it another way, if something (like an increase in FSCO/I) is forbidden on Earth, it's forbidden by something other than the second law of thermodynamics. But I'm getting a bit far from my original point. What's your take on my objections to Rob Sheldon's claims? Specifically: - Rob said that entropy increases correspond to increases in information, and entropy decreases correspond to decreases in information. I said that this is only true if you adopt an unusual and not-relevant-to-ID definition of "information". (BTW, this "unusual definition" is essentially the "missing information" view you described, but as far as I can see not closely related to CSI, FSCO/I, etc). - Rob said that evolution is unique in being an irreversible reaction in which entropy decreases; I said that such reactions are entirely common and unremarkable (although they all involve processes that couple to equal-or-larger entropy increases somewhere else). So what's your opinion on these two specific points? Do you think Rob is out to lunch, or that I'm out to lunch, or that we're both off the mark? (And if you think I'm wrong on either point, I have a followup question/challenge for you: in your understanding, when water freezes, how does its entropy change? Does that correspond to a change in information?)Gordon Davisson
January 2, 2017
January
01
Jan
2
02
2017
10:23 PM
10
10
23
PM
PDT
Silver Asiatic, I wouldn't be so impressed with ba77's research unless you've checked it out for yourself; pretty much everything of his that I've checked out (or know about independently) he's significantly wrong about. Take the quantum Zeno effect as an example. He says "How in blue blazes can conscious observation put a freeze on entropic decay, unless consciousness was and is more foundational to reality than the 1 in 10^10^123 initial entropy is?" But if you actually look at the experiment that he cites, it didn't involve conscious observation at all. It used a laser beam to put the freeze on decay. Far from showing something special about consciousness, it shows that (at least in this respect) a conscious observer can be replaced by a beam of light.Gordon Davisson
January 2, 2017
January
01
Jan
2
02
2017
09:49 PM
9
09
49
PM
PDT
BA77 posts 14-19, thank you for bringing all of that research together into a fascinating and comprehensive argument.Silver Asiatic
January 1, 2017
January
01
Jan
1
01
2017
05:20 PM
5
05
20
PM
PDT
As well it is interesting to note the primary source for increasing entropy in the universe:
Entropy of the Universe – Hugh Ross – May 2010   Excerpt: Egan and Lineweaver found that supermassive black holes are the largest contributor to the observable universe’s entropy. They showed that these supermassive black holes contribute about 30 times more entropy than what the previous research teams estimated. http://www.reasons.org/entropy-universe   
Kip Thorne gets across the destructive power of black holes in a fairly dramatic fashion:
"Einstein's equation predicts that, as the astronaut reaches the singularity (of the black-hole), the tidal forces grow infinitely strong, and their chaotic oscillations become infinitely rapid. The astronaut dies and the atoms which his body is made become infinitely and chaotically distorted and mixed-and then, at the moment when everything becomes infinite (the tidal strengths, the oscillation frequencies, the distortions, and the mixing), spacetime ceases to exist." Kip S. Thorne - "Black Holes and Time Warps: Einstein's Outrageous Legacy" pg. 476
Moreover, in the Shroud of Turin, in Jesus Christ's resurrection from the dead, we have evidence of God overcoming the entropic forces of Gravity:
A Quantum Hologram of Christ's Resurrection? by Chuck Missler Excerpt: “You can read the science of the Shroud, such as total lack of gravity, lack of entropy (without gravitational collapse), no time, no space—it conforms to no known law of physics.” The phenomenon of the image brings us to a true event horizon, a moment when all of the laws of physics change drastically. Dame Piczek created a one-fourth size sculpture of the man in the Shroud. When viewed from the side, it appears as if the man is suspended in mid air (see graphic, below), indicating that the image defies previously accepted science. The phenomenon of the image brings us to a true event horizon, a moment when all of the laws of physics change drastically. http://www.khouse.org/articles/2008/847 THE EVENT HORIZON (Space-Time Singularity) OF THE SHROUD OF TURIN. - Isabel Piczek - Particle Physicist Excerpt: We have stated before that the images on the Shroud firmly indicate the total absence of Gravity. Yet they also firmly indicate the presence of the Event Horizon. These two seemingly contradict each other and they necessitate the past presence of something more powerful than Gravity that had the capacity to solve the above paradox. http://shroud3d.com/findings/isabel-piczek-image-formation Turin shroud – (Particle Physicist explains event horizon) – video https://www.youtube.com/watch?v=HHVUGK6UFK8 The Resurrection of Jesus Christ as the 'Theory of Everything' (Entropic Concerns) https://www.youtube.com/watch?v=rqv4wVP_Fkc&list=PLtAP1KN7ahia8hmDlCYEKifQ8n65oNpQ5&index=2
Thus, contrary to the claims of Darwinists that entropy presents no obstacle for Darwinian evolution, the fact of the matter is that not only is entropy not compatible with life, but entropy is found to be the primary source for death and destruction in this universe. In fact, Jesus Christ, in his defiance of gravity at his resurrection from the dead, apparently had to deal directly with the deadly force of entropy in his resurrection from the dead. Supplemental videos:
Special and General Relativity compared to Heavenly and Hellish Near Death Experiences https://www.youtube.com/watch?v=TbKELVHcvSI&list=PLtAP1KN7ahia8hmDlCYEKifQ8n65oNpQ5&index=1 Resurrection of Jesus Christ as the Theory of Everything - Centrality Concerns https://www.youtube.com/watch?v=8uHST2uFPQY&list=PLtAP1KN7ahia8hmDlCYEKifQ8n65oNpQ5&index=4
Verse:
Colossians 1:15-20 The Son is the image of the invisible God, the firstborn over all creation. For in him all things were created: things in heaven and on earth, visible and invisible, whether thrones or powers or rulers or authorities; all things have been created through him and for him. He is before all things, and in him all things hold together. And he is the head of the body, the church; he is the beginning and the firstborn from among the dead, so that in everything he might have the supremacy. For God was pleased to have all his fullness dwell in him, and through him to reconcile to himself all things, whether things on earth or things in heaven, by making peace through his blood, shed on the cross.
bornagain77
January 1, 2017
January
01
Jan
1
01
2017
06:06 AM
6
06
06
AM
PDT
Moreover, although Darwinists have many times appealed to the 'random thermodynamic jostling' in cells to try to say that life is not designed, (i.e. Carl Zimmer's “barely constrained randomness” remark for one example), the fact of the matter is that if anything ever gave evidence for the supernatural design of the universe it is the initial 1 in 10^10^123 entropy of the universe.
“The time-asymmetry is fundamentally connected to with the Second Law of Thermodynamics: indeed, the extraordinarily special nature (to a greater precision than about 1 in 10^10^123, in terms of phase-space volume) can be identified as the “source” of the Second Law (Entropy).” Roger Penrose - The Physics of the Small and Large: What is the Bridge Between Them? How special was the big bang? – Roger Penrose Excerpt: This now tells us how precise the Creator’s aim must have been: namely to an accuracy of one part in 10^10^123. (from the Emperor’s New Mind, Penrose, pp 339-345 – 1989)
This number,1 in 10^10^123, is so large that, if it were written down in ordinary notation, it could not be written down even if you used every single particle of the universe to denote a decimal place. Moreover, the improbability of this number ,1 in 10^10^123, is so small that it drives, via Boltzmann Brain, atheistic materialism into catastrophic epistemological failure:
Multiverse and the Design Argument - William Lane Craig Excerpt: Roger Penrose of Oxford University has calculated that the odds of our universe’s low entropy condition obtaining by chance alone are on the order of 1 in 10^10(123), an inconceivable number. If our universe were but one member of a multiverse of randomly ordered worlds, then it is vastly more probable that we should be observing a much smaller universe. For example, the odds of our solar system’s being formed instantly by the random collision of particles is about 1 in 10^10(60), a vast number, but inconceivably smaller than 1 in 10^10(123). (Penrose calls it “utter chicken feed” by comparison [The Road to Reality (Knopf, 2005), pp. 762-5]). Or again, if our universe is but one member of a multiverse, then we ought to be observing highly extraordinary events, like horses’ popping into and out of existence by random collisions, or perpetual motion machines, since these are vastly more probable than all of nature’s constants and quantities’ falling by chance into the virtually infinitesimal life-permitting range. Observable universes like those strange worlds are simply much more plenteous in the ensemble of universes than worlds like ours and, therefore, ought to be observed by us if the universe were but a random member of a multiverse of worlds. Since we do not have such observations, that fact strongly disconfirms the multiverse hypothesis. On naturalism, at least, it is therefore highly probable that there is no multiverse. — Penrose puts it bluntly “these world ensemble hypothesis are worse than useless in explaining the anthropic fine-tuning of the universe”. http://www.reasonablefaith.org/multiverse-and-the-design-argument Does a Multiverse Explain the Fine Tuning of the Universe? - Dr. Craig (observer selection effect vs. Boltzmann Brains) - video https://www.youtube.com/watch?v=pb9aXduPfuA
It is also important to note the broad explanatory scope of entropy for the universe:
Shining Light on Dark Energy – October 21, 2012 Excerpt: It (Entropy) explains time; it explains every possible action in the universe;,, Even gravity, Vedral argued, can be expressed as a consequence of the law of entropy. ,,, The principles of thermodynamics are at their roots all to do with information theory. Information theory is simply an embodiment of how we interact with the universe —,,, http://crev.info/2012/10/shining-light-on-dark-energy/
In fact, entropy is the primary reason why our material, temporal, bodies grow old and die,,,
Entropy Explains Aging, Genetic Determinism Explains Longevity, and Undefined Terminology Explains Misunderstanding Both - 2007 Excerpt: There is a huge body of knowledge supporting the belief that age changes are characterized by increasing entropy, which results in the random loss of molecular fidelity, and accumulates to slowly overwhelm maintenance systems [1–4].,,, http://www.plosgenetics.org/article/info%3Adoi/10.1371/journal.pgen.0030220
Yet, despite entropy's broad explanatory scope for the universe, in the quantum zeno effect we find that “an unstable particle, if observed continuously, will never decay.”
Quantum Zeno Effect “The quantum Zeno effect is,, an unstable particle, if observed continuously, will never decay.” http://en.wikipedia.org/wiki/Quantum_Zeno_effect Interaction-free measurements by quantum Zeno stabilization of ultracold atoms – 14 April 2015 Excerpt: In our experiments, we employ an ultracold gas in an unstable spin configuration, which can undergo a rapid decay. The object—realized by a laser beam—prevents this decay because of the indirect quantum Zeno effect and thus, its presence can be detected without interacting with a single atom. http://www.nature.com/ncomms/2015/150414/ncomms7811/full/ncomms7811.html?WT.ec_id=NCOMMS-20150415 Quantum Zeno effect “It has been experimentally confirmed,, that unstable particles will not decay, or will decay less rapidly, if they are observed. Somehow, observation changes the quantum system. We’re talking pure observation, not interacting with the system in any way.” Douglas Ell – Counting to God – pg. 189 – 2014 – Douglas Ell graduated early from MIT, where he double majored in math and physics. He then obtained a masters in theoretical mathematics from the University of Maryland. After graduating from law school, magna cum laude, he became a prominent attorney.
This is just fascinating! How in blue blazes can conscious observation put a freeze on entropic decay, unless consciousness was and is more foundational to reality than the 1 in 10^10^123 initial entropy is? This finding rules out any possibility that my consciousness was and is the result of the thermodynamic processes of the universe and of my brain. In fact, I hold it to be proof that consciousness must precede material reality just as the Christian Theist presupposes. Perhaps the most compelling piece of evidence that there must be immaterial information constraining life to be so far out of thermodynamic equilibrium is to note that the thermodynamic processes of the universe are, for the most part, kept in check until the moment of death at which time the entropic forces kick in full force and, relatively quickly, disintegrate the approx. billion-trillion protein molecules of a single human body into dust. Stephen Talbott elucidates that 'fateful transition' as such:
The Unbearable Wholeness of Beings - Stephen L. Talbott - 2010 Excerpt: Virtually the same collection of molecules exists in the canine cells during the moments immediately before and after death. But after the fateful transition no one will any longer think of genes as being regulated, nor will anyone refer to normal or proper chromosome functioning. No molecules will be said to guide other molecules to specific targets, and no molecules will be carrying signals, which is just as well because there will be no structures recognizing signals. Code, information, and communication, in their biological sense, will have disappeared from the scientist’s vocabulary. ,,, the question, rather, is why things don’t fall completely apart — as they do, in fact, at the moment of death. What power holds off that moment — precisely for a lifetime, and not a moment longer? Despite the countless processes going on in the cell, and despite the fact that each process might be expected to “go its own way” according to the myriad factors impinging on it from all directions, the actual result is quite different. Rather than becoming progressively disordered in their mutual relations (as indeed happens after death, when the whole dissolves into separate fragments), the processes hold together in a larger unity. http://www.thenewatlantis.com/publications/the-unbearable-wholeness-of-beings Scientific evidence that we do indeed have an eternal soul (Elaboration on Talbott's question “What power holds off that moment — precisely for a lifetime, and not a moment longer?”)– video 2016 https://youtu.be/h2P45Obl4lQ
bornagain77
January 1, 2017
January
01
Jan
1
01
2017
06:06 AM
6
06
06
AM
PDT
In the following paper, Dr Andy C. McIntosh, who is professor of thermodynamics and combustion theory at the University of Leeds, holds that it is non-material information that constrains biological life to be so far out of thermodynamic equilibrium. Moreover, Dr. McIntosh holds that regarding information as independent of energy and matter ‘resolves the thermodynamic issues and invokes the correct paradigm for understanding the vital area of thermodynamic/organisational interactions’.
Information and Thermodynamics in Living Systems – Andy C. McIntosh – 2013 Excerpt: ,,, information is in fact non-material and that the coded information systems (such as, but not restricted to the coding of DNA in all living systems) is not defined at all by the biochemistry or physics of the molecules used to store the data. Rather than matter and energy defining the information sitting on the polymers of life, this approach posits that the reverse is in fact the case. Information has its definition outside the matter and energy on which it sits, and furthermore constrains it to operate in a highly non-equilibrium thermodynamic environment. This proposal resolves the thermodynamic issues and invokes the correct paradigm for understanding the vital area of thermodynamic/organisational interactions, which despite the efforts from alternative paradigms has not given a satisfactory explanation of the way information in systems operates.,,, http://www.worldscientific.com/doi/abs/10.1142/9789814508728_0008
And in support of Dr. McIntosh’s contention that it must be non-material information which constrains biological life to be so far out of thermodynamic equilibrium, information has now been experimentally shown to have a ‘thermodynamic content’:
Demonic device converts information to energy – 2010 Excerpt: “This is a beautiful experimental demonstration that information has a thermodynamic content,” says Christopher Jarzynski, a statistical chemist at the University of Maryland in College Park. In 1997, Jarzynski formulated an equation to define the amount of energy that could theoretically be converted from a unit of information2; the work by Sano and his team has now confirmed this equation. “This tells us something new about how the laws of thermodynamics work on the microscopic scale,” says Jarzynski. http://www.scientificamerican.com/article.cfm?id=demonic-device-converts-inform Maxwell’s demon demonstration (knowledge of a particle’s position) turns information into energy – November 2010 Excerpt: Scientists in Japan are the first to have succeeded in converting information into free energy in an experiment that verifies the “Maxwell demon” thought experiment devised in 1867.,,, In Maxwell’s thought experiment the demon creates a temperature difference simply from information about the gas molecule temperatures and without transferring any energy directly to them.,,, Until now, demonstrating the conversion of information to energy has been elusive, but University of Tokyo physicist Masaki Sano and colleagues have succeeded in demonstrating it in a nano-scale experiment. In a paper published in Nature Physics they describe how they coaxed a Brownian particle to travel upwards on a “spiral-staircase-like” potential energy created by an electric field solely on the basis of information on its location. As the particle traveled up the staircase it gained energy from moving to an area of higher potential, and the team was able to measure precisely how much energy had been converted from information. http://www.physorg.com/news/2010-11-maxwell-demon-energy.html Information: From Maxwell’s demon to Landauer’s eraser – Lutz and Ciliberto – Oct. 25, 2015 – Physics Today Excerpt: The above examples of gedanken-turned-real experiments provide a firm empirical foundation for the physics of information and tangible evidence of the intimate connection between information and energy. They have been followed by additional experiments and simulations along similar lines.12 (See, for example, Physics Today, August 2014, page 60.) Collectively, that body of experimental work further demonstrates the equivalence of information and thermodynamic entropies at thermal equilibrium.,,, (2008) Sagawa and Ueda’s (theoretical) result extends the second law to explicitly incorporate information; it shows that information, entropy, and energy should be treated on equal footings. http://www.johnboccio.com/research/quantum/notes/Information.pdf J. Parrondo, J. Horowitz, and T. Sagawa. Thermodynamics of information. Nature Physics, 11:131-139, 2015.
It is also interesting to note just how much information is involved in keeping life so far out of thermodynamic equilibrium with the rest of the environment
Biophysics – Information theory. Relation between information and entropy: - Setlow-Pollard, Ed. Addison Wesley Excerpt: Linschitz gave the figure 9.3 x 10^12 cal/deg or 9.3 x 10^12 x 4.2 joules/deg for the entropy of a bacterial cell. Using the relation H = S/(k In 2), we find that the information content is 4 x 10^12 bits. Morowitz' deduction from the work of Bayne-Jones and Rhees gives the lower value of 5.6 x 10^11 bits, which is still in the neighborhood of 10^12 bits. Thus two quite different approaches give rather concordant figures. http://www.astroscu.unam.mx/~angel/tsb/molecular.htm “a one-celled bacterium, e. coli, is estimated to contain the equivalent of 100 million pages of Encyclopedia Britannica. Expressed in information in science jargon, this would be the same as 10^12 bits of information. In comparison, the total writings from classical Greek Civilization is only 10^9 bits, and the largest libraries in the world – The British Museum, Oxford Bodleian Library, New York Public Library, Harvard Widenier Library, and the Moscow Lenin Library – have about 10 million volumes or 10^12 bits.” – R. C. Wysong
Of related note to immaterial information having a ‘thermodynamic content’, classical digital information was found to be a subset of ‘non-local’, (i.e. beyond space and time), quantum entanglement/information by the following method which removed heat from a computer by the deletion of data:
Quantum knowledge cools computers: New understanding of entropy – June 2011 Excerpt: No heat, even a cooling effect; In the case of perfect classical knowledge of a computer memory (zero entropy), deletion of the data requires in theory no energy at all. The researchers prove that “more than complete knowledge” from quantum entanglement with the memory (negative entropy) leads to deletion of the data being accompanied by removal of heat from the computer and its release as usable energy. This is the physical meaning of negative entropy. Renner emphasizes, however, “This doesn’t mean that we can develop a perpetual motion machine.” The data can only be deleted once, so there is no possibility to continue to generate energy. The process also destroys the entanglement, and it would take an input of energy to reset the system to its starting state. The equations are consistent with what’s known as the second law of thermodynamics: the idea that the entropy of the universe can never decrease. Vedral says “We’re working on the edge of the second law. If you go any further, you will break it.” http://www.sciencedaily.com/releases/2011/06/110601134300.htm Scientists show how to erase information without using energy – January 2011 Excerpt: Until now, scientists have thought that the process of erasing information requires energy. But a new study shows that, theoretically, information can be erased without using any energy at all.,,, “Landauer said that information is physical because it takes energy to erase it. We are saying that the reason it (information) is physical has a broader context than that.”, Vaccaro explained. http://www.physorg.com/news/2011-01-scientists-erase-energy.html
The preceding paper was experimentally verified
What is Information - video https://youtu.be/2AvIOzVJMCM New Scientist astounds: Information is physical – May 13, 2016 Excerpt: Recently came the most startling demonstration yet: a tiny machine powered purely by information, which chilled metal through the power of its knowledge. This seemingly magical device could put us on the road to new, more efficient nanoscale machines, a better understanding of the workings of life, and a more complete picture of perhaps our most fundamental theory of the physical world. https://uncommondescent.com/news/new-scientist-astounds-information-is-physical/
bornagain77
January 1, 2017
January
01
Jan
1
01
2017
06:04 AM
6
06
04
AM
PDT
Of related note: ATP synthase is 'unexpectedly' found to be, thermodynamically, 100% efficient
Your Motor/Generators Are 100% Efficient – October 2011 Excerpt: ATP synthase astounds again. The molecular machine that generates almost all the ATP (molecular “energy pellets”) for all life was examined by Japanese scientists for its thermodynamic efficiency. By applying and measuring load on the top part that synthesizes ATP, they were able to determine that one cannot do better at getting work out of a motor,,, The article was edited by noted Harvard expert on the bacterial flagellum, Howard Berg. http://crev.info/content/111014-your_motor_generators Thermodynamic efficiency and mechanochemical coupling of F1-ATPase - 2011 Excerpt: F1-ATPase is a nanosized biological energy transducer working as part of FoF1-ATP synthase. Its rotary machinery transduces energy between chemical free energy and mechanical work and plays a central role in the cellular energy transduction by synthesizing most ATP in virtually all organisms.,, Our results suggested a 100% free-energy transduction efficiency and a tight mechanochemical coupling of F1-ATPase. http://www.pnas.org/content/early/2011/10/12/1106787108.short?rss=1
As well, photosynthesis itself was 'unexpectedly' found to be astonishingly efficient. Moreover, photosynthesis achieves this astonishing efficiency by overcoming thermodynamic noise by way of 'quantum coherence':
Unlocking nature's quantum engineering for efficient solar energy - January 7, 2013 Excerpt: Certain biological systems living in low light environments have unique protein structures for photosynthesis that use quantum dynamics to convert 100% of absorbed light into electrical charge,,, "Some of the key issues in current solar cell technologies appear to have been elegantly and rigorously solved by the molecular architecture of these PPCs – namely the rapid, lossless transfer of excitons to reaction centres.",,, These biological systems can direct a quantum process, in this case energy transport, in astoundingly subtle and controlled ways – showing remarkable resistance to the aggressive, random background noise of biology and extreme environments. "This new understanding of how to maintain coherence in excitons, and even regenerate it through molecular vibrations, provides a fascinating glimpse into the intricate design solutions – seemingly including quantum engineering – ,,, and which could provide the inspiration for new types of room temperature quantum devices." http://phys.org/news/2013-01-nature-quantum-efficient-solar-energy.html Uncovering Quantum Secret in Photosynthesis - June 20, 2013 Excerpt: Photosynthetic organisms, such as plants and some bacteria, have mastered this process: In less than a couple of trillionths of a second, 95 percent of the sunlight they absorb is whisked away to drive the metabolic reactions that provide them with energy. The efficiency of photovoltaic cells currently on the market is around 20 percent.,,, Van Hulst and his group have evaluated the energy transport pathways of separate individual but chemically identical, antenna proteins, and have shown that each protein uses a distinct pathway. The most surprising discovery was that the transport paths within single proteins can vary over time due to changes in the environmental conditions, apparently adapting for optimal efficiency. "These results show that coherence, a genuine quantum effect of superposition of states, is responsible for maintaining high levels of transport efficiency in biological systems, even while they adapt their energy transport pathways due to environmental influences" says van Hulst. http://www.sciencedaily.com/releases/2013/06/130620142932.htm
Moreover, protein folding is not a achieved by random thermodynamic jostling but is found to be achieved by 'quantum transition'
Physicists Discover Quantum Law of Protein Folding – February 22, 2011 Quantum mechanics finally explains why protein folding depends on temperature in such a strange way. Excerpt: First, a little background on protein folding. Proteins are long chains of amino acids that become biologically active only when they fold into specific, highly complex shapes. The puzzle is how proteins do this so quickly when they have so many possible configurations to choose from. To put this in perspective, a relatively small protein of only 100 amino acids can take some 10^100 different configurations. If it tried these shapes at the rate of 100 billion a second, it would take longer than the age of the universe to find the correct one. Just how these molecules do the job in nanoseconds, nobody knows.,,, Today, Luo and Lo say these curves can be easily explained if the process of folding is a quantum affair. By conventional thinking, a chain of amino acids can only change from one shape to another by mechanically passing though various shapes in between. But Luo and Lo say that if this process were a quantum one, the shape could change by quantum transition, meaning that the protein could ‘jump’ from one shape to another without necessarily forming the shapes in between.,,, Their astonishing result is that this quantum transition model fits the folding curves of 15 different proteins and even explains the difference in folding and unfolding rates of the same proteins. That's a significant breakthrough. Luo and Lo's equations amount to the first universal laws of protein folding. That’s the equivalent in biology to something like the thermodynamic laws in physics. http://www.technologyreview.com/view/423087/physicists-discover-quantum-law-of-protein/
Besides photosynthesis and protein folding unexpectedly not being subject to 'thermodynamic jostling', in the following paper, finding a lack of 'random' collisions in a crowded cell was a 'counterintuitive surprise' for researchers:
Proteins put up with the roar of the crowd - June 23, 2016 Excerpt: It gets mighty crowded around your DNA, but don't worry: According to Rice University researchers, your proteins are nimble enough to find what they need. Rice theoretical scientists studying the mechanisms of protein-DNA interactions in live cells showed that crowding in cells doesn't hamper protein binding as much as they thought it did.,,, If DNA can be likened to a library, it surely is a busy one. Molecules roam everywhere, floating in the cytoplasm and sticking to the tightly wound double helix. "People know that almost 90 percent of DNA is covered with proteins, such as polymerases, nucleosomes that compact two meters into one micron, and other protein molecules," Kolomeisky said.,,, That makes it seem that proteins sliding along the strand would have a tough time binding, and it's possible they sometimes get blocked. But the Rice team's theory and simulations indicated that crowding agents usually move just as rapidly, sprinting out of the way. "If they move at the same speed, the molecules don't bother each other," Kolomeisky said. "Even if they're covering a region, the blockers move away quickly so your protein can bind." In previous research, the team determined that stationary obstacles sometimes help quicken a protein's search for its target by limiting options. This time, the researchers sought to define how crowding both along DNA and in the cytoplasm influenced the process. "We may think everything's fixed and frozen in cells, but it's not," Kolomeisky said. "Everything is moving.",,, Floating proteins appear to find their targets quickly as well. "This was a surprise," he said. "It's counterintuitive, because one would think collisions between a protein and other molecules on DNA would slow it down. But the system is so dynamic (and so well designed?), it doesn't appear to be an issue." http://phys.org/news/2016-06-proteins-roar-crowd.html
In fact, in the following video at the 2:30 minute mark,, Jim Al-Khalili states that, in regards to quantum mechanics, “Biologists, on the other hand have got off lightly in my view”
",,and Physicists and Chemists have had a long time to try and get use to it (Quantum Mechanics). Biologists, on the other hand have got off lightly in my view. They are very happy with their balls and sticks models of molecules. The balls are the atoms. The sticks are the bonds between the atoms. And when they can't build them physically in the lab nowadays they have very powerful computers that will simulate a huge molecule.,, It doesn't really require much in the way of quantum mechanics in the way to explain it." Jim Al-Khalili – Quantum biology – video https://www.youtube.com/watch?v=zOzCkeTPR3Q
At the 6:52 minute mark of the video, Jim Al-Khalili goes on to state life has a certain order 'that’s very different from the random thermodynamic jostling of atoms and molecules in inanimate matter of the same complexity. In fact, living matter seems to behave in its order and its structure just like inanimate cooled down to near absolute zero':
“To paraphrase, (Erwin Schrödinger in his book “What Is Life”), he says at the molecular level living organisms have a certain order. A structure to them that’s very different from the random thermodynamic jostling of atoms and molecules in inanimate matter of the same complexity. In fact, living matter seems to behave in its order and its structure just like inanimate cooled down to near absolute zero. Where quantum effects play a very important role. There is something special about the structure, about the order, inside a living cell. So Schrodinger speculated that maybe quantum mechanics plays a role in life”. Jim Al-Khalili – Quantum biology – video https://www.youtube.com/watch?v=zOzCkeTPR3Q
And indeed, in regards to quantum biology, there is much evidence confirming the fact that current biologists working under the reductive materialistic framework of Darwinian evolution are not even using the correct theoretical framework to properly understand life in the first place:
Molecular Biology - 19th Century Materialism meets 21st Century Quantum Mechanics - video https://www.youtube.com/watch?v=rCs3WXHqOv8&index=3&list=PLtAP1KN7ahiYxgYCc-0xiUAhNWjT4q6LD
bornagain77
January 1, 2017
January
01
Jan
1
01
2017
06:03 AM
6
06
03
AM
PDT
Moreover, even though the energy allowed to enter the atmosphere of the Earth is constrained, i.e. finely-tuned, to 1 trillionth of a trillionth of the entire electromagnetic spectrum, that still does not fully negate the disordering effects of pouring raw energy into an open system. This is made evident by the fact that objects left out in the sun age and deteriorate much more quickly than objects stored inside in cool conditions, away from the sun and heat. The following video clearly illustrates that just pouring raw energy into a 'open system' actually increases the disorder of the system,
Thermodynamic Arguments for Creation - Thomas Kindell (46:39 minute mark) - video https://www.youtube.com/watch?v=I1yto0-z2bQ&feature=player_detailpage#t=2799
To offset this disordering effect that the raw energy of sunlight has on objects, the raw energy from the sun must be processed further to be of biological utility. This is accomplished by photosynthesis which converts sunlight into ATP. To say that photosynthesis defies Darwinian explanation is to make a dramatic understatement:
Evolutionary biology: Out of thin air John F. Allen & William Martin: The measure of the problem is here: “Oxygenetic photosynthesis involves about 100 proteins that are highly ordered within the photosynthetic membranes of the cell." http://www.nature.com/nature/journal/v445/n7128/full/445610a.html The Elaborate Nanoscale Machine Called Photosynthesis: No Vestige of a Beginning - Cornelius Hunter - July 2012 Excerpt: "The ability to do photosynthesis is widely distributed throughout the bacterial domain in six different phyla, with no apparent pattern of evolution. Photosynthetic phyla include the cyanobacteria, proteobacteria (purple bacteria), green sulfur bacteria (GSB), firmicutes (heliobacteria), filamentous anoxygenic phototrophs (FAPs, also often called the green nonsulfur bacteria), and acidobacteria (Raymond, 2008)." http://darwins-god.blogspot.com/2012/07/elaborate-nanoscale-machine-called.html?showComment=1341739083709#c1202402748048253561 Enzymes and protein complexes needed in photosynthesis - with graphs http://elshamah.heavenforum.org/t1637-enzymes-and-protein-complexes-needed-in-photosynthesis#2527 The 10 Step Glycolysis Pathway In ATP Production: An Overview - video http://www.youtube.com/watch?v=8Kn6BVGqKd8
At the 14:00 minute mark of the following video, Chris Ashcraft, PhD – molecular biology, gives us an overview of the Citric Acid Cycle, which is, after the 10 step Glycolysis Pathway, also involved in ATP production:
Evolution vs ATP Synthase – Chris Ashcraft - video - citric acid cycle at 14:00 minute mark https://www.youtube.com/watch?feature=player_detailpage&v=rUV4CSs0HzI#t=746 The Citric Acid Cycle: An Overview - video http://www.youtube.com/watch?v=F6vQKrRjQcQ
Moreover, there is a profound 'chicken and egg' dilemma with ATP production for evolutionists:
Evolutionist Has Another Honest Moment as “Thorny Questions Remain” - Cornelius Hunter - July 2012 Excerpt: It's a chicken and egg question. Scientists are in disagreement over what came first -- replication, or metabolism. But there is a third part to the equation -- and that is energy. … You need enzymes to make ATP and you need ATP to make enzymes. The question is: where did energy come from before either of these two things existed? http://darwins-god.blogspot.com/2012/07/evolutionist-has-another-honest-moment.html
bornagain77
January 1, 2017
January
01
Jan
1
01
2017
06:02 AM
6
06
02
AM
PDT
Granville Sewell asks:
Why Tornados Running Backward do not Violate the Second Law – Granville Sewell – May 2012 – article with video Excerpt: So, how does the spontaneous rearrangement of matter on a rocky, barren, planet into human brains and spaceships and jet airplanes and nuclear power plants and libraries full of science texts and novels, and supercomputers running partial differential equation solving software , represent a less obvious or less spectacular violation of the second law—or at least of the fundamental natural principle behind this law—than tornados turning rubble into houses and cars? Can anyone even imagine a more spectacular violation? https://uncommondescent.com/intelligent-design/why-tornados-running-backward-do-not-violate-the-second-law/
Granville Sewell further notes that “the very equations of entropy change upon which this compensation argument is based actually support, on closer examination, the common sense conclusion that “if an increase in order is extremely improbable when a system is isolated, it is still extremely improbable when the system is open, unless something is entering which makes it not extremely improbable.””
The Common Sense Law of Physics Granville Sewell – March 2016 Excerpt: (The) “compensation” argument, used by every physics text which discusses evolution and the second law to dismiss the claim that what has happened on Earth may violate the more general statements of the second law, was the target of my article “Entropy, Evolution, and Open Systems,” published in the proceedings of the 2011 Cornell meeting Biological Information: New Perspectives (BINP). In that article, I showed that the very equations of entropy change upon which this compensation argument is based actually support, on closer examination, the common sense conclusion that “if an increase in order is extremely improbable when a system is isolated, it is still extremely improbable when the system is open, unless something is entering which makes it not extremely improbable.” The fact that order can increase in an open system does not mean that computers can appear on a barren planet as long as the planet receives solar energy. Something must be entering our open system that makes the appearance of computers not extremely improbable, for example: computers. http://www.evolutionnews.org/2016/03/the_common_sens102725.html
Moreover Dr. Sewell has empirical evidence backing up his claim. Specifically, empirical evidence and numerical simulations tell us that “Genetic Entropy”, i.e. the tendency of biological systems to drift towards decreasing complexity, and decreasing information content, holds true as an overriding rule for biological adaptations over long periods of time:
“The First Rule of Adaptive Evolution”: Break or blunt any functional coded element whose loss would yield a net fitness gain – Michael Behe – December 2010 Excerpt: In its most recent issue The Quarterly Review of Biology has published a review by myself of laboratory evolution experiments of microbes going back four decades.,,, The gist of the paper is that so far the overwhelming number of adaptive (that is, helpful) mutations seen in laboratory evolution experiments are either loss or modification of function. Of course we had already known that the great majority of mutations that have a visible effect on an organism are deleterious. Now, surprisingly, it seems that even the great majority of helpful mutations degrade the genome to a greater or lesser extent.,,, I dub it “The First Rule of Adaptive Evolution”: Break or blunt any functional coded element whose loss would yield a net fitness gain. http://behe.uncommondescent.com/2010/12/the-first-rule-of-adaptive-evolution/ Can Purifying Natural Selection Preserve Biological Information? – May 2013 – Paul Gibson, John R. Baumgardner, Wesley H. Brewer, John C. Sanford In conclusion, numerical simulation shows that realistic levels of biological noise result in a high selection threshold. This results in the ongoing accumulation of low-impact deleterious mutations, with deleterious mutation count per individual increasing linearly over time. Even in very long experiments (more than 100,000 generations), slightly deleterious alleles accumulate steadily, causing eventual extinction. These findings provide independent validation of previous analytical and simulation studies [2–13]. Previous concerns about the problem of accumulation of nearly neutral mutations are strongly supported by our analysis. Indeed, when numerical simulations incorporate realistic levels of biological noise, our analyses indicate that the problem is much more severe than has been acknowledged, and that the large majority of deleterious mutations become invisible to the selection process.,,, http://www.worldscientific.com/doi/pdf/10.1142/9789814508728_0010 Genetic Entropy – references to several peer reviewed numerical simulations analyzing and falsifying all flavors of Darwinian evolution,, (via John Sanford and company) http://www.geneticentropy.org/#!properties/ctzx
In their compensation argument, Darwinists claim that the second law does not contradict evolution as long as you have energy entering the 'open system'. In this case the open system is the Earth. Yet, what Darwinists do not tell you is that the energy allowed to enter the atmosphere of the Earth is constrained, i.e. finely-tuned, to 1 trillionth of a trillionth of the entire electromagnetic spectrum:
8:12 minute mark,,, "These specific frequencies of light (that enable plants to manufacture food and astronomers to observe the cosmos) represent less than 1 trillionth of a trillionth (10^-24) of the universe’s entire range of electromagnetic emissions." Fine tuning of Light, Atmosphere, and Water to Photosynthesis (etc..) – video (2016) - https://youtu.be/NIwZqDkrj9I?t=384
As the preceding video highlighted, visible light is incredibly fine-tuned for life to exist on earth. Though visible light is only a tiny fraction of the total electromagnetic spectrum coming from the sun, it happens to be the "most permitted" portion of the sun's spectrum allowed to filter through the our atmosphere. All the other bands of electromagnetic radiation, directly surrounding visible light, happen to be harmful to organic molecules, and are almost completely absorbed by the earth's magnetic shield and the earth's atmosphere. The size of light's wavelengths and the constraints on the size allowable for the protein molecules of organic life, strongly indicate that they were tailor-made for each other:
The visible portion of the electromagnetic spectrum (~1 micron) is the most intense radiation from the sun (Figure 1); has the greatest biological utility (Figure 2); and easily passes through atmosphere of Earth (Figure 3) and water (Figure 4) with almost no absorption. It is uniquely this same wavelength of radiation that is idea to foster the chemistry of life. This is either a truly amazing series of coincidences or else the result of careful design. - (Walter Bradley - Is There Scientific Evidence for the Existence of God? How the Recent Discoveries Support a Designed Universe - - http://www.leaderu.com/offices/bradley/docs/scievidence.html
Moreover, the light coming from the sun must be of the 'right color'
The " just right " relationship of the light spectrum and photosynthesis Excerpt: The American astronomer George Greenstein discusses this in The Symbiotic Universe, p 96: Chlorophyll is the molecule that accomplishes photosynthesis... The mechanism of photosynthesis is initiated by the absorption of sunlight by a chlorophyll molecule. But in order for this to occur, the light must be of the right color. Light of the wrong color won't do the trick. A good analogy is that of a television set. In order for the set to receive a given channel it must be tuned to that channel; tune it differently and the reception will not occur. It is the same with photosynthesis, the Sun functioning as the transmitter in the analogy and the chlorophyll molecule as the receiving TV set. If the molecule and the Sun are not tuned to each other-tuned in the sense of colour- photosynthesis will not occur. As it turns out, the sun's color is just right. One might think that a certain adaptation has been at work here: the adaptation of plant life to the properties of sunlight. After all, if the Sun were a different temperature could not some other molecule, tuned to absorb light of a different colour, take the place of chlorophyll? Remarkably enough the answer is no, for within broad limits all molecules absorb light of similar colours. The absorption of light is accomplished by the excitation of electrons in molecules to higher energy states, and (are) the same no matter what molecule you are discussing. Furthermore, light is composed of photons, packets of energy and photons of the wrong energy simply can not be absorbed… As things stand in reality, there is a good fit between the physics of stars and that of molecules. Failing this fit, however, life would have been impossible. The harmony between stellar and molecular physics that Greenstein refers to is a harmony too extraordinary ever to be explained by chance. There was only one chance in 10^25 of the Sun's providing just the right kind of light necessary for us and that there should be molecules in our world that are capable of using that light. This perfect harmony is unquestionably proof of Intelligent Design. http://elshamah.heavenforum.org/t1927-the-just-right-relationship-of-the-light-spectrum-and-photosynthesis
bornagain77
January 1, 2017
January
01
Jan
1
01
2017
06:01 AM
6
06
01
AM
PDT
GD, A happy new year to you and to others. I passed by to see how UD got along for the overnight, and see your exchange with Mung. I think I should make a comment or a few. First, your basic problem is to suggest that Earth is an entropy exporter, when in fact the primary issue is that it imports heat, and thus is subject to the disorganising impact of such heat. That there is exhaust of heat to reservoir at a lower temperature is a necessary but not a sufficient condition for a successful heat engine, especially when the issue is not merely to move to order -- e.g. a hurricane in this sense is a spontaneously formed heat engine that provides orderly motion of winds [and disorders just about everything it impacts] -- but functionally specific, complex organisation and associated information. In general, assembly of complex, specifically functional organised systems is only observed on an assembly plan, i.e. on equally organised assembly. Which, is exactly what we find -- per actual OBSERVATION (not ideologically loaded inference) -- in life forms also. Think, protein synthesis if you doubt me. If life forms were as simplistic and orderly a system as a hurricane, we would not be likely to be seeing the sort of elaborate step by step construction of proteins under numerical control and algorithmic programs that we find in ribosomes and the like in the living cell. In short, there is something the living cell is trying to tell you. The point that such FSCO/I points to is that once we have equivalent binary string to describe the specific, functional info going beyond 500 - 1,000 bits worth of info, the atomic and temporal resources of the observed cosmos become maximally unlikely to ever find such islands of function -- imposed by requisites of multi-part, matched arrangement and coupling to achieve a unified result -- in beyond astronomical config spaces. That is, unobservable on 10^57 atoms [sol system scale], 10^17 s and 10^12 - 14 observations per atom per sec [~ fast org chem rxns] for the lower end, and similarly but with 10^80 atoms [observed cosmos] at the upper. At the upper end, the number of observations can be compared to a straw, and on this, the haystack to be searched will dwarf an observed cosmos of some 90 bn LY across. In short, there is a reason why on trillions of examples, FSCO/I is uniformly seen to result from intelligently directed configuration. Which brings to bear Newton's Vera Causa principle, that when we seek to explain what we have not directly observed, we must infer only to things we have observed to actually be able to cause the like result. (If we did that, the whole evolutionary materialist account of origin of life and or body plans would instantly collapse. Which would be to the good. Better to acknowledge ignorance than to pretend to know what we do not, or to impose ideological agendas by the back door of methodological naturalism. Then we can have a real look at the pivotal issue: the origin of required FSCO/I. ) That is, the inference to design as best causal explanation on seeing FSCO/I is strong, being empirically reliable [trillions of cases] and analytically grounded in a hopeless search challenge. Further to this, I find the following summary of what entropy is as a micro phenomenon is highly instructive; here taken from an admission against patent interest in a wiki article on informational views on entropy (as observed nigh on six years past and put up in my always linked note):
At an everyday practical level the links between information entropy and thermodynamic entropy are not close. Physicists and chemists are apt to be more interested in changes in entropy as a system spontaneously evolves away from its initial conditions, in accordance with the second law of thermodynamics, rather than an unchanging probability distribution. And, as the numerical smallness of Boltzmann's constant kB indicates, the changes in S / kB for even minute amounts of substances in chemical and physical processes represent amounts of entropy which are so large as to be right off the scale compared to anything seen in data compression or signal processing. But, at a multidisciplinary level, connections can be made between thermodynamic and informational entropy, although it took many years in the development of the theories of statistical mechanics and information theory to make the relationship fully apparent. In fact, in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon's information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states that it could be in, thus making any complete state description longer. (See article: maximum entropy thermodynamics.[Also,another article remarks: >>in the words of G. N. Lewis writing about chemical entropy in 1930, "Gain in entropy always means loss of information, and nothing more" . . . in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes/no questions that need to be answered in order to fully specify the microstate, given that we know the macrostate.>>]) Maxwell's demon can (hypothetically) reduce the thermodynamic entropy of a system by using information about the states of individual molecules; but, as Landauer (from 1961) and co-workers have shown, to function the demon himself must increase thermodynamic entropy in the process, by at least the amount of Shannon information he proposes to first acquire and store; and so the total entropy does not decrease (which resolves the paradox).
This missing info to specify view is highly instructive. And, I find it particularly saddening to see how there is still resort to the notion that as the earth is an open system entropy can decrease by its exporting energy and so there is nothing to be explained, move along nothing to see. Sorry, that is a strawman argument. KF PS: I suggest you will find here on useful, and here on also, in that always linked note. Notice, in the latter, a discussion of Clausius' context of derivation of entropy and its extension to address the sort of issue raised by FSCO/I.kairosfocus
December 31, 2016
December
12
Dec
31
31
2016
11:34 PM
11
11
34
PM
PDT
Mung:
There’s no such thing as a decrease in entropy.
There is absolutely such a thing as a decrease in entropy. Take the classical definition of entropy: dS =rev dQ/T. That means for any reversible process, when there's heat leaving the system (dQ < 0), there will be an entropy decrease (dS < 0). (Assuming T > 0. Negative absolute temperatures are weird, and I'm not going to worry about them here.)
Things do not get cooler because of any decrease in entropy, and water does not freeze because there’s a decrease in entropy.
The entropy decrease doesn't cause cooling or freezing. It's more accurate to say that cooling and freezing are processes that cause entropy to decrease. For example, given amount of water in the liquid state has higher entropy than the same amount of ice. So when water freezes, it's moving from a higher-entropy state to a lower-entropy state. That is a decrease in entropy.Gordon Davisson
December 31, 2016
December
12
Dec
31
31
2016
09:54 PM
9
09
54
PM
PDT
Mung: Since when is a decrease in entropy not a violation of the second law of thermodynamics? Gordon: When it’s coupled to an equal-or-larger entropy increase somewhere else. That’s why things can get cooler, water freeze, etc without violating the second law. There's no such thing as a decrease in entropy. Things do not get cooler because of any decrease in entropy, and water does not freeze because there's a decrease in entropy.Mung
December 31, 2016
December
12
Dec
31
31
2016
08:57 PM
8
08
57
PM
PDT
Gordon:
Or take the statistical mechanics approach, where entropy is defined in terms of the number of distinct microscopically-distinct states the system might be in, where quantum mechanics gives you an absolute value.
The statistical approach agrees with the classical approach.Mung
December 31, 2016
December
12
Dec
31
31
2016
08:43 PM
8
08
43
PM
PDT
Gordon:
Note that the classical definition of entropy only defines changes in entropy, not its absolute value.
I agree. Now apply that to your claims about the entropy of the earth.Mung
December 31, 2016
December
12
Dec
31
31
2016
08:40 PM
8
08
40
PM
PDT
Gordon:
There are many ways of defining entropy.
How do we decide which definition applies?
There are many ways of defining entropy; the original thermodynamic definition (dS =rev dQ/T) only applies to systems at equilibrium, but the statistical definitions can be applied to a much wider variety of systems.
How many statistical definitions of entropy are there?Mung
December 31, 2016
December
12
Dec
31
31
2016
08:37 PM
8
08
37
PM
PDT
Mung:
The earth not an isolated thermodynamic system at equilibrium. The entropy is not well-defined.
There are many ways of defining entropy; the original thermodynamic definition (dS =rev dQ/T) only applies to systems at equilibrium, but the statistical definitions can be applied to a much wider variety of systems. But actually, if the Earth is what you're interested in, you don't need to go that far. Most of Earth can be broken up up into small enough pieces that each one is very close to equilibrium, apply the classical definition to each one, then sum them to get a good approximation to the total entropy for Earth. But...
How on earth do you propose to calculate the entropy change if you can’t calculate the entropy?
Note that the classical definition of entropy only defines changes in entropy, not its absolute value. To get absolute entropies, you need to add something like the assumption that an ordered crystal at a temperature of absolute zero, has entropy zero (and then to get the entropies of things in other states, find a reversible path between that and the ordered crystal and integrate dQ/T). Or take the statistical mechanics approach, where entropy is defined in terms of the number of distinct microscopically-distinct states the system might be in, where quantum mechanics gives you an absolute value. But none of these are necessary to calculate entropy changes or apply the second law.
Since when is a decrease in entropy not a violation of the second law of thermodynamics?
When it's coupled to an equal-or-larger entropy increase somewhere else. That's why things can get cooler, water freeze, etc without violating the second law.Gordon Davisson
December 31, 2016
December
12
Dec
31
31
2016
07:33 PM
7
07
33
PM
PDT
Gordon:
Entropy flux is essentially a way of tracking the interactions between systems that couple an entropy decrease in one system to a decrease in another system.
This sets off immediate alarm bells. Since when is a decrease in entropy not a violation of the second law of thermodynamics?Mung
December 31, 2016
December
12
Dec
31
31
2016
06:52 PM
6
06
52
PM
PDT
Gordon:
I don’t see how this is relevant to my point.
It's relevant in that it points out that you haven't performed the relevant calculations. For example, do you know the absolute entropy of the earth? Frankly, I think the question itself, what is the entropy of the earth, is nonsensical. The earth not an isolated thermodynamic system at equilibrium. The entropy is not well-defined. Same with the surroundings. How on earth do you propose to calculate the entropy change if you can't calculate the entropy?Mung
December 31, 2016
December
12
Dec
31
31
2016
06:49 PM
6
06
49
PM
PDT
Correct, I’m using a metaphor. Entropy isn’t really a “thing”, and hence cannot move from place to place. But it acts like a thing, so treating it as one provides a set of useful intuitions and metaphors about how it behaves.
I don't know of a single metaphor for entropy that isn't misleading. I agree that entropy is not a thing. I agree that entropy doesn't move from place to place. So what is entropy?Mung
December 31, 2016
December
12
Dec
31
31
2016
06:39 PM
6
06
39
PM
PDT
Mung:
No one knows what the entropy of the earth is any more than they know what the entropy of the universe is.
I don't see how this is relevant to my point. Also, you could probably make a good estimate of the Earth's entropy, although you'd need to know quite a bit about its composition (hint: most of its entropy is going to be in its largest components, the mantle and core), temperature profile, etc and the thermodynamic properties of those components. As for the universe, you could also probably make a reasonable estimate of its average entropy density, but the big unknown is going to be its overall size.
And entropy is not something that is “emitted.”
Correct, I'm using a metaphor. Entropy isn't really a "thing", and hence cannot move from place to place. But it acts like a thing, so treating it as one provides a set of useful intuitions and metaphors about how it behaves. (BTW, I cited the wrong figure in my first comment. I gave the entropy of the light leaving the Earth rather than the difference. I should have 3.3e14 J/K is the figure I should have given.) Technically, what I'm talking about is the Earth's entropy flux. Entropy flux is essentially a way of tracking the interactions between systems that couple an entropy decrease in one system to a decrease in another system. For instance, if there's a near-equilibrium heat flow from one system to another, it'll be associated with an entropy flux of Q/T (amount of heat divided by the absolute temperature), meaning that the entropy of the system the heat is flowing from might decrease by up to Q/T, and the entropy of the system it's flowing to must increase by at least Q/T. So you can think of the heat flow as carrying Q/T of entropy from one system to the other, even though that's technically wrong. In the case of Earth, there's an even simpler way to put it, since the entropy flux I'm describing is just the entropy of the light entering and leaving Earth. The sunlight reaching Earth each second has entropy 3.83e13 J/K. The entropy of the thermal radiation from Earth to deep space is hard to calculate exactly, but it's easy to get a lower bound of 3.7e14 J/K per second. That means the difference is at least 3.3e14 J/K per second. Details are here. (BTW, I cited the wrong figure in my first comment. I gave the entropy of the light leaving the Earth rather than the difference. I should have 3.3e14 J/K is the figure I should have given.)
The earth is not an isolated system, nor is it at an equilibrium, nor are its surroundings.
I agree completely.Gordon Davisson
December 31, 2016
December
12
Dec
31
31
2016
05:53 PM
5
05
53
PM
PDT
as I’ve pointed out repeatedly, the Earth as a whole emits more entropy to its surroundings I'm sorry, but this is just nonsense. No one knows what the entropy of the earth is any more than they know what the entropy of the universe is. And entropy is not something that is "emitted." The earth is not an isolated system, nor is it at an equilibrium, nor are its surroundings. Modern Thermodynamics Entropy: The Truth, the Whole Truth, and Nothing But the TruthMung
December 31, 2016
December
12
Dec
31
31
2016
04:03 PM
4
04
03
PM
PDT
From Rob Sheldon:
In physics, we discuss reversible and irreversible reactions. If entropy (or information) is unchanged, then the system is reversible. If entropy increases (loss of information), then the reaction cannot be reversed. Outside of Darwin’s theory of evolution, there are no irreversible reactions in which entropy decreases (information is gained), because that would enable a perpetual motion machine.
Good grief. When are you going to stop peddling this nonsense? Do you even care how thoroughly it's been refuted? Ok, let's go over why it's wrong. Again. First, identifying entropy increases with loss of information and entropy decreases with gain of information. This isn't actually wrong, but it requires you to use a very unusual definition of "information", and one that's pretty irrelevant to the usual information-based arguments for ID. To illustrate the problem, consider that cooling an object decreases its entropy, while heating an object increases its entropy. Does that mean that cooling an object increases information, and heating it looses information? If we accept Rob's view, it has to mean that. As I said, this isn't actually wrong, you just need to adopt a very unusual definition of information. Essentially, you need to be looking at how much information you have about the precise physical state of the object. Take a simple example: when water freezes, the molecules that make it up move into a much more regular arrangement (a crystal), instead of the mostly-disordered mess they made in the liquid state. This means you know a lot more about the arrangement of the molecules in the solid (crystalline) state than you did in the liquid state. Since you learn something about each molecule, and there are a lot of molecules, this winds up being a huge amount of information; about 1.5 * 10^23 bits of information per gram of water. Now, I choose freezing and melting as an example because the loss/gain of information is fairly easy (heh!) to understand in terms of knowning/not knowing about the arrangement of molecules, but the same thing happens with heating/cooling even when a phase change isn't involved. If you heat liquid water above freezing, the molecules become even less ordered, and so you lose even more information about their state. Similarly, cooling ice below freezing gains you even more information. In principle, if you could cool the water all the way to absolute zero its absolute entropy would also reach zero, and you'd have complete knowledge of the precise physical state of all of the molecules in it. So, the question for readers: to you accept that this is a valid definition of "information"? If so, you've accepted that heat flows can produce huge changes in information -- far larger than any mere intelligent human could ever produce. If you reject it, then you have to toss out Rob's claim as nonsense. Now let me look at the last part of Rob's statement:
Outside of Darwin’s theory of evolution, there are no irreversible reactions in which entropy decreases (information is gained), because that would enable a perpetual motion machine.
This part is just flat-out wrong. Entropy decreases happen all over the place: objects cooling off, freezing, condensing, many chemical reactions, etc. And those are just the easy, obvious examples. This doesn't violate the second law or enable perpetual motion machines because all instances of entropy decrease are coupled to equal-or-larger entropy increases somewhere else. When water freezes, it gives off heat, which heats up .. well, wherever the heat goes, thus increasing the entropy of that place by at least as much as the entropy decrease due to freezing. As far as the second law is concerned, evolution works exactly the same way: organisms emit heat (and entropy in other forms), thus increasing the entropy of their surroundings by more than their entropy decreases (if it does -- it doesn't always). They take in energy in low-entropy forms and dump high-entropy forms of energy to their surroundings. This is what allows them to do a number of things that go against the general trend toward thermodynamic equilibrium, and thus superficially appear to violate the second law: they grow, reproduce, evolve, maintain their state despite changes in their surroundings, etc. The same thing happens at larger scales as well: as I've pointed out repeatedly, the Earth as a whole emits more entropy to its surroundings (by at least 3.7e14 J/K per second -- which corresponds to 3.4e37 bits per second of "informatiom" if you accept the negative information = entropy view). This powers a wide variety of away-from-equilibrium processes processes on Earth, even beyond the ones relating to living organisms. Rob's claim that evolution is unique in this respect is completely and utterly wrong.Gordon Davisson
December 31, 2016
December
12
Dec
31
31
2016
03:08 PM
3
03
08
PM
PDT
1 2

Leave a Reply