Uncommon Descent Serving The Intelligent Design Community

A Maxwell Demon engine in action beyond the Carnot/ “standard” Second law limit

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Maxwell’s Demon (sometimes, “Max”) has long been a fictional device for discussing how if we have access to information we can manipulate molecular scale particles to extract work. Now, physics dot org is discussing a case:

>>Physicists have experimentally demonstrated an information engine—a device that converts information into work—with an efficiency that exceeds the conventional second law of thermodynamics. Instead, the engine’s efficiency is bounded by a recently proposed generalized second law of thermodynamics, and it is the first information engine to approach this new bound . . . . [R]ecent experimental demonstrations of information engines have raised the question of whether there is an upper bound on the efficiency with which an information engine can convert information into work. To address this question, researchers have recently derived a generalized second law of thermodynamics, which accounts for both energy and information being converted into work. However, no experimental information engine has approached the predicted bounds, until now.

The generalized second law of thermodynamics states that the work extracted from an information engine is limited by the sum of two components: the first is the free energy difference between the final and initial states (this is the sole limit placed on conventional engines by the conventional second law), and the other is the amount of available information (this part sets an upper bound on the extra work that can be extracted from information).

To achieve the maximum efficiency set by the generalized second law, the researchers in the new study designed and implemented an information engine made of a particle trapped by light at room temperature. Random thermal fluctuations cause the tiny particle to move slightly due to Brownian motion, and a photodiode tracks the particle’s changing position with a spatial accuracy of 1 nanometer. If the particle moves more than a certain distance away from its starting point in a certain direction, the light trap quickly shifts in the direction of the particle. This process repeats, so that over time the engine transports the particle in a desired direction simply by extracting work from the information it obtains from the system’s random thermal fluctuations (the free energy component here is zero, so it does not contribute to the work extracted).

One of the most important features of this system is its nearly instantaneous feedback response: the trap shifts in just a fraction of a millisecond, giving the particle no time to move further and dissipate energy. As a result, almost none of the energy gained by the shift is lost to heat, but rather nearly all of it is converted into work. By avoiding practically any information loss, the information-to-energy conversion of this process reaches approximately 98.5% of the bound set by the generalized second law.  [Govind Paneru, Dong Yun Lee, Tsvi Tlusty, and Hyuk Kyu Pak. January 12, 2018.  “Lossless Brownian Information Engine.” Physical Review Letters. DOI: 10.1103/PhysRevLett.120.020601]>>

Of course, it can be realised that to set up all of this capability to extract work from information has its own energy and entropy costs. And, in a sense, this is unsurprising, a windmill routinely extracts a high fraction of available energy up to the Betz limit, precisely because the wind is an orderly flow of fluid.

What is highly relevant is that this is an inadvertent confirmation of the relevance of information to thermodynamics and to entropy. All of this fits right in with the informational thermodynamics approach championed by Jaynes and others since. Indeed, Harry S Robertson in Statistical Thermophysics (Prentice, 1993) observed that: “the standard assertion that molecular chaos exists is nothing more than a poorly disguised admission of ignorance, or lack of detailed information about the dynamic state of a system . . . . If I am able to perceive order, I may be able to use it to extract work from the system, but if  I am unaware of internal correlations, I cannot use them for macroscopic dynamical purposes. On this basis, I shall distinguish heat from work, and thermal energy from other forms . . . “ [pp. vii – viii.]

All of this then becomes highly relevant to the ID debates:

>>[T]he results may also lead to practical applications, which the researchers plan to investigate in the future.

“Both nanotechnology and living systems operate at scales where the interplay between thermal noise and information processing is significant,” Pak said. “One may think about engineered systems where information is used to control molecular processes and drive them in the right direction. One possibility is to create hybrids of biological systems and engineered ones, even in the living cell.”>>

Food for thought. END

Comments
J-M & BA77, pardon but this thread is on a major scientific topic and on an emerging issue. Namely, generalisation of the second law of thermodynamics to include information effects under non equilibrium circumstances. Sorry, necessarily mathematical. Could we return to and please keep focus? KFkairosfocus
January 21, 2018
January
01
Jan
21
21
2018
09:34 AM
9
09
34
AM
PDT
Empirically I cannot directly comment on what was on the ark or on the subsequent diversity of life that followed from the animals on the ark. Nor, I believe, can anyone comment with any unquestionable empirical rigor on those questions. Though some have certainly tried:
Noah's Ark would have floated...even with 70,000 animals - 03 Apr 2014 Noah’s Ark would have floated even with two of every animal in the world packed inside, scientists have calculated. Although researchers are unsure if all the creatures could have squeezed into the huge boat, they are confident it would have handled the weight of 70,000 creatures without sinking. A group of master’s students from the Department of Physics and Astronomy at Leicester University studied the exact dimensions of the Ark, set out in Genesis 6:13-22. According to The Bible, God instructed Noah to build a boat which was 300 cubits long 50 cubits wide and 30 cubits high – recommending gopher wood for the enormous lifeboat. The students averaged out the Egyptian and Hebrew cubit measurement to come up with 48.2cm, making the Ark around 144 metres long – about 100 metres shorter than Ark Royal. Using the dimensions, the Archimedes principal of buoyancy and approximate animal wrights they were astonished to find out that the Ark would have floated. Student Thomas Morris, 22, from Chelmsford, said: “You don’t think of the Bible necessarily as a scientifically accurate source of information, so I guess we were quite surprised when we discovered it would work. We’re not proving that it’s true, but the concept would definitely work.” http://www.telegraph.co.uk/news/science/science-news/10740451/Noahs-Ark-would-have-floated...even-with-70000-animals.html
What I can do empirically is appeal to the geologic record to substantiate the Biblical claim for a fairly recent cataclysmic worldwide flooding circa 13,000 to 14,000 years ago. I can also appeal to the first human civilization. A few notes to that effect:
Humanpast.net Excerpt: Worldwide, we know that the period of 14,000 to 13,000 years ago, which coincides with the peak of abundant monsoonal rains over India, was marked by violent oceanic flooding - in fact, the first of the three great episodes of global superfloods that dominated the meltdown of the Ice Age. The flooding was fed not merely by rain but by the cataclysmic synchronous collapse of large ice-masses on several different continents and by gigantic inundations of meltwater pouring down river systems into the oceans. (124) What happened, at around 13,000 years ago, was that the long period of uninterrupted warming that the world had just passed through (and that had greatly intensified, according to some studies, between 15,000 years ago and 13,000 years ago) was instantly brought to a halt - all at once, everywhere - by a global cold event known to palaeo climatologists as the 'Younger Dryas' or 'Dryas III'. In many ways mysterious and unexplained, this was an almost unbelievably fast climatic reversion - from conditions that are calculated to have been warmer and wetter than today's 13,000 years ago, to conditions that were colder and drier than those at the Last Glacial Maximum, not much more than a thousand years later. From that moment, around 12,800 years ago, it was as though an enchantment of ice had gripped the earth. In many areas that had been approaching terminal meltdown full glacial conditions were restored with breathtaking rapidity and all the gains that had been made since the LGM were simply stripped away…(124) A great, sudden extinction took place on the planet, perhaps as recently as 11,500 years ago (usually attributed to the end of that last ice age), in which hundreds of mammal and plant species disappeared from the face of the earth, driven into deep caverns and charred muck piles the world over. Modern science, with all its powers and prejudices, has been unable to adequately explain this event. (83) http://humanpast.net/environment/environment11k.htm Study: Deep beneath the earth, more water than in all the oceans combined – June 16, 2014 Excerpt: And its a good thing, too, Jacobsen told New Scientist: “We should be grateful for this deep reservoir. If it wasn’t there, it would be on the surface of the Earth, and mountain tops would be the only land poking out.” https://www.washingtonpost.com/news/morning-mix/wp/2014/06/16/study-deep-beneath-north-america-theres-more-water-than-in-all-the-oceans-combined/ Genesis 7:11 In the six hundredth year of Noah’s life, in the second month, on the seventeenth day of the month, on the same day all the fountains of the great deep burst open, and the floodgates of the sky were opened.
In this following video lecture (on the ‘table of nations'), at around the 6:00 minute mark, we find that the first ‘advanced’ human civilization, (with the oldest archaeological evidence of metallurgy, agriculture, wine making, etc…), flourished near, or at, the Ankara area,,,(The Ankara area is called Anatolia in the video), which is close to where Noah’s Ark is said to have come to rest on a mountaintop:
Tracing your Ancestors through History - Paul James-Griffiths – video https://www.youtube.com/watch?v=YHL8UU0BEgk Ankara Excerpt: Centrally located in Anatolia, Ankara is an important commercial and industrial city. http://en.wikipedia.org/wiki/Ankara
Although, because of his Young Earth Biblical view, Paul James-Griffiths did not give the dating of the area, the dating of the first 'advanced' human civilization, around that area, is approx. 12,000 years before the present:
Stone Age Temple May Be Birthplace of Civilization Excerpt: The elaborate temple at Gobelki Tepe in southeastern Turkey, near the Syrian border, is staggeringly ancient: 11,500 years old, from a time just before humans learned to farm grains and domesticate animals. According to the German archaeologist in charge of excavations at the site, it might be the birthplace of agriculture, of organized religion — of civilization itself. http://www.freerepublic.com/tag/gobeklitepe/index Science Does Not Understand Our Consciousness of God, but Not for the Reasons We Might Think - Denyse O’Leary - February 14, 2017 Excerpt: Gobekli Tepe in Southeastern Anatolia, Turkey, discovered in 1994. Dated at 11,500 years ago, it seems to have been a massive worship site.,,, As science writer Charles C. Mann explained in National Geographic, “Gobekli Tepe was like finding that someone had built a 747 in a basement with an X-Acto knife.” The find suggests to many that religion, rather than agriculture, built civilization.,,, https://www.hbu.edu/news-and-events/2017/02/14/science-not-understand-consciousness-god-not-reasons-might-think/
bornagain77
January 21, 2018
January
01
Jan
21
21
2018
06:14 AM
6
06
14
AM
PDT
BA77 I'm talking mainly about the variety of life forms that had to go to the ark. I agree that the majority of life is bacteria, microbes etc...J-Mac
January 21, 2018
January
01
Jan
21
21
2018
05:52 AM
5
05
52
AM
PDT
F/N2: As DV, we are going to go into some further details (and yes, there will be more of Thermodynamics and Math), let's set some further background on Helmholtz free energy, F -- NB IUPAC recommends A but F is common and familiar. Wiki again is handy for non-controversial topics needing a quick tutorial-level reference. I will be substituting F:
In thermodynamics, the Helmholtz free energy is a thermodynamic potential that measures the "useful" work obtainable from a closed thermodynamic system at a constant temperature and volume. The negative of the difference in the Helmholtz energy is equal to maximum amount of work that the system can perform in a thermodynamic process in which volume is held constant. If the volume is not held constant, part of this work will be performed as boundary work. The Helmholtz energy is commonly used for systems held at constant volume. Since in this case no work is performed on the environment, the drop in the Helmholtz energy is equal to the maximum amount of useful work that can be extracted from the system. For a system at constant temperature and volume, the Helmholtz energy is minimized at equilibrium . . . . While Gibbs free energy is most commonly used as a measure of thermodynamic potential, especially in the field of chemistry, it is inconvenient for some applications that do not occur at constant pressure. For example, in explosives research, Helmholtz free energy is often used, since explosive reactions by their nature induce pressure changes. It is also frequently used to define fundamental equations of state of pure substances . . . . The Helmholtz energy is defined as[3] F = U - T*S , where F is the Helmholtz free energy (SI: joules, CGS: ergs), U is the internal energy of the system (SI: joules, CGS: ergs), T is the absolute temperature (kelvins) of the surroundings, modelled as a heat bath, S is the entropy of the system (SI: joules per kelvin, CGS: ergs per kelvin).
This then brings up the Generalised Jarzynski expression used by Paneru et al, and as approached in their experiment to the 98.5% level. Clipping the paper reported on in the OP's news piece:
Here, we examine a bound on demons, i.e., information engines, that follows from a generalization of a Jarzynski equality [28] to feedback-controlled systems [8,9,15,17,29]: [Gen. Jarzynski:] Expect'n {exp[ - beta( W - delta-F) - (I - I_u)]} = 1 . . . Eqn 1 The exponent averaged in Eq. (1) augments the terms from the standard Jarzynski equality — the work performed on the system W and the free energy change delta-F (in k_B * T = 1/beta units) — with a contribution from the information circuitry: I is the information gathered by measurements, out of which a part I_u becomes unavailable due to the irreversibility of the feedback process [17].
[NB: I is being measured by a yardstick amenable to working with natural logs, not the more typical bits, which use a base of 2 for logging. Ref 17 is: "General achievable bound of extractable work under feedback control" by Yuto Ashida et al.]
Applying Jansen’s inequality to Eq. (1) yields a gener-alized second law [17]: Expec {beta * W} LT/= - beta * delta-F [thermal-only term] + Expec {I} - Expec {I_u} [avail. info term] . . . Eqn 2 Namely, the work extracted from the information engine . . . cannot exceed the sum of the free energy difference between final and initial states . . . and the available information. In the absence of information, the inequality recaps the notion of the free energy as the maximal available work in an isothermal process . . . while the additional term sets an upper bound on extra work that can be gained from information on the system.
This opens up a whole new world. KFkairosfocus
January 21, 2018
January
01
Jan
21
21
2018
01:07 AM
1
01
07
AM
PDT
F/N: I am wondering if it is registering just how significant the result in the OP is, i/l/o its proposal of a particular generalisation of the second law of thermodynamics through the impact of micro-particle level observability and linked information allowing actual not just theoretically discussed work to be done. This shifts the balance towards the Jaynes et al informational approach to thermodynamics in highly significant ways, and it also is a second case of something that is inherently abstract having demonstrably significant causal effect. The first is of course the impact of numbers and more broadly the logic of structure and quantity, AKA mathematics. KF PS1: Notice, Paneru et al:
[R]ecent experimental demonstrations of information engines have raised the question of whether there is an upper bound on the efficiency with which an information engine can convert information into work. To address this question, researchers have recently derived a generalized second law of thermodynamics, which accounts for both energy and information being converted into work. However, no experimental information engine has approached the predicted bounds, until now. The generalized second law of thermodynamics [--> as proposed., and this generalisation, there are others on the table esp relating to gravity and black holes etc] states that the work extracted from an information engine is limited by the sum of two components: the first is the free energy difference between the final and initial states (this is the sole limit placed on conventional engines by the conventional second law),
[--> think, Gibbs Free Energy, e.g., HT Wiki as handy source, (G = H-TS) (J in SI units) is "the maximum amount of non-expansion work that can be extracted from a thermodynamically closed system (one that can exchange heat and work with its surroundings, but not matter); this maximum can be attained only in a completely reversible process.]" H being Enthalpy, a measurement of energy in a thermodynamic system; equal to the internal energy of the system plus the product of pressure and volume, H = U + PV -- reflecting that P-V changes tend to affect both work done on and by a system and its internal energy, e.g. as pressurised air escapes a tyre when the valve is opened, it cools noticeably. T is Absolute Temperature and S is entropy. Also, "the thermodynamic potential that is minimized when a system reaches chemical equilibrium at constant pressure and temperature. Its derivative with respect to the reaction coordinate of the system vanishes at the equilibrium point. As such, a reduction in G is a necessary condition for the spontaneity of processes at constant pressure and temperature." That is, this embeds the entropy increase as a system moves to thermodynamic equilibrium.
. . . and the other is the amount of available information (this part sets an upper bound on the extra work that can be extracted from information).
And,
[T]he results may also lead to practical applications, which the researchers plan to investigate in the future. “Both nanotechnology and living systems operate at scales where the interplay between thermal noise and information processing is significant,” Pak said. “One may think about engineered systems where information is used to control molecular processes and drive them in the right direction. One possibility is to create hybrids of biological systems and engineered ones, even in the living cell.”
PS2: Notice, again Harry S Robertson in Statistical Thermophyiscs:
“the standard assertion that molecular chaos exists is nothing more than a poorly disguised admission of ignorance, or lack of detailed information about the dynamic state of a system . . . . If I am able to perceive order, I may be able to use it to extract work from the system, but if I am unaware of internal correlations, I cannot use them for macroscopic dynamical purposes. On this basis, I shall distinguish heat from work, and thermal energy from other forms . . . “ [pp. vii – viii.]
Notice, thermodynamics as has been developed pivots on ignorance of the specifics of microstate, so this raises huge questions as to what happens if info systems are able to act at molecular level, in effect converting some such entities to organised, observable entities. Which is exactly what is happening in life. In turn that points to, where does such information come from, and at what entropy cost elsewhere. Likewise, what is the overall cost to bring that information to bear on the target system, i.e. is there an exporting to the rest of the world. In the case above, yes, that is obvious. Think, too, about the ways in which life systems routinely create organised concentration gradients across membranes [and semiconductor entities also, which are now down to the sort of scale of interest]. Notice, how in both cases there is a lot of drawing on energy drivers and storage, through things like ATP [thus the ATP synthase enzyme] and power supplies. PS3: This brings us back to a point I long since clipped from Wiki for my always linked briefing note, on information and entropy:
At an everyday practical level the links between information entropy and thermodynamic entropy are not close. Physicists and chemists are apt to be more interested in changes in entropy as a system spontaneously evolves away from its initial conditions, in accordance with the second law of thermodynamics, rather than an unchanging probability distribution. And, as the numerical smallness of Boltzmann's constant kB indicates, the changes in S / kB for even minute amounts of substances in chemical and physical processes represent amounts of entropy which are so large as to be right off the scale compared to anything seen in data compression or signal processing. But, at a multidisciplinary level, connections can be made between thermodynamic and informational entropy, although it took many years in the development of the theories of statistical mechanics and information theory to make the relationship fully apparent. In fact, in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon's information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states that it could be in, thus making any complete state description longer. (See article: maximum entropy thermodynamics.[Also,another article remarks: >>in the words of G. N. Lewis writing about chemical entropy in 1930, "Gain in entropy always means loss of information, and nothing more" . . . in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes/no questions that need to be answered in order to fully specify the microstate, given that we know the macrostate.>>]) Maxwell's demon can (hypothetically) reduce the thermodynamic entropy of a system by using information about the states of individual molecules; but, as Landauer (from 1961) and co-workers have shown, to function the demon himself must increase thermodynamic entropy in the process, by at least the amount of Shannon information he proposes to first acquire and store; and so the total entropy does not decrease (which resolves the paradox).
Food for thought.kairosfocus
January 20, 2018
January
01
Jan
20
20
2018
11:45 PM
11
11
45
PM
PDT
Of supplemental note: Even many Darwinists admit to surprisingly rapid recovery of life forms after mass extinction events in the deep past:
Terrestrial biodiversity recovered faster after Permo-Triassic extinction than previously believed - October 2011 Excerpt: While the cause of the mass extinction that occurred between the Permian and Triassic periods (250 million years ago) is still uncertain, two University of Rhode Island researchers collected data that show that terrestrial biodiversity recovered much faster than previously thought, potentially contradicting several theories for the cause of the extinction. ,,, David Fastovsky, URI professor of geosciences, and graduate student David Tarailo found that terrestrial biodiversity recovered in about 5 million years, compared to the 15- to 30-million year recovery period that earlier studies had estimated. http://www.physorg.com/news/2011-10-terrestrial-biodiversity-recovered-faster-permo-triassic.html
Of note; I believe 5 million years is very close to what is termed the 'geologic resolution time' for accurately dating these ancient time periods precisely.bornagain77
January 20, 2018
January
01
Jan
20
20
2018
06:03 PM
6
06
03
PM
PDT
J-Mac, as I understand it, the number of 'billions' is controversial. Moreover, bacteria in any count, make up, by far, the largest percentage. I believe that the evidence unequivocally shows, when looking at the whole geologic and/or fossil record instead of just looking at a snapshot, that diversification of species 'within kind', upon the earth, has been wrought by a top down process of genetic entropy and certainly has not been wrought by bottom up Darwinian processes..bornagain77
January 20, 2018
January
01
Jan
20
20
2018
05:12 PM
5
05
12
PM
PDT
@BA77 How do you explain the variety of life forms reaching beyond billions after the Noah's Flood? Same say it reaches a trillion...J-Mac
January 20, 2018
January
01
Jan
20
20
2018
04:15 PM
4
04
15
PM
PDT
@BA77 How do you explain the variety of life forms after the Noah's Flood?J-Mac
January 20, 2018
January
01
Jan
20
20
2018
04:13 PM
4
04
13
PM
PDT
KF, as pointed out in this video,,,
Darwinian Materialism vs. Quantum Biology – video https://youtu.be/LHdD2Am1g5Y
The 'randomness' of biological systems is, contrary to Darwinian presuppositions, and for all practical purposes, non-existent in biological systems. This 'lack of randomness', which Darwinists fully expected to see but which is not there, is directly attributable to 'quantum information' which is constraining biological life to be so far out of thermodynamic equilibrium. A few excerpted notes from my referenced video to that effect:
Jim Al-Khalili, at the 2:30 minute mark of the following video states, ",,and Physicists and Chemists have had a long time to try and get use to it (Quantum Mechanics). Biologists, on the other hand have got off lightly in my view. They are very happy with their balls and sticks models of molecules. The balls are the atoms. The sticks are the bonds between the atoms. And when they can't build them physically in the lab nowadays they have very powerful computers that will simulate a huge molecule.,, It doesn't really require much in the way of quantum mechanics in the way to explain it." At the 6:52 minute mark of the video, Jim Al-Khalili goes on to state: “To paraphrase, (Erwin Schrödinger in his book “What Is Life”), he says at the molecular level living organisms have a certain order. A structure to them that’s very different from the random thermodynamic jostling of atoms and molecules in inanimate matter of the same complexity. In fact, living matter seems to behave in its order and its structure just like inanimate matter cooled down to near absolute zero. Where quantum effects play a very important role. There is something special about the structure, about the order, inside a living cell. So Schrodinger speculated that maybe quantum mechanics plays a role in life”. Jim Al-Khalili – Quantum biology – video https://www.youtube.com/watch?v=zOzCkeTPR3Q
And in confirmation of Al-Khalili, and Erwin Schrodinger's, contention that life acts like inanimate matter cooled down to near absolute zero, in the following experiment it was found that protein molecules do indeed act like inanimate matter cooled down to near absolute zero.
Quantum coherent-like state observed in a biological protein for the first time - October 13, 2015 Excerpt: If you take certain atoms and make them almost as cold as they possibly can be, the atoms will fuse into a collective low-energy quantum state called a Bose-Einstein condensate. In 1968 physicist Herbert Fröhlich predicted that a similar process at a much higher temperature could concentrate all of the vibrational energy in a biological protein into its lowest-frequency vibrational mode. Now scientists in Sweden and Germany have the first experimental evidence of such so-called Fröhlich condensation (in proteins).,,, The real-world support for Fröhlich's theory took so long to obtain because of the technical challenges of the experiment, Katona said. https://phys.org/news/2015-10-quantum-coherent-like-state-biological-protein.html
And the following paper also confirmed Erwin Schrödinger's contention that life is based on quantum mechanical principles:
Proteins ‘ring like bells’ - June 2014 Excerpt: As far back as 1948, Erwin Schrödinger—the inventor of modern quantum mechanics—published the book “What is life?” In it, he suggested that quantum mechanics and coherent ringing might be at the basis of all biochemical reactions. At the time, this idea never found wide acceptance because it was generally assumed that vibrations in protein molecules would be too rapidly damped. Now, scientists at the University of Glasgow have proven he was on the right track after all. Using modern laser spectroscopy, the scientists have been able to measure the vibrational spectrum of the enzyme lysozyme, ,,, The experiments show that the ringing motion lasts for only a picosecond or one millionth of a millionth of a second. Biochemical reactions take place on a picosecond timescale and,,, (are) optimised enzymes to ring for just the right amount of time. Any shorter, and biochemical reactions would become inefficient as energy is drained from the system too quickly. Any longer and the enzyme would simple oscillate forever: react, unreact, react, unreact, etc. The picosecond ringing time is just perfect for the most efficient reaction.,,, Klaas Wynne, Chair in Chemical Physics at the University of Glasgow said: “This research shows us that proteins have mechanical properties that are highly unexpected and geared towards maximising efficiency.”,,, http://www.gla.ac.uk/news/headline_334344_en.html
The video I referenced goes into more detail, but the rub is that Quantum information, which is conserved, and is not reducible to any possible material explanation (i.e. non-locality), is the physically real entity, separate from matter and energy, that is constraining the molecules of biological systems to be so far out of thermodynamic equilibrium.bornagain77
January 20, 2018
January
01
Jan
20
20
2018
10:28 AM
10
10
28
AM
PDT
BA77, of course, if you can use information on observable state to extract work, you are in a different regime from dealing with a blackbox that has to be treated as random. KFkairosfocus
January 20, 2018
January
01
Jan
20
20
2018
09:51 AM
9
09
51
AM
PDT
KF, I posted your linked reference on Facebook, to which a friend commented:
"To my simple mind information is an abstract concept... How does one get work out of an abstraction? What am I missing?"
to which I replied:
'What am I missing?" What you are 'missing' is the fact that immaterial information is a physically real entity that has a 'thermodynamic content'. A physically real entity that, although capable of interacting with matter and energy, is separate from, and irreducible to, matter and energy:
As I did in post 5, I also referenced these videos for him
Information is physical (but not how Rolf Landauer meant) – video https://youtu.be/H35I83y5Uro Darwinian Materialism vs. Quantum Biology – video https://youtu.be/LHdD2Am1g5Y Darwinism vs Biological Form – video https://www.youtube.com/watch?v=JyNzNPgjM4w
and this following video is also relevant to the whole 'immaterial information is physically real' line of evidence:
Darwinian Evolution vs Mathematics - video https://www.youtube.com/watch?v=q3gyx70BHvA
bornagain77
January 20, 2018
January
01
Jan
20
20
2018
08:50 AM
8
08
50
AM
PDT
Great stuff, KF.tribune7
January 20, 2018
January
01
Jan
20
20
2018
06:11 AM
6
06
11
AM
PDT
The finding that 'information is physical' and that information has a 'thermodynamic content' directly falsifies the reductive materialistic, i.e. Darwinian, presupposition, that holds that information is merely 'emergent' from a material basis.
Information is physical (but not how Rolf Landauer meant) - video https://youtu.be/H35I83y5Uro Demonic device converts information to energy – 2010 Excerpt: “This is a beautiful experimental demonstration that information has a thermodynamic content,” says Christopher Jarzynski, a statistical chemist at the University of Maryland in College Park. In 1997, Jarzynski formulated an equation to define the amount of energy that could theoretically be converted from a unit of information2; the work by Sano and his team has now confirmed this equation. “This tells us something new about how the laws of thermodynamics work on the microscopic scale,” says Jarzynski. http://www.scientificamerican.com/article.cfm?id=demonic-device-converts-inform Maxwell’s demon demonstration (knowledge of a particle’s position) turns information into energy – November 2010 Excerpt: Scientists in Japan are the first to have succeeded in converting information into free energy in an experiment that verifies the “Maxwell demon” thought experiment devised in 1867.,,, In Maxwell’s thought experiment the demon creates a temperature difference simply from information about the gas molecule temperatures and without transferring any energy directly to them.,,, Until now, demonstrating the conversion of information to energy has been elusive, but University of Tokyo physicist Masaki Sano and colleagues have succeeded in demonstrating it in a nano-scale experiment. In a paper published in Nature Physics they describe how they coaxed a Brownian particle to travel upwards on a “spiral-staircase-like” potential energy created by an electric field solely on the basis of information on its location. As the particle traveled up the staircase it gained energy from moving to an area of higher potential, and the team was able to measure precisely how much energy had been converted from information. http://www.physorg.com/news/2010-11-maxwell-demon-energy.html Information: From Maxwell’s demon to Landauer's eraser - Lutz and Ciliberto - Oct. 25, 2015 - Physics Today Excerpt: The above examples of gedanken-turned-real experiments provide a firm empirical foundation for the physics of information and tangible evidence of the intimate connection between information and energy. They have been followed by additional experiments and simulations along similar lines.12 (See, for example, Physics Today, August 2014, page 60.) Collectively, that body of experimental work further demonstrates the equivalence of information and thermodynamic entropies at thermal equilibrium.,,, (2008) Sagawa and Ueda’s (theoretical) result extends the second law to explicitly incorporate information; it shows that information, entropy, and energy should be treated on equal footings. http://www.johnboccio.com/research/quantum/notes/Information.pdf J. Parrondo, J. Horowitz, and T. Sagawa. Thermodynamics of information. Nature Physics, 11:131-139, 2015. Matter, energy… knowledge: - May 11, 2016 Running a brain-twisting thought experiment for real shows that information is a physical thing – so can we now harness the most elusive entity in the cosmos? https://www.newscientist.com/article/mg23030730-200-demon-no-more-physics-most-elusive-entity-gives-up-its-secret/ New Scientist astounds: Information is physical – May 13, 2016 Excerpt: Recently came the most startling demonstration yet: a tiny machine powered purely by information, which chilled metal through the power of its knowledge. This seemingly magical device could put us on the road to new, more efficient nanoscale machines, a better understanding of the workings of life, and a more complete picture of perhaps our most fundamental theory of the physical world. https://uncommondescent.com/news/new-scientist-astounds-information-is-physical/ The Quantum Thermodynamics Revolution – May 2017 Excerpt: the 19th-century physicist James Clerk Maxwell put it, “The idea of dissipation of energy depends on the extent of our knowledge.” In recent years, a revolutionary understanding of thermodynamics has emerged that explains this subjectivity using quantum information theory — “a toddler among physical theories,” as del Rio and co-authors put it, that describes the spread of information through quantum systems. Just as thermodynamics initially grew out of trying to improve steam engines, today’s thermodynamicists are mulling over the workings of quantum machines. Shrinking technology — a single-ion engine and three-atom fridge were both experimentally realized for the first time within the past year — is forcing them to extend thermodynamics to the quantum realm, where notions like temperature and work lose their usual meanings, and the classical laws don’t necessarily apply. They’ve found new, quantum versions of the laws that scale up to the originals. Rewriting the theory from the bottom up has led experts to recast its basic concepts in terms of its subjective nature, and to unravel the deep and often surprising relationship between energy and information — the abstract 1s and 0s by which physical states are distinguished and knowledge is measured.,,, Renato Renner, a professor at ETH Zurich in Switzerland, described this as a radical shift in perspective. Fifteen years ago, “we thought of entropy as a property of a thermodynamic system,” he said. “Now in information theory, we wouldn’t say entropy is a property of a system, but a property of an observer who describes a system.”,,, https://www.quantamagazine.org/quantum-thermodynamics-revolution/
Put simply, Darwinists, with their reductive materialistic framework, a framework which denies the physical reality of immaterial information, are not even on the correct theoretical foundation in order to properly understand biological organisms in the first place.
Darwinian Materialism vs. Quantum Biology - video https://youtu.be/LHdD2Am1g5Y Darwinism vs Biological Form - video https://www.youtube.com/watch?v=JyNzNPgjM4w
Verse:
John 1:1-4 In the beginning was the Word, and the Word was with God, and the Word was God. He was with God in the beginning. Through him all things were made; without him nothing was made that has been made. In him was life, and that life was the light of all mankind.
bornagain77
January 20, 2018
January
01
Jan
20
20
2018
04:32 AM
4
04
32
AM
PDT
GP, indeed. KFkairosfocus
January 20, 2018
January
01
Jan
20
20
2018
03:20 AM
3
03
20
AM
PDT
F/N: On waking back up I thought, h'mm, the Carnot reference may be a bit obscure so I put in "standard" second law [of thermodynamics]. Notice, this is discussing a generalisation of that law that incorporates information. Issues of info sources and propagation etc come into play. Not to mention, metrics. KFkairosfocus
January 20, 2018
January
01
Jan
20
20
2018
03:19 AM
3
03
19
AM
PDT
KF: Very interesting. Thank you! :)gpuccio
January 20, 2018
January
01
Jan
20
20
2018
02:53 AM
2
02
53
AM
PDT
A Maxwell Demon engine in action beyond the Carnot limit -- implications for ID debates.kairosfocus
January 19, 2018
January
01
Jan
19
19
2018
11:46 PM
11
11
46
PM
PDT

Leave a Reply