Uncommon Descent Serving The Intelligent Design Community

The Second Law: In Force Everywhere But Nowhere?

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

I hope our materialist friends will help us with this one.

As I understand their argument, entropy is not an obstacle to blind watchmaker evolution, because entropy applies absolutely only in a “closed system,” and the earth is not a closed system because it receives electromagnetic radiation from space.

Fair enough. But it seems to me that under that definition of “closed system” only the universe as a whole is a closed system, because every particular place in the universe receives energy of some kind from some other place. And if that is so, it seems the materialists have painted themselves into a corner in which they must, to remain logically consistent, assert that entropy applies everywhere but no place in particular, which is absurd.

Now this seems like an obvious objection, and if it were valid the “closed system/open system” argument would have never gained any traction to begin with. So I hope someone will clue me in as to what I am missing.

Comments
Then, a little bit later, I learned that the delayed choice experiment had been extended:
The Experiment That Debunked Materialism – video – (delayed choice quantum eraser) http://www.youtube.com/watch?v=6xKUass7G8w (Double Slit) A Delayed Choice Quantum Eraser – updated 2007 Excerpt: Upon accessing the information gathered by the Coincidence Circuit, we the observer are shocked to learn that the pattern shown by the positions registered at D0 (Detector Zero) at Time 2 depends entirely on the information gathered later at Time 4 and available to us at the conclusion of the experiment. http://www.bottomlayer.com/bottom/kim-scully/kim-scully-web.htm
And then I learned the delayed choice experiment was refined yet again:
“If we attempt to attribute an objective meaning to the quantum state of a single system, curious paradoxes appear: quantum effects mimic not only instantaneous action-at-a-distance but also, as seen here, influence of future actions on past events, even after these events have been irrevocably recorded.” Asher Peres, Delayed choice for entanglement swapping. J. Mod. Opt. 47, 139-143 (2000). Quantum physics mimics spooky action into the past – April 23, 2012 Excerpt: The authors experimentally realized a “Gedankenexperiment” called “delayed-choice entanglement swapping”, formulated by Asher Peres in the year 2000. Two pairs of entangled photons are produced, and one photon from each pair is sent to a party called Victor. Of the two remaining photons, one photon is sent to the party Alice and one is sent to the party Bob. Victor can now choose between two kinds of measurements. If he decides to measure his two photons in a way such that they are forced to be in an entangled state, then also Alice’s and Bob’s photon pair becomes entangled. If Victor chooses to measure his particles individually, Alice’s and Bob’s photon pair ends up in a separable state. Modern quantum optics technology allowed the team to delay Victor’s choice and measurement with respect to the measurements which Alice and Bob perform on their photons. “We found that whether Alice’s and Bob’s photons are entangled and show quantum correlations or are separable and show classical correlations can be decided after they have been measured”, explains Xiao-song Ma, lead author of the study. According to the famous words of Albert Einstein, the effects of quantum entanglement appear as “spooky action at a distance”. The recent experiment has gone one remarkable step further. “Within a naïve classical world view, quantum mechanics can even mimic an influence of future actions on past events”, says Anton Zeilinger. http://phys.org/news/2012-04-quantum-physics-mimics-spooky-action.html
i.e. The preceding experiment clearly shows, and removes any doubt whatsoever, that the ‘material’ detector recording information in the double slit is secondary to the experiment and that a conscious observer being able to consciously know the ‘which path’ information of a photon with local certainty, is of primary importance in the experiment. You can see a more complete explanation of the startling results of the experiment at the 9:11 minute mark of the following video:
Delayed Choice Quantum Eraser Experiment Explained – 2014 video http://www.youtube.com/watch?v=H6HLjpj4Nt4
And then, after the delayed choice experiments, I learned about something called Leggett’s Inequality. Leggett’s Inequality was, as far as I can tell, a mathematical proof designed by Nobelist Anthony Leggett to try to prove ‘realism’. Realism is the belief that an objective reality exists independently of a conscious observer looking at it. And, as is usual with challenging the predictions of Quantum Mechanics, his proof was violated by a stunning 80 orders of magnitude, thus once again, in an over the top fashion, highlighting the central importance of the conscious observer to Quantum Experiments:
A team of physicists in Vienna has devised experiments that may answer one of the enduring riddles of science: Do we create the world just by looking at it? – 2008 Excerpt: In mid-2007 Fedrizzi found that the new realism model was violated by 80 orders of magnitude; the group was even more assured that quantum mechanics was correct. Leggett agrees with Zeilinger that realism is wrong in quantum mechanics, but when I asked him whether he now believes in the theory, he answered only “no” before demurring, “I’m in a small minority with that point of view and I wouldn’t stake my life on it.” For Leggett there are still enough loopholes to disbelieve. I asked him what could finally change his mind about quantum mechanics. Without hesitation, he said sending humans into space as detectors to test the theory.,,, (to which Anton Zeilinger responded) When I mentioned this to Prof. Zeilinger he said, “That will happen someday. There is no doubt in my mind. It is just a question of technology.” Alessandro Fedrizzi had already shown me a prototype of a realism experiment he is hoping to send up in a satellite. It’s a heavy, metallic slab the size of a dinner plate. http://seedmagazine.com/content/article/the_reality_tests/P3/ Alain Aspect and Anton Zeilinger by Richard Conn Henry – Physics Professor – John Hopkins University Excerpt: Why do people cling with such ferocity to belief in a mind-independent reality? It is surely because if there is no such reality, then ultimately (as far as we can know) mind alone exists. And if mind is not a product of real matter, but rather is the creator of the “illusion” of material reality (which has, in fact, despite the materialists, been known to be the case, since the discovery of quantum mechanics in 1925), then a theistic view of our existence becomes the only rational alternative to solipsism (solipsism is the philosophical idea that only one’s own mind is sure to exist). (Dr. Henry’s referenced experiment and paper – “An experimental test of non-local realism” by S. Gröblacher et. al., Nature 446, 871, April 2007 – “To be or not to be local” by Alain Aspect, Nature 446, 866, April 2007 (Leggett’s Inequality: Verified to 80 orders of magnitude) http://henry.pha.jhu.edu/aspect.html
The following video gets the general, and dramatic, point across of what ‘giving up realism’ actually means:
Quantum Physics – (material reality does not exist until we look at it) – Dr. Quantum video http://www.youtube.com/watch?v=D1ezNvpFcJU
Further work confirms the initial results for Leggett's Inequality
Macrorealism Emerging from Quantum Physics – Brukner, Caslav; Kofler, Johannes American Physical Society, APS March Meeting, – March 5-9, 2007 Excerpt: for unrestricted measurement accuracy a violation of macrorealism (i.e., a violation of the Leggett-Garg inequalities) is possible for arbitrary large systems.,, http://adsabs.harvard.edu/abs/2007APS..MARB33005B
But, as if all that was not enough to demonstrate consciousness’s centrality in quantum mechanics, I then learned about something called the ‘Quantum Zeno Effect’,,
Quantum Zeno Effect The quantum Zeno effect is,, an unstable particle, if observed continuously, will never decay. http://en.wikipedia.org/wiki/Quantum_Zeno_effect “It has been experimentally confirmed,, that unstable particles will not decay, or will decay less rapidly, if they are observed. Somehow, observation changes the quantum system. We’re talking pure observation, not interacting with the system in any way.” Douglas Ell – Counting to God – pg. 189 – 2014 – Douglas Ell graduated early from MIT, where he double majored in math and physics.
bornagain77
July 20, 2014
July
07
Jul
20
20
2014
06:02 AM
6
06
02
AM
PDT
kf, as far as quantum mechanics is concerned, conscious observation is not so easily dismissed as to having an effect on atomic molecules. And can only be ignored on pain of a infinite number of many world kairosfocus's all agreeing and disagreeing with me. i.e. on pain of Hyper irrationality! That consciousness is integral to quantum mechanics is fairly obvious to the unbiased observer (no pun intended). I first, much like everybody else, was immediately shocked to learn that the observer could have any effect whatsoever in the double slit experiment:
Quantum Mechanics – Double Slit and Delayed Choice Experiments – video https://vimeo.com/87175892 Dr. Quantum – Double Slit Experiment – video https://www.youtube.com/watch?v=Q1YqgPAtzho Double Slit Experiment – Explained By Prof Anton Zeilinger – video http://www.metacafe.com/watch/6101627/ Quantum Mechanics – Double Slit Experiment. Is anything real? (Prof. Anton Zeilinger) – video http://www.youtube.com/watch?v=ayvbKafw2g0
Prof. Zeilinger makes this rather startling statement in the preceding video:
“The path taken by the photon is not an element of reality. We are not allowed to talk about the photon passing through this or this slit. Neither are we allowed to say the photon passes through both slits. All this kind of language is not applicable.” Anton Zeilinger
I think Feynman, who had high regard for the double slit experiment, sums the ‘paradox’ up extremely well,
…the “paradox” is only a conflict between reality and your feeling of what reality “ought to be.” Richard Feynman, in The Feynman Lectures on Physics, vol III, p. 18-9 (1965)
Dean Radin, who spent years at Princeton testing different aspects of consciousness (with positive results towards consciousness having causality), recently performed experiments testing the possible role of consciousness in the double slit. His results were, not so surprisingly, very supportive of consciousness’s central role in the experiment:
Consciousness and the double-slit interference pattern: six experiments – Radin – 2012 Abstract: A double-slit optical system was used to test the possible role of consciousness in the collapse of the quantum wavefunction. The ratio of the interference pattern’s double-slit spectral power to its single-slit spectral power was predicted to decrease when attention was focused toward the double slit as compared to away from it. Each test session consisted of 40 counterbalanced attention-toward and attention-away epochs, where each epoch lasted between 15 and 30 s(seconds). Data contributed by 137 people in six experiments, involving a total of 250 test sessions, indicate that on average the spectral ratio decreased as predicted (z = -4:36, p = 6·10^-6). Another 250 control sessions conducted without observers present tested hardware, software, and analytical procedures for potential artifacts; none were identified (z = 0:43, p = 0:67). Variables including temperature, vibration, and signal drift were also tested, and no spurious influences were identified. By contrast, factors associated with consciousness, such as meditation experience, electrocortical markers of focused attention, and psychological factors including openness and absorption, significantly correlated in predicted ways with perturbations in the double-slit interference pattern. The results appear to be consistent with a consciousness-related interpretation of the quantum measurement problem. http://www.deanradin.com/papers/Physics%20Essays%20Radin%20final.pdf
Of course, atheists/materialists were/are in complete denial as to the obvious implications of mind in the double slit (invoking infinite parallel universes and such absurdity as that to try to get around the obvious implications of ‘Mind’). But personally, not being imprisoned in the materialist’s self imposed box, my curiosity was aroused and I’ve been sort of poking around, finding out a little more here and there about quantum mechanics, and how the observer is central to it. One of the first interesting experiments in quantum mechanics I found after the double slit, that highlighted the centrality of the observer to the experiment, was Wigner’s Quantum Symmetries. Here is Wigner commenting on the key experiment that led Wigner to his Nobel Prize winning work on quantum symmetries,,,
Eugene Wigner Excerpt: When I returned to Berlin, the excellent crystallographer Weissenberg asked me to study: why is it that in a crystal the atoms like to sit in a symmetry plane or symmetry axis. After a short time of thinking I understood:,,,, To express this basic experience in a more direct way: the world does not have a privileged center, there is no absolute rest, preferred direction, unique origin of calendar time, even left and right seem to be rather symmetric. The interference of electrons, photons, neutrons has indicated that the state of a particle can be described by a vector possessing a certain number of components. As the observer is replaced by another observer (working elsewhere, looking at a different direction, using another clock, perhaps being left-handed), the state of the very same particle is described by another vector, obtained from the previous vector by multiplying it with a matrix. This matrix transfers from one observer to another. http://www.reak.bme.hu/Wigner_Course/WignerBio/wb1.htm
Wigner went on to make these rather dramatic comments in regards to his work:
“It was not possible to formulate the laws (of quantum theory) in a fully consistent way without reference to consciousness.” Eugene Wigner (1902 -1995) from his collection of essays “Symmetries and Reflections – Scientific Essays”; Eugene Wigner laid the foundation for the theory of symmetries in quantum mechanics, for which he received the Nobel Prize in Physics in 1963. “It will remain remarkable, in whatever way our future concepts may develop, that the very study of the external world led to the scientific conclusion that the content of the consciousness is the ultimate universal reality” - Eugene Wigner – (Remarks on the Mind-Body Question, Eugene Wigner, in Wheeler and Zurek, p.169) 1961
Also of note:
Von Neumann–Wigner – interpretation Excerpt: The von Neumann–Wigner interpretation, also described as “consciousness causes collapse [of the wave function]“, is an interpretation of quantum mechanics in which consciousness is postulated to be necessary for the completion of the process of quantum measurement. http://en.wikipedia.org/wiki/Von_Neumann%E2%80%93Wigner_interpretation#The_interpretation “I think von Neumann’s orthodox QM gives a good way to understand the nature of the universe: it is tightly tied to the practical test and uses of our basic physical theory, while also accounting for the details of the mind-brain connection in a way that is rationally concordant with both our conscious experiences, and experience of control, and the neuroscience data.” Henry Stapp
Then after I had learned about Wigner’s Quantum Symmetries, I stumbled across Wheeler’s Delayed choice experiments in which this finding shocked me as to the central importance of the observer’s free will choice in quantum experiments:
Alain Aspect speaks on John Wheeler’s Delayed Choice Experiment – video http://vimeo.com/38508798 “Thus one decides the photon shall have come by one route or by both routes after it has already done its travel” John A. Wheeler Wheeler’s Classic Delayed Choice Experiment: Excerpt: Now, for many billions of years the photon is in transit in region 3. Yet we can choose (many billions of years later) which experimental set up to employ – the single wide-focus, or the two narrowly focused instruments. We have chosen whether to know which side of the galaxy the photon passed by (by choosing whether to use the two-telescope set up or not, which are the instruments that would give us the information about which side of the galaxy the photon passed). We have delayed this choice until a time long after the particles “have passed by one side of the galaxy, or the other side of the galaxy, or both sides of the galaxy,” so to speak. Yet, it seems paradoxically that our later choice of whether to obtain this information determines which side of the galaxy the light passed, so to speak, billions of years ago. So it seems that time has nothing to do with effects of quantum mechanics. And, indeed, the original thought experiment was not based on any analysis of how particles evolve and behave over time – it was based on the mathematics. This is what the mathematics predicted for a result, and this is exactly the result obtained in the laboratory. http://www.bottomlayer.com/bottom/basic_delayed_choice.htm Genesis, Quantum Physics and Reality Excerpt: Simply put, an experiment on Earth can be made in such a way that it determines if one photon comes along either on the right or the left side or if it comes (as a wave) along both sides of the gravitational lens (of the galaxy) at the same time. However, how could the photons have known billions of years ago that someday there would be an earth with inhabitants on it, making just this experiment? ,,, This is big trouble for the multi-universe theory and for the “hidden-variables” approach. - per Greer “It begins to look as we ourselves, by our last minute decision, have an influence on what a photon will do when it has already accomplished most of its doing… we have to say that we ourselves have an undeniable part in what we have always called the past. The past is not really the past until is has been registered. Or to put it another way, the past has no meaning or existence unless it exists as a record in the present.” – John Wheeler – The Ghost In The Atom – Page 66-68
bornagain77
July 20, 2014
July
07
Jul
20
20
2014
06:00 AM
6
06
00
AM
PDT
kf, I profoundly disagree with your opinion.bornagain77
July 20, 2014
July
07
Jul
20
20
2014
05:22 AM
5
05
22
AM
PDT
BA77: It's not conscious observation that freezes but that if something is in a constrained macrostate it has less entropy than in a less constrained one, fewer ways for mass and energy to be arranged consistent with the state. E.g. consider free expansion: Case A: || ***** | || Remove the internal barrier for the gas: Case B: || * * * * * || In B there is a rise in entropy as the micro-particles have greater freedom consistent with the macro constraints (which are observable). The constraint comes first, the lack of info on specific microstate stems from that. Our observation of the macrostate is just the usual way we do science at lab scale. In this case we observe two volumes, one much more confining. There is no necessary profound entanglement of the observer and the observation with the system state that makes a material difference. (This is not a quantum concept, though it can be extended to such -- Gibbs was pre-quantum.) KFkairosfocus
July 20, 2014
July
07
Jul
20
20
2014
04:53 AM
4
04
53
AM
PDT
kairosfocus, but why should conscious observation put a freeze on entropic decay from a mathematical point of view? i.e. since, as far as I can tell, math cannot possibly causally explain this effect from a entropic point of view then why does the quantum zeno effect even happen unless mind/consciousness precedes both the entropy of the universe and the math that describes the entropy?bornagain77
July 20, 2014
July
07
Jul
20
20
2014
04:31 AM
4
04
31
AM
PDT
F/N: I want to note, this is the 45th anniversary of the first Moon landing, to day and date. KFkairosfocus
July 20, 2014
July
07
Jul
20
20
2014
04:26 AM
4
04
26
AM
PDT
BA77: When, based on macro level state, we have a tightly constrained distribution of mass and energy at micro level, that means that there is low freedom to be in various states, hence low entropy. A living cell is in a highly constrained state, and is a macro-level observable in this sense. Prick it and allow its contents to diffuse via Brownian motion etc, and there is a much lower constraint. There is effectively zero likelihood of such a pricked cell spontaneously re assembling, on the gamut of time and resources of the observable cosmos across its lifespan. And BTW, this is also a strong argument against spontaneous OOL. KFkairosfocus
July 20, 2014
July
07
Jul
20
20
2014
04:25 AM
4
04
25
AM
PDT
Notice, how this work tries to get info for free for life:
One theme seems to characterize life: It is able to convert the thermodynamic information contained in food or in sunlight into complex and statistically unlikely configurations of matter. A flood of information-containing free-energy reaches the earth's biosphere in the form of sunlight. Passing through the metabolic pathways of living organisms, this information keeps the organisms far away from thermodynamic equilibrium (which is death). As the thermodynamic information flows through the biosphere, much of it is degraded into heat, but part is converted into cybernetic information and preserved in the intricate structures which are characteristic of life. The principle of natural selection ensures that as this happens, the configurations of matter in living organisms constantly increase in complexity, refinement and statistical improbability. This is the process which we call evolution, or in the case of human society, progress. [Information Theory and Evolution, 2nd Edn. John Scales Avery (World Scientific, Singapore, 2012). p.97]
If we look carefully, we will see that the error here is to imagine that organisation to process energy and information can be acquired for free. The OOL threshold is vast, and the threshold to originate onward body plans is vaster yet. Absent organised coupling and configuring mechanisms that provide the required highly functionally specific, complex work to make the structures and arrangements of cell based life, input energy will just go into randomising, aka heating up. In Maxwell's analysis of his demon, the demon got info and the weightless door to gate passage of molecules free of energy and information cost. But int eh physical world, there will need to be specific, coupled mechanisms for the demon to do the work, and that will extract its own requisite to design and build it, and to operate it. The same obtains in the wider case of OOL and origin of body plans. But there is a widespread blindness to that, thus the sort of verbal hey presto, bang it's there we just saw. KFkairosfocus
July 20, 2014
July
07
Jul
20
20
2014
04:14 AM
4
04
14
AM
PDT
F/N: This may make useful reading also. KFkairosfocus
July 20, 2014
July
07
Jul
20
20
2014
03:50 AM
3
03
50
AM
PDT
kairosfocus per 79:
Maxwell’s demon can (hypothetically) reduce the thermodynamic entropy of a system by using information about the states of individual molecules; but, as Landauer (from 1961) and co-workers have shown, to function the demon himself must increase thermodynamic entropy in the process, by at least the amount of Shannon information he proposes to first acquire and store; and so the total entropy does not decrease (which resolves the paradox). Boiling down, and with a dash of this clip from Jaynes courtesy HSR: “. . . The entropy of a thermodynamic system is a measure of the degree of ignorance of a person whose sole knowledge about its microstate consists of the values of the macroscopic quantities . . . which define its thermodynamic state. This is a perfectly ‘objective’ quantity . . . it is a function of [those variables] and does not depend on anybody’s personality. There is no reason why it cannot be measured in the laboratory.” . . . that is, the entropy of a body is the suitably adjusted metric of the average missing info to specify its microstate, given only its macroscopic state definition on temp, pressure etc. https://uncommondescent.com/intelligent-design/the-second-law-in-force-everywhere-but-nowhere/#comment-507913
As to being measured in the laboratory, this has now been accomplished:
Maxwell's demon demonstration (knowledge of a particle's position) turns information into energy - November 2010 Excerpt: Until now, demonstrating the conversion of information to energy has been elusive, but University of Tokyo physicist Masaki Sano and colleagues have succeeded in demonstrating it in a nano-scale experiment. In a paper published in Nature Physics they describe how they coaxed a Brownian particle to travel upwards on a "spiral-staircase-like" potential energy created by an electric field solely on the basis of information on its location. As the particle traveled up the staircase it gained energy from moving to an area of higher potential, and the team was able to measure precisely how much energy had been converted from information. http://www.physorg.com/news/2010-11-maxwell-demon-energy.html Demonic device converts information to energy - 2010 Excerpt: "This is a beautiful experimental demonstration that information has a thermodynamic content," says Christopher Jarzynski, a statistical chemist at the University of Maryland in College Park. In 1997, Jarzynski formulated an equation to define the amount of energy that could theoretically be converted from a unit of information2; the work by Sano and his team has now confirmed this equation. "This tells us something new about how the laws of thermodynamics work on the microscopic scale," says Jarzynski. http://www.scientificamerican.com/article.cfm?id=demonic-device-converts-inform “It is CSI that enables Maxwell’s demon to outsmart a thermodynamic system tending toward thermal equilibrium” William Dembki Intelligent Design, pg. 159
I also like how the 'conscious observer' is brought in in your quote kairosfocus:
The entropy of a thermodynamic system is a measure of the degree of ignorance of a person whose sole knowledge about its microstate consists of the values of the macroscopic quantities
But my question for you kairosfocus is "what is the mathematical explanation for why the entropic decay of a unstable particle 'freezes' when a person has 'knowledge of its microstate'?"
Quantum Zeno effect Excerpt: The quantum Zeno effect is,,, an unstable particle, if observed continuously, will never decay. http://en.wikipedia.org/wiki/Quantum_Zeno_effect “It has been experimentally confirmed,, that unstable particles will not decay, or will decay less rapidly, if they are observed. Somehow, observation changes the quantum system. We’re talking pure observation, not interacting with the system in any way.” Douglas Ell – Counting to God – pg. 189 – 2014 – Douglas Ell graduated early from MIT, where he double majored in math and physics.
kairosfocus, for me, being a novice, it seems fairly obvious the implications to 'consciousness as fundamental (Planck)', but I would be much interested in your opinion on the matter if it is not too much trouble.bornagain77
July 20, 2014
July
07
Jul
20
20
2014
02:45 AM
2
02
45
AM
PDT
Salvador:
The probabilities likewise involved in OOL are not the same probabilities associated with Shannon entropy nor thermodynamic entropy which only deals with energy microstates or position and momentum microstates, not things like the functional state of a system!
Sure. We'll just take your word for it. You being an authority in the probabilities involved in OOL, and an authority in the probabilities associated with Shannon entropy, and an authority in the probabilities associated with thermodynamic entropy. Are you admitting that these are all probabilistic? Salvador:
... not things like the functional state of a system!
You just don't get it, do you. Is it that functional states are probabilistically unlikely?Mung
July 20, 2014
July
07
Jul
20
20
2014
01:11 AM
1
01
11
AM
PDT
Salvador:
Shannon information is not the same as the information most important to OOL. There is for example prescriptive information, algorithmic information, specified information, etc.
You're just pontificating. These are just assertions, not an argument. Even if we grant the existence of prescriptive information, algorithmic information, specified information, etc., it does not follow that these are not subsumed under Shannon Information or that they are "more important" to OOL than Shannon Information. Salvador:
Shannon information is not the same as the information most important to OOL... So the information argument isn’t the same as the thermodynamic argument if one is discussing forms of information beyond Shannon. But that subtlety seems to have escaped you.
Even if we grant that Shannon Information is not the same as the information most important to OOL, it does not follow that the information argument isn’t the same as the thermodynamic argument. That's just a non-sequitur. Also, I don't recall saying anything about the thermodynamic argument. Mung:
Salvador, there is no difference between the entropy argument and the information argument and the probability argument.
Salvador:
So the information argument isn’t the same as the thermodynamic argument if one is discussing forms of information beyond Shannon. But that subtlety seems to have escaped you.
Oh, but the information argument is the same as the thermodynamic argument if one isn't discussing forms of information beyond Shannon. But that subtlety seems to have escaped you.Mung
July 20, 2014
July
07
Jul
20
20
2014
12:53 AM
12
12
53
AM
PDT
Salvador:
What argument are you talking about? The information argument about OOL, the probability argument about OOL, the thermodynamic entropy argument about OOL?
I thought I made it clear that there was no difference between the three. Mung:
Salvador, there is no difference between the entropy argument and the information argument and the probability argument.
You disagree. Please say why.Mung
July 20, 2014
July
07
Jul
20
20
2014
12:26 AM
12
12
26
AM
PDT
Salvador:
Would you care to correct my analysis of the relevant differential equations? Explain for the readers where I went materially wrong in my math? Talk about false insinuations. I never claimed or even insinuated that your math was wrong, so why this challenge to show how your math was wrong? You're confused.
Mung
July 20, 2014
July
07
Jul
20
20
2014
12:12 AM
12
12
12
AM
PDT
Salvador:
I derived these relations and posted them here and elsewhere. Don’t make false insinuations about me of not understanding the relationship of Shannon information and thermodynamic entropy.
You've got to be kidding. What false insinuation? I plainly stated that you understood the relationship, I even quoted you:
When I [Salvador] related Shannon, Dembski, Boltzmann and Clausius notions of entropy, I thought I demonstrated that entropy must necessarily increase for CSI to increase.
So my question to you is, what is your problem? Mung:
Salvador, there is no difference between the entropy argument and the information argument and the probability argument.
You objected. Why? You acknowledge the relationships.Mung
July 20, 2014
July
07
Jul
20
20
2014
12:04 AM
12
12
04
AM
PDT
kairosfocus:
So, while I doubt Mung has done a serious stat mech course, his reading and intuition on solid state and vacuum physics tied to electronics and to info theory will be in the right ballpark.
Man, you've got that right! I've no idea why the US Navy thought I'd be a good fit for their nuclear power program, lol! As it turns out, due to a propensity to steal rather than purchase (at age 17/18) I was ejected from that program and ended up in electronics instead, and thus communications. And here I am decades later still in the communications field. The point is that entropy is a statistical/probabilistic concept, and thus uniquely suited to analysis according to information theory.Mung
July 19, 2014
July
07
Jul
19
19
2014
11:46 PM
11
11
46
PM
PDT
Additional reading material for Salvador: Principles of Statistical Mechanics: The Information Theory Approach From the dustjacket:
In this book the author approaches statistical mechanics in the uniquely consistent and unified way made possible by information theory ... Information theory ... has proved to be very powerful in clarifying the concepts of statistical mechanics and in uniting its various branches to one another and to the rest of physics.
From the Preface:
The material presented here reflects the work of many authors over a long period of time. ... (I should, however, mention E.T. Jaynes, who has pioneered the information theory approach to statistical mechanics.)
H.T. to kairosfocus who also brought attention to Jaynes here at UD.Mung
July 19, 2014
July
07
Jul
19
19
2014
11:18 PM
11
11
18
PM
PDT
Ps 2: Further following on, and drawing on Robertson: ______________ >>Summarising Harry Robertson's Statistical Thermophysics (Prentice-Hall International, 1993) -- excerpting desperately and adding emphases and explanatory comments, we can see, perhaps, that this should not be so surprising after all. (In effect, since we do not possess detailed knowledge of the states of the vary large number of microscopic particles of thermal systems [typically ~ 10^20 to 10^26; a mole of substance containing ~ 6.023*10^23 particles; i.e. the Avogadro Number], we can only view them in terms of those gross averages we term thermodynamic variables [pressure, temperature, etc], and so we cannot take advantage of knowledge of such individual particle states that would give us a richer harvest of work, etc.) For, as he astutely observes on pp. vii - viii:
. . . the standard assertion that molecular chaos exists is nothing more than a poorly disguised admission of ignorance, or lack of detailed information about the dynamic state of a system . . . . If I am able to perceive order, I may be able to use it to extract work from the system, but if I am unaware of internal correlations, I cannot use them for macroscopic dynamical purposes. On this basis, I shall distinguish heat from work, and thermal energy from other forms . . .
And, in more details, (pp. 3 - 6, 7, 36, cf Appendix 1 below for a more detailed development of thermodynamics issues and their tie-in with the inference to design; also see recent ArXiv papers by Duncan and Samura here and here):
. . . It has long been recognized that the assignment of probabilities to a set represents information, and that some probability sets represent more information than others . . . if one of the probabilities say p2 is unity and therefore the others are zero, then we know that the outcome of the experiment . . . will give [event] y2. Thus we have complete information . . . if we have no basis . . . for believing that event yi is more or less likely than any other [we] have the least possible information about the outcome of the experiment . . . . A remarkably simple and clear analysis by Shannon [1948] has provided us with a quantitative measure of the uncertainty, or missing pertinent information, inherent in a set of probabilities [NB: i.e. a probability different from 1 or 0 should be seen as, in part, an index of ignorance] . . . . [deriving informational entropy, cf. discussions here, here, here, here and here; also Sarfati's discussion of debates and the issue of open systems here . . . ] H({pi}) = - C [SUM over i] pi*ln pi, [. . . "my" Eqn 6] [where [SUM over i] pi = 1, and we can define also parameters alpha and beta such that: (1) pi = e^-[alpha + beta*yi]; (2) exp [alpha] = [SUM over i](exp - beta*yi) = Z [Z being in effect the partition function across microstates, the "Holy Grail" of statistical thermodynamics]. . . . [H], called the information entropy, . . . correspond[s] to the thermodynamic entropy [i.e. s, where also it was shown by Boltzmann that s = k ln w], with C = k, the Boltzmann constant, and yi an energy level, usually ei, while [BETA] becomes 1/kT, with T the thermodynamic temperature . . . A thermodynamic system is characterized by a microscopic structure that is not observed in detail . . . We attempt to develop a theoretical description of the macroscopic properties in terms of its underlying microscopic properties, which are not precisely known. We attempt to assign probabilities to the various microscopic states . . . based on a few . . . macroscopic observations that can be related to averages of microscopic parameters. Evidently the problem that we attempt to solve in statistical thermophysics is exactly the one just treated in terms of information theory. It should not be surprising, then, that the uncertainty of information theory becomes a thermodynamic variable when used in proper context . . . . Jayne's [summary rebuttal to a typical objection] is ". . . The entropy of a thermodynamic system is a measure of the degree of ignorance of a person whose sole knowledge about its microstate consists of the values of the macroscopic quantities . . . which define its thermodynamic state. This is a perfectly 'objective' quantity . . . it is a function of [those variables] and does not depend on anybody's personality. There is no reason why it cannot be measured in the laboratory." . . . . [pp. 3 - 6, 7, 36; replacing Robertson's use of S for Informational Entropy with the more standard H.]
As is discussed briefly in Appendix 1, Thaxton, Bradley and Olsen [TBO], following Brillouin et al, in the 1984 foundational work for the modern Design Theory, The Mystery of Life's Origins [TMLO], exploit this information-entropy link, through the idea of moving from a random to a known microscopic configuration in the creation of the bio-functional polymers of life, and then -- again following Brillouin -- identify a quantitative information metric for the information of polymer molecules. For, in moving from a random to a functional molecule, we have in effect an objective, observable increment in information about the molecule. This leads to energy constraints, thence to a calculable concentration of such molecules in suggested, generously "plausible" primordial "soups." In effect, so unfavourable is the resulting thermodynamic balance, that the concentrations of the individual functional molecules in such a prebiotic soup are arguably so small as to be negligibly different from zero on a planet-wide scale. By many orders of magnitude, we don't get to even one molecule each of the required polymers per planet, much less bringing them together in the required proximity for them to work together as the molecular machinery of life. The linked chapter gives the details. More modern analyses [e.g. Trevors and Abel, here and here], however, tend to speak directly in terms of information and probabilities rather than the more arcane world of classical and statistical thermodynamics, so let us now return to that focus; in particular addressing information in its functional sense, as the third step in this preliminary analysis. As the third major step, we now turn to information technology, communication systems and computers, which provides a vital clarifying side-light from another view on how complex, specified information functions in information processing systems: [In the context of computers] information is data -- i.e. digital representations of raw events, facts, numbers and letters, values of variables, etc. -- that have been put together in ways suitable for storing in special data structures [strings of characters, lists, tables, "trees" etc], and for processing and output in ways that are useful [i.e. functional]. . . . Information is distinguished from [a] data: raw events, signals, states etc represented digitally, and [b] knowledge: information that has been so verified that we can reasonably be warranted, in believing it to be true. [GEM, UWI FD12A Sci Med and Tech in Society Tutorial Note 7a, Nov 2005.] That is, we have now made a step beyond mere capacity to carry or convey information, to the function fulfilled by meaningful -- intelligible, difference making -- strings of symbols. In effect, we here introduce into the concept, "information," the meaningfulness, functionality (and indeed, perhaps even purposefulness) of messages -- the fact that they make a difference to the operation and/or structure of systems using such messages, thus to outcomes; thence, to relative or absolute success or failure of information-using systems in given environments. And, such outcome-affecting functionality is of course the underlying reason/explanation for the use of information in systems. [Cf. the recent peer-reviewed, scientific discussions here, and here by Abel and Trevors, in the context of the molecular nanotechnology of life.] Let us note as well that since in general analogue signals can be digitised [i.e. by some form of analogue-digital conversion], the discussion thus far is quite general in force. So, taking these three main points together, we can now see how information is conceptually and quantitatively defined, how it can be measured in bits, and how it is used in information processing systems; i.e., how it becomes functional. In short, we can now understand that: Functionally Specific, Complex Information [FSCI] is a characteristic of complicated messages that function in systems to help them practically solve problems faced by the systems in their environments. Also, in cases where we directly and independently know the source of such FSCI (and its accompanying functional organisation) it is, as a general rule, created by purposeful, organising intelligent agents. So, on empirical observation based induction, FSCI is a reliable sign of such design, e.g. the text of this web page, and billions of others all across the Internet. (Those who object to this, therefore face the burden of showing empirically that such FSCI does in fact -- on observation -- arise from blind chance and/or mechanical necessity without intelligent direction, selection, intervention or purpose.) Indeed, this FSCI perspective lies at the foundation of information theory: (i) recognising signals as intentionally constructed messages transmitted in the face of the possibility of noise, (ii) where also, intelligently constructed signals have characteristics of purposeful specificity, controlled complexity and system- relevant functionality based on meaningful rules that distinguish them from meaningless noise; (iii) further noticing that signals exist in functioning generation- transfer and/or storage- destination systems that (iv) embrace co-ordinated transmitters, channels, receivers, sources and sinks. That this is broadly recognised as true, can be seen from a surprising source, Dawkins, who is reported to have said in his The Blind Watchmaker (1987), p. 8: Hitting upon the lucky number that opens the bank's safe [NB: cf. here the case in Brown's The Da Vinci Code] is the equivalent, in our analogy, of hurling scrap metal around at random and happening to assemble a Boeing 747. [NB: originally, this imagery is due to Sir Fred Hoyle, who used it to argue that life on earth bears characteristics that strongly suggest design. His suggestion: panspermia -- i.e. life drifted here, or else was planted here.] Of all the millions of unique and, with hindsight equally improbable, positions of the combination lock, only one opens the lock. Similarly, of all the millions of unique and, with hindsight equally improbable, arrangements of a heap of junk, only one (or very few) will fly. The uniqueness of the arrangement that flies, or that opens the safe, has nothing to do with hindsight. It is specified in advance. [Emphases and parenthetical note added, in tribute to the late Sir Fred Hoyle. (NB: This case also shows that we need not see boxes labelled "encoders/decoders" or "transmitters/receivers" and "channels" etc. for the model in Fig. 1 above to be applicable; i.e. the model is abstract rather than concrete: the critical issue is functional, complex information, not electronics.)] Here, we see how the significance of FSCI naturally appears in the context of considering the physically and logically possible but vastly improbable creation of a jumbo jet by chance. Instantly, we see that mere random chance acting in a context of blind natural forces is a most unlikely explanation, even though the statistical behaviour of matter under random forces cannot rule it strictly out. But it is so plainly vastly improbable, that, having seen the message -- a flyable jumbo jet -- we then make a fairly easy and highly confident inference to its most likely origin: i.e. it is an intelligently designed artifact. For, the a posteriori probability of its having originated by chance is obviously minimal -- which we can intuitively recognise, and can in principle quantify. FSCI is also an observable, measurable quantity; contrary to what is imagined, implied or asserted by many objectors. This may be most easily seen by using a quantity we are familiar with: functionally specific bits [FS bits], such as those that define the information on the screen you are most likely using to read this note . . . >> _________________ In short, the interconnexions are there.kairosfocus
July 19, 2014
July
07
Jul
19
19
2014
04:26 AM
4
04
26
AM
PDT
PS 1: I clip from my always linked note, Sec A: _______________ >> The second major step is to refine our thoughts, through discussing the communication theory definition of and its approach to measuring information. A good place to begin this is with British Communication theory expert F. R Connor, who gives us an excellent "definition by discussion" of what information is:
From a human point of view the word 'communication' conveys the idea of one person talking or writing to another in words or messages . . . through the use of words derived from an alphabet [NB: he here means, a "vocabulary" of possible signals]. Not all words are used all the time and this implies that there is a minimum number which could enable communication to be possible. In order to communicate, it is necessary to transfer information to another person, or more objectively, between men or machines. This naturally leads to the definition of the word 'information', and from a communication point of view it does not have its usual everyday meaning. Information is not what is actually in a message but what could constitute a message. The word could implies a statistical definition in that it involves some selection of the various possible messages. The important quantity is not the actual information content of the message but rather its possible information content. This is the quantitative definition of information and so it is measured in terms of the number of selections that could be made. Hartley was the first to suggest a logarithmic unit . . . and this is given in terms of a message probability. [p. 79, Signals, Edward Arnold. 1972. Bold emphasis added. Apart from the justly classical status of Connor's series, his classic work dating from before the ID controversy arose is deliberately cited, to give us an indisputably objective benchmark.]
To quantify the above definition of what is perhaps best descriptively termed information-carrying capacity, but has long been simply termed information (in the "Shannon sense" - never mind his disclaimers . . .), let us consider a source that emits symbols from a vocabulary: s1,s2, s3, . . . sn, with probabilities p1, p2, p3, . . . pn. That is, in a "typical" long string of symbols, of size M [say this web page], the average number that are some sj, J, will be such that the ratio J/M --> pj, and in the limit attains equality. We term pj the a priori -- before the fact -- probability of symbol sj. Then, when a receiver detects sj, the question arises as to whether this was sent. [That is, the mixing in of noise means that received messages are prone to misidentification.] If on average, sj will be detected correctly a fraction, dj of the time, the a posteriori -- after the fact -- probability of sj is by a similar calculation, dj. So, we now define the information content of symbol sj as, in effect how much it surprises us on average when it shows up in our receiver: I = log [dj/pj], in bits [if the log is base 2, log2] . . . Eqn 1 This immediately means that the question of receiving information arises AFTER an apparent symbol sj has been detected and decoded. That is, the issue of information inherently implies an inference to having received an intentional signal in the face of the possibility that noise could be present. Second, logs are used in the definition of I, as they give an additive property: for, the amount of information in independent signals, si + sj, using the above definition, is such that: I total = Ii + Ij . . . Eqn 2 For example, assume that dj for the moment is 1, i.e. we have a noiseless channel so what is transmitted is just what is received. Then, the information in sj is: I = log [1/pj] = - log pj . . . Eqn 3 This case illustrates the additive property as well, assuming that symbols si and sj are independent. That means that the probability of receiving both messages is the product of the probability of the individual messages (pi *pj); so: Itot = log1/(pi *pj) = [-log pi] + [-log pj] = Ii + Ij . . . Eqn 4 So if there are two symbols, say 1 and 0, and each has probability 0.5, then for each, I is - log [1/2], on a base of 2, which is 1 bit. (If the symbols were not equiprobable, the less probable binary digit-state would convey more than, and the more probable, less than, one bit of information. Moving over to English text, we can easily see that E is as a rule far more probable than X, and that Q is most often followed by U. So, X conveys more information than E, and U conveys very little, though it is useful as redundancy, which gives us a chance to catch errors and fix them: if we see "wueen" it is most likely to have been "queen.") Further to this, we may average the information per symbol in the communication system thusly (giving in termns of -H to make the additive relationships clearer): - H = p1 log p1 + p2 log p2 + . . . + pn log pn or, H = - SUM [pi log pi] . . . Eqn 5 H, the average information per symbol transmitted [usually, measured as: bits/symbol], is often termed the Entropy; first, historically, because it resembles one of the expressions for entropy in statistical thermodynamics. As Connor notes: "it is often referred to as the entropy of the source." [p.81, emphasis added.] Also, while this is a somewhat controversial view in Physics, as is briefly discussed in Appendix 1below, there is in fact an informational interpretation of thermodynamics that shows that informational and thermodynamic entropy can be linked conceptually as well as in mere mathematical form. Though somewhat controversial even in quite recent years, this is becoming more broadly accepted in physics and information theory, as Wikipedia now discusses [as at April 2011] in its article on Informational Entropy (aka Shannon Information, cf also here):
At an everyday practical level the links between information entropy and thermodynamic entropy are not close. Physicists and chemists are apt to be more interested in changes in entropy as a system spontaneously evolves away from its initial conditions, in accordance with the second law of thermodynamics, rather than an unchanging probability distribution. And, as the numerical smallness of Boltzmann's constant kB indicates, the changes in S / kB for even minute amounts of substances in chemical and physical processes represent amounts of entropy which are so large as to be right off the scale compared to anything seen in data compression or signal processing. But, at a multidisciplinary level, connections can be made between thermodynamic and informational entropy, although it took many years in the development of the theories of statistical mechanics and information theory to make the relationship fully apparent. In fact, in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon's information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states that it could be in, thus making any complete state description longer. (See article: maximum entropy thermodynamics.[Also,another article remarks: >>in the words of G. N. Lewis writing about chemical entropy in 1930, "Gain in entropy always means loss of information, and nothing more" . . . in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes/no questions that need to be answered in order to fully specify the microstate, given that we know the macrostate.>>]) Maxwell's demon can (hypothetically) reduce the thermodynamic entropy of a system by using information about the states of individual molecules; but, as Landauer (from 1961) and co-workers have shown, to function the demon himself must increase thermodynamic entropy in the process, by at least the amount of Shannon information he proposes to first acquire and store; and so the total entropy does not decrease (which resolves the paradox).
Summarising Harry Robertson's Statistical Thermophysics (Prentice-Hall International, 1993) . . . >> _______________ This gets us to an information expression that is closely tied to the thermodynamics situation.kairosfocus
July 19, 2014
July
07
Jul
19
19
2014
04:18 AM
4
04
18
AM
PDT
SalC (attn Mung): I suggest you should take a look at Harry S Robertson's Statistical Thermophysics, esp Ch 1. (L K Nash's classic Elements of Statistical Thermodynamics is a very good intro, too. I strongly suggest this Chemist's look at the area for newbies. I have never seen a better introductory presentation done by a Physicist. I wish I had gone over to the Chemistry section of the uni bookshop or library in my undergrad days . . . Mandl was and is a pig, and Sears et al improves on the reading with the years, similar to Milman-Halkias' Integrated Electronics. There are 1st time thru books and there are nth time through books. My u/grad rule of thumb was, expect to do three readings to get the point in most physics books of that era. I think that's because late occurring points are really needed to understand earlier ones. H'mm, maybe that's why I came to favour spiral curricula that start from a compressed overview and guided tour of key themes . . . esp. based on a key case study, then elaborate major aspect by aspect until a capstone integrative at a higher level can be done. Thus, too, a view on my foundationalist - coherentist -- elegant balance approach generally: what is the hard core things are built up from? In the case of the design inference that is abductive inference on induction and analysis to explain the causal origin of functionally specific, complex information. The only empirically grounded, analytically reasonable such, is design. But when you are up against entrenched ideology and indoctrination, that is going to be an uphill charge into the teeth of MG fire. You need tanks closely backed by infantry to take such on.) I think we would all find the following from Wiki [clipped Apr 2011 fr the Informational Entropy (i.e. Shannon Info . . . avg info per symbol in a message and message context) art] helpful (understanding that the informational approach to statistical thermodynamics has been controversial since Brillouin and Jaynes but has latterly been increasingly seen as having a serious point):
At an everyday practical level the links between information entropy and thermodynamic entropy are not close. Physicists and chemists are apt to be more interested in changes in entropy as a system spontaneously evolves away from its initial conditions, in accordance with the second law of thermodynamics, rather than an unchanging probability distribution. And, as the numerical smallness of Boltzmann's constant kB indicates, the changes in S / kB for even minute amounts of substances in chemical and physical processes represent amounts of entropy which are so large as to be right off the scale compared to anything seen in data compression or signal processing. But, at a multidisciplinary level, connections can be made between thermodynamic and informational entropy, although it took many years in the development of the theories of statistical mechanics and information theory to make the relationship fully apparent. In fact, in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon's information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states that it could be in, thus making any complete state description longer. (See article: maximum entropy thermodynamics.[Also,another article remarks: >>in the words of G. N. Lewis writing about chemical entropy in 1930, "Gain in entropy always means loss of information, and nothing more" . . . in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes/no questions that need to be answered in order to fully specify the microstate, given that we know the macrostate.>>]) Maxwell's demon can (hypothetically) reduce the thermodynamic entropy of a system by using information about the states of individual molecules; but, as Landauer (from 1961) and co-workers have shown, to function the demon himself must increase thermodynamic entropy in the process, by at least the amount of Shannon information he proposes to first acquire and store; and so the total entropy does not decrease (which resolves the paradox).
Boiling down, and with a dash of this clip from Jaynes courtesy HSR:
". . . The entropy of a thermodynamic system is a measure of the degree of ignorance of a person whose sole knowledge about its microstate consists of the values of the macroscopic quantities . . . which define its thermodynamic state. This is a perfectly 'objective' quantity . . . it is a function of [those variables] and does not depend on anybody's personality. There is no reason why it cannot be measured in the laboratory."
. . . that is, the entropy of a body is the suitably adjusted metric of the average missing info to specify its microstate, given only its macroscopic state definition on temp, pressure etc. Where of course a 1 mm cube easily has over 10^18 1 nm scale particles [except if it is a gas which might be like 10^15 or so]. As a consequence of this connexion, we must understand that info is connected to log of probability measures on the one hand and to entropy in the thermodynamic sense on the other, once relevant multiplicative factors and units such as the Boltzmann's constant in J/K are factored in. They are inextricably intertwined, in short. So, while I doubt Mung has done a serious stat mech course, his reading and intuition on solid state and vacuum physics tied to electronics and to info theory will be in the right ballpark. All that stuff about Fermi levels and thermionic emission and photo effect does tend to tie in together over time. That said, in an info era probably the easiest break-in point is going to be info, starting with the idea of a 2-state switch storing one bit. Thence, a 4-state string position in a D/RNA molecule stores 2 bits raw, adjusted for relative frequency as relevant across GCAT/U. The info is in effect stored in prong height, much like with a classic Yale type lock's key. (And BTW, that prong height approach is what von neumann put up for his self replicating machine thought exercise, in 1948.) We then point to the way proteins are synthesised, using a simplified model, drawings and vids, such as Vuk Nikolic's wonderful simulation. That can be correlated to the adapted form of the Shannon Comms model, source > encoder > transmission unit --> CHANNEL --> receiver unit > decoder > sink. This helps us see that a lot of co-ordinated components are needed for a comms framework to work. Onward, proteins including enzymes are the workhorse molecules of the cell. These rest on chained AA strings controlled by D/RNA codes and algorithms. To work right they have to fold to 3-d shapes partly controlled by the AA string sequences, and often partly assisted by chaperoning molecular machines. There are prions after all, misfolded more stable but destructive structures. Thousands of proteins are routinely involved in the operation of the cell, and they form a coherent whole based on the right parts in the right place at the right time. There is even a cellular post office and transport network using walking tractors. The integrated functionally specific complexity of this easily overwhelms the blind chance and mechanical necessity resources of our observed cosmos, being far, far beyond 1,000 bits. The only empirically warranted, analytically plausible explanation of the required FSCO/I is design. This puts design at the table of discussion for origins science, right from the root of the tree of life. There is therefore no reason to exclude it from discussion there or thereafter across the branches pointing to major body plans. Ideological a prioris rooted in evolutionary materialism and imposed on science and science education do not count as reasons. A similar point can be made regarding the fine tuning of physics to build a cosmos that supports the kind of C-Chemistry, aqueous medium, gated metabolism, molecular nanomachine, code and algorithm using cell based life we see and experience. Ideological game over. Thing is, this is riddled with information and thermodynamics issues tied onwards to likelihoods rooted in blind chance and necessity challenged to search vast configuration spaces. While thermodynamics and info issues are not there at the surface, they are just beneath it, and come up as objector talking points are trotted out. (Other than well poisoning attacks.) So, I suggest the best answer is to acknowledge the formal connexions and the reason behind them, but to emphasise what is most easy to see. KFkairosfocus
July 19, 2014
July
07
Jul
19
19
2014
04:07 AM
4
04
07
AM
PDT
Mung,
We shall see that the concept of entropy, central to information theory as treated here, has at least a formal equivalence with the entropy of thermodynamics. (Brillouin, 1956. Jaynes, 1959).
I derived these relations and posted them here and elsewhere. Don't make false insinuations about me of not understanding the relationship of Shannon information and thermodynamic entropy. For example, I provided a derivation on April 2, 2014 relating information measures to thermodynamic entropy: Clausius, Boltzmann, Dembski. All you could do is quote, but I actually could do the derivation. Do you think you could do the same? Would you care to correct my analysis of the relevant differential equations? Explain for the readers where I went materially wrong in my math? :-)
So what’s so controversial about that statement?
What argument are you talking about? The information argument about OOL, the probability argument about OOL, the thermodynamic entropy argument about OOL? Shannon information is not the same as the information most important to OOL. There is for example prescriptive information, algorithmic information, specified information, etc. So the information argument isn't the same as the thermodynamic argument if one is discussing forms of information beyond Shannon. But that subtlety seems to have escaped you. The probabilities likewise involved in OOL are not the same probabilities associated with Shannon entropy nor thermodynamic entropy which only deals with energy microstates or position and momentum microstates, not things like the functional state of a system! Feel free to do a derivation that shows thermodynamic entropy (as would be derived by a college student in physics, engineering, or chemistry) will necessarily indicate if an object is functional. An object has on the order 10^7 J/K. Another object has only 10^1 J/K entropy. Explain for the reader which object is functional and why. So what's the probability the object with 10^7 J/K entropy is functional versus the object with 10^1 J/K entropy. You after all claimed:
there is no difference between the entropy argument and the information argument and the probability argument.
So what's the probability the object with 10^7 J/K entropy is functional not functional based on the entropy score alone. I'm now giving you a chance to defend your confused claim. You won't be able to, because you're wrong.
there is no difference between the entropy argument and the information argument and the probability argument.
So what's the probability something with approximately 10^7 J/K entropy is functional. Given your claim, you should be able to take 10^7 J/K and convert it to a probability of whether it is functional or not, whether it is living or not. Go ahead, Mung. :roll: I could convert that number into Shannon bits if I wanted. I could convert it into a probability. Think you can do the same? :-) HINT: I did a comparable derivation in the link! Do you think once you've converted that number in to a probability that it is the same probability as functionality? HINT: NO! Why? The thermodynamic probabilities are associated with the energy microstates, which say little or nothing about functionality. But you since you insist:
there is no difference between the entropy argument and the information argument and the probability argument.
By all means, show the reader you can actually put into practice what you claim. :roll:scordova
July 19, 2014
July
07
Jul
19
19
2014
02:04 AM
2
02
04
AM
PDT
More reading material for Salvador: Information Theory and Coding
We shall see that the concept of entropy, central to information theory as treated here, has at least a formal equivalence with the entropy of thermodynamics. (Brillouin, 1956. Jaynes, 1959).
Mung
July 19, 2014
July
07
Jul
19
19
2014
12:30 AM
12
12
30
AM
PDT
Macroscopic thermodynamics is here reexamined from the perspective of atomic-molecular theory. The thermodynamic concepts of entropy and equilibrium are thus invested with new meaning and implication... - Leonard K. Nash
Chapter 1 introduces and develops a statistical analysis of the concept of distribution - culminating in a very simple derivation of the Boltzmann distribution law and a demonstration of the relation of statistical and thermodynamic entropies. - Leonard K. Nash
Compared with the first edition, the present second edition offers in its opening chapter an analysis that is both much simpler and more decisive. This chapter also provide a brief but convincing demonstration of one crucial point merely announced in the first edition, namely: the enormous odds by which some distributions are favored over others. - Leonard K. Nash
Mung:
Salvador, there is no difference between the entropy argument and the information argument and the probability argument.
So what’s so controversial about that statement?Mung
July 18, 2014
July
07
Jul
18
18
2014
11:13 PM
11
11
13
PM
PDT
Reading material for Salvador: Mathematical Foundations of Information Theory The first section of this book consists of The Entropy Concept in Probability Theory Mathematical Foundations of Statistical Mechanics Chapter 4 in this book is Reduction to the Problem of the Theory of Probability Elements of Statistical Thermodynamics Don't miss the section starting on page 33. The Concept of Entropy Mung:
Salvador, there is no difference between the entropy argument and the information argument and the probability argument.
So what's so controversial about that statement?Mung
July 18, 2014
July
07
Jul
18
18
2014
10:39 PM
10
10
39
PM
PDT
Mung:
Salvador, there is no difference between the entropy argument and the information argument and the probability argument.
Salvador:
Your assertion is confused. What entropy are you talking about? What probability are you talking about? Until you specify these parameters, your statement is only valid for special cases and not true in general.
My assertion is not confused. You admitted in your post @8 that it is correct and that you agree with it. Here's what you wrote:
When I related Shannon, Dembski, Boltzmann and Clausius notions of entropy, I thought I demonstrated that entropy must necessarily increase for CSI to increase.
How did you manage to relate Shannon, Dembski, Boltzmann and Clausius notions of entropy?
What entropy are you talking about? What probability are you talking about? Until you specify these parameters, your statement is only valid for special cases and not true in general.
Now you're contradicting yourself and showing that you are the one that is confused. My argument is that all three of the above are generalizable, so asking me to explain which entropy or which probability is just missing the point.
Until you specify these parameters, your statement is only valid for special cases and not true in general.
And that's just absurd. Weaver:
The quantity which uniquely meets the natural requirements that one sets up for "information" turns out to be exactly that which is known in thermodynamics as entropy.
Mung
July 18, 2014
July
07
Jul
18
18
2014
09:55 PM
9
09
55
PM
PDT
Henry, you don't realize how an incubator works. It starts pre-loaded. Trust me, unless you seed it with life, life won't come.revelator
July 16, 2014
July
07
Jul
16
16
2014
02:11 PM
2
02
11
PM
PDT
F/N: And, Mapou should recognise that we are dealing with conceptual systems in math and so will have to take the infinite seriously and positively. KFkairosfocus
July 16, 2014
July
07
Jul
16
16
2014
03:53 AM
3
03
53
AM
PDT
SalC: I suggest, as can be seen, that there are fundamental connexions between information and thermodynamics concerns in the context of FSCO/I. While -- as with mathematics -- it is not a simple matter to explore such, we do need to understand them in the face of a three or four way debate. Further to such, I suggest that we should accept that there is a difference between what are ABC basics for those getting a first exposure and what are foundational but sometimes abstruse matters. However, the simplified picture -- and visuals help -- of the connexions is also important as on a matter that has been the subject of misrepresentations, exploitation of equivocation, polarisation, wedge tactics and the like there is an unhealthy emphasis among objectors on trying to pose difficulties as attack points meant to discredit. The recent tactics on twisting the meaning and context of squares and circles in order to suggest that it is not a reasonably plain point that a square circle is an impossible being is a case in point. To address that required me to reluctantly go back to the empty set cascade to get the Reals [which, recall involves that place value numbers are infinite series . . . in order to get the continuum], inject complex numbers to specify an orthogonal axis and thus a plane, then on that basis of a definite plane, define squares and circles functionally, thence show from this the fundamental incompatibility. All along, simple common sense and respect for context would have been enough. And of course a year ago there was a related issue on whether it was self evidently axiomatic that parallel lines . . . which implies a plane . . . never meet. That one required using the plane to specify equations of form y = mx + c, with m the same and c values different. The point is, ABC basics can be approached on common good sense where there is good will, but when toxic hyperskeptical agendas break in, fundamental analysis may need to be deployed, even at 101 level. KFkairosfocus
July 16, 2014
July
07
Jul
16
16
2014
03:15 AM
3
03
15
AM
PDT
PPS: Looks like some redefinitions of the SI system are in prospect, here is a clip from the draft brochure for SI:
Since the establishment of the SI in 1960, extraordinary advances have been made in relating SI units to truly invariant quantities such as the fundamental constants of physics and the properties of atoms. Recognising the importance of linking SI units to such invariant quantities, the XXth CGPM, in 20XX, adopted new definitions of the kilogram, ampere, kelvin, and mole in terms of fixed numerical values of the Planck constant h, elementary charge e, Boltzmann constant k, and Avogadro constant N_A, respectively.
Somehow, I preferred it the other way around. KFkairosfocus
July 16, 2014
July
07
Jul
16
16
2014
12:10 AM
12
12
10
AM
PDT
PS: Wiki on Boltzmann's expression, here. (I recall an incident whent his first came up in a discussion, where an objector to design theory at another site assumed I did not know anything about what the expression meant, which speaks volumes about the assumptions of too many objectors, along lines of Dawkins' prejudice-laced taunt on you must be ignorant, stupid, insane or wicked.)kairosfocus
July 16, 2014
July
07
Jul
16
16
2014
12:02 AM
12
12
02
AM
PDT
1 2 3 4 5 6

Leave a Reply