Uncommon Descent Serving The Intelligent Design Community

The Second Law: In Force Everywhere But Nowhere?

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

I hope our materialist friends will help us with this one.

As I understand their argument, entropy is not an obstacle to blind watchmaker evolution, because entropy applies absolutely only in a “closed system,” and the earth is not a closed system because it receives electromagnetic radiation from space.

Fair enough. But it seems to me that under that definition of “closed system” only the universe as a whole is a closed system, because every particular place in the universe receives energy of some kind from some other place. And if that is so, it seems the materialists have painted themselves into a corner in which they must, to remain logically consistent, assert that entropy applies everywhere but no place in particular, which is absurd.

Now this seems like an obvious objection, and if it were valid the “closed system/open system” argument would have never gained any traction to begin with. So I hope someone will clue me in as to what I am missing.

Comments
Mung, So what probability are you talking about? Are you talking about thermodynamic probability of a thermodynamic microstate or are you talking the probability of life? What probability do think your quotes are talking about? Probability of an energy microstate or of life or something else or can't you tell the difference between the two? :roll: So spell it out. What probability are you talking about: A. probability of an object being alive B. probability of an finding an object in a particular energy microstate C. probability of something else D. you don't know what probability you're talking about If your answer is "C", then specify probability you are talking about. Why are you avoiding answering a simple question? If you say "thermodynamic probability", you better define what you think that means, because it should mean the probability of some "thing" being true, and what is that "thing". HINT: thermodynamic probability usually means probability that a particular microstate is found. But if that is the case, it's generally insufficient to say anything about something being alive.scordova
July 25, 2014
July
07
Jul
25
25
2014
09:48 PM
9
09
48
PM
PDT
Re-posting here, given Salvador's propensity for deleting my posts in any thread he has control over:
The laws of information had already solved the paradoxes of thermodynamics; in fact, information theory consumed thermodynamics. The problems in thermodynamics can be solved by recognizing that thermodynamics is, in truth, a special case of information theory. - Decoding the Universe, p. 87
Mung
July 25, 2014
July
07
Jul
25
25
2014
09:38 PM
9
09
38
PM
PDT
Re-posting here, given Salvador's propensity for deleting my posts in any thread he has control over: Gordon Davisson:
Either way, you'll get the same result you would've from Boltzmann's formula...Either way, it doesn't affect the physics at all -- they're just different ways of looking at the same old familiar entropy.
Mung
July 25, 2014
July
07
Jul
25
25
2014
09:37 PM
9
09
37
PM
PDT
there is no difference between the entropy argument and the information argument and the probability argument.
So what probability are you talking about? Are you talking about thermodynamic probability of a thermodynamic microstate or are you talking the probability of life. If you're talking about the probability of thermodynamic energy microstate, that is irrelevant to the probability of life. You're statement is meaningless at best, irrelevant to the question of life at worst.scordova
July 25, 2014
July
07
Jul
25
25
2014
09:25 PM
9
09
25
PM
PDT
Salvador:
For the reader’s benefit. The 2nd law can be stated in terms of entropy. As Gordon pointed out an alternative form of the 2nd law is:
For the reader's benefit:
The term entropy and its denotation by the letter S were introduced by Rudolf Julius Emanuel Clausius (1822-1888) in 1864 in his work on the foundations of classical thermodynamics, where he formulated the second law of thermodynamics in the form that the entropy of a closed system cannot decrease. Later, the concept of thermodynamic entropy was clarified and grounded by Ludwig Eduard Boltzmann (1844-1906), the famous formula for thermodynamic entropy is: S = k * ln W (3.2.1) where W is the thermodynamic probability that the system is in the state with the entropy S. - Theory Of Information: Fundamentality, Diversity and Unification
Mung:
Salvador, there is no difference between the entropy argument and the information argument and the probability argument.
Mung
July 25, 2014
July
07
Jul
25
25
2014
09:14 PM
9
09
14
PM
PDT
Mung, I think no one is understanding how to represent what you said. So what probability are you talking about when you wrote this:
there is no difference between the entropy argument and the information argument and the probability argument.
You're only being asked to clarify your own statements, because as your statement stands, it's meaningless and irrelevant to the probability of life.scordova
July 25, 2014
July
07
Jul
25
25
2014
08:44 PM
8
08
44
PM
PDT
I opened another of my books on information theory and find the following quote:
I is impossible to defeat an ignorant man by argument. - William McAdoo
Mung
July 25, 2014
July
07
Jul
25
25
2014
08:44 PM
8
08
44
PM
PDT
Salvador, pardon, but there's that little matter of your misrepresentations of what I claimed that needs to be cleared up.Mung
July 25, 2014
July
07
Jul
25
25
2014
07:54 PM
7
07
54
PM
PDT
SalC: Pardon, cf 79 on above. KFkairosfocus
July 24, 2014
July
07
Jul
24
24
2014
06:47 PM
6
06
47
PM
PDT
Mung, What probability are you talking about when you wrote this:
there is no difference between the entropy argument and the information argument and the probability argument.
Is the probability of winning a lottery, the probability of Rubio being the next president, the probability of a coin being heads, the probability of life? You're statement is meaningless without specifying the probability you were talking about.scordova
July 24, 2014
July
07
Jul
24
24
2014
06:04 PM
6
06
04
PM
PDT
Salvador, Since you claim to be able to calculate the entropy of the origin of life, please do so for the benefit of our interested readers. Since you claim to be able to calculate the information of the origin of life, please do so for the benefit of our interested readers. Once you have done this, I will gladly calculate the probability of the origin of life. Salvador:
We were talking probability of life weren’t we?
Were you? Where? Salvador:
The 2nd law of Thermodynamics is not a obstacle to mindless OOL any more than 2nd law of motion by Newton.
It follows then that entropy is not an obstacle to mindless OOL. Salvador:
For the reader’s benefit. The 2nd law can be stated in terms of entropy.
Yes, most of us already knew this. In fact, I made the connection early in this thread. But thanks anyways. If the 2LOT is no obstacle, then neither is entropy. Right? Mung:
There is no difference between the entropy argument and the information argument and the probability argument.
It also follows that, according to you, the probability argument and the information argument are likewise no obstacle to OOL. So in a single post you manage to undermine all of ID, and you don't even know it. Yet you claim to be an ID'ist. This is a consistent pattern you've displayed for years. And for pointing it out I've been "banned" from your threads. But you can't ban me from this thread, so bring it on Salvador. Show us what you have when you can't control the debate by deleting dissenting opinion.Mung
July 24, 2014
July
07
Jul
24
24
2014
05:55 PM
5
05
55
PM
PDT
Mung wrote: there is no difference between the entropy argument and the information argument and the probability argument.
We were talking probability of life weren't we? You can bail out by saying: "No, I wasn't talking about the probability of life." If that's the way you backtrack, then what probability are you talking about?scordova
July 24, 2014
July
07
Jul
24
24
2014
02:20 PM
2
02
20
PM
PDT
Salvador:
How about you show the readers that one can derive a probability number for life based on thermodynamic entropy. It would be a service to our readers if you did.
How about you show our readers where I claimed that I or anyone else could do so. Salvador:
You claim you can relate the thermodynamic entropy and the probability of the system being alive.
I ask again, where did I make that a claim?Mung
July 24, 2014
July
07
Jul
24
24
2014
12:35 PM
12
12
35
PM
PDT
For the reader's benefit. The 2nd law can be stated in terms of entropy. As Gordon pointed out an alternative form of the 2nd law is:
* The entropy of an isolated system cannot decrease.
Here is a followup discussion of entropy and Shannon information: https://uncommondescent.com/physics/shannon-information-entropy-uncertainty-in-thermodynamics-and-id/scordova
July 23, 2014
July
07
Jul
23
23
2014
08:28 PM
8
08
28
PM
PDT
Mung, How about you show the readers that one can derive a probability number for life based on thermodynamic entropy. It would be a service to our readers if you did. If such a probability can't be derived based on thermodynamic entropy, then say so. State what you believe.scordova
July 23, 2014
July
07
Jul
23
23
2014
08:19 PM
8
08
19
PM
PDT
Slavador:
Suppose the thermodynamic entropy of an object is 50 bits (Shannon). Tell the readers what the probability is the object is alive.
What's really sad about this is that you seem to think this is a serious question that ought to be taken seriously and that other readers of this thread ought to agree with you. Or perhaps it's just another of your red herrings, like this one:
Think you are up to the calculation? Most college science students who study thermodynamic entropy would be able to answer that question. If you can’t, why are you giving people reading suggestions about thermodynamic entropy since you can’t even do a basic entropy calculation yourself?
Let's say, for the sake of argument, that I cannot perform even the most simple entropy calculation. So what? Your argument is that if I cannot perform a simple entropy calculation that none of he sources that I cite can either, and can therefore be ignored as irrelevant, which is just absurd. Absurd, Salvador.
You claim you can relate the thermodynamic entropy and the probability of the system being alive.
Where did I make that claim?Mung
July 23, 2014
July
07
Jul
23
23
2014
08:04 PM
8
08
04
PM
PDT
Mung, Suppose the thermodynamic entropy of an object is 50 bits (Shannon). Tell the readers what the probability is the object is alive. You claim you can relate the thermodynamic entropy and the probability of the system being alive. Go ahead. Demonstrate it. Do the calculation and defend your claim. This question is relevant to the discussion of the 2nd law and life. So how about it Mung? If you can't relate a simply stated thermodynamic entropy expressed in Shannon bits to the probability that the system is a living system or not, then how do you expect readers to believe you. Can you do the calculation to prove your point? If you can't say so. Are you capable of even doing the calculation? A simple yes or no will suffice.scordova
July 22, 2014
July
07
Jul
22
22
2014
09:47 PM
9
09
47
PM
PDT
Salvador @ 107:
What argument are you talking about? The information argument about OOL, the probability argument about OOL, the thermodynamic entropy argument about OOL?
Mung:
I thought I made it clear that there was no difference between the three.
Salvador:
You did no such thing.
So this exchange between the two of was precipitated by an exchange you claim never took place. How do you explain the following? Mung @ 35:
Salvador, there is no difference between the entropy argument and the information argument and the probability argument.
Mung
July 22, 2014
July
07
Jul
22
22
2014
08:45 PM
8
08
45
PM
PDT
Salvador:
I related Shannon, Dembski, Boltzmann and Clausius notions of entropy
At first I took your word for it, but now I am beginning to doubt this. Salvador:
... why are you giving people reading suggestions about thermodynamic entropy since you can’t even do a basic entropy calculation yourself?
Which of the reading suggestions that I have offered do you disagree with, and why? Salvador:
...since you can’t even do a basic entropy calculation yourself...
Hilarious. You consistently delete my posts from any thread you author (continuing to the present day) and in the past have modified the content of my posts in threads you've authored to make it appear as if I had written something which I did not in fact write. It probably takes no stretch to imagine that you've just ignored any post in which I actually performed an entropy calculation. So perhaps you're not lying, but rather just willfully ignorant. In any event, this is just red herring. You took umbrage at the following:
Salvador, there is no difference between the entropy argument and the information argument and the probability argument.
It's still a mystery why you took offense. I have cited numerous sources in support of my claim. So far all you have in rebuttal is the assertion that I can't do a simple entropy calculation, therefore I must be wrong.Mung
July 22, 2014
July
07
Jul
22
22
2014
08:37 PM
8
08
37
PM
PDT
Yet more reading for Salvador, since he still doesn't get it: A Farewell To Entropy: Statistical Thermodynamics Based on Information
...the aim of this book is to show that statistical thermodynamics will benefit from the replacement of the concept of entropy by "information" ...
From the book description on Amazon:
The principal message of this book is that thermodynamics and statistical mechanics will benefit from replacing the unfortunate, misleading and mysterious term "entropy" with a more familiar, meaningful and appropriate term such as information, missing information or uncertainty. This replacement would facilitate the interpretation of the "driving force" of many processes in terms of informational changes and dispel the mystery that has always enshrouded entropy. It has been 140 years since Clausius coined the term "entropy"; almost 50 years since Shannon developed the mathematical theory of "information" -- subsequently renamed "entropy". In this book, the author advocates replacing "entropy" by "information", a term that has become widely used in many branches of science. The author also takes a new and bold approach to thermodynamics and statistical mechanics. Information is used not only as a tool for predicting distributions but as the fundamental cornerstone concept of thermodynamics, held until now by the term "entropy". The topics covered include the fundamentals of probability and information theory; the general concept of information as well as the particular concept of information as applied in thermodynamics; the re-derivation of the Sackur-Tetrode equation for the entropy of an ideal gas from purely informational arguments; the fundamental formalism of statistical mechanics; and many examples of simple processes the "driving force" for which is analyzed in terms of information.
Mung
July 22, 2014
July
07
Jul
22
22
2014
08:01 PM
8
08
01
PM
PDT
Mung, Liquid water has a standard molar entropy of 69.9 J/K per mole. You have a warm little pond of 1,000 gallons of water. Estimate the entropy amount in that warm little pond. Think you are up to the calculation? Most college science students who study thermodynamic entropy would be able to answer that question. If you can't, why are you giving people reading suggestions about thermodynamic entropy since you can't even do a basic entropy calculation yourself? But I expect you might find the answer on the internet. Suppose now you find that answer, translate your answer for that thermodynamic entropy into a probability for OOL in that warm little pond. :-) You claimed you should be able to, since you said there is no difference between thermodynamic entropy and OOL probability. :roll: Well, how about it Mung?scordova
July 21, 2014
July
07
Jul
21
21
2014
12:46 AM
12
12
46
AM
PDT
I asked:
What argument are you talking about? The information argument about OOL, the probability argument about OOL, the thermodynamic entropy argument about OOL?
Mung responded:
I thought I made it clear that there was no difference between the three.
You did no such thing. Suppose an object has thermodynamic entropy on the order of 10^7 J/K. Show for the reader how this amount of entropy translates into the probability of OOL. :roll: HINT: you can't, because they are generally unrelated, you can not make the assertion the probability of OOL is the same as the probability related to thermodynamic entropy. Thermodynamic entropy deals with probability of finding a system in a particular energy or position/momentum microstate, this can be equivalently described into the Shannon entropy for energy or position/momentum microstates. But not all possible Shannon entorpies a necessarily thermodynamic entropies. You don't seem to understand that! The probability of seeing a particular energy or position/momentum microstate is not the same as the probability of OOL.
More reading suggestions for Salvador:...
Quit offer reading suggestions as if you really understand what you're talking about, because you don't. Take some grad level statistical mechanics and thermodynamics before your start giving reading suggestions to those that have actually studied such material at the graduate level. :roll: You think:
there was no difference between the three (probability of OOL, thermodynamic entropy, information).
suggests you're the one who needs to read and comprehend. Take a given thermodynamic entropy level (not just 10^7 J/K) for an object and explain to the readers how that translates into a probability of OOL. If you can't make the derivation, then that shows you're just bloviating about stuff you have no clue about. So how about it mung, if you're given the amount of thermodynamic entropy for a warm little pond, do you think you can state, based on the thermodynamic entropy alone, whether there is something alive in it? C'mon, mung, let's settle this once and for all. :-)scordova
July 21, 2014
July
07
Jul
21
21
2014
12:11 AM
12
12
11
AM
PDT
More reading suggestions for Salvador: Science and Information Theory
A classic source for understanding the connections between information theory and physics, this text was written by one of the giants of 20th-century physics.
Scientific Uncertainty, and InformationMung
July 20, 2014
July
07
Jul
20
20
2014
10:05 PM
10
10
05
PM
PDT
BA77, perhaps you should take a lesson from Salvador, who while constantly boasting in his four degrees still can't stand up to kf.Mung
July 20, 2014
July
07
Jul
20
20
2014
09:26 PM
9
09
26
PM
PDT
kairosfocus @102:
BA77, your privilege. I just invite you to ponder that Q-Mech is C20, but Stat Mech, per Gibbs (and Boltzmann) is C19. One of the simplest roads in is to think of boxes of bouncing marbles — quite classical, cf my App 1, the always linked. KF
That's one thing I love about the Amnon Katz text.
Professor Katz first sets up a general theory of statistical mechanics and then applies it to problems in and out of equilibrium. He carries out classical and quantum mechanical treatments in parallel and stresses the analogy between them.
Mung
July 20, 2014
July
07
Jul
20
20
2014
09:22 PM
9
09
22
PM
PDT
kf, no disrespect, but your reply 'C20 C19 bouncing marbles etc' makes no sense whatsoever to me as to coherently explaining the quantum Zeno effect nor coherently explaining any of the other quantum effects presented to you. Moreover, I can easily ask you, on the other hand, to ponder the irrationality that denying your 'mind' entails. If you refuse the budge from your position on this matter we are at a genuine impasse since I find your position, (i.e. no causal effect for consciousness), to be genuinely incoherent. But so be it. I have too much respect for your tireless effort and work to gripe too much.bornagain77
July 20, 2014
July
07
Jul
20
20
2014
06:56 AM
6
06
56
AM
PDT
BA77, your privilege. I just invite you to ponder that Q-Mech is C20, but Stat Mech, per Gibbs (and Boltzmann) is C19. One of the simplest roads in is to think of boxes of bouncing marbles -- quite classical, cf my App 1, the always linked. KFkairosfocus
July 20, 2014
July
07
Jul
20
20
2014
06:37 AM
6
06
37
AM
PDT
of related note as to tying up some 'philosophical loose ends' Christianity and Panentheism - (conflict or concordance?) - video https://www.youtube.com/watch?v=_xki03G_TO4&list=UU5qDet6sa6rODi7t6wfpg8gbornagain77
July 20, 2014
July
07
Jul
20
20
2014
06:28 AM
6
06
28
AM
PDT
Related notes on ‘interaction free’ measurement:
The Mental Universe – Richard Conn Henry – Professor of Physics John Hopkins University Excerpt: The only reality is mind and observations, but observations are not of things. To see the Universe as it really is, we must abandon our tendency to conceptualize observations as things.,,, Physicists shy away from the truth because the truth is so alien to everyday physics. A common way to evade the mental universe is to invoke “decoherence” – the notion that “the physical environment” is sufficient to create reality, independent of the human mind. Yet the idea that any irreversible act of amplification is necessary to collapse the wave function is known to be wrong: in “Renninger-type” experiments, the wave function is collapsed simply by your human mind seeing nothing. The universe is entirely mental,,,, The Universe is immaterial — mental and spiritual. Live, and enjoy. http://henry.pha.jhu.edu/The.mental.universe.pdf The Renninger Negative Result Experiment – video http://www.youtube.com/watch?v=C3uzSlh_CV0 Elitzur–Vaidman bomb tester Excerpt: In 1994, Anton Zeilinger, Paul Kwiat, Harald Weinfurter, and Thomas Herzog actually performed an equivalent of the above experiment, proving interaction-free measurements are indeed possible.[2] In 1996, Kwiat et al. devised a method, using a sequence of polarising devices, that efficiently increases the yield rate to a level arbitrarily close to one. http://en.wikipedia.org/wiki/Elitzur%E2%80%93Vaidman_bomb-testing_problem#Experiments Experimental Realization of Interaction-Free Measurement – Paul G. Kwiat; H. Weinfurter, T. Herzog, A. Zeilinger, and M. Kasevich – 1994 http://www.univie.ac.at/qfp/publications3/pdffiles/1994-08.pdf Interaction-Free Measurement – 1995 http://archive.is/AjexE Realization of an interaction-free measurement – 1996 http://bg.bilkent.edu.tr/jc/topics/Interaction%20free%20measurements/papers/realization%20of%20an%20interaction%20free%20measurement.pdf
Verse and Music:
Colossians 1:17 And he is before all things, and by him all things consist. Brooke Fraser- “C S Lewis Song” http://www.godtube.com/watch/?v=DL6LPLNX
Supplemental Notes:
The Galileo Affair and Life/Consciousness as the true “Center of the Universe” https://docs.google.com/document/d/1BHAcvrc913SgnPcDohwkPnN4kMJ9EDX-JJSkjc4AXmA/edit
Two very different eternities revealed by physics:
General Relativity, Special Relativity, Heaven and Hell https://docs.google.com/document/d/1_4cQ7MXq8bLkoFLYW0kq3Xq-Hkc3c7r-gTk0DYJQFSg/edit
bornagain77
July 20, 2014
July
07
Jul
20
20
2014
06:05 AM
6
06
05
AM
PDT
The reason why I am very impressed with the Quantum Zeno effect as to establishing consciousness’s primacy in quantum mechanics is, for one thing, that Entropy is, by a wide margin, the most finely tuned of initial conditions of the Big Bang:
The Physics of the Small and Large: What is the Bridge Between Them? Roger Penrose Excerpt: “The time-asymmetry is fundamentally connected to with the Second Law of Thermodynamics: indeed, the extraordinarily special nature (to a greater precision than about 1 in 10^10^123, in terms of phase-space volume) can be identified as the “source” of the Second Law (Entropy).” How special was the big bang? – Roger Penrose Excerpt: This now tells us how precise the Creator’s aim must have been: namely to an accuracy of one part in 10^10^123. (from the Emperor’s New Mind, Penrose, pp 339-345 – 1989)
For another thing, it is interesting to note just how foundational entropy is in its explanatory power for actions within the space-time of the universe:
Shining Light on Dark Energy – October 21, 2012 Excerpt: It (Entropy) explains time; it explains every possible action in the universe;,, Even gravity, Vedral argued, can be expressed as a consequence of the law of entropy. ,,, The principles of thermodynamics are at their roots all to do with information theory. Information theory is simply an embodiment of how we interact with the universe —,,, http://crev.info/2012/10/shining-light-on-dark-energy/
In fact, entropy is also the primary reason why our physical, temporal, bodies grow old and die,,,
Ageing Process – 85 years in 40 seconds – video http://www.youtube.com/watch?v=A91Fwf_sMhk *3 new mutations every time a cell divides in your body * Average cell of 15 year old has up to 6000 mutations *Average cell of 60 year old has 40,000 mutations Reproductive cells are ‘designed’ so that, early on in development, they are ‘set aside’ and thus they do not accumulate mutations as the rest of the cells of our bodies do. Regardless of this protective barrier against the accumulation of slightly detrimental mutations still we find that,,, *60-175 mutations are passed on to each new generation. Per John Sanford Entropy Explains Aging, Genetic Determinism Explains Longevity, and Undefined Terminology Explains Misunderstanding Both - 2007 Excerpt: There is a huge body of knowledge supporting the belief that age changes are characterized by increasing entropy, which results in the random loss of molecular fidelity, and accumulates to slowly overwhelm maintenance systems [1–4].,,, http://www.plosgenetics.org/article/info%3Adoi/10.1371/journal.pgen.0030220
And yet, to repeat wikipedia,,,
Quantum Zeno effect Excerpt: The quantum Zeno effect is,,, an unstable particle, if observed continuously, will never decay. per wiki
This is just fascinating! Why in blue blazes should conscious observation put a freeze on entropic decay, unless consciousness was/is more foundational to reality than the 1 in 10^10^120 entropy is? Putting all the lines of evidence together the argument for God from consciousness can now be framed like this:
1. Consciousness either preceded all of material reality or is a ‘epi-phenomena’ of material reality. 2. If consciousness is a ‘epi-phenomena’ of material reality then consciousness will be found to have no special position within material reality. Whereas conversely, if consciousness precedes material reality then consciousness will be found to have a special position within material reality. 3. Consciousness is found to have a special, even central, position within material reality. 4. Therefore, consciousness is found to precede material reality. Four intersecting lines of experimental evidence from quantum mechanics that shows that consciousness precedes material reality (Wigner’s Quantum Symmetries, Wheeler’s Delayed Choice, Leggett’s Inequalities, Quantum Zeno effect): https://docs.google.com/document/d/1G_Fi50ljF5w_XyJHfmSIZsOcPFhgoAZ3PRc_ktY8cFo/edit
bornagain77
July 20, 2014
July
07
Jul
20
20
2014
06:04 AM
6
06
04
AM
PDT
1 2 3 4 6

Leave a Reply