Uncommon Descent Serving The Intelligent Design Community

Thoughts on the Second Law

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

A couple of days ago Dr. Granville Sewell posted a video (essentially a summary of his 2013 Biocomplexity paper).  Unfortunately, he left comments off (as usual), which prevents any discussion, so I wanted to start a thread in case anyone wants to discuss this issue.

Let me say a couple of things and then throw it open for comments.

1. I typically do not argue for design (or against the blind, undirected materialist creation story) by referencing the Second Law.  I think there is too much misunderstanding surrounding the Second Law, and most discussions about the Second Law tend to generate more heat (pun intended) than light.  Dr. Sewell’s experience demonstrates, I think, that it is an uphill battle to argue from the Second Law.

2. However, I agree with Dr. Sewell that many advocates of materialistic evolution have tried to support their case by arguing that the Earth is an open system, so I think his efforts to debunk that nonsense are worthwhile, and I applaud him for the effort.  Personally, I am astounded that he has had to spend so much time on the issue, as the idea of life arising and evolution proceeding due to Earth being an open system is so completely off the mark and preposterous as to not even be worthy of much discussion.  Yet it raises its head from time to time.  Indeed, just two days ago on a thread here at UD, AVS made essentially this same argument.  Thus, despite having to wade into such preposterous territory, I appreciate Dr. Sewell valiantly pressing forward.

3. Further, whatever weaknesses the discussion of the Second Law may have, I believe Dr. Sewell makes a compelling case that the Second Law has been, and often is, understood in the field as relating to more than just thermal entropy.  He cites a number of examples and textbook cases of the Second Law being applied to a broader category of phenomena than just thermal flow, categories that could be applicable to designed objects.  This question about the range of applicability of the Second Law appears to be a large part of the battle.

Specifically, whenever someone suggests that evolution should be scrutinized in light of the Second Law, the discussion gets shut down because “Hey, the Second Law only applies to heat/energy, not information or construction of functional mechanical systems, etc.”  Yet, ironically, some of those same objectors will then refer to the “Earth is an open system, receiving heat and energy from the Sun” as an answer to the conundrum – thereby essentially invoking the Second Law to refute something to which they said the Second Law did not apply.

—–

I’m interested in others’ thoughts.

Can the Second Law be appropriately applied to broader categories, to more than just thermal entropy?  Can it be applied to information, to functional mechanical structures?

Is there an incoherence in saying the Second Law does not apply to OOL or evolution, but in the same breath invoking the “Earth is an open system” refrain?

What did others think of Dr. Sewell’s paper, and are there some avenues here that could be used productively to think about these issues?

Comments
1) I disagree with Dr Dembski He wrote: "information is usually characterized as the negative logarithm to the base two of a probability (or some logarithmic average of probabilities, often referred to as entropy)." I say that Entropy is NOT a probability, or a logarithm of a probability. Probabilities don't have units of Btus per degree. Entropy does. 2.) Your statement of the Second law, "Without a counteracting force, systems tend toward a stable state.” is a false statement, on two counts. First, systems in metastable states will remain unchanged indefinitely. Thus they do not "tend toward a stable state" Second, as shown by the tank of air, a system will eventually tend toward an unstable state, if you wait long enough. 3) On your objections to my tank of air (i) Practically. So what? A Scientific Law is a statement of an observed regularity in the physical Created world. T A statement of a Scientific Law should always be true. he example shows that other statements of the second law are false. (ii) Logically. The fact is that those unstable states will eventually show up. If a fact is "directly contradictory to what the Second Law describes", it is because the statement of the second law is false. The very low probability of the tank of air example occurring in the life of the universe is well covered by my definition of a Stable State: "A system is in a “Stable State” if it is HOPELESSLY IMPROBABLE that it can do measurable amount of work without a finite and permanent change in its environment."chris haynes
April 2, 2014
April
04
Apr
2
02
2014
11:01 AM
11
11
01
AM
PDT
niwraD, Thank you but AVS cannot offend me because he is beneath me. ;) He is just upset because I keep correcting him and exposing him as a poseur. His only recourse is to try to attack me personally but it ain't working out so well.Joe
April 2, 2014
April
04
Apr
2
02
2014
10:50 AM
10
10
50
AM
PDT
Aww, nimrod, that so nice of you. It's reasonable when it's the truth, sorry.AVS
April 2, 2014
April
04
Apr
2
02
2014
10:45 AM
10
10
45
AM
PDT
AVS #71 Please, don't offend my friend Joe, thank you. To accuse anyone of us of lacking "a basic level of scientific knowledge" in no reasonable way helps your evolutionist cause.niwrad
April 2, 2014
April
04
Apr
2
02
2014
10:36 AM
10
10
36
AM
PDT
Does Mercury receive an influx of energy from the Sun? Yes.Joe
April 2, 2014
April
04
Apr
2
02
2014
10:35 AM
10
10
35
AM
PDT
AVS, no one takes you seriously and seeing that you are a proponent of evolutionism you don't know anything about scienceJoe
April 2, 2014
April
04
Apr
2
02
2014
10:33 AM
10
10
33
AM
PDT
Joe, I can no longer take you seriously. You have lost all respect, and I have lost all hope for you ever exhibiting a basic level of scientific knowledge. ByeAVS
April 2, 2014
April
04
Apr
2
02
2014
10:05 AM
10
10
05
AM
PDT
So EA, in your opinion, which was the most damaging refutation of my thought experiment.AVS
April 2, 2014
April
04
Apr
2
02
2014
10:02 AM
10
10
02
AM
PDT
Mercury...Joe
April 2, 2014
April
04
Apr
2
02
2014
09:47 AM
9
09
47
AM
PDT
AVS:
What I had come up with was that the sun was a huge import source, driving an increase in order on Earth.
How is that coming along on Mecury?Joe
April 2, 2014
April
04
Apr
2
02
2014
09:29 AM
9
09
29
AM
PDT
AVS @38:
It’s funny, I actually made that comment after reading his paper. . . . My idea, I think relates to his statement that “thermal order can increase, but no faster than it is imported.” What I had come up with was that the sun was a huge import source, driving an increase in order on Earth. No one really came up with an argument against it to my knowledge. . . . It was a simple thought experiment, but apparently none of you wanted to think. Why am I not surprised?
It is good to see that you are backpedaling a bit and acknowledging that your assertion was "a simple thought experiment." But you might do well to fully dismount. Your idea is not new, and if you had actually read Sewell's paper carefully you would know that he spent a significant amount of time dealing with it, so your assertion that no one really came up with an argument against it doesn't wash. Furthermore, more than one person, on this very thread, has shown why the idea of the Sun "driving an increase in order on Earth" is flawed, both in terms of actual thermodynamics, as well as in terms of your particular statement regarding the "inevitable" origin and evolution of life on Earth. You need to be a bit more careful than saying things like "none of you wanted to think," particularly when it is evident there are many people here who have thought about this particular issue a lot longer and much more in depth than you have. I apologize for the somewhat harsh nature of my assessment above. We welcome you to UD and hope that you will participate with a sincere desire to learn about ID and to engage the issues. To the extent you are willing to do that, we are grateful for your participation and, I trust, all can learn something from your contribution as well.Eric Anderson
April 2, 2014
April
04
Apr
2
02
2014
09:16 AM
9
09
16
AM
PDT
Entropy is a property of a system that is equal to the difference between i) the system’s energy, and ii) the maximum amount of work the system can produce in attaining a Stable State that is in equilibrium with an indefinitely large tank of ice water.
You stated the Clausius view. The Boltzman view is conceptually more accurate (not necessarily practical) since it merely counts the way the particles can be configured in terms of position and momentum or energy. The genius step was connecting the Classius view with the Boltzmann view. Clausius: S = Integral(dS) Boltzman S = k log W Genius connection: S = Integral (dS) = k log W The reason S = k log W has significance because if we let k = 1, and use the symbol I instead of S, it looks like Shannon! I = log W where W is number of equiprobable microstates, or equivalently I = -log (P) where P = 1/W which looks like Dembski's claim I = -log2(P) Example, information of 3 fair coins. There a 2^3 = 8 microstates, W = 8, P = 1/W = 1/8 I = log2 W = log2 (8)= 3 bits or I = -log2 ( 1/W) = -log2 (1/8) = 3 bits Bill said as much here:
In the information-theory literature, information is usually characterized as the negative logarithm to the base two of a probability (or some logarithmic average of probabilities, often referred to as entropy). This has the effect of transforming probabilities into bits and of allowing them to be added (like money) rather than multiplied (like probabilities). Thus, a probability of one-eighths, which corresponds to tossing three heads in a row with a fair coin, corresponds to three bits, which is the negative logarithm to the base two of one-eighths. - See more at: http://www.evolutionnews.org/2012/08/conservation_of063671.html#sthash.5k8r8JHq.dpuf
More bits = more entropy Since increase in complexity requires increase in bits, this also means entropy must INCREASE for complexity to increase. You have to INCREASE Shannon entropy, you have to INCREASE thermal entropy to increase complexity as a general rule. Most, creationists and IDists have the ideas exactly backward. A living human has 100 trillion times more Shannon and thermal entropy than a dead cell. The calculations bear this out. I suppose one can define other kinds entropy where the relationship is reversed than the traditional ones I've outlined, and I suppose one would have to expand the 2nd law to deal with those sorts of entropies. X-entropies were proposed by Dr. Sewell. But maybe it is helpful to see the traditional entropies first. I provided them: 1. Clausius 2. Boltzmann 3. Shannon 4. Dembskiscordova
April 2, 2014
April
04
Apr
2
02
2014
09:06 AM
9
09
06
AM
PDT
scordova @39:
Order and disorder were introduced as “definitions” of entropy by a passing and erroneous remark by Boltzmann, the error has been restated in chem and physics text ever since, but appears nowhere in the actual calculations.
Sal, instead of "order" and "disorder", which I agree is a terrible way to think about the Second Law (or ID or designed systems, for that matter), I'm wondering if it is possible that what he was trying to describe is really better understood as "uniformity" or "non-uniformity"?Eric Anderson
April 2, 2014
April
04
Apr
2
02
2014
08:56 AM
8
08
56
AM
PDT
chris haynes @61 responding to Dr Moose:
If you wait a bozillion years, the molecules in the tank will assume every possible state, one of which has all the fast ones at one end and the slow ones at the other end. This is much like a turning drum of US and Canadian pennies. Sooner or later the US ones will be at one end, and the Canadian ones at the other end. I’m sorry I cant explain it better.
I'm not sure this is any better than the multiverse idea. :) What you are essentially saying is that if we have enough trials then eventually every possible state will have existed at some point, so we shouldn't be surprised at this very unusual state of affairs. This is problematic for a couple of reasons: (i) Practically. We don't have a bazillion years for anything. We're interested in what can be done, here and now, within reasonable timeframes. (ii) Logically. Your molecules example would seem to undercut the whole point of the Second Law. What you have essentially said is that the Second Law will force a system to a particular state . . . unless we wait long enough, in which case all states will eventually show up -- presumably even those that are unstable. That seems directly contradictory to what the Second Law describes. It is essentially a reliance on the chance occurrence of "every possible state" showing up eventually.Eric Anderson
April 2, 2014
April
04
Apr
2
02
2014
08:52 AM
8
08
52
AM
PDT
chris haynes: Thanks for your thoughtful comments. I still need to go through everything you've written to make sure I understand it fully. Just one point of clarification:
A complete statement of the second law is “Stable States exist”.
I'm not sure this is a helpful statement of the Second Law. Do stable states exist? Sure they do. But so do unstable states. The question is for how long, and in which direction things move. We don't need a Second Law to know that stable states exist. We can understand that from simple observation. What a law should do is help us understand why we observe what we do, or how certain forces will interact, or what predictions we can make given initial conditions. So the Second Law is not simply a restatement of the basic observation that stable states exist. Rather it needs to be explaining or predicting or understanding directional movement. If we were to reword it slightly, then it would be more substantive. For example, we might say: "Without a counteracting force, systems tend toward a stable state." Wording along those lines would be more substantive and meaningful and would be, I believe, more in line with what the Second Law is really trying to communicate.Eric Anderson
April 2, 2014
April
04
Apr
2
02
2014
08:41 AM
8
08
41
AM
PDT
Please let me correct an error in my definition of entropy Entropy is a property of a system that is equal to the difference between i) the system’s energy, and ii) the maximum amount of work the system can produce in attaining a Stable State that is in equilibrium with an indefinitely large tank of ice water.chris haynes
April 2, 2014
April
04
Apr
2
02
2014
06:15 AM
6
06
15
AM
PDT
Again, I thank you for your responses. Please forgive my not answering all of them. Mr Mung asked the key question of this whole thread: "What, precisely, differentiates it (the second law) from information theory? The units of measurement?" My answer: I dont know. My gut feeling, reinforced by the units of measurement issue, is that they're connected only by a loose analogy, nothing more. For those who disgaree, I need to say I get nothing from your long essays, becausue I dont believe you understand the terms you throw around. I urge you to do three things that Dr Sewell hasnt done. 1) State the second law in terms that one can understand. 2.) Define entropy precisely. 3.) Explain what Btu's and degrees of temperature have to do with information. Based on my present understanding, here is what I would give: 1) Stable States Exist. A system is in a "Stable State" if it is hopelessly improbable that it can do measurable work without a finite and permanent change in its environment. 2) Entropy is a property of a system that is equal to the difference between i) the system's energy, and ii) the maximum amount of work the system can produce in coming to equilibrium with an indefinitely large tank of ice water. 3) Nothing. Finally, Dr Moose asked about the tank of air. If you wait a bozillion years, the molecules in the tank will assume every possible state, one of which has all the fast ones at one end and the slow ones at the other end. This is much like a turning drum of US and Canadian pennies. Sooner or later the US ones will be at one end, and the Canadian ones at the other end. I'm sorry I cant explain it better.chris haynes
April 2, 2014
April
04
Apr
2
02
2014
06:11 AM
6
06
11
AM
PDT
PS: Just to get an idea, we may look at the number of arrangements for 10^6 items, imagined as a string, 1,000 each of 1,000 types: [10^6]!/[1,000!]^1000 possible arrangements & using accessible factorial values . . . = [8.2639316883×10^5,565,708]/[4.0238726008×10^2,567]^1000 Take logs, go to 4 sig figs: log[x] ~ [5,565,708 + 0.9172 ] - [2,567,000 + 604.7] ~ 5,439, 497 + 0.2172 x ~ 1.7 * 10^5,439,497 If all 10^80 atoms, each were furnished with a string of this type, and were to be blindly searching a new config every 10^-14 s, for 10^17 s, they would scan 10^111 possibilities, moving up to working at Planck time rates for 10^25 s, we see 10^150. There is just too much haystack and too little resource to carry out a thorough enough search to find the needle in it. And that's just for the string of elements. 10^18! is in computer reduced to slag territory.kairosfocus
April 2, 2014
April
04
Apr
2
02
2014
03:37 AM
3
03
37
AM
PDT
CH (et al): One of the fallacies of probability reasoning is the confusion of a logical possibility with a credibly potentially observable probability on the gamut of possible observations. This issue is often seen in the notion that someone "must" win a lottery. Actually, not, unless it is very carefully calibrated to be winnable. That is, there must be a sufficiently high likelihood of a winning ticket being bought, or else people will see that no-one is winning and it will collapse. In the case of thermodynamic systems, it is for instance, logically possible for say all the O2 molecules in the room in which you are sitting to go to one end, leaving you gasping. However, as there is no good reason to see a strong bias in favour of such a config, the utterly overwhelming bulk of more or less evenly mixed states mean that even in a case where a room had no oxygen in it to begin with, if a tank of the gas were released in one corner, in a fairly short while it would spread throughout the room. In fact, it is overwhelmingly likely that we would ever observe the reverse, a bottle spontaneously filling up with pure O2, even once on the gamut of the observed cosmos, of credibly 10^80 atoms and perhaps c 10^17 s duration, with chemical events taking maybe 10^-14 s. What statistical thermodynamics informs us in no uncertain terms, is that thermodynamic systems of any appreciable size will dampen out the relative effect of fluctuations, leading to a strong tendency of microstates to migrate to the bulk clusters of macrostates. It is in this context that we can see a connexion between entropy and information, as I linked on previously at 25 above. Namely, it can be seriously developed that entropy is an informational metric on the average missing info [in bits or a measure convertible into bits under relevant conditions] on the particular microstate of a body of particles facing a circumstance wherein the known state is given as a macrostate. This ties to the Gibbs and Boltzmann formulations of entropy at statistical levels. This leads to the point that say if we were to decant the parts for a micro-scale UAV into a vat with 1 cu m of fluid, say, 10^6 particles small enough to undergo diffusion, let's just say micron size, in the cu m, we would face 10^6 particles in a field of 10^18 1 micron cells. It is easy to see that random molecular interactions (of the sort responsible for Brownian motion) would overwhelmingly tend to disperse and mix the decanted components. The spontaneous clumping of these particles would be overwhelmingly unlikely . . . unweaving of diffusion -- just do the ink drop in a glass of water exercise to see why. Beyond clumping, the spontaneous arrangement of the clumped particles into a flyable config would also be further maximally unlikely, as the number of non-functional clumped configs vastly outnumbers the functional ones. Indeed, this can be seen to be a second unweaving of diffusive forces. Where also, we now see the significance of Sewell's deduction that entropy discussions are closely connected to diffusion. We also can see that if we were to agitate the vat, it would not help our case. That is, Sewell's point that if a config is overwhelmingly unlikely to be observed in an isolated or closed system, it is not going to be suddenly materially more likely if we were to simply open up the system to blind forces. But, if we were to put into our vat a cooperative army of clumping nanobots and assembly bots working under a program, assembly of the UAV from components would be far more explicable. For work is forced ordered motion, so if we have organising instructions and implementing devices, we can search out, categorise, collect, then cluster and assemble the parts according to a plan. Such work of complex, functionally specific organisation is a commonplace all around us and it normally rests on organised effort working according to a plan based on an information-rich design. With required energy flows achieving shaft work while exhausting waste energy that contributes to the overall rise in entropy of the observed cosmos. All of this is of course very closely connected to the too often willfully obfuscated challenge to spontaneously get to origin of cell based life in Darwin's warm little pond or the like. Clumping and assembling work have to be accounted for, and associated information, including coded algorithmic info used in DNA, RNA and proteins. The compensation argument fallacies and dismissiveness to the likes of a Hoyle et al, do not even begin to scratch the surface of that challenge. But, what is quite evident is confirmation bias of the most blatant sort, leading to clutching at straws and belief in materialistic magic that in imagination manufactures complex functionally specific organisation out of diffusive and known disorganising forces. Failing, spectacularly, the vera causa test that to explain unobserved OOL, we should first show that forces and factors used are capable of the effect, on empirical observation here and now. Otherwise, all is materialist fantasy, driven by ideological a prioris. KFkairosfocus
April 2, 2014
April
04
Apr
2
02
2014
02:20 AM
2
02
20
AM
PDT
Greetings. Mung at 53: Maybe it's good you say tend. From CMI:
No, I would not say that entropy/Second Law of Thermodynamics began at the Fall. The Second Law is responsible for a number of good things which involve increases in entropy, so are ‘decay’ processes in the thermodynamic sense but maybe not what most people would imagine are decay
http://creation.com/the-second-law-of-thermodynamics-answers-to-criticsseventrees
April 2, 2014
April
04
Apr
2
02
2014
02:14 AM
2
02
14
AM
PDT
fossil @36: The compensation argument may be of some relevance in a purely thermal sense when we are looking at systems that are close in space and time, are directly linked in some way, and are affected by each other. But even in that case it isn't so much of an "explanation" as an observation. You don't get a decrease in entropy in a system simply by the injection of thermal energy, unless we fall back to the barest concept of entropy being nothing more than a measure of the total quantity of thermal energy in a system. The compensation argument in regards to OOL and evolution is nonsensical because (i) OOL and evolution are not primarily thermal problems, (ii) even to the extent that energy is needed for OOL and evolution, simply pouring energy into the system isn't helpful; there needs to be a directing process to channel the energy in useful ways, and (iii) no-one doubts that there is plenty of energy available, whether it be lightning strikes, volcanic vents, the Sun, deep sea vents, or otherwise; energy (at least in terms of raw quantity) has never been the issue. Furthermore, if we go beyond the purely thermal considerations and think of the Second Law a bit more broadly as a number of scholars (in addition to Sewell) have done, then the compensation argument is simply irrational. When considering functional machines, for instance, can we possibly think that if I build a machine in my garage then -- as a direct consequence of my activity -- somewhere, in some undefined location in the universe, entropy increases? Sure, one could say that. But it is just nonsense. There is no possible mechanism for the entropy somewhere else in the universe to respond to what I am doing here in my garage. And on the other side of the coin, if I later accidentally back over my machine with the car and destroy it, or accidentally set the house afire, can we with a straight face argue that somewhere else in the universe another machine must have been built or another house must have arisen from the ashes to "compensate" for what happened here and now in this locale? Or if I have a sentence made out of Scrabble letters and then mix the letters up, has a counteracting decrease in informational entropy happened elsewhere in the universe to compensate for it? The whole idea of compensation in these areas is wild fantasy and magic of the strangest sort.Eric Anderson
April 1, 2014
April
04
Apr
1
01
2014
10:44 PM
10
10
44
PM
PDT
DebianFanatic observed:
As I understand entropy / the second law, simply having an open system is only one third of what’s needed to overcome it. The input of energy onto my pile of scrap will only increase the entropy of that system, and will never, by itself, decrease it.
Nicely put. Still, if you blow up the pile of scrap, one piece might land on the roof. Since the pieces do fit together, Darwinian evolutionists extrapolate this stunning result, pinning their hopes on having enough scrap piles and explosions, so that eventually the result will be a fully functional lawn mower! Actually that's not right. Eventually, you'll get an edger for the lawn as an intermediate result. Additional explosions on subsequent versions will make innovations and improvements. Finally, you'll get a tractor with a stereo and an Internet connection...all due to "the blind lawn equipment manufacturer"! To see this method in action, you just have to be willing to wait a billion years, which of course, no one can do to verify their assertion. However, they did witness a gas explosion that sent a lawnmower up into a tree---a triumph of their theory! ;-) -QQuerius
April 1, 2014
April
04
Apr
1
01
2014
10:12 PM
10
10
12
PM
PDT
Chris Haynes, "It is a matter of simple Kinetic Theory that sooner or later a gas will self separate into hot and cold regions." I don't follow. It would appear to me that in the absence of gravity, the hot gas will share its heat with the cold, and all of the gas will become the same temperature. That certainly seems to be what happens when I drop a cup of hot water into a pail of cold.Moose Dr
April 1, 2014
April
04
Apr
1
01
2014
09:27 PM
9
09
27
PM
PDT
I'm not a math-whiz, or an engineer, or particularly smart; most of this conversation is above me. But something I know: A pile of scrap metal in my backyard, exposed to an input of energy (whether sunshine, or wind, or rain, or gasoline and a lit match, etc), will never self-assemble into a lawn mower. To get a lawn-mower out of that scrap, I need a minimum of three things: 1) a machine capable of turning that scrap into a mower 2) an input of energy (of the right type - gasoline might work, as long as it's put into the tank of the mower-making machine and not poured onto the outside of the machine; solar might work; even coal might work; depends on the machine's requirements) 3) a program to control that machine (sometimes the program is "built into" the machine, in the form of certain-sized gears, and timing mechanisms, and piston-rod lengths, etc) As I understand entropy / the second law, simply having an open system is only one third of what's needed to overcome it. The input of energy onto my pile of scrap will only increase the entropy of that system, and will never, by itself, decrease it. If I understand seventrees in 14, he said the same thing, saying: "In thermal entropy sense, a basic is that work needs to be done for heat transfer in the negative gradient. How is the work provided? This is the case for machinery. Another issue in machinery is how is the energy directed for it to perform its task?" In other words, you need 1) energy, 2) a machine to convert that energy into useful work, and 3) a program that controls that machine. That's just common sense to me.DebianFanatic
April 1, 2014
April
04
Apr
1
01
2014
08:31 PM
8
08
31
PM
PDT
Creationists tend to associate the second law with the fall, which is just absurd. It's as if the second law did not exist prior to the fall but after the fall the second law was in control of everything.Mung
April 1, 2014
April
04
Apr
1
01
2014
07:49 PM
7
07
49
PM
PDT
Yes, the probability is very low. But as you noted, its not zero. So any statement of the second law, such as those by Clausius, Plank, or Hawking, that says its zero, is false. Myself, whatever Dr Hawking may think, I don't think its okay to make a false statement of a Scientific Law. It isn't hard to state the second law correctly. Even a Creationist like myself did. So why sell yourself short?chris haynes
April 1, 2014
April
04
Apr
1
01
2014
07:27 PM
7
07
27
PM
PDT
chris haynes:
I’m happy you like my posts, indeed I’m flattered, but I really think they’re kind of obvious.
If you have an axe to grind you hide it well. :) But no, they are not obvious, which is why I attempted to draw you out on some of the things you raised. In particular this statement of yours:
The second law is not about absolutes. It is about probabilities.
That is NOT obvious. Even you seem to know it's not obvious. But my question to you is, if the second law is indeed about probabilities, as you assert, then what, precisely, differentiates it from information theory? The units of measurement?Mung
April 1, 2014
April
04
Apr
1
01
2014
07:23 PM
7
07
23
PM
PDT
Chris, I once read how far a typical (ideal) gas molecule travelled at STP before colliding with another one. Not very far. So now you require a trillions of collisions, all of which are segregated between higher energy and lower energy molecules (or a single collision each, at angles such that no other collisions occur before your gas molecules are segregated). Next take the probability of success of one such molecule to the power of 6.02 x 10^23 to get the probability of the event. This probability is so small compared to the age of the universe times a trillion that it would be a miracle on the order of Moses parting the Red Sea or Jesus turning water into wine. Are you sure you want to incorporate this into your ideas? ;-) -QQuerius
April 1, 2014
April
04
Apr
1
01
2014
07:06 PM
7
07
06
PM
PDT
One last nit pick The classic definition of entropy dS = dQ/T is a tautology. Temperature cannot be defined without first defining either heat or entropy. To see this, consider systems with temperatures below absolute zero. An example is an electron spin system with slightly more spinning at high energy than at low. The temperature will be in the order of -1,000,000 degrees Kelvin. How do you define temperature, without using either heat or entropy so that you get 1 million degrees below absolute zero?chris haynes
April 1, 2014
April
04
Apr
1
01
2014
07:01 PM
7
07
01
PM
PDT
I'm happy you like my posts, indeed I'm flattered, but I really think they're kind of obvious. As far as information and thermal entropy being more than analogies, I dont see it. I just don't understand what the units of thermal entropy (Btu's and degrees fahrenheit) have to do with information. I remember Dr Sewell discussing this issue, but to me at least, what he said was so much hocus pocus. Perhaps you could do better.chris haynes
April 1, 2014
April
04
Apr
1
01
2014
06:37 PM
6
06
37
PM
PDT
1 2 3 4 5

Leave a Reply