Uncommon Descent Serving The Intelligent Design Community

“Specified Complexity” and the second law

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

A mathematics graduate student in Colombia has noticed the similarity between my second law arguments (“the underlying principle behind the second law is that natural forces do not do macroscopically describable things which are extremely improbable from the microscopic point of view”), and Bill Dembski’s argument (in his classic work “The Design Inference”) that only intelligence can account for things that are “specified” (=macroscopically describable) and “complex” (=extremely improbable). Daniel Andres’ article can be found (in Spanish) here . If you read the footnote in my article A Second Look at the Second Law you will notice that some of the counter-arguments addressed are very similar to those used against Dembski’s “specified complexity.”

Every time I write on the topic of the second law of thermodynamics, the comments I see are so discouraging that I fully understand Phil Johnson’s frustration, when he wrote me “I long ago gave up the hope of ever getting scientists to talk rationally about the 2nd law instead of their giving the cliched emotional and knee-jerk responses. I skip the words ‘2nd law’ and go straight to ‘information'”. People have found so many ways to corrupt the meaning of this law, to divert attention from the fundamental question of probability–primarily through the arguments that “anything can happen in an open system” (easily demolished, in my article) and “the second law only applies to energy” (though it is applied much more generally in most physics textbooks). But the fact is, the rearrangement of atoms into human brains and computers and the Internet does not violate any recognized law of science except the second law, so how can we discuss evolution without mentioning the one scientific law that applies?

Comments
Joe, if a planet is turning, then a region of the surface is sometimes in day and sometime night. At night, that entire region will be cooling down, a process it achives by "coupling" with space, so it can "raw export" that energy. This all results in a lowering of entropy across the physical landscape. Of course, you could take the position that all planets were set up to do that by God, but then you might as well claim that evolution was set up to do that by God, or abiogenesis if you really want to go down that route.The Pixie
April 5, 2007
April
04
Apr
5
05
2007
12:43 PM
12
12
43
PM
PDT
the Pixie, I'm not sure what your point is. Would a planet that is not turning also have huge regions cooling down? Does a star that turns have huge regions cooling down? How about a star that doesn't turn? the Pixie: Every planet that is turning will have huge regions cooling down, and so decreasing in entropy. I would say it would depend on the speed of rotation as well as the atmosphere. And then all that would also depend on how the rotation started.Joseph
April 5, 2007
April
04
Apr
5
05
2007
12:14 PM
12
12
14
PM
PDT
Joe, I am not sure what your point is. Every planet that is turning will have huge regions cooling down, and so decreasing in entropy. This is not something unique to this "priveledged planet".The Pixie
April 5, 2007
April
04
Apr
5
05
2007
09:49 AM
9
09
49
AM
PDT
the Pixie: Non-intelligent systems repeatedly, but predictably create large regions of low entropy; as the planet turns whole continents cool down, losing entropy. That just begs the question as "The Privileged Planet" makes it clear that the planet and solar system were intelligently designed. IOW you start off talking about "non-intelligent systems", there isn't any data that would demonstrate this planet is such a system and yet you appear to be using it as an example to support your claim.Joseph
April 5, 2007
April
04
Apr
5
05
2007
09:00 AM
9
09
00
AM
PDT
kairosfocus I am having problems with the filter, so have had to break my post up into smaller and smaller bits as I get to the problem.
All that happens in TMLO at that point [NB: I am looking at my paper copy, with my annotation “where dG/T LTE 0 for spontaneous changes”] is they point out that there is a difference between a random and a specified configuration of the molecules they have in mind:
What they are doing is wrong, but I do now realise that the way they are justifying it is in the way you say, not the way I said originally. Yes, they say they are splitting deltaS into deltaS[thermal] and deltaS[config]. But deltaS[thermal] is the same as the deltaS that Gibbs' used, and everyone since him has used. They slipping deltaS[config] into deltaS, and then explicitly bringing it out again as they go from 8.4b to 8.5, so in effective introducing a whole new term deltaS[config]. So the issue is that it makes no sense to split the entropy, S = S[thermal] + S[config]. Remember that the thermodynamic entropy is a property we can measure in the laboratory. It is a routine analysis, simply involving heating a sample of known weight and measuring the energy input (say with a differential scanning calorimeter), then extrapolating back to absolute zero (where entropy is zero).The Pixie
April 5, 2007
April
04
Apr
5
05
2007
06:54 AM
6
06
54
AM
PDT
kairosfocus
6] Spontaneous: By chance + necessity only.
Well that is the first time I have come across THAT definition for spontaneous. So, as I asked last time, is the combustion of coal spontaneous? Does entropy increase or decrease? Personally, I believe it is spontaneous, as the word is used in thermodynamics (but not under common usage), and entropy does increase, but I will be intrigued to see what your position is. For reference, see the Wiki entry for "spontaneous process". Please note that it also relates the two terms of the Gibbs' equation to entropy in the system and entropy in the surroundings, and not to "chemical work" and "entropy work", as Thaxton would have us believe.The Pixie
April 5, 2007
April
04
Apr
5
05
2007
06:52 AM
6
06
52
AM
PDT
kairosfocus
6] Spontaneous: By chance + necessity only.
Well that is the first time I have come across THAT definition for spontaneous. So, as I asked last time, is the combustion of coal spontaneous? Does entropy increase or decrease? Personally, I believe it is spontaneous, as the word is used in thermodynamics (but not under common usage), and entropy does increase, but I will be intrigued to see what your position is. For reference, see the Wiki entry for "spontaneous process". Please note that it also relates the two terms of the Gibbs' equation to entropy in the system and entropy in the surroundings, and not to "chemical work" and "entropy work", as Thaxton would have us believe.
All that happens in TMLO at that point [NB: I am looking at my paper copy, with my annotation “where dG/T LTE 0 for spontaneous changes”] is they point out that there is a difference between a random and a specified configuration of the molecules they have in mind:
What they are doing is wrong, but I do now realise that the way they are justifying it is in the way you say, not the way I said originally. Yes, they say they are splitting deltaS into deltaS[thermal] and deltaS[config]. But deltaS[thermal] is the same as the deltaS that Gibbs' used, and everyone since him has used. They slipping deltaS[config] into deltaS, and then explicitly bringing it out again as they go from 8.4b to 8.5, so in effective introducing a whole new term deltaS[config]. So the issue is that it makes no sense to split the entropy, S = S[thermal] + S[config]. Remember that the thermodynamic entropy is a property we can measure in the laboratory. It is a routine analysis, simply involving heating a sample of known weight and measuring the energy input (say with a differential scanning calorimeter), then extrapolating back to absolute zero (where entropy is zero). I believe that if S[config] (if we suppose such a quantity) is zero at 0 K, and presuming the configuration does not change during warming to ambient, then S[config] is still zero at 25degC. Bear in mind that chemists, physics and engineers have been using the Gibbs' equation for about a century to extend the second law to open systems, and in all that time no one has ever found an exception. I think that would be odd if there really was an additional term. Has no one any experimental data of processes going or not going that can only be explained by this new term? Why not?The Pixie
April 5, 2007
April
04
Apr
5
05
2007
06:49 AM
6
06
49
AM
PDT
kairosfocus
Of course, as you later note, the 2nd law is at micro level, founded on thermodynamic probabilities. That is where the basic problem comes in in your arguments on abiogenesis — the probabilities and equilibria likely to result are such that in the gamut of a planet with a prebiotic soup, the relevant macro-moloecules simply will not form in anything like the concentration or proximity to achieve either RNA world or metabolism first scenarios.
We agree that entropy must increase because it is too improbable for entropy to decrease. What I am objecting to is the reverse claim; that if something is improbable, then that is because entropy must be decreasing. It is improbable that I will win the lottery or get stuck by lightning, but that has nothing to do with entropy. Personally I do not know enough about (supposed) pre-biotic conditions to be able to estimate the probabilities (though if you have any figures, and can justify them, I would be interested to see).
You skip over the key difference between a random DNA polymer and a bio-informational one. Both are complex if long enough, but one is functionally specified, the other is not. To move from the random state to the functional one plainly requires work, and shifts the entropy as there is now a quite discernibly different macrostate — just think about the effect of random changes in DNA in living systems.
So we have three DNA sequences, of equal length. Sequence X is a simple repeating patterm, sequence Y is random and sequence Z is human DNA. Do they have the same thermodynamic entropy? Does it take the same energy input to synthesise them in the lab, one base at a time (does it make a difference if you specify the random sequence in advance)? If not, which will take more, and why? My position is that each sequence has the same entropy. As each base is added, the entropy of the sequence will change by the same amount, regardless of whether the sequence ultimately becomes X, Y or Z. It is my belief that at absolute zero all three sequences will have zero entropy (that is what the third law tells me, anyway).The Pixie
April 5, 2007
April
04
Apr
5
05
2007
06:48 AM
6
06
48
AM
PDT
kairosfocus
Of course, as you later note, the 2nd law is at micro level, founded on thermodynamic probabilities. That is where the basic problem comes in in your arguments on abiogenesis — the probabilities and equilibria likely to result are such that in the gamut of a planet with a prebiotic soup, the relevant macro-moloecules simply will not form in anything like the concentration or proximity to achieve either RNA world or metabolism first scenarios.
We agree that entropy must increase because it is too improbable for entropy to decrease. What I am objecting to is the reverse claim; that if something is improbable, then that is because entropy must be decreasing. It is improbable that I will win the lottery or get stuck by lightning, but that has nothing to do with entropy. Personally I do not know enough about (supposed) pre-biotic conditions to be able to estimate the probabilities (though if you have any figures, and can justify them, I would be interested to see).
You skip over the key difference between a random DNA polymer and a bio-informational one. Both are complex if long enough, but one is functionally specified, the other is not. To move from the random state to the functional one plainly requires work, and shifts the entropy as there is now a quite discernibly different macrostate — just think about the effect of random changes in DNA in living systems.
So we have three DNA sequences, of equal length. Sequence X is a simple repeating patterm, sequence Y is random and sequence Z is human DNA. Do they have the same thermodynamic entropy? Does it take the same energy input to synthesise them in the lab, one base at a time (does it make a difference if you specify the random sequence in advance)? If not, which will take more, and why? My position is that each sequence has the same entropy. As each base is added, the entropy of the sequence will change by the same amount, regardless of whether the sequence ultimately becomes X, Y or Z. It is my belief that at absolute zero all three sequences will have zero entropy (that is what the third law tells me, anyway).
6] Spontaneous: By chance + necessity only.
Well that is the first time I have come across THAT definition for spontaneous. So, as I asked last time, is the combustion of coal spontaneous? Does entropy increase or decrease? Personally, I believe it is spontaneous, as the word is used in thermodynamics (but not under common usage), and entropy does increase, but I will be intrigued to see what your position is. For reference, see the Wiki entry for "spontaneous process" (the filter seems to choke on the link). Please note that it also relates the two terms of the Gibbs' equation to entropy in the system and entropy in the surroundings, and not to "chemical work" and "entropy work", as Thaxton would have us believe.
All that happens in TMLO at that point [NB: I am looking at my paper copy, with my annotation “where dG/T LTE 0 for spontaneous changes”] is they point out that there is a difference between a random and a specified configuration of the molecules they have in mind:
What they are doing is wrong, but I do now realise that the way they are justifying it is in the way you say, not the way I said originally. Yes, they say they are splitting deltaS into deltaS[thermal] and deltaS[config]. But deltaS[thermal] is the same as the deltaS that Gibbs' used, and everyone since him has used. They slipping deltaS[config] into deltaS, and then explicitly bringing it out again as they go from 8.4b to 8.5, so in effective introducing a whole new term deltaS[config]. So the issue is that it makes no sense to split the entropy, S = S[thermal] + S[config]. Remember that the thermodynamic entropy is a property we can measure in the laboratory. It is a routine analysis, simply involving heating a sample of known weight and measuring the energy input (say with a differential scanning calorimeter), then extrapolating back to absolute zero (where entropy is zero). I believe that if S[config] (if we suppose such a quantity) is zero at 0 K, and presuming the configuration does not change during warming to ambient, then S[config] is still zero at 25degC. Bear in mind that chemists, physics and engineers have been using the Gibbs' equation for about a century to extend the second law to open systems, and in all that time no one has ever found an exception. I think that would be odd if there really was an additional term. Has no one any experimental data of processes going or not going that can only be explained by this new term? Why not?The Pixie
April 5, 2007
April
04
Apr
5
05
2007
06:47 AM
6
06
47
AM
PDT
J: 1] Thanks: Welcome. 2] Re Pixie on strawmen Sadly, just as predicted at the head of the thread. 3] Pixie: re Gibbs free energy & entropy Cf TMLO ch7 and TBO's discussion in and around eqns 7.5 - 10b. (The next section on far from eq syss is also interesting on components of S. Ch 8, eqns 8.1 - 3c and their context are also interesting on the issue of random vs informational molecules and entropy. Eqs 8.4 - 5 follow in that context. ) GEM of TKIkairosfocus
April 5, 2007
April
04
Apr
5
05
2007
06:24 AM
6
06
24
AM
PDT
Me:
What intelligence certainly can do is repeatedly and unpredictably create regions of indefinitely large amounts of low entropy, of arbitrary character. Blind/dumb/purposeless processes can’t.
The Pixie:
I am not clear what you ...are claiming here. There are no known exceptions to the second law; throwing intelligence into the mix will not change that one bit (supernatural intelligence might, but I assume you are talking about human intelligence).
I agree, and have never claimed otherwise. There are no known exceptions to the 2nd law, including in cases involving intelligence agency. What intelligent (including human) agency can do, however, is generate results that are different from anything produced by blind/dumb/purposeless processes. If one holds that all natural processes are blind/dumb/purposeless, then this implies that intelligence is "supernatural." __________ The Pixie:
Non-intelligent systems repeatedly, but predictably create large regions of low entropy; as the planet turns whole continents cool down, losing entropy.
I agree, and have never claimed otherwise. There are tons of self-ordering phenomena in nature. You left out a few adjectives that make all the difference: "unpredictably," "indefinitely," "of arbitrary character." You are slaying strawmen. __________ kairosfocus:
As touching the more basic point, intelligence does not violate entropy, it intentionally creates low entropy systems.
Thanks.j
April 5, 2007
April
04
Apr
5
05
2007
04:58 AM
4
04
58
AM
PDT
Patrick: Insomnia patrol. Thanks, appreciated. OFF-TOPIC: Congratulations. The advice I got in prep for mine, was to remember this is my bride's big day, so once she has said yes, I need to say yes until the parson pronounces us man and wife. Subsequent to that, 17 + years ago now, I have found that it is often wise to reserve no for emergency use only! (But then I am a very blessed and very happy man. Not least, I get the chance to improve my image by being seen in good company every day!) Have a great wedding day,and may God grant you a blessed marriage. GEM of TKI PS: Pixie -- I forgot to mention, TMLO was favourably reviewed by prof Robert Shapiro; famous OOL researcher and chemist. [His recent sci am article will well repay a read.]kairosfocus
April 4, 2007
April
04
Apr
4
04
2007
11:26 PM
11
11
26
PM
PDT
kairosfocus, Sorry about the delay...I've been busy preparing for my wedding so I don't have the time to clear the moderation queue as often.Patrick
April 4, 2007
April
04
Apr
4
04
2007
07:48 PM
7
07
48
PM
PDT
H'mm: I have put up a response on points, but it is in filter. I hope that is not because I took time to discuss 8.4 - 5 using the equations. [Maybe there was a point to Pixie's spelling out . . .] Since that is the most serious claim, I note that in the step in question, dH is constant inthe equations, all TBO have done is to split up the increment in entropy into thermal and configurational parts. The chemistry per se does not distinguish a random from a biofunctional polymer and the cell uses an algorithmic system to create the latter, but as S is a state function such a split is equivalent, and analytically helpful. It will help to realise that Thaxton is a PhD chemist and Bradley a PhD polymer scientist specialising in fracture mechanics, which is of course riddled with thermodynamic considerations. GEM of TKIkairosfocus
April 4, 2007
April
04
Apr
4
04
2007
06:19 PM
6
06
19
PM
PDT
H'mm: The thread is fairly active. I will note on several points: 1] J: re "Entropy accountancy" I stand corrected on terminology, having misread your term. As touching the more basic point, intelligence does not violate entropy, it intentionally creates low entropy systems. (In effect, we are dealing with specialised energy converters under algortihmic or direct intelligent control.) And, examples are quite commonly encountered. E.g building a house to a plan, or a flyable jumbo jet -- compare the odds of such happening [spontaneously!] by a tornado passing through a junkyard. 2] Pixie: I am going to ignore the probabilitistic arguments that are not based on the second law . . . . Of course, as you later note, the 2nd law is at micro level, founded on thermodynamic probabilities. That is where the basic problem comes in in your arguments on abiogenesis -- the probabilities and equilibria likely to result are such that in the gamut of a planet with a prebiotic soup, the relevant macro-moloecules simply will not form in anything like the concentration or proximity to achieve either RNA world or metabolism first scenarios. 3] on "coupling" Obfuscation of what is plain enough. Have a look at the isolated system with sub systems again, and observe why B increases its entropy on receiving an increment of heat. As you know, systems that use heat are limited by Carnot, but those that couple energy can exceed that limit. 4] Rock cooling down: Distractor; the relevant bodies -- as discussed -- are energy receivers or converters. 5] DNA chains You skip over the key difference between a random DNA polymer and a bio-informational one. Both are complex if long enough, but one is functionally specified, the other is not. To move from the random state to the functional one plainly requires work, and shifts the entropy as there is now a quite discernibly different macrostate -- just think about the effect of random changes in DNA in living systems. And, how you get to that state starting with prebiotic conditions and through chance + necessity only is precisely the heart of my point, Dr Sewell's point and for that matter TBO's. 6] Spontaneous: By chance + necessity only. A refrigerator forces export of heat from colder body to hotter ones, but that is not at all a spontaneous process or system. 7] "Error" in TBO Ch 8 Eqns 8.4b to 5 What are you talking about? All that happens in TMLO at that point [NB: I am looking at my paper copy, with my annotation "where dG/T LTE 0 for spontaneous changes"] is they point out that there is a difference between a random and a specified configuration of the molecules they have in mind: 8.4b: dG = dH - TdS 8.5: dG = dH - T dS th - T dS config In short the step 8.4b --> 8.5 is to simply split up TdS: TdS = T{dSth + dSconfig) That is what I pointed to verbally in my point 5 just above. It also makes sense once we can macroscopically distinguish between the random and specified polymer, which we can. One is biofunctional, the other is [by overwhelming probability] not, and no prizes for spotting which is which. More directly, they do NOTHING to dH, the enthalpy [Onlookers, roughly, heat content], and that makes sense there. Going back to 8.4a: dG = dE + PdV - TdS So, of course, they substitute: dH = dE + pdV Since we are not looking at significant pressure-volume work, and since the bonding energy in the relevant chains is independent of the order of DNA monomers, and since that more or less holds for proteins at the point of chaining [as opposed to folding!], dH is not changing between 8.4 and 8.5. There is no error, grave or minor, in the step. GEM of TKIkairosfocus
April 4, 2007
April
04
Apr
4
04
2007
06:13 PM
6
06
13
PM
PDT
DaveScot
Thermodyamic principles are applied to far more than just energy. Keep in mind matter and energy are the same thing according to e=mc^2. In general 2LoT applies to gradients of all kinds. Gradients are areas of lowered entropy and 2LoT states that entropy tends to increase in ordered systems (order decreases). It also applies to information gradients.
Sorry, but I do not see how your Wiki reference supports your claim. It seems to just be saying that thermodynamics is important in a wide range of fields. From the start of the entry: Thermodynamics (from the Greek θερμη, therme, meaning "heat" and δυναμις, dunamis, meaning "power") is a branch of physics that studies the effects of changes in temperature, pressure, and volume on physical systems at the macroscopic scale by analyzing the collective motion of their particles using statistics.[1][2] Roughly, heat means "energy in transit" and dynamics relates to "movement"; thus, in essence thermodynamics studies the movement of energy and how energy instills movement. kairosfocus You mention and link to chapter eight of The Mystery of Life's Origin (TBO). There is a serious flaw in their argument as they go from equation 8-4b to equation 8-5. Equation 8-4b is the Gibbs' equation: deltaG = deltaH - TdeltaS A funny thing happens to the Gibbs' equation if you divide through by -T: -deltaG/T = deltaH/T + deltaS deltaS is, of course, the entropy change in the system. deltaH/T is the entropy change in the surroundings (from dS = dQ/T, when T is constant). So -deltaG/T is the total entropy change. Second law says that must be positive, so deltaG must therefore be negative. That, in reverse, is the derivation of the Gibbs' function (see here. Thaxton et al have missed that, and it is apparant that they believe deltaH is the "Chemical work" (rather than a measure of the entropy change in the surroundings) and TdeltaS is the "Thermal entropy work" (rather than a measure of the entropy change in the system). Gibbs actually had it all covered. He had accounted for the entropy change in the system, and for the entropy change outside the system. It makes no sense to introduce any new terms; what other entropy can there possibly be? Nevertheless, in equation 8-5, Thaxton et al add "Configurational entropy work"! J and kairosfocus
J: That said, however, I don’t maintain that there is nothing special about intelligence with regard to entropy. What intelligence certainly can do is repeatedly and unpredictably create regions of indefinitely large amounts of low entropy, of arbitrary character. Blind/dumb/purposeless processes can’t.
kairosfocus: Precisely.
I am not clear what you two are claiming here. There are no known exceptions to the second law; throwing intelligence into the mix will not change that one bit (supernatural intelligence might, but I assume you are talking about human intelligence). Non-intelligent systems repeatedly, but predictably create large regions of low entropy; as the planet turns whole continents cool down, losing entropy.The Pixie
April 4, 2007
April
04
Apr
4
04
2007
03:38 PM
3
03
38
PM
PDT
kairosfocus
But the story of Body A is here serving as a rhetorical distractor from the key issue, namely that unless energy is COUPLED into a system that imports it, raw importation of energy will naturally increase its entropy. And, of course the systems relevant to say the OOL are energy importers, not heat exporters, much less work exporters. Dumping raw energy into the prebiotic atmosphere and oceans does not credibly get us to the origins of the functionally specific, highly complex and integrated macromolecules of life. Not in the gamut of the observed universe across its entire lifetime. The probabilistic resources are just plain far too scanty. [And, resort to an unobserved quasi-infinite array of sub-universes simply lets the cat out of that bag.]
I am going to ignore the probabilitistic arguments that are not based on the second law. Personally, I am just talking about the second law (and yes, I know that has a probabilitistic argument at its root, but that does not imply all probabilitistic arguments are connected to the second law. Abiogenesis involves a whole load of chemical reactions. It is my belief that each and every one is accompagnied by an overall increase in entropy. For example, the coming together of two amino acids to form a dipeptide will give an increase in entropy under certain conditions (eg where something is happening to the resultant water). We do not know what those reactions were, so, yes, I am taking that on faith, if you like. Frankly, I see the word "coupled" as a rhetorical device meant to make us think of machinery. In what way were A and B "coupled". Er, they were next to each other. How much design does that take? Body A manages to export heat - losing entropy in the process - body B manages to import heat. No intelligence required.
Further to this, you will note I was careful to give a specific case of a natural heat engine [a tropical cyclone] and to point out just where the issue relevant to the design inference comes in. Namely, the ORIGIN of such energy coupling and conversion devices as manifest specified complexity, e.g. as in [a] a jet engine in a jumbo jet, or [b] the far more sophisticated DNA -RNA -Ribosome -Enzyme systems in a cell.
Any rock cooling down overnight is an example of entropy decreasing without any need for design or machinery.
Of course, we need to ask how snowflakes form. Answer, the water is a closed but not isolated system, and as latent heat of fusion is extracted, the anisotropy of the molecules lends itself to highly structured, orderly — but precisely not COMPLEX — crystallisation. [This very case is cogently discussed in TMLO, which BTW also makes reference to relevant values from thermodynamics tables and from more specific research on the chemistry of biopolymers and their precursors.] By sharpest contrast, DNA is highly aperiodic, which is how it can store information in a code. How much information is stored in . . . HOHHOHHOHHOH . . .? [I know that’s a 2-D rep of a 3-D pattern, but the point is not materially different.]
It is worth bearing in mind that entropy is about order (or disorder, strictly), not complexity. A simple repeating sequence of bases in a DNA chain has exactly the same thermodynamic entropy as a DNA chain of the same length that prescrobes a man.
This simply dodges the issue of getting TO the functional complex information stored and processed in the cellular energy conversion device.
Sure, but we are talking about the second law. The second law says nothing about how you get from one state to another, about machinery, about intelligence. It just says the entropy is higher at the end. There may or may not be these obstacles, but that would be off-topic.
Predictably, we see a pouncing on a minor error, in a context where the material point was already noted and corrected: SPONTANEOUS. (So, I have reason to say that we see here the knocking over of a convenient strawman. The PCs and net are DESIGNED, and the DNA that largely controls the production of the human brain etc evinces all the characteristics that we would at once infer to be designed were not a worldview assertion in the way.]
I am not sure what your point is about "spontaneous". In thermodynamics the combustion of coal is "spontaneous", but you try lighting a coal fire with a match. And again, you miss the fact that the second law says nothing about how you get from one state to another, about machinery, about intelligence. It just says the entropy is higher at the end.The Pixie
April 4, 2007
April
04
Apr
4
04
2007
02:40 PM
2
02
40
PM
PDT
kairosfocus, I specifically referred to entropy accountancy, not "energy accountancy." Per the 2nd law, the entropy rate balance for a control volume (CV) is as follows: [rate of change of entropy within the CV] = + [rate of entropy transfer associated with heat transfer across boundary] (1) + [rate of entropy transfer associated with mass transfer into the CV] (2) - [rate of entropy transfer associated with mass transfer out of the CV] (3) + [rate of entropy production within the CV due to irreversibilities] (4) (1) is positive when heat is transferred in, negative when heat is transferred out. (2) and (3) are zero for closed systems (mass is not transferred into or out of a closed system). (4) is always positive. Show me an experiment that demonstrates that intelligence can violate this. (In 100 words or less, please.)j
April 4, 2007
April
04
Apr
4
04
2007
05:02 AM
5
05
02
AM
PDT
H'mm: A follow up or two seems in order, even in the absence of Pixie: 1] DS: Wiki cite Surprisingly good, though I have in significant measure lost respect for this encyclopedia due to censorship and bias. A pity, as it was a good idea. 2] J: I know of no good reason to think that intelligence (or the development any of the products thereof) violates the entropy accountancy required by the 2nd Law. First, the law that undertakes "energy accountancy" is the 1st not the second. The latter is the one that is called "time's arrow" for a reason -- it is temporally asymmetric and so gives a driving constraint on change. Second, strictly the 2nd law does not "forbid" what is improbable -- on the statistical thermodynamics view. It simply lays out the facts on the numbers of microstates that are consistent with given macroscopic state descriptions, e.g the pressure, volume and temperature of a body of gas. Those facts are that we see an overwhelmingly probable cluster of microstates, which leads us to expect that fluctuations from equilibrium will be small enough to more or less ignore for any sizable body of gas, for instance. (A directly related point, elaborated by Einstein in 1905 -- this is why we basically don't see Brownian motion for sufficiently large particles in a fluid. The transition zone for BM is about a micron if memory serves.) The relevant issue -- as has been pointed out already, and is in Dr Sewell's main article -- is that highly informational configurations of matter are precisely tightly constrained and so are vastly improbable relative to less information-rich states. Have a look, e.g. at TBO's thermodynamic analysis of the chance origin of protein and DNA molecules here, to see the point. Their onward analysis [Ch 9] of the plight of OOL research can in that context be viewed as a prediction of experimental results -- one that has been amply borne out over these 23 years since the publication of their work. (Dr Robert Shapiro's recent Sci Am article tellingly updates the point -- though he, too, fails to see that his strictures on the RNA world hypothesis also cut across his own favoured metabolism first model.) So, there IS empirical data that strongly supports Dr Sewell's point. Just, there is a paradigm in the way of seeing the confirmation for what it is. 3] I don’t maintain that there is nothing special about intelligence with regard to entropy. What intelligence certainly can do is repeatedly and unpredictably create regions of indefinitely large amounts of low entropy, of arbitrary character. Blind/dumb/purposeless processes can’t. Precisely. In the cell [e.g. in the brain, etc] and in the PC and Internet [etc] we see: regions of indefinitely large amounts of low entropy, of arbitrary character. We know empirically that intelligent agency routinely produces such zones of functionally specific, complex integrated information and information processing structures. On thermodynamic probability grounds, we know that blind, spontaneous forces are maximally unlikely to get to such states in the gamut of the observed cosmos across its estimated lifetime. [Dembski's threshold is the spontaneous generation of ~ 500 bits. Both PC technologies and cell-based life systems by far exceed that bound.] So -- reckoning with a couple of infelicities of expression -- apart from a worldview assumption that leads to the resort to the maximally improbable, instead of what is empirically supported, instead of the obvious, empirically well-supported inference? Namely:
. . . the [spontaneous] rearrangement of atoms into human brains and computers and the Internet . . . [is vastly improbable to the point [where] within the probabilistic resources of the OBSERVED cosmos, such a spontaneous rearrangement of hydrogen etc into humans and their information-rich artifacts, is maximally unlikely relative to the hypothesis that chance and/or natural regularities are the only driving forces . . .]
Jus wonderin . . . GEM of TKIkairosfocus
April 4, 2007
April
04
Apr
4
04
2007
01:37 AM
1
01
37
AM
PDT
Dr. Sewell: First, welcome to UD. I do enjoy your writings. However, I must take issue with you about the 2nd law of thermo. You state:
But the fact is, the rearrangement of atoms into human brains and computers and the Internet does not violate any recognized law of science except the second law.
I know of no good reason to think that intelligence (or the development any of the products thereof) violates the entropy accountancy required by the 2nd Law. Do you know of any experiment that demonstrates this? That said, however, I don't maintain that there is nothing special about intelligence with regard to entropy. What intelligence certainly can do is repeatedly and unpredictably create regions of indefinitely large amounts of low entropy, of arbitrary character. Blind/dumb/purposeless processes can't.j
April 3, 2007
April
04
Apr
3
03
2007
06:43 PM
6
06
43
PM
PDT
I'm curious as to what Dr. Dembski thinks on ID. Does he believe there was an actual Adam and Eve and does he view the creation of life? Does Dembski support front loading? Curioussfg
April 3, 2007
April
04
Apr
3
03
2007
02:36 PM
2
02
36
PM
PDT
Pixie Thermodyamic principles are applied to far more than just energy. Keep in mind matter and energy are the same thing according to e=mc^2. In general 2LoT applies to gradients of all kinds. Gradients are areas of lowered entropy and 2LoT states that entropy tends to increase in ordered systems (order decreases). It also applies to information gradients. Intelligent processes can create information gradients. For instance - absent intelligence the library of congress, a highly ordered set of information, would not exist with only energy from the sun input into the system and no intelligent agency directing how that energy is utilized to decrease entropy. Similarly buildings, roads, space shuttles, and all sorts of other highly ordered things wouldn't exist without intelligent agency. Thermodynamics
The starting point for most thermodynamic considerations are the laws of thermodynamics, which postulate that energy can be exchanged between physical systems as heat or work.[4] They also postulate the existence of a quantity named entropy, which can be defined for any system.[5] In thermodynamics, interactions between large ensembles of objects are studied and categorized. Central to this are the concepts of system and surroundings. A system is composed of particles, whose average motions define its properties, which in turn are related to one another through equations of state. Properties can be combined to express internal energy and thermodynamic potentials, which are useful for determining conditions for equilibrium and spontaneous processes. With these tools, thermodynamics describes how systems respond to changes in their surroundings. This can be applied to a wide variety of topics in science and engineering, such as engines, phase transitions, chemical reactions, transport phenomena, and even black holes. The results of thermodynamics are essential for other fields of physics and for chemistry, chemical engineering, aerospace engineering, mechanical engineering, cell biology, biomedical engineering, and materials science to name a few.[6][7]
DaveScot
April 3, 2007
April
04
Apr
3
03
2007
01:38 PM
1
01
38
PM
PDT
More specifically, this 4 page document: Can ANYTHING happen in an open system?Atom
April 3, 2007
April
04
Apr
3
03
2007
08:37 AM
8
08
37
AM
PDT
RE Pixie @ 13: You seem to have not read Prof Sewell's article on the Second Law. Since you did not mention Boundary Conditions (or maybe I read your post too fast an missed it) it seems that you just argue what his paper answers. In a sentence, entropy can only decrease in an open system as fast as you export it through the boundary. He shows this mathematically. If you get an increase in order in an open system, it is because you are importing order. I suggest reading the paper if you haven't had a chance. A Second Look at the Second LawAtom
April 3, 2007
April
04
Apr
3
03
2007
08:19 AM
8
08
19
AM
PDT
Dr Sewell Perhaps it is as well I lost my own blog post for the morning -- I will have to reconstruct from memory -- through one of those annoying PC mess-ups. For, on telling Firefox to reconstruct, it updated this thread which just happened to have had this page open in a tab. Lo and behold: Pixie aptly illustrates just the point made above by Profs Johanson and Sewell! FYI, Pixie: 1] The case no 1: || A, Th --> d'Q --> B, Tc || This shows that a hotter body on losing heat will (of course -- the number of available microstates goes down] reduce its entropy, but also that lost heat goes somewhere, and in so doing it net retains or increases the entropy of the overall system. But the story of Body A is here serving as a rhetorical distractor from the key issue, namely that unless energy is COUPLED into a system that imports it, raw importation of energy will naturally increase its entropy. And, of course the systems relevant to say the OOL are energy importers, not heat exporters, much less work exporters. Dumping raw energy into the prebiotic atmosphere and oceans does not credibly get us to the origins of the functionally specific, highly complex and integrated macromolecules of life. Not in the gamut of the observed universe across its entire lifetime. The probabilistic resources are just plain far too scanty. [And, resort to an unobserved quasi-infinite array of sub-universes simply lets the cat out of that bag.] Further to this, you will note I was careful to give a specific case of a natural heat engine [a tropical cyclone] and to point out just where the issue relevant to the design inference comes in. Namely, the ORIGIN of such energy coupling and conversion devices as manifest specified complexity, e.g. as in [a] a jet engine in a jumbo jet, or [b] the far more sophisticated DNA -RNA -Ribosome -Enzyme systems in a cell. In all directly known cases, where such functionally specific, complex systems are observed, their origin is intelligent agency. Properly, itis those who argue that in effect a tornado in a junkyard can assemble, fuel and fly a 747, who have something to prove. Something that, after 150 years of trying, remains unproved to date. 2] The second law [i]does[/i] only apply to thermodynamic entropy . . . A glance at say Brillouin, as in my linked through my handle, will show that entropy is far broader than just heat, through the informational implications of the statistical weight of macrostates. This insight is is exploited in TBO's TMLO, Chapter 8, also linked through my own outline discussion. Surely, as one familiar with Gibbs etc, you are aware of the informational approach to statistical thermodynamics that his work was a precursor to, as followed through from Jaynes on to for instance Harry Robertson [Look up his Statistical Thermophysics, PHI, 1993], as I cite? [In short there is a whole other school out there tracing to Gibbs . . . You may disagree, but these folks have a serious point.] 3] Molecules of water will spontaneously form snowflakes (even in a system that has no water going in or out), so again “water order” can decrease. Of course, we need to ask how snowflakes form. Answer, the water is a closed but not isolated system, and as latent heat of fusion is extracted, the anisotropy of the molecules lends itself to highly structured, orderly -- but precisely not COMPLEX -- crystallisation. [This very case is cogently discussed in TMLO, which BTW also makes reference to relevant values from thermodynamics tables and from more specific research on the chemistry of biopolymers and their precursors.] By sharpest contrast, DNA is highly aperiodic, which is how it can store information in a code. How much information is stored in . . . HOHHOHHOHHOH . . .? [I know that's a 2-D rep of a 3-D pattern, but the point is not materially different.] 4] Evolution can happen because while entropy in the system is decreasing, the overall entropy, including the entropy of the surroundings is going up. This simply dodges the issue of getting TO the functional complex information stored and processed in the cellular energy conversion device. We have reason to believe that in the gamut of the observed cosmos, even so little as 500 bits of information will not spontaneously form, relative to a specified configuration. [E.g. Tossing and reading once per second, how long will you have to wait on average to get 500 heads in a set of coins? Ans: far longer than the observed universe existed for to date.] 5] The rearrangement of atoms into human brains and computers and the Internet clearly does not violate any law of nature, recognized or not. Predictably, we see a pouncing on a minor error, in a context where the material point was already noted and corrected: SPONTANEOUS. (So, I have reason to say that we see here the knocking over of a convenient strawman. The PCs and net are DESIGNED, and the DNA that largely controls the production of the human brain etc evinces all the characteristics that we would at once infer to be designed were not a worldview assertion in the way.] Predictable . . . and, sad. GEM of TKIkairosfocus
April 3, 2007
April
04
Apr
3
03
2007
07:36 AM
7
07
36
AM
PDT
Thermodynamics usually deals with substances which may be treated as continuous and homogeneous (even though they consist of discrete atoms and molecules), with uniform properties throughout. In contrast, living things are discontinuous -- there are individual cells and then the cells have discontinuous structures inside. Furthermore, living things have a purposeful arrangement of parts. When a homogeneous substance at the macroscopic level consists of gadzillions of the same atoms or molecules, the substance's properties become uniform because the mathematics of the probabilities of large numbers takes over. If, for example, a coin is tossed ten times, five heads (or tails) is the most likely result but the probability of getting exactly this result is small. However, if a coin is tossed a thousand times, the probability that the percentage of heads (or tails) will fall between 45 and 55 is large. Therefore, tossing a coin one thousand times -- as compared to only ten times -- has a more uniform "property" of producing results of approximately one-half heads (or tails). Also, the Second Law of Thermodynamics is often stated in physical or engineering terms that have nothing to do with biology. For example, a popular statement of the SLOT is the Kelvin statement: "It is impossible to construct an engine, operating in a cycle, whose sole effect is receiving heat from a single reservoir and the performance of an equivalent amount of work." I think that the SLOT makes poor arguments either for or against evolution. I think that SLOT arguments against evolution do nothing more than expose critics of evolution to ridicule from the Darwinists. As for a 4th Law of Thermodynamics, I have heard of 4th and higher-numbered laws of thermodynamics, but I think that only the first four laws, the Zeroth through the 3rd, are universally recognized. My blog has an article discussing the application of the SLOT to evolution theory: http://im-from-missouri.blogspot.com/2007/02/2nd-law-of-thermodynamics-and.htmlLarry Fafarman
April 3, 2007
April
04
Apr
3
03
2007
06:06 AM
6
06
06
AM
PDT
kairosfocus
1] Now, the classic e.g. no 1 in studying the 2nd law, is an isolated system having in it two thermally interacting closed systems, A at Thot, B at Tcold. {I am using the more usual physics terminology: closed systems exchange energy, but not matter, with their surroundings. Open ones exchange both, isolated ones exchange neither.)
What is interesting about this example is that it actually proves entropy can go down in a closed system! Just consider A; for the moment the system we are interested in comprises A alone. A is not an isolated system as it exchanges energy with B, but we can still study it as a system. When happens when heat energy flows from A to B? The entropy in A decreases! And it can do that because the total entropy, the entropy in the system (i.e. A) plus the entropy in the surroundings (effectively B, as heat can go nowhere else) is increasing. Even when we consider open systems, the second law demands that entropy goes up but the entropy in the system might still go down. And it is worth noting that the entropy in A decreased without the aid of any machinery.The Pixie
April 3, 2007
April
04
Apr
3
03
2007
04:22 AM
4
04
22
AM
PDT
Granville Sewell
People have found so many ways to corrupt the meaning of this law, to divert attention from the fundamental question of probability–primarily through the arguments that “anything can happen in an open system” (easily demolished, in my article) and “the second law only applies to energy” (though it is applied much more generally in most physics textbooks). But the fact is, the rearrangement of atoms into human brains and computers and the Internet does not violate any recognized law of science except the second law, so how can we discuss evolution without mentioning the one scientific law that applies?
I am afraid I am going to have to disagree. The second law [i]does[/i] only apply to thermodynamic entropy (a measure of the distribution of energy, defined by dS = dQ/T). It is, afterall, the second law of [i]thermodynamics[/i]. Furthermore, the fact that it is specified for a closed system, that is one in which energy cannot get in or out, is a further hint that the second law is about energy. The second law does not apply to, say, "carbon order". The Earth is a closed system for "carbon order", and yet "carbon order" can and does increase, every time a tree grows. It does not apply to "water order". Molecules of water will spontaneously form snowflakes (even in a system that has no water going in or out), so again "water order" can decrease. That is not to say that the arrangement of matter is unaffected by the second law, but matter is affected because it impacts on how energy is distributed. A snowflake is low in thermodynamic entropy because the individual atoms cannot move much, there is not much scope for randomly distributing energy around a system. When a snowflake forms, the entropy of the water decreases. But this is okay for the scond law, because as the snowflake forms, it releases energy into the surroundings, and that increases the entropy of the surroundings. [i]Overall[/i] thermodynamic entropy increases, as it always must. Which brings us to open systems. [i]Formally[/i] the second law applies only to closed systems, but the mathematics of Gibbs' allows us to extend that to open systems. All you have to remember is that it is the total entropy, the entropy change in the system, plus the entropy change in the surroundings (ignoring [i]all[/i] other processes in the universe), must increase. Evolution can happen because while entropy in the system is decreasing, the overall entropy, including the entropy of the surroundings is going up. The rearrangement of atoms into human brains and computers and the Internet clearly does not violate any law of nature, recognized or not. It happens. I have three kids, all with human brains. Ten years ago, before any of the were conceived, all those atoms were not in those human brains. And trust me, no one violated any laws of nature to get those atoms into those human brains.
Much of the disagreement and confusion about the second law is due to the fact that, unlike most other “laws” of science, there is not widespread consensus on exactly what it says. It was originally applied only to heat conduction, then gradually generalized more and more, until today many physics textbooks apply it to things like rusting computers and shattering glass. The key is that there is one and only one principle behind all applications, and that is the use of probability at the microscopic level to predict macroscopic change. Thus as far as I am concerned, this principle IS the second law.
The second law says entropy increases in a closed system, S[final] > S[initial]. That was what it said when it was originally applied to heat conduction. That is still what it says when applied to chemical and biological processes, to rusting computers and even blackholes. And it is all the same entropy, dS = dQ/T. Boltzmann did indeed relate that entropy to "probability at the microscopic level", S = k ln W. But this equality is only justified if W is the number of energy microstates. Thermodynamic entropy is a "property of state". That means that a given substance at a give set of conditions always has the same entropy. You can look up that entropy for common materials at stardard conditions (eg here is a table for engineers of the entropy of steam at various temperatures). The entropy of a material can be calculated from dS = dQ/T (given that entropy is zero at abslute zero), or from S = k ln W. You get the same value either way.The Pixie
April 3, 2007
April
04
Apr
3
03
2007
03:52 AM
3
03
52
AM
PDT
Ack! Calculus! Well, someone has to know how to do it. But leaving that aside for the moment, and of necessity ... Whenever someone starts blithering on about the ancient earth being an open system and receiving energy from the sun which could be enough to drive evolutionary processes I want to tell them to go and start living outdoors all the time and see what all that in-pouring energy from the sun will do to them. My poor non-tanning husband spent many days on the beach in his youth. The skin cancers started turning up in his very early 30s - thankfully all BCCs so far. Or, talking about driving, how about next time they go to the petrol station they don't bother with pumping it into the system designed to feed the fuel into a chamber where (by design) its energy is extracted via a series of tiny, controlled explosions which (by design) move pistons whose up and down movements (by design) are translated into forward or backward movement which (by design) you can select for by choosing one of several gears. Instead just pour it all over the car, light it and see how far the car will go. Of course I realise that these sort of simple arguments carry no weight with those who wilfully choose to give them no weight. Some may not believe that there are fairies at the bottom of the garden but they all believe in magic even if they do call it abiogenesis followed by macroevolution.Janice
April 3, 2007
April
04
Apr
3
03
2007
03:12 AM
3
03
12
AM
PDT
H'mm: Given the way minor points are often abused by evolutionary materialism advocates through obfuscatory debate tactics, to divert attention from the material point, I should note that a slight adjustment on Dr Sewell's quote may be helpful:
. . . the [spontaneous] rearrangement of atoms into human brains and computers and the Internet . . . [is vastly improbable to the point of constituting a violation of what we reasonably expect under 2 LOT]]
Of course, too, the correctly understood sense of "violates" here, is that:
. . . within the probabilistic resources of the OBSERVED cosmos, such a spontaneous rearrangement of hydrogen etc into humans and their information-rich artifacts, is maximally unlikely relative to the hypothesis that chance and/or natural regularities are the only driving forces . . .
The most telling illustration of the force of that point, is the commonly met with idea that there is a quasi-infinite array of sub universes in the wider cosmos as a whole. So, with assumed randomly distributed laws and circumstances, the complex world we see described by Prof Sewell above has in fact happened by chance. This is of course both an ad hoc speculative assertion without basis in observational fact, and one that begs major metaphysical questions and debates. It shoots itself in the foot by what such a resort tot he quasi-infinite adn unobserved implies: the relevant probabilities are so small in a universe of up to 10^80 or so atoms and up to say 13.7 BY or so, that it is implausible to expect that the cosmos we see originated by chance + necessity only within that gamut. The now rising onward objection that such estimates of probability are incalculable or unproven or useless, fails too. For instance, consider that the odds of say the value of the Cosmological Constant [~ energy density of free space, the "yeast" that makes space itself expand] falling within a life-permitting range are sometines estimated as 1 in 10^53, based on the physics of what ranges it could have in the raw, and what range it can reasonably have that is life-permitting. The dismissal of this estimate or the like, rests on selectively ignoring or dismissing the basic facts, principles and factors in how probabilities (and related quantities) are calculated. That selective hyperskepticism starts with how we estimate the odds on throwing a 6 on a die as 1 in 6. That is, through the Laplacian principle of indifference and a comparison of the result in question to the range of possible results. For, as we have no reason to prefer any one face so of 6, the odds are 1 in 6 for a specified face. I further note that probabilities and associated expectations are routinely and reasonably estimated, and in effect indicate/ model (sometimes quantitatively) the intuitive idea of the rational degree of confidence we can place in something occurring or not occurring "by chance." They are equally routinely used in important decision-making situations. I trust these notes will be helpful in addressing some of the usual diversionary talking points. Ah Gone . . . GEM of TKIkairosfocus
April 3, 2007
April
04
Apr
3
03
2007
12:43 AM
12
12
43
AM
PDT
1 2 3

Leave a Reply