Uncommon Descent Serving The Intelligent Design Community

Are Darwinian claims for evolution consistent with the 2nd law of thermodynamics?

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

A friend wrote to ask because he came across a 2001 paper, Entropy and Self-Organization in Multi-Agent Systems by H. Van Dyke Parunak and Sven Brueckner Proceedings of the International Conference on Autonomous Agents (Agents 2001), 124-130:

Emergent self-organization in multi-agent systems appears to contradict the second law of thermodynamics. This paradox has been explained in terms of a coupling between the macro level that hosts self-organization (and an apparent reduction in entropy), and the micro level (where random processes greatly increase entropy). Metaphorically, the micro level serves as an entropy “sink,” permitting overall system entropy to increase while sequestering this increase from the interactions where selforganization is desired. We make this metaphor precise by constructing a simple example of pheromone-based coordination, defining a way to measure the Shannon entropy at the macro (agent) and micro (pheromone) levels, and exhibiting an entropybased view of the coordination.

The thought seems to be that entropy decreases here but somehow increases somewhere where we can’t see it.

I’ve (O’Leary for News) always thought that a fishy explanation, especially because I soon discovered that even raising the question is considered presumptive evidence of unsound loyalties. The sort I am long accustomed to hearing from authoritarians covering up a scandal.

So not only do I not believe it, but after that sort of experience I get the sense I shouldn’t believe it. Depending on where I am working, I might need to parrot it to keep my job, of course, but it would be best not to actually believe it.

Dr Sheldon
Rob Sheldon

Rob Sheldon told us both,

What you read is the “standard” physics response. It is misleading on many levels.

a) Physicists really, really can’t explain what goes on in biology. Neither their definition of entropy, nor their definition of information (Shannon, etc) work. Rather than admit that they don’t know what is going on, they simply extrapolate what they do know (ideal gasses) to biology and make pronouncements.

b) While it is true that “open” systems may allow energy and matter to flow through them, which would change the information in the system, this does not nor cannot explain biology. The best treatment of this is Granville Sewell’s articles on different types of entropy. Truly excellent. It explains why sunlight does not carry enough information to create life out of precursor molecules. And people who claim this are either: (i) deluded that physics entropy = biology entropy, or (ii) equivocating on the use of the word “entropy”, or (iii) unable to handle basic math, or most likely, (iv) all the above.

c) This paper suggests that the cell has machinery for converting sunlight to information–e.g. photosynthesis. While true, this machine must be even more complicated than the carbohydrates it produces. Ditto for self-replicating machinery, etc. So if we permit some high level of information to enter the system, then low-level information can be created from energy sources. This argument really is indistinguishable from ID, though they may not realize it.

In conclusion, the violation of the 2nd Law remains true for biology, and there still is no good physics explanation for it.

It’s a good thing they didn’t realize it. They won’t have to issue some embarrassing repudiation of their work.

And I don’t have to believe something for which we have no evidence just to protect the tenurebots’ theory.

Follow UD News at Twitter!

Comments
Box: Just like the question of whether ID is “science” ultimately just depends on your definition of “science” and is irrelevant to its truthfulness, debating whether Sewell’s argument is based on what you consider the “Second Law” is an irrelevant distraction to its truthfulness. It makes a difference because when Sewell says it's the 2nd law of thermodynamics, then the arguments raised concern the 2nd law of thermodynamics. (See reply to halloschlaf later in this comment.) Maybe he means to propose a law *analogous* to the 2nd law of thermodynamics. Call it the 2nd law of something-something, but don't call it the 2nd law of thermodynamics. Box: The four fundamental forces are pretty creative indeed, however intuition informs us that there is a definite boundary to what they can do. Intuition is a powerful tool, but doesn't substitute for scientific support. Seversky: The conditions in the primordial singularity would seem to have been so unimaginably extreme that nothing – not matter, not information, not order, not regularity – could possibly survive intact. While there's a lot unknown, much of the order is due to symmetry breaking, quantum fluctuations, and the resulting history of interactions. Zachriel: Adding intelligence, as in Intelligent Design, does not allow one to violate the 2nd Law of Thermodynamics. Life and evolution are consistent with the 2nd Law of Thermodynamics, as are refrigerators and designer genes. Mung: And your knowledge of these alleged “facts” comes from where? They're scientific findings. Zachriel: Are you saying the manufacture of computers violates the 2nd law of thermodynamics? halloschlaf: No. Because in this case manifested intelligence in form of roboters, machines, workers has transgressed the border of the open system to form kybernetic systems. It's energy that crosses the boundary that keeps it in conformity with the 2nd law. Intelligence doesn't impact the 2nd law of thermodynamics, no matter how inventive the engineer. That's the reason the law was originally formulated. Perhaps you are referring to the 2nd law of something-something. Zachriel
Box, yes and yes again. That is why I took time to directly confront objectors with the case of the Abu 6500 C3 reel, to show how emergent functionality rooted in specific, configuration dependent interaction of parts is an undeniable fact of the real -- reel? -- world. Then, we can extend to molecular nanotech in cells that shows the same and ask pointed questions on where the empirical evidence points to on origins, given the needle in haystack search challenge. The very challenge that the statistical analytical framework integral to a modern understanding of 2LOT, poses. Where -- ever since Paley's thought exercise on a self-replicating functional watch in Ch 2 of NT showed -- the origin of the cell's von Neumann self replication facility is an additional critical case of FSCO/I that should increase our recognition of supreme artifice or contrivance. In the face of such, evasiveness and question begging circularities such as CH posed above stand out like a sore thumb and flag their inadequacies. KF kairosfocus
Mung, cf 72 above, which I have been fruitlessly trying to draw to the attention of objectors to design thought and the thermodynamics connexion. Names like Brillouin, Jaynes and Robertson as well as Gilbert N Lewis have weighed in long since. Just, it seems such cuts across a favoured talking point line. KF kairosfocus
Zachriel:
Adding intelligence, as in Intelligent Design, does not allow one to violate the 2nd Law of Thermodynamics. Life and evolution are consistent with the 2nd Law of Thermodynamics, as are refrigerators and designer genes.
And your knowledge of these alleged "facts" comes from where? Mung
#137 All that was needed was for the Universe to expand and cool. The expanding universe allowed the excess of thermal energy to escape, satisfying the 2nd LoT as physical interactions led to the emergence of structures (see Hangonasec at #85). Piotr
rvb8 died a mysterious death. The death of cowards. Mung
We make this metaphor precise by constructing a simple example of pheromone-based coordination, defining a way to measure the Shannon entropy at the macro (agent) and micro (pheromone) levels, and exhibiting an entropy-based view of the coordination.
What is Shannon entropy and what is it's relationship to thermodynamic entropy? News? Anyone? Mung
Are young earth creationist claims consistent with the second law of thermodynamics? Mung
Piotr, The four fundamental forces are pretty creative indeed, however intuition informs us that there is a definite boundary to what they can do. CS3 on this matter:
I think it is fair to say that the Second Law says that the direction of spontaneous change is from an arrangement of lesser probability to an arrangement of greater probability. The only problem is that, outside of thermal entropy, which is easily quantifiable, allowing one to compute with certainty which arrangement is more probable, in other cases, we usually can only use intuition to guess which arrangement has greater probability. And intuition can sometimes be wrong, but that doesn’t mean it isn’t sometimes pretty obvious. To use my magnetized coin flipping example again, I can believe that natural forces could cause all heads to come up, even though that would be an extremely improbable macrostate (with only one microstate) based on random chance alone, because that is consistent with the type of “order” I could expect based on my knowledge and experience with the natural forces (and in fact, would happen if I placed a magnet under the table). However, I could not believe that natural forces could cause the coins to come up as a bit-wise representation of a great novel (without, say, an intelligent human setting initial conditions to assure that happens, for example, by placing small magnets selectively under certain coins), even though that macrostate is no more improbable (assuming only chance) than the all heads macrostate (in fact there are more microstates in this case), because that is not a type of “order” consistent with my understanding or experience of what the natural forces can do. Could they land such that every 2i+5 is heads? I would think probably not, but maybe there is a way that could happen that I just haven’t thought of. It is not always easy to say exactly what the four unintelligent natural forces can and can’t do, but that doesn’t mean, in my opinion, that there aren’t some cases that are obvious. (Disclaimer: obviously all of these statements are made assuming a reasonable limit on the total number of flips.) That said, I agree that, apart from the refutation of the compensation argument (which was the content of the paper itself), these other arguments are essentially equivalent to the CSI arguments made, with more rigor, by Dembski and others. Thus, I would not expect anyone unconvinced by those arguments to be convinced by Sewell’s arguments in that regard.
Piotr: clouds collapsed gravitationally into large-scale structures, eventually forming dense objects like stars and protoplanetary discs… etc.
which reminds me of Eric Anderson's article on the accretion hypothesis, which shows how much uncertainty there is on these apparent 'basic' matters. Box
The four fundamental forces are evidence for an Intelligently Designed universe. Joe
Piotr @ 136
“The four fundamental forces” are pretty creative. About one nanosecond after the Big Bang the uniform quark soup of the earliest universe curdled into baryons. A few minutes later some protons and neutrons were already combining into deuterium and helium nuclei....
This has always struck me one of the greatest mysteries. The conditions in the primordial singularity would seem to have been so unimaginably extreme that nothing - not matter, not information, not order, not regularity - could possibly survive intact. But if that is so, whence came the order, regularity, information that emerged after the singularity went "Bang"? Was it still intrinsic, surviving somehow where nothing should have survived? Or was it extrinsic, entering from outside? But if that is the case, what "outside" could it have come from? Seversky
#133 Box, "The four fundamental forces" are pretty creative. About one nanosecond after the Big Bang the uniform quark soup of the earliest universe curdled into baryons. A few minutes later some protons and neutrons were already combining into deuterium and helium nuclei. Thousands of years later, nuclei recombined with electrons to form neutral atoms; atoms formed simple molecules; simple molecules formed larger molecules; gravity made baryonic matter condense into clouds of gas; clouds collapsed gravitationally into large-scale structures, eventually forming dense objects like stars and protoplanetary discs... etc. A lot of order was generated in perfect agreement with the second law of thermodynamics. Piotr
Kairosfocus, Every time this subject is brought up I realize more and more how fundamental it is in relation to ID, FSCO/I and holism. BTW thanks for mentioning E.T.Jaynes, I've acquired "Probability Theory - the logic of science". Box
Box, yes. KF kairosfocus
Excellent summation of Sewell's argument by CS3:
Sewell’s argument basically boils down to two statements: 1) Natural forces do not do things that are macroscopically describable that are extremely improbable from the microscopic point of view. 2) Statement 1 holds whether a system is isolated or open; when it is open, you just have to also consider what is entering or leaving the system when deciding what is or is not extremely improbable. Statement 1 derives from two sources: the principle that particles obey the four fundamental forces, and the idea that, of all the microstates equally likely given the constraints of the four fundamental forces, macrostates with more microstates are more probable. For example, when only diffusion is operative, all positions within the volume are equally likely, so a uniform distribution is the most probable macrostate. A macrostate with few microstates will be achieved only if the four fundamental forces make that microstate not improbable – for example, a magnet moves magnetic particles initially uniformly distributed in a volume all to one side of the volume. Statement 2 derives from logic and common sense, although he also proves it analytically for the simple case of diffusion.
This statement by Halloschlaf is also very clear:
Halloschlaf: So if you want to show the aggregation of micro particles to form cybernetic systems you have actually to look for adequate forces or principles otherwise the pool of microparticles will be reigned by the second Law i.e the trend to higher entropy states.
Box
@zachriel Question 1: Of course. They tend to fall apart. Question 2:No. Because in this case manifested intelligence in form of roboters, machines, workers has transgressed the border of the open system to form kybernetic systems. halloschlaf
holloschlaf: We don’t argue about the ruling of the second law over cybernetic systems, we argue about their origin. So you agree that cybernetic systems follow the 2nd law of thermodynamics? Electronic computers can act as cybernetic systems. Are you saying the manufacture of computers violates the 2nd law of thermodynamics? Zachriel
@zachriel I don't understand you. We don't argue about the ruling of the second law over cybernetic systems, we argue about their origin. halloschlaf
halloschlaf: Additional it’s also obvious for decades that kybernetic systems and their parts are definitely not arranged by Coulomb forces. Cybernetic systems still follow the 2nd law of thermodynamics. Zachriel
@ zachriel and hangonasec The formation of chrystals is of course no problem for the second Law regarding X-Entropy because the binding forces are system-inherent. It's the same fallacy as with the first thought experiment by Davisson in which it was overseen that gravity has already transgressed the border of the open system. Of course you will -under special conditions- see ordering effects in respect of weight when their is a gravity field; that's trivial. The same is with coupling or binding forces of microparticles. These forces have already transgressed the border of the open system and will obviously show up at special conditions. The energy niveaus of coupling forces will actually be effective when the kinetic energies are to small and are therefore overruled. But there is no doubt that without the Coulomb forces there will be no binding. The Columb forces themselves obey in the same way the second Law and tend towards lower energy states i.e. higher entropy. So if you want to show the aggregation of micro particles to form kybernetic systems you have actually to look for adaequate forces or principles otherwise the pool of microparticles will be reigned by the second Law i.e the trend to higher entropy states. This means: The strongest bindings will be preferred. It's only a question of energy states. I hope you will admit this. It's the same with the gravity thought experiment. The Particles will definitely sink to the bottom of the jar in perfect accordance with the existing gravity field. On the Bottom of the Jar there is the lowest energy state and there they will stay. In the same way it would be essential to show that the assumed replicator is at „the bottom“ of the inherent Coulomb energy states. Otherwise its realization is forbidden by the second law. That replicators are of this kind of system type has never shown to be true. In fact the opposite seems to be obvious. Additional it's also obvious for decades that kybernetic systems and their parts are definitely not arranged by Coulomb forces. Actually its the very scope of kybernetic systems not to be dominated by coulomb or gravity forces otherwise there would be no freedom /variability which is the sign of kybernetic systems. This is especially crucial for nanosystems. Kybernetic systems have to be arranged in a way that there will be decision nodes, which are per definitonem force-free. Just look for example at the arrangement of DNA as a part of a kybernetic system. In force-free systems which are afforded to form kybernetic systems the second Law rules with an iron fist. If there is no adaequate force which transgresses the border of the observed open system to form decision nodes kybernetic systems won't come into existence. halloschlaf
kairosfocus: There is no need to further belabour such in time wasting circles of back and forth. That's completely up to you. Do you agree that the manufacture and use of a computer do not violate the 2nd law of thermodynamics? Zachriel
Z, it is obvious that you are not open to reckon with the informational and config or phase space blind search challenges and linked plausibility (reducible to probability under some cases per approaches of statistical thermodynamics) that are involved. There is no need to further belabour such in time wasting circles of back and forth. I therefore simply refer the serious onlooker to the remarks at 72 above and onwards. I further note to such onlookers that these remarks in some form have been linked through my handle for every comment I have ever made at UD -- that is what in the end the violently hostile objectors have been trying to undermine when they have done every thing they could to play the red herrings led away to ad hominem soaked strawman caricatures set alight with accusatory rhetoric games that have been such an unfortunate feature of the debates on design. Z and a few others have not stooped to such levels, but the evasiveness at key points is plain. KF kairosfocus
kairosfocus: again, I point out that above I indicated how crystalline ordered states occur. So you agree that ordered states can occur under the 2nd law of thermodynamics, atoms can be sorted and arranged into specific configurations? It's as if you flipped trillions of coins, and they all ended up on heads. And you apparently arrive at that conclusion because you understand and accept the mechanisms involved. kairosfocus: There is no analogue to that which would spontaneously “crystallise” or polymerise the hundreds of proteins and other molecules needed to form a living cell. Yes, we understand you reject the proposed mechanisms, but those proposed mechanisms are not in violation of the 2nd law of thermodynamics any more than the manufacture and use of a computer are a violation of the 2nd law of thermodynamics. Zachriel
Z, again, I point out that above I indicated how crystalline ordered states occur. I also pointed out how organised functionally specific states occur. Neither are inconsistent with 2LOT. But what is contrary to the reasoning involved is that blind chance forces are being expected to spontaneously come up with organisation. In the case of say water molecules, there is a polar structure and geometry that gives rise to ice once there is a process that removes the thermal agitation sufficiently. There is no analogue to that which would spontaneously "crystallise" or polymerise the hundreds of proteins and other molecules needed to form a living cell; including of course a neuron . . . much less the embryological program to create a brain much less a human one and onward issues about programming the neural networks. Indeed, high contingency as opposed to mechanically necessary order is a requisite of the informational nature of DNA and RNA, and thus proteins assembled based on such; the chemistry of chaining does not force an ordering, and the side chains do not force a chaining in a particular sequence either. Such would be self-defeating. That, too, is a very good reason why we observe such being assembled in living cells based on execution of highly specific instructions that are based on a code and algorithms with execution machinery. So the comparison you have tried to set up with your artful question -- with all due respect -- is little more than a way of avoiding making and reckoning with due, empirically grounded distinctions. KF kairosfocus
Box, 2LOT does inextricably have the context that Sewell has spoken to, post Josiah Willard Gibbs and others who founded statistical mechanics or statistical thermodynamics. It so happens that it is one of the laws of physics with multiple formulations, having been arrived at by multiple people form multiple directions. The pivotal issue is the logic and linked plausibility of forming rare clusters of states spontaneously. Just consider, temperature is a macro-observable, that traces to that at molecular etc scale, there is a random distribution of energy and mass across the set of possible configurations, i.e. degrees of freedom. And in fact I have found that the informational view that Jaynes et al brought to full form gives particularly relevant insights. I find the astonishing refusal to acknowledge that well known fact quite revealing that something is deeply, deeply wrong. KF kairosfocus
kairosfocus, You forgot to answer the questions we posed. So you agree that order is not unexpected to occur because of the 2nd law of thermodynamics, as we can easily point to many natural, ordered phenomena, none of which violate the 2nd law? Which has less entropy (thermodynamic ‘order’), a human brain or a like mass of diamonds? Zachriel
Z, on the contrary, it does, that is the point of the underlying microstate picture issue; functionally specific organisation (especially at molecular levels is a special and macroscopically recognisable form of order). The very same issue of overwhelming bulk of states that statistically grounds 2LOT and its applications in ever so many domains of work, is the very one that grounds the issue that spontaneous arrival at FSCO/I rich configs is maximally implausible. And, the microstate picture dates back to Gibbs et al in C19. That is what 72 above addresses. To use a simple example, consider a string of slots to be filled with bits at random, of length 1000 bits. The very same statistics that indicates that states such as that by chance flipping we arrive at 1111 . . . 1 or 000 . . . 0 or 0101 . . . 01, are maximally implausible and instead the system will with all but certainty gravitate to near 50-50 1/0 in no particular order will make it utterly unlikely that we would get to the ascii characters for the first 143 or so characters of this post by the same means. THis is the same statistics that I illustrated by the example of micro-jet parts diffusing in a vat of liquid and forming or failing to form a flyable jet by chance forces, and more, much more. The string problem is directly related to issues of formation of proteins and D/RNA in a Darwin's pond or the like environment, once we go to realistic lengths such as 300 AA for the protein. KF kairosfocus
Box: Just like the question of whether ID is “science” ultimately just depends on your definition of “science” and is irrelevant to its truthfulness, debating whether Sewell’s argument is based on what you consider the “Second Law” is an irrelevant distraction to its truthfulness. How many legs does a dog have if you call the tail a leg? Box: If you prefer, assume he is just using logic and probability, not the “Second Law.” The 2nd law of Sewell.
Sewell: It is {not} widely argued that the spectacular local decreases in something-something that occurred on Earth as a result of the origin and evolution of life and the development of human intelligence are not inconsistent with the second law of Sewell, because the Earth is an open system and something-something can decrease in an open system, provided the decrease is compensated by something-something increases outside the system. http://bio-complexity.org/ojs/index.php/main/article/download/BIO-C.2013.2/BIO-C.2013.2
Then Sewell goes on to state the 2nd law of thermodynamics from "Classical and Modern Physics". Then he quotes a bunch of physicists talking about the 2nd law of thermodynamics. Then he provides examples of the 2nd law of thermodynamics. It's impossible to read his paper as anything but a fallacious understanding of the 2nd law of thermodynamics. Perhaps you could link to the "logic and probability" argument without the something-something mumbo jumbo. Zachriel
CS3 on the annoying "but this is not really the 2nd law of thermodynamics" defense:
CS3: Just like the question of whether ID is “science” ultimately just depends on your definition of “science” and is irrelevant to its truthfulness, debating whether Sewell’s argument is based on what you consider the “Second Law” is an irrelevant distraction to its truthfulness. If you want to define the “Second Law” to exclude anything not explicitly about energy, and dismiss the countless textbooks and journal papers that use a broader definition as “confused”, fine. If you prefer, assume he is just using logic and probability, not the “Second Law.”
Box
Timaeus @102 Thanks for that link! harry
Box: You claim that the second law should be confined to “heat”. Not at all. In classical terms, entropy is a measure of the unavailable energy per unit temperature in a system. In statistical terms, entropy is the number of possible microscopic states of a system in equilibrium. However, it is important to note that these measures are equivalent. Box: forum member CS3 shows you dead wrong by offering citations from several general university physics textbooks and several prominent physicists, which all indicate that Sewell’s alleged “conflation” is indeed universally recognized. No. The examples just highlight how Sewell has conflated the analogy with the thing being described. While statistical thermodynamics is based on probability, not all probability is statistical thermodynamics. To take one of the examples provided, while a deck of cards that is shuffled will almost certainly not return to its sorted state by continued shuffling, the deck of cards has virtually* the same thermodynamic entropy whether shuffled or sorted. (*Shuffling may cause some of the bonds in the paper to break.) Sewell needs to abandon his use of the term 2nd law of thermodynamics with regards to his probability argument. He doesn't presumably because he wants to obliquely garner the imprimatur of "law" to his claim. In any case, the manufacture and use of computers does not violate the 2nd law of thermodynamics. Do you disagree? Zachriel
box
Here, in 2013, forum member CS3 shows you dead wrong by offering citations from several general university physics textbooks and several prominent physicists, which all indicate that Sewell’s alleged “conflation” is indeed universally recognized.
If that is the case what are the formula and units used to calculate the 'entropy' of these other applications? franklin
Zachriel #83, You claim that the second law should be confined to "heat".
Zachriel: How is Sewell’s conflation “universally recognized”? What “other applications”?
Here, in 2013, forum member CS3 shows you dead wrong by offering citations from several general university physics textbooks and several prominent physicists, which all indicate that Sewell's alleged "conflation" is indeed universally recognized. Box
Box: Are you, in effect, arguing that, just as ice crystals are what the actions of the four fundamental forces predict will form from a drop of water when entering the lower entropy state, maybe computers are what the four fundamental forces predict will form from a pile of rubble when entering a lower entropy state? Our claim, which is obvious, is that the manufacturing and use of computers does not violate the 2nd law of thermodynamics. Zachriel
Zachriel: Halloschlaf’s statement seemed to imply that order is not expected to occur because of the 2nd law of thermodynamics, when we can easily point to many natural, ordered phenomena, none of which violate the 2nd law.
Are you, in effect, arguing that, just as ice crystals are what the actions of the four fundamental forces predict will form from a drop of water when entering the lower entropy state, maybe computers are what the four fundamental forces predict will form from a scrap yard when entering a lower entropy state? Are you saying that human brains, space ships, and encyclopedias are what the four unintelligent fundamental forces predict will form from a barren planet when sunlight enters it? Box
kairosfocus: order is not functionally specific, typically aperiodic interactive, configuration and coupling sensitive organisation No, but that has nothing to do with the 2nd law of thermodynamics, the topic of the thread. Which has more entropy (thermodynamic 'disorder'), a human brain or a like mass of diamonds? Zachriel
Z, order is not functionally specific, typically aperiodic interactive, configuration and coupling sensitive organisation -- a distinction drawn by Orgel and Wicken in the 1970's. The issue is not that the existence of FSCO/I violates 2LOT, but that per the underlying phase or configs space view and relative statistical weights of clusters of microstates, FSCO/I is maximally implausible to emerge by spontaneous action of blind chance and mechanical necessity. The only empirically warranted source of the constructive work to create such is intelligently directed configuration, not the sort of forces responsible for phenomena like diffusion . . . a concept central to all of the discussions above. Again, I point you to 72 above. By responding to it, you will allow us to see the real issues at stake. KF kairosfocus
kairosfocus: if you don’t understand how say ice crystallises by having ordering forces traceable to its polar molecules being in a context of the random thermal agitation that lies behind temperature or how metals crystallise due to packing factors etc, We do understand how crystals form. So you agree that order is not unexpected to occur because of the 2nd law of thermodynamics, as we can easily point to many natural, ordered phenomena, none of which violate the 2nd law? Zachriel
CHartsil:
Complexity alone is not a metric of design and never has been.
This is true:
Functional complexity already has come about mindlessly on earth.
Evidence please. Joe
Z, if you don't understand how say ice crystallises by having ordering forces traceable to its polar molecules being in a context of the random thermal agitation that lies behind temperature or how metals crystallise due to packing factors etc, then I suggest a reading of TMLO, esp. ch 7. KF PS: My discussion here on in my always linked may help too. The box of idealised, classical marbles thought exercise will help develop an intuitive feel for wider implications of thermodynamic reasoning. It's not just about "heat." PPS: Your interlocutor comes across as knowing relevant statistical thermodynamics but is evidently not a native English speaker. kairosfocus
kairosfocus: we both know that circumstances on which crystals form have to do with the ordering mechanical forces overcoming sufficiently low disorder linked to especially thermal agitation. Have no idea what you are trying to say. Halloschlaf's statement seemed to imply that order is not expected to occur because of the 2nd law of thermodynamics, when we can easily point to many natural, ordered phenomena, none of which violate the 2nd law. Zachriel
halloschlaf:
I’m sorry, but in my opinion you haven’t understood Sewell either. There’s no discussion about the probability of the formation of an hydrogen molecule under the dominion of the 2nd Law.
Agreed, there isn't. That was my jumping-off point to talk about entropy as it applies to biological systems, which is the continual passage to lower-energy molecular states due largely to following gradients provided by fields of charge and filling orbitals. That is the principal barrier a living system must overcome - not the generation of an 'improbable' complex state out of nothing, but the coupling of rechargable electron donors to replication (I know many here will see that as the same thing!).
The real problem is the emergence of highly improbable macrostates like kybernetic systems out of random microstates under the reign of the second law.
I don't think we can really know if the macrostates at the OoL were improbable or not. The configuration that works may be impossible or likely. Obviously, we haven't found it yet and maybe never will. But again, I see a confusion between the statistical-mechanical treatment of entropy and the concept of chemical free energy. What we look for at the OoL is essentially a replicator, potentially a single molecule. Appearing fully-formed, we see it as improbable (and may be right) but there is no rigorous way to test this.
The second law is correctly extended by Sewell and others to the whole microworld, because it’s a statistical law.
Well ... it can be treated as a statistical law. But actually, chemical free energy at the molecular level is much less 'statistical'. Get molecules close enough and they will react with significant inevitability. One of the distinctive characteristics of Life is the 'surgical' manner in which energy is applied. Things are not bathed in low-grade energy; 'energetic' molecules convey energy to specific locations (into which they fall by following thermodynamic gradients). Now, I wouldn't trivialise the problem of getting these molecules charged in the first place, and turning them into a minimally replicating system, but as soon as replication is ignited with an exponent > 1, the system has developed a unique buffer against 'informational' decay. Rather than having all one's eggs in a single molecular basket, multiple copies are made, some of which will degrade, but not all. I think this is vital.
Thermic energy which may transgress the border will be of no use in these scenarios[...]
Thermic energy plays little part in biology, and the same, I think, goes for the OoL. Indeed, I would relegate photons to a secondary role too. It's all about the chemical energy, and (I think) early life was entirely chemically sustained, with no input from the sun.
The same is true with random impulses which may transgress the border. Because random impulses lead exactly to the statistical consequences of the second law that means higher entropy
Agreed, but I don't appeal to those either. I can only speculate, but I think a potential first step may be the self-selection of short complementary nucleic acid strands. They bind preferentially (by following thermodynamic gradients, exactly the way PCR primers find their targets today). This allows greater persistence over free single strands in solution (by increasing thermodynamic robustness). Thermodynamics can certainly not be ignored in OoL considerations. Tapping into thermodynamic gradients is one of the defining characteristics of Life - not random impulses, but a continuous supply. That which degrades also sustains. Hangonasec
Z, pardon me but we both know that circumstances on which crystals form have to do with the ordering mechanical forces overcoming sufficiently low disorder linked to especially thermal agitation. To melt a crystalline solid, we HEAT it, as a rule. KF kairosfocus
Z, again, kindly look at the statistical foundations of the 2LOT. As I pointed out in 72 above (per fair comment, now conspicuously tip-toed around again and again . . . ), injection of raw uncoupled energy is utterly unlikely to perform constructive work at nano or macro scales, particularly work . . . forced, ordered motion per dW = F*dx . . . that produces FSCO/I rich entities; e.g. by step by step co-ordinated assembly processes. And that is at the heart of the OOL issue. Open systems dismissive arguments miss the pivotal point. KF kairosfocus
halloschlaf: The microstates of small particles which are governed by the second law will definitely show a trend towards higher entropy which means, that „the stuff will mix up“. Crystals are highly improbable arrangements of molecules, definitely not "mixed up". So you're saying crystals can't form naturally. Zachriel
harry: Thanks for your able comments at 91. I was relying too much on memory for what Sewell said. I might add that some of the best comments on Sewell's argument that I have seen were posted on this site by "cs3", on this thread: https://uncommondesc.wpengine.com/intelligent-design/where-is-the-difference-here/ Sewell like the comments of cs3 and said they were clearer than his own original article. I would agree with that. Timaeus
@hangonasec I hope you can understand my English, because English is not my mother tongue. I'm sorry, but in my opinion you haven't understood Sewell either. There's no discussion about the probability of the formation of an hydrogen molecule under the dominion of the 2nd Law. The real problem is the emergence of highly improbable macrostates like kybernetic systems out of random microstates under the reign of the second law. The second law is correctly extended by Sewell and others to the whole microworld, because it's a statistical law. This extension seems totally plausible and is of course testable. The microstates of small particles which are governed by the second law will definitely show a trend towards higher entropy which means, that „the stuff will mix up“. That is absolutely analog to the thermodynamic effects, which can be observed in classical thermodynamic experiments. Only and only if a capable force or a capable principle transgress the border of the open system that can operate with the microparticles in a distinctive way so that improbable macrostates will be formed, improblable macrostates will occur. Thermic energy which may transgress the border will be of no use in these scenarios, because it provides no directed stimulus which is needed to form a kybernetic system as an example of an highly improblable macrostatus. The same is true with random impulses which may transgress the border. Because random impulses lead exactly to the statistical consequences of the second law that means higher entropy.. halloschlaf
Box @87,
“Apparent ordering” … sure. - On the free energy from the sun …
I don't think you grasped my point, and seem to have completely skipped over the part about chemotrophy. The sun has certainly had a part to play, but I invited readers to consider the bare bones of an 'apparent ordering' system, where 2 atoms become 'ordered' molecular hydrogen because they can shed energy by doing so. There is no 'local decrease' in entropy; the system simply spreads out by movement of energy from a local to a dissipated state. It's electrons that really drive this, for present purposes. The same applies to protein folding, and nucleic acid complementary binding for example. The 'ordering' process is adoption of a lower free energy state. It is ALWAYS coupled to a net dissipation of energy - ie, the 2nd Law is not violated. This 'ordered' molecular state is not locally improbable, but almost inevitable. Of course, higher-order states - computers and suchlike - are individually less probable, but their existence depends nevertheless upon the energy flux through those lower ordered states. Of course, what Life does - and it is pretty nifty - is 'recharge' by various means so that permanent equilibrium is not achieved. But there is no sudden improbable, against-entropy step in this cycle. It needs fuel, in the form of electronegative molecules, or photons. Photosynthesis is a substantial source of energy flux for sure. But more fundamental is the chemical tendency of electrons to follow gradients (and shed energy as they do so). It's really that which photosynthesis exploits as well. It elevates the energy of electrons to permit the molecule in which they sit to function as an electron donor. Electron donation is key. Chemical entropy has next to nothing to do with probability, or statistical mechanics. Chemistry is pretty deterministic, as it goes, and I am no fan of the 'entropy = disorder' pedagological approach. The quote from Sewell, meanwhile ... well, how can one have a discussion with a quote? There is no 'compensation'. One is not 'borrowing probability', like a quantum-tunnelling electron or a 'local improbability drive'. Computers are made by people, some time after a decent (and not free) lunch. They may have been a priori improbable on a bare earth, but they violate no physical laws. Hangonasec
CHartsil @98
Complexity alone is not a metric of design and never has been.
So what? Intricacy and complexity are common. Functional complexity is another matter.
Functional complexity already has come about mindlessly on earth.
Yes. The nanotechnology of life and functionally complex technology created by humanity are present on Earth. Life, like our own technology, is the result of intelligent agency. What is your example of mindlessly arrived at significant functional complexity (that isn't life itself)? harry
"The point is that there is nothing about energy flow into an open system that makes the emergence of functional complexity significantly less unlikely. It may make it more ordered if there is a mechanism to constructively harness the energy, but significant functional complexity doesn’t seem to ever come about mindlessly." Complexity alone is not a metric of design and never has been. Functional complexity already has come about mindlessly on earth. CHartsil
harry: The point is that there is nothing about energy flow into an open system that makes the emergence of functional complexity less unlikely. There's nothing in the 2nd law of thermodynamics that precludes it either. Zachriel
CHartsil @93 Let me put it the way McIntosh cited by Sewell put it, cited above:
However, all these authors are making the same assumption — viz. that all one needs is sufficient energy flow into a [non-isolated] system and this will be the means of increasing the probability of life developing in complexity and new machinery evolving.
The point is that there is nothing about energy flow into an open system that makes the emergence of functional complexity significantly less unlikely. It may make it more ordered if there is a mechanism to constructively harness the energy, but significant functional complexity doesn't seem to ever come about mindlessly. harry
Harry, You must forgive Zachriel who holds on to a very narrow and outdated understanding of the second law. Experience informs us that it's no use trying to change his mind. Box
harry: The emergence of gemstones does not in any way establish that sunshine made the mindless, accidental emergence of life on Earth any less unlikely. However, the 2nd law of thermodynamics doesn't preclude it. harry: if such an entropy decreasing mechanism is present, all it does is just that: creates a more ordered system. Order in itself does not force the emergence of functional complexity. However, the 2nd law of thermodynamics doesn't preclude it. Zachriel
Zachriel @92
Actually, increases in entropic order (i.e. local decreases in entropy) are very common in nature, e.g. the formation of gemstones.
And as I said:
if such an entropy decreasing mechanism is present, all it does is just that: creates a more ordered system. Order in itself does not force the emergence of functional complexity.
The emergence of gemstones is an example of increasing order, but not functional complexity. Yes, that commonly happens. harry
"The entrance of energy makes a decrease in entropy possible, provided there is a mechanism to constructively harness it, but it in no way makes that decrease probable. And if such an entropy decreasing mechanism is present, all it does is just that: creates a more ordered system." You seem to have a fundamental misunderstanding of entropy. Entropy in physics is just a measure of energy in a system unable to do physical work. If you think the sun adding energy to the earth doesn't help, try living on Pluto CHartsil
harry: The entrance of energy makes a decrease in entropy possible, provided there is a mechanism to constructively harness it, but it in no way makes that decrease probable. Actually, increases in entropic order (i.e. local decreases in entropy) are very common in nature, e.g. the formation of gemstones. Zachriel
Timaeus @65:
The way he wrote it up, he ended up having to defend two very large theses: (1) that the second law is actually merely an instance of a more general law; (2) that this more general law forbids evolution. It's always better if you can restrict yourself to one thesis per article; it's not only easier for the reader to tell what you are talking about -- it's easier to defend one radical thesis than two.
I don't think Sewell had to establish "that the second law is actually merely an instance of a more general law." It is widely regarded as just that. For example, some thoughts of Stephen Hawking on the 2LT:
It is a matter of common experience, that things get more disordered and chaotic with time. This observation can be elevated to the status of a law, the so-called Second Law of Thermodynamics. This says that the total amount of disorder, or entropy, in the universe, always increases with time. However, the Law refers only to the total amount of disorder. The order in one body can increase, provided that the amount of disorder in its surroundings increases by a greater amount. -- http://www.hawking.org.uk/life-in-the-universe.html
Or as Sewell himself points out:
Consider, for example, three common statements of the second law from the textbook Classical and Modern Physics [2: p. 618]: 1. In an isolated system, thermal entropy cannot decrease. 2. In an isolated system, the direction of spontaneous change is from order to disorder. 3. In an isolated system, the direction of spontaneous change is from an arrangement of lesser probability to an arrangement of greater probability. -- Sewell G (2013) Entropy and evolution. BIO-Complexity 2013 (2): 1-5.
Yet it is true that when it is pointed out that the "more general law" presents a problem for the theory of mindless, accidental evolution, that is typically countered with arguments relying on the narrow definition of the 2LT. Sewell later points this out:
But not everyone finds this line of argument convincing. Andy McIntosh offers this critique of the Styer [5] and Bunn [6] papers in a recent article [8]:
Both Styer and Bunn calculate by slightly different routes a statistical upper bound on the total entropy reduction necessary to 'achieve' life on earth. This is then compared to the total entropy received by the Earth for a given period of time. However, all these authors are making the same assumption -- viz. that all one needs is sufficient energy flow into a [non-isolated] system and this will be the means of increasing the probability of life developing in complexity and new machinery evolving. But as stated earlier this begs the question of how a local system can possibly reduce the entropy without existing machinery to do this.
Indeed, the compensation argument is predicated on the idea that there is no content to the second law apart from a prohibition of net entropy decreases in isolated systems, and moreover that the universal currency for entropy is thermal entropy.
Indeed, "not everyone finds this line of reasoning convincing." It seems obvious that energy entering an open system is going to do nothing but increase entropy in that system unless it is constructively harnessed by some mechanism. The entrance of energy makes a decrease in entropy possible, provided there is a mechanism to constructively harness it, but it in no way makes that decrease probable. And if such an entropy decreasing mechanism is present, all it does is just that: creates a more ordered system. Order in itself does not force the emergence of functional complexity. Functional complexity -- technology -- only comes about through the activity of an intelligent agent. Technology is commonly understood to be the application of scientific knowledge for practical purposes. The digital-information-based nanotechnology of life is as obviously that as it is that our own technology is that. Most of us, most of the time, find it easy to immediately distinguish between technology and natural phenomena that were accidentally brought about by the mindless forces of nature. One of the reasons we can easily make that distinction, whether we are conscious of it or not, is that we have a sense of what is inevitable and what isn't. Phenomena like television sets and electric can-openers are not inevitable; one wouldn't say that the laws of physics applied to a given material environment will inevitably produce a television set. One might say that about a phenomenon like a star or a planet, but not about a laptop PC. Technology is not inevitable because it is unlikely. The nanotechnology of life is far, far more unlikely to come about mindlessly and accidentally than is a laptop PC. It is a tribute to the power of indoctrination that so many would scoff at the notion that a laptop PC might come about accidentally, but insist that the nanotechnology of life, the functional complexity of which is light years beyond our own technology, came about mindlessly and accidentally. harry
Box: It’s perfectly fine for you to discuss heat distribution. We're discussing the topic of the original post. Spaceships do not violate the 2nd law of thermodynamics. Zachriel
Zachriel, It's perfectly fine for you to discuss heat distribution. However I'm not interested, so don't direct your posts to me. Box
Box: Please retract yourself from this thread and read up. This discussion is not about heat distribution. Please read up. This discussion has the title of "Are Darwinian claims for evolution consistent with the 2nd law of thermodynamics?" Box (quoting): Thus unless we are willing to argue that the influx of solar energy into the Earth makes the appearance of spaceships, computers and the Internet not extremely improbable, we have to conclude that at least the basic principle behind the second law has in fact been violated here. Those processes do not violate the 2nd law of thermodynamics. Nothing humans have done or can do violates the 2nd law of thermodynamics. That's rather the whole point. Zachriel
Hangonasec, Humans, cars, high-speed computers, libraries full of science texts and encyclopedias, TV sets, airplanes and spaceships.
Hangonasec: Which simply means that all apparent ordering is accompanied by the shedding of free energy.
"Apparent ordering" ... sure. - On the free energy from the sun ...
Granville Sewell: Thus unless we are willing to argue that the influx of solar energy into the Earth makes the appearance of spaceships, computers and the Internet not extremely improbable, we have to conclude that at least the basic principle behind the second law has in fact been violated here.
... which boils down to the "compensation" argument:
Granville Sewell: It is widely argued that the spectacular local decreases in entropy that occurred on Earth as a result of the origin and evolution of life and the development of human intelligence are not inconsistent with the second law of thermodynamics, because the Earth is an open system and entropy can decrease in an open system, provided the decrease is compensated by entropy increases outside the system. I refer to this as the compensation argument, and I argue that it is without logical merit, amounting to little more than an attempt to avoid the extraordinary probabilistic difficulties posed by the assertion that life has originated and evolved by spontaneous processes. To claim that what has happened on Earth does not violate the fundamental natural principle behind the second law, one must instead make a more direct and difficult argument. (...) Of course the whole idea of compensation, whether by distant or nearby events, makes no sense logically: an extremely improbable event is not rendered less improbable simply by the occurrence of "compensating" events elsewhere. According to this reasoning, the second law does not prevent scrap metal from reorganizing itself into a computer in one room, as long as two computers in the next room are rusting into scrap metal -- and the door is open. (Or the thermal entropy in the next room is increasing, though I am not sure how fast it has to increase to compensate for computer construction!)
Box
H, chemical issues and Gibbs Free energy are significant but I would add points in 72 above, there is more to thermodynamics once informational issues come to bear. Gotta go. KF kairosfocus
a) Physicists really, really can’t explain what goes on in biology. Neither their definition of entropy, nor their definition of information (Shannon, etc) work. Rather than admit that they don’t know what is going on, they simply extrapolate what they do know (ideal gasses) to biology and make pronouncements.
I'd say biology owes more to concepts of chemical thermodynamics, which is of course rooted in physics, but nothing to do with ideal gases. Free energy can be measured, and no movement is found against a free energy gradient, granted that there is frequent coupling to give a net 'downhill' slope. Simply: systems that can shed energy do so. In doing so, entropy increases. But entropy increase can be associated with an increase in apparent 'order', and this is the source of frequent confusion. Is molecular hydrogen more 'ordered' than atomic hydrogen? Kind of, but in strict entropic terms, the energy has to be accounted for as well. Two atoms of hydrogen locally tethered experience force. Allow that system to equilibrate, and energy is released, unrecoverably. The system, with its quanta of 'lost' energy, is less ordered, even though the part we detect is not. Biology does no more. Electrons pass down gradients of serial electonegativity. Sunlight is but one mechanism of elevating electrons to enable them to descend such a chain. In the atomic species found on earth, there are some already furnished with more, and some with less, electronegativity. Those with more can act as electron donors, those with less electron acceptors. There are entire ecosystems which receive not one joule of energy from the sun. Of course, like any entropic system, they are slowly running down - moving towards equilibrium. But none of it violates the 2nd Law of thermodynamics. Which simply means that all apparent ordering is accompanied by the shedding of free energy. Hangonasec
Zach, Please retract yourself from this thread and read up. This discussion is not about heat distribution. Box
Box: However it’s totally irrelevant, because who said they are? You did.
Zachriel: Drawing ten straight flushes in a row from a fair, well-shuffled deck is not plausible Box: And why is not plausible? Exactly because of the second law
Sewell: While the first formulations were all about heat, it is now universally recognized that the second law of thermodynamics can be used, in a quantifiable way, in many other applications. Probability theory dates to the 17th century. How is Sewell's conflation "universally recognized"? What "other applications"? Zachriel
Zach: The odds of a royal flush are not an example of thermodynamics.
True. However it's totally irrelevant, because who said they are? You are not playing stupid with me now or what?
Sewell: While the first formulations were all about heat, it is now universally recognized that the second law of thermodynamics can be used, in a quantifiable way, in many other applications.
Box
Box: Indeed. And why is not plausible? Exactly because of the second law, which is all about probability. Um, no. The odds of a royal flush are not an example of thermodynamics. The caloric cost of shuffling and dealing and playing of cards are paid for by the alcohol. Zachriel
Z, kindly cf 72 above. KF kairosfocus
Zachriel: That is incorrect.
Indeed. "Anything is possible in an open system", often proclaimed by your ilk, is totally nonsensical. Thank you for the admission.
Zachriel: Drawing ten straight flushes in a row from a fair, well-shuffled deck is not plausible (...)
Indeed. And why is not plausible? Exactly because of the second law, which is all about probability. Box
Timaeus: But that, in itself, doesn’t make Sewell’s argument wrong. It just means that his choice of words creates some confusion about what exactly he is arguing. From paper Box cited: "It is widely argued that the spectacular local decreases in entropy that occurred on Earth as a result of the origin and evolution of life and the development of human intelligence are not inconsistent with the second law of thermodynamics, because the Earth is an open system and entropy can decrease in an open system, provided the decrease is compensated by entropy increases outside the system. I refer to this as the compensation argument, and I argue that it is without logical merit" http://bio-complexity.org/ojs/index.php/main/article/view/70 There is no ambiguity in that statement. Sewell is wrong about a fundamental finding in physics. And that's just the very first lines! Box: This boils down to the “compensation argument” aka ‘anything is possible in an open system’. That is incorrect. There are all sorts of physical restrictions on what is possible and what is plausible. The 2nd law merely restricts in one manner what is possible. Granville Sewell: Of course the whole idea of compensation, whether by distant or nearby events, makes no sense logically: an extremely improbable event is not rendered less improbable simply by the occurrence of “compensating” events elsewhere. Just because something is consistent with the 2nd law of thermodynamics doesn't mean it is possible or plausible. Drawing ten straight flushes in a row from a fair, well-shuffled deck is not plausible, but is consistent with the 2nd law of thermodynamics. Zachriel
Gordon Davisson: There’s a direct relationship between entropy and probability only in the case of an isolated system fluctuating around thermodynamic equilibrium. In open systems fluctuating around equilibrium there’s a more complicated relationship, and for non-equilibrium systems (e.g. pretty much everything relating to life) the relationship breaks down entirely.
This boils down to the “compensation argument” aka 'anything is possible in an open system'. Peter Urone: "it is always possible for the entropy of one part of the universe to decrease, provided the total change in entropy of the universe increases."
Granville Sewell: Of course the whole idea of compensation, whether by distant or nearby events, makes no sense logically: an extremely improbable event is not rendered less improbable simply by the occurrence of "compensating" events elsewhere. According to this reasoning, the second law does not prevent scrap metal from reorganizing itself into a computer in one room, as long as two computers in the next room are rusting into scrap metal--and the door is open.
Box
F/N: Thaxton et al at the end of Ch 7, TMLO, as they were about to bridge to the discussion on spontaneous origin of proteins and D/RNA (an onward discussion that has of course been further developed in the past 30 years):
While the maintenance of living systems is easily rationalized in terms of thermodynamics, the origin of such living systems is quite another matter. Though the earth is open to energy flow from the sun, the means of converting this energy into the necessary work to build up living systems from simple precursors remains at present unspecified (see equation 7-17). The "evolution" from biomonomers of to fully functioning cells is the issue. Can one make the incredible jump in energy and organization from raw material and raw energy, apart from some means of directing the energy flow through the system? In Chapters 8 and 9 we will consider this question, limiting our discussion to two small but crucial steps in the proposed evolutionary scheme namely, the formation of protein and DNA from their precursors. It is widely agreed that both protein and DNA are essential for living systems and indispensable components of every living cell today.11 Yet they are only produced by living cells. Both types of molecules are much more energy and information rich than the biomonomers from which they form. Can one reasonably predict their occurrence given the necessary biomonomers and an energy source? Has this been verified experimentally? These questions will be considered . . .
Again, the focus on functionally specific interactive organisation and linked information in the context of high complexity should be plain. KF PS: I could not but notice how Rational Wiki popped up in the Google search and tried to dismiss a technical, scientific discussion as a "religious" book. This speaks volumes and belies the "rational" part of the title. TMLO is one of the foundational technical works behind design theory, period. kairosfocus
Me_Think #74, You are right. Every 12 years or so I make a tiny mistake and this must be it. I should have linked to "Entropy and evolution" - Granville Sewell (2013). The quote I provided in post #70 is on page 4 of the paper. Box
Box @ 70 You probably linked the wrong paper. What you quoted is not there in the paper Me_Think
PS: The underlying roots of my emphasis on config spaces and the implication that COMPLEX, INTERACTIVE AND SPECIFIC FUNCTION SHARPLY CONSTRAINS THE POSSIBLE CONFIGS TO ISLANDS OF FUNCTION (LEADING TO NEEDLE IN HAYSTACK SEARCH CHALLENGES) should be clear. kairosfocus
F/N: As I have pointed out above several times, Sewell is looking at the phase and/or configuration space, informational view of thermodynamics. I link and clip my always linked note . . . at minimum so the onlooker will not be carried away by dismissive remarks above and elsewhere: _______________ http://www.angelfire.com/pro/kairosfocus/resources/Info_design_and_science.htm#shnn_info >> . . . we may average the information per symbol in [a] communication system thusly (giving in terms of -H to make the additive relationships clearer): - H = p1 log p1 + p2 log p2 + . . . + pn log pn or, H = - SUM [pi log pi] . . . Eqn 5 H, the average information per symbol transmitted [usually, measured as: bits/symbol], is often termed the Entropy; first, historically, because it resembles one of the expressions for entropy in statistical thermodynamics. As Connor notes: "it is often referred to as the entropy of the source." [p.81, emphasis added.] Also, while this is a somewhat controversial view in Physics, as is briefly discussed in Appendix 1 below, there is in fact an informational interpretation of thermodynamics that shows that informational and thermodynamic entropy can be linked conceptually as well as in mere mathematical form. Though somewhat controversial even in quite recent years, this is becoming more broadly accepted in physics and information theory, as Wikipedia now discusses [as at April 2011] in its article on Informational Entropy (aka Shannon Information, cf also here):
At an everyday practical level the links between information entropy and thermodynamic entropy are not close. Physicists and chemists are apt to be more interested in changes in entropy as a system spontaneously evolves away from its initial conditions, in accordance with the second law of thermodynamics, rather than an unchanging probability distribution. And, as the numerical smallness of Boltzmann's constant kB indicates, the changes in S / kB for even minute amounts of substances in chemical and physical processes represent amounts of entropy which are so large as to be right off the scale compared to anything seen in data compression or signal processing. But, at a multidisciplinary level, connections can be made between thermodynamic and informational entropy, although it took many years in the development of the theories of statistical mechanics and information theory to make the relationship fully apparent. In fact, in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon's information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states that it could be in, thus making any complete state description longer. (See article: maximum entropy thermodynamics.[Also,another article remarks: >>in the words of G. N. Lewis writing about chemical entropy in 1930, "Gain in entropy always means loss of information, and nothing more" . . . in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes/no questions that need to be answered in order to fully specify the microstate, given that we know the macrostate.>>]) Maxwell's demon can (hypothetically) reduce the thermodynamic entropy of a system by using information about the states of individual molecules; but, as Landauer (from 1961) and co-workers have shown, to function the demon himself must increase thermodynamic entropy in the process, by at least the amount of Shannon information he proposes to first acquire and store; and so the total entropy does not decrease (which resolves the paradox).
Summarising Harry Robertson's Statistical Thermophysics (Prentice-Hall International, 1993) -- excerpting desperately and adding emphases and explanatory comments, we can see, perhaps, that this should not be so surprising after all. (In effect, since we do not possess detailed knowledge of the states of the vary large number of microscopic particles of thermal systems [typically ~ 10^20 to 10^26; a mole of substance containing ~ 6.023*10^23 particles; i.e. the Avogadro Number], we can only view them in terms of those gross averages we term thermodynamic variables [pressure, temperature, etc], and so we cannot take advantage of knowledge of such individual particle states that would give us a richer harvest of work, etc.) For, as he astutely observes on pp. vii - viii:
. . . the standard assertion that molecular chaos exists is nothing more than a poorly disguised admission of ignorance, or lack of detailed information about the dynamic state of a system . . . . If I am able to perceive order, I may be able to use it to extract work from the system, but if I am unaware of internal correlations, I cannot use them for macroscopic dynamical purposes. On this basis, I shall distinguish heat from work, and thermal energy from other forms . . .
And, in more details, (pp. 3 - 6, 7, 36, cf Appendix 1 below for a more detailed development of thermodynamics issues and their tie-in with the inference to design; also see recent ArXiv papers by Duncan and Samura here and here):
. . . It has long been recognized that the assignment of probabilities to a set represents information, and that some probability sets represent more information than others . . . if one of the probabilities say p2 is unity and therefore the others are zero, then we know that the outcome of the experiment . . . will give [event] y2. Thus we have complete information . . . if we have no basis . . . for believing that event yi is more or less likely than any other [we] have the least possible information about the outcome of the experiment . . . . A remarkably simple and clear analysis by Shannon [1948] has provided us with a quantitative measure of the uncertainty, or missing pertinent information, inherent in a set of probabilities [NB: i.e. a probability different from 1 or 0 should be seen as, in part, an index of ignorance] . . . . [deriving informational entropy, cf. discussions here, here, here, here and here; also Sarfati's discussion of debates and the issue of open systems here . . . ] H({pi}) = - C [SUM over i] pi*ln pi, [. . . "my" Eqn 6] [where [SUM over i] pi = 1, and we can define also parameters alpha and beta such that: (1) pi = e^-[alpha + beta*yi]; (2) exp [alpha] = [SUM over i](exp - beta*yi) = Z [Z being in effect the partition function across microstates, the "Holy Grail" of statistical thermodynamics]. . . . [H], called the information entropy, . . . correspond[s] to the thermodynamic entropy [i.e. s, where also it was shown by Boltzmann that s = k ln w], with C = k, the Boltzmann constant, and yi an energy level, usually ei, while [BETA] becomes 1/kT, with T the thermodynamic temperature . . . A thermodynamic system is characterized by a microscopic structure that is not observed in detail . . . We attempt to develop a theoretical description of the macroscopic properties in terms of its underlying microscopic properties, which are not precisely known. We attempt to assign probabilities to the various microscopic states . . . based on a few . . . macroscopic observations that can be related to averages of microscopic parameters. Evidently the problem that we attempt to solve in statistical thermophysics is exactly the one just treated in terms of information theory. It should not be surprising, then, that the uncertainty of information theory becomes a thermodynamic variable when used in proper context . . . . Jayne's [summary rebuttal to a typical objection] is ". . . The entropy of a thermodynamic system is a measure of the degree of ignorance of a person whose sole knowledge about its microstate consists of the values of the macroscopic quantities . . . which define its thermodynamic state. This is a perfectly 'objective' quantity . . . it is a function of [those variables] and does not depend on anybody's personality. There is no reason why it cannot be measured in the laboratory." . . . . [pp. 3 - 6, 7, 36; replacing Robertson's use of S for Informational Entropy with the more standard H.]
As is discussed briefly in Appendix 1, Thaxton, Bradley and Olsen [TBO], following Brillouin et al, in the 1984 foundational work for the modern Design Theory, The Mystery of Life's Origins [TMLO], exploit this information-entropy link, through the idea of moving from a random to a known microscopic configuration in the creation of the bio-functional polymers of life, and then -- again following Brillouin -- identify a quantitative information metric for the information of polymer molecules. For, in moving from a random to a functional molecule, we have in effect an objective, observable increment in information about the molecule. This leads to energy constraints, thence to a calculable concentration of such molecules in suggested, generously "plausible" primordial "soups." In effect, so unfavourable is the resulting thermodynamic balance, that the concentrations of the individual functional molecules in such a prebiotic soup are arguably so small as to be negligibly different from zero on a planet-wide scale. By many orders of magnitude, we don't get to even one molecule each of the required polymers per planet, much less bringing them together in the required proximity for them to work together as the molecular machinery of life. The linked chapter gives the details. More modern analyses [e.g. Trevors and Abel, here and here], however, tend to speak directly in terms of information and probabilities rather than the more arcane world of classical and statistical thermodynamics . . . >> ________________ So, in fact, there is a legitimate physical view on thermodynamics that brings out the force of Sewell's point. Which, we may clip (yet again, one of the all too typical Darwinist debate tactics is to ignore or "forget" clarifications or corrective information so that the same strawman caricatures keep on coming up):
. . . The second law is all about probability, it uses probability at the microscopic level to predict macroscopic change: the reason carbon distributes itself more and more uniformly in an insulated solid is, that is what the laws of probability predict when diffusion alone is operative [--> a clear context . . . ]. The reason natural forces may turn a spaceship, or a TV set, or a computer into a pile of rubble but not vice-versa is also probability: of all the possible arrangements atoms could take, only a very small percentage could fly to the moon and back, or receive pictures and sound from the other side of the Earth, or add, subtract, multiply and divide real numbers with high accuracy. The second law of thermodynamics is the reason that computers will degenerate into scrap metal over time, and, in the absence of intelligence, the reverse process will not occur; and it is also the reason that animals, when they die, decay into simple organic and inorganic compounds, and, in the absence of intelligence, the reverse process will not occur. [--> again, a clear context, the statistical weights of clusters of possible microstates define probabilities to all relevant intents] The discovery that life on Earth developed through evolutionary "steps," coupled with the observation that mutations and natural selection -- like other natural forces -- can cause (minor) change, is widely accepted in the scientific world as proof that natural selection -- alone among all natural forces -- can create order out of disorder, and even design human brains, with human consciousness. Only the layman seems to see the problem with this logic. In a recent Mathematical Intelligencer article ["A Mathematician's View of Evolution," The Mathematical Intelligencer 22, number 4, 5-7, 2000] I asserted that the idea that the four fundamental forces of physics alone could rearrange the fundamental particles of Nature into spaceships, nuclear power plants, and computers, connected to laser printers, CRTs, keyboards and the Internet, appears to violate the second law of thermodynamics in a spectacular way.1 . . . . What happens in a[n isolated] system depends on the initial conditions; what happens in an open system depends on the boundary conditions as well. As I wrote in "Can ANYTHING Happen in an Open System?","order [--> including organisation, and from context, FSCO/I] can increase in an open system, not because the laws of probability are suspended when the door is open, but simply because order may walk in through the door.... If we found evidence that DNA, auto parts, computer chips, and books entered through the Earth's atmosphere at some time in the past, then perhaps the appearance of humans, cars, computers, and encyclopedias on a previously barren planet could be explained without postulating a violation of the second law here . . . But if all we see entering is radiation and meteorite fragments, it seems clear that what is entering through the boundary cannot explain the increase in order observed here." Evolution is a movie running backward, that is what makes it special. THE EVOLUTIONIST, therefore, cannot avoid the question of probability by saying that anything can happen in an open system, he is finally forced to argue that it only seems extremely improbable, but really isn't, that atoms would rearrange themselves into spaceships and computers and TV sets . . .
The point is a serious one and the first level it must be addressed at is the claimed spontaneous organisation and origin of cell based life in Darwin's warm pond or the like. KF kairosfocus
Timaeus: Gordon Davisson has no problem distinguishing between the original notion and Sewell’s “extended” notion.
Probably due to the fact that your idea of Sewell having an innovative "extended notion" of the second law is nonsense. The existence of different kinds of entropy finds universal acceptance these days.
Sewell: While the first formulations were all about heat, it is now universally recognized that the second law of thermodynamics can be used, in a quantifiable way, in many other applications.
Box
Gordon Davisson: Sewell claims that the second law applies separately to each different kind of entropy, but this is not true in general.
Sewell does NOT claim that AT ALL. In fact he explicitly states that in general even in solids different kinds of entropy can effect each other. From the paper (p.4):
G. Sewell: He [Bob Lloyd] wrote that the idea that my X-entropies are always independent of each other was “central to all of the versions of his argument.” Actually, I never claimed that: (….) But even in solids, the different X-entropies can affect each other under more general assumptions . Simple definitions of entropy are only useful in simple contexts. [My Emphasis]
Box
Timaeus, What Sewell has presented is a hunch. That's fine for what it is, but without some way to test his idea, it's no better than string theory. rhampton7
Timaeus at 55 Thanks for expressing something I have been thinking for a while. I feel like there is some principle that science just hasn't fully discovered yet. Here is an analogy: In the middle ages, scientists could tell you quite a bit about how light worked. Everyone could see the difference between colors and and brightness/shadow. But no one could tell you what "red" was or what "color" is. But later it was discovered that colors are light waves moving at a different frequency. People knew that light existed, but people couldn't prove what light was, scientifically, prior to the 19th century. Here I think there is an intuitive feel that the "order" or "organizing principle" found in cells, computers, pocket watches, eyeballs and brains does not come about without forethought and abstract reasoning. This "order" is not to be confused with the term "order" in the 2nd law, but it is "something" that we can detect. CSI, irreducible complexity, Sewell's argument, they all "get at" this thing. Maybe we should just call it "the Word." Collin
Timaeus: But that, in itself, doesn’t make Sewell’s argument wrong. It just means that his choice of words creates some confusion about what exactly he is arguing. How many legs does a dog have if you call the tail a leg? Zachriel
And wanting them to get next weeks winning lotto numbers falls under psychic experiences not under life after death or veridical Nde's . They do have people that have brought back new information as far as science is concerned but again that falls under another category. And mos wallstreeter43
Zachriel: "Then it’s not the 2nd law of thermodynamics." Nolo contendere. But that, in itself, doesn't make Sewell's argument wrong. It just means that his choice of words creates some confusion about what exactly he is arguing. But once one gets by the title, and reads his explanation why the 2nd law needs to be "extended" outside its original sphere, one can understand what he is driving at. Gordon Davisson has no problem distinguishing between the original notion and Sewell's "extended" notion. And Gordon then does the intellectually appropriate thing, i.e., to grant for the moment Sewell's special use of terms, but show that even the extended principle which Sewell employs -- whether it's called the 2nd law or something else -- does not contradict evolution. I do think that Sewell has some important things to say about the creation of order; however, as a point of literary strategy, I would not have chosen to use the term "second law of thermodynamics" in my argument. To write as if you are reviving or restating the second law argument calls up past arguments which have been soundly defeated, and then puts you in the position of having to explain "Well, my argument isn't based on thermodynamics as the term is most commonly understood..." He would have been better to stay away from that terminology and invent his own term for the type of order and disorder that he was talking about. And if he felt it absolutely necessary to point out some relationship between his new principle and the original second law, he could have done that in a footnote at the end of the piece, or in a speculative "aftermath" section in his conclusion, leaving the notion of thermodynamics out of the title and the main argument of the article. The way he wrote it up, he ended up having to defend two very large theses: (1) that the second law is actually merely an instance of a more general law; (2) that this more general law forbids evolution. It's always better if you can restrict yourself to one thesis per article; it's not only easier for the reader to tell what you are talking about -- it's easier to defend one radical thesis than two. Timaeus
SA @42
ID opponent: What do you mean by the terms “intelligent,” “design,” “chance,” “complex,” “functional,” “specified” and “information.” These terms are so vague as to render your argument meaningless. One can be certain that DDD is being employed when a person involved in a debate displays a convenient lapse of understanding of even the most common terms. In extreme cases ID opponents have even claimed that a term they themselves injected into the debate has no clear meaning.
Cuts both ways, I must say. Hangonasec
Joe
Me: Fine, GAs don’t model unguided evolution AND they demonstrate that unguided evolution leads to genetic entropy. Joe: Wrong. They CAN be used to demonstrate unguided evolution leads to genetic entropy.
Do you understand what the word 'wrong' means, Joe? Examine my sentence again, and yours. Hangonasec
""I want them to have knowledge they couldn’t have had otherwise. That seems pretty reasonable to me"" The 57 year old social worker did have knowledge that he couldn't have had before. This is the whole point of the study since strict controls were taken. The patient had no way to access the info of his veridical nde that he eerie maces . Ur just in denial my friend wallstreeter43
Box: Here is a thought experiment for you: try to imagine a more spectacular violation than what has happened on our planet. Those processes do not violate the 2nd law of thermodynamics. Nothing humans have done or can do violates the 2nd law of thermodynamics. Zachriel
So, how does the spontaneous rearrangement of matter on a rocky, barren, planet into human brains and spaceships and jet airplanes and nuclear power plants and libraries full of science texts and novels, and super computers running partial differential equation solving software, represent a less obvious or less spectacular violation of the second law -- or at least of the fundamental natural principle behind this law -- than tornados turning rubble into houses and cars? Here is a thought experiment for you: try to imagine a more spectacular violation than what has happened on our planet.
[Sewell] Box
@ Gordon Davisson Sorry, but your view on the problem of decreasing entropy in open systems is not very persuading. In fact Sewell has already considered your little but defunct thought experiment, because your experiment – your jar- is actually open to an ordering principle with respect to weight: gravity. Put your jar into a gravity-free environment and nothing will happen. That’s exactly the principle which Sewell is discussing. Something has to transgress the border of the open system to create an X-entropy-lowering effect. Next thought-experiment, please. halloschlaf
Timaeus: Sewell is arguing not from the standard “energy” notion of the second law, but from what he believes to be a deeper principle that underlies the second law. Then it's not the 2nd law of thermodynamics. Zachriel
T, the underlying issue is origin of FSCO/I- rich energy conversion devices that carry forward the onward work of coupling energy sources to create entities of interest. The spontaneous origin of such on raw inflow of energy is maximally implausible. And, the reasoning, on needle in haystack search challenges for clusters of special configs in the abstract space of possibilities, is close to that which grounds the statistical form of the Second Law. As has been drawn out since Thaxton et al in TMLO in the early-mid 1980's. I outline and point onwards in 9 above. KF kairosfocus
Box: Simple logic. So you're claiming development requires an injection of information, something not in the original cells. If so, whence the energy? You do realize that we have quite a bit of knowledge about how development occurs? The developmental pattern in plants can be manipulated with growth regulators (e.g. auxin or gibberellin), and these growth regulators are naturally modulated by environmental conditions. For instance, phytochrome in the presence of light causes expression of meristem genes. Hence, cell differentiation requires nothing that is not already present in the genome other than exposure to a suitable environment. Box: Entangled particles are influenced by some force which is at the moment entirely unmeasurable There's no work involved, and it doesn't violate the 2nd law of thermodynamics. Zachriel
I agree with those who say that the second law of thermodynamics -- as that law has traditionally been presented in science classes, i.e., as a rule governing the flow of heat or energy generally -- does not in itself contradict the idea of evolution. The earth is not a closed system. The sun inputs immense amounts of energy into the system. However, as Gordon Davisson pointed out, Sewell's argument is more subtle than that. Sewell is arguing not from the standard "energy" notion of the second law, but from what he believes to be a deeper principle that underlies the second law. One might have argued in 1860 that the principles of the USA, interpreted narrowly, allowed for slavery, since the States had constitutional rights to run their own affairs. But against that, one might argue that lying beneath that "freedom" of the States from authoritarian federal tyranny is a deeper principle, i.e., that no government has the right to take away the freedom of a human being. Thus, the very "freedom" for which the southern States fought was grounded in a deeper principle -- and slavery was incompatible with that deeper principle, even if it did not appear to conflict with the principle of "States rights." So it *might* be the case that not the second law of thermodynamics, but some deeper and broader law of which the Second Law is only a limited expression, is incompatible with Darwinian evolution. Gordon Davisson recognizes this. Not everyone who has bashed Sewell has read Sewell as carefully as Gordon has. Timaeus
Zach,
Zach: Energy is required. What do you mean that ‘information’ must be provided? How is ‘information’ being injected into a daisy as it grows?
When a daisy consists of, let's say, only 16 undifferentiated identical cells, then these cells needs information about what to do next. This information cannot be present in the cells because they are all the same and each individual cell lacks overview and authority to direct operations. So, their has to be another source of information which directs the growing process. Simple logic.
Zach: That suggests you aren’t making a scientific claim.
Entangled particles are influenced by some force which is currently entirely unmeasurable; it seems to be beyond space and time. Is quantum physics making "not a scientific claim" when it is stated that such a force nevertheless exists? Box
harry: There appear to be many bodies in our solar system that get plenty of sunshine and remain lifeless. It must not be as simple as that. The 2nd law of thermodynamics is a limitative law. It doesn't explain life, however, it doesn't act as a barrier either. Zachriel
Box: Life persists only as long as information is provided for work to be done. Energy is required. What do you mean that 'information' must be provided? How is 'information' being injected into a daisy as it grows? Box: If the blunt instruments available to science cannot measure it, so much the worse for the present state of science. That suggests you aren't making a scientific claim. Zachriel
Zach: Life persists only as long as energy is provided for work to be done. Cut off the energy, and life winds down like all processes.
Life persists only as long as information is provided for work to be done. Cut off the information, and life winds down like all processes. As clear as day there is - in each organism - an unifying power that keeps numerous parts in functional submission for exactly a lifetime. If the blunt instruments available to science cannot measure it, so much the worse for the present state of science. Box
If the natural Universe is a closed system then sustaining decreasing entropy indefinitely is impossible. In spite of any local, temporary decreases in entropy that might occur, matter in the Universe, on the whole, inexorably disintegrates into a more likely, disordered state. For each instance of a local, temporary decrease in entropy, its instantiation was only allowed by a greater increase in entropy overall. Furthermore, without some mechanism to harness available energy constructively, when it is expended it only further increases entropy. Absent such a mechanism, a release of energy tends to disintegrate an assemblage of matter, not further integrate and order the given assemblage. This is why tornados destroy buildings rather than remodel them. The reason I point this out is that mentioning that the second law of thermodynamics presents a problem for any theory of the mindless, accidental development of life, is often immediately countered with a flippant, dismissive assertion such as: "Of course life in its complexity would run counter to the second law of thermodynamics if it were operating in a closed system. But it's not, it's in an open system in which the sun's energy runs down as life's complexity runs 'up-hill'." There appear to be many bodies in our solar system that get plenty of sunshine and remain lifeless. It must not be as simple as that. A few more things are required for life to get started, one of which is a mechanism provided by the material environment to constructively harness available energy. Assuming for the moment that it is even possible for digital-information-based nanotechnology to come about mindlessly -- a patently absurd notion -- the question then becomes, how did the very unlikely environment that would be required for that to happen itself come about mindlessly and accidentally? Something far more likely to get slopped together accidentally than the digital-information-based nanotechnology of life, would be something consisting of really crude technology in comparison to that of life, like robotic equipment. It would be far easier to explain how robotic equipment might come about accidentally than it will be to plausibly explain how life came about mindlessly and accidentally. The requirement that the accidentally arrived at robotic equipment also be self-replicating could even be ignored. Maybe atheistic science should figure out how robotic equipment might come about accidentally first, and from the experience gained from that exercise, tackle the problem the mindless, accidental origin of life presents them with. In the meantime, true science that has remained relentlessly objective, not allowing the religious/philosophical implications of its discoveries to disturb its cold rationality, will follow the evidence wherever it leads, even if that path continues to provide overwhelming evidence that there must be a non-material reality -- an intellect -- that preexisted and was responsible for the Universe and the life within it. harry
"Chartsil your response on Nde’s is incredibly ridiculous and shows that you are disingenuous about dealing with the evidence ." I want them to have knowledge they couldn't have had otherwise. That seems pretty reasonable to me "This is me ::this is compelling evidence that conscious awareness can happen without a functioning brain." Let me guess, there were some people in masks dressed in white, a linoleum floor, a tray of surgical instruments. I must have had an NDE to have all that persuasive knowledge. "Most normal people can see the disconnect from reality and rational thinking , but you probably don’t and this is why we shouldn’t hold it against you, it’s just the way you think." >Coming from the guy that thinks someone having knowledge of the goings on of an OR is evidence of the supernatural. Psychological projection is a psychological defense mechanism where a person subconsciously denies his or her own attributes, thoughts, and emotions, which are then ascribed to the outside world, usually to other people. Thus, projection involves imagining or projecting the belief that others originate those feelings. CHartsil
Box: “Override” is a better choice of words. Life is a good example, it overrides the second law’s overall effect to drive everything in the universe towards disorder. Changing the word doesn't change the heat law. Life persists only as long as energy is provided for work to be done. Cut off the energy, and life winds down like all processes. Zachriel
F/N: Given the relevant scope of configuration spaces and the known randomising effects of diffusion and the like, raw energy is not a good explanation of functionally specific complex organisation and associated information. Information in the relevant sense is best understood as constrained configurations of contingent elements where the configuration can be used to convey or store data, control messages and the like. Note, the info is tied to the configuration and to the onward linked rules that give meaning. KF kairosfocus
Chartsil your response on Nde's is incredibly ridiculous and shows that you are disingenuous about dealing with the evidence . What the man experiences can't be explained away as coincidence or so etching coming from the brain. This is your way of moving the goalposts back and shows me that your atheistic beliefs are emotional and not truly intellectual , but then again we both knew this ;) I think its high time that u think about joining one of the new atheist mega churches that have been opening up. This is me ::this is compelling evidence that conscious awareness can happen without a functioning brain. Chartsil:: but this is clearly false because this guy did not bring back information about next weeks winning lotto pick lol Most normal people can see the disconnect from reality and rational thinking , but you probably don't and this is why we shouldn't hold it against you, it's just the way you think. wallstreeter43
Zach: There are no known processes that violate the 2nd law of thermodynamics.
"Override" is a better choice of words. Life is a good example, it overrides the second law’s overall effect to drive everything in the universe towards disorder.
Zachriel: By the way, you never answered. What is the origin of the requisite energy to rearrange matter to create organisms?
I would say that its origin is positioned in the same realm as the origin of the universe itself. Box
"Chartsil I believe that almost all csi comes from a mind as our experience as human beings has shown, but I found one rare exception ." It being specified is what you're trying to demonstrate. Just saying that it's specified is circular logic and question begging. "Also Chartsil why did u run away from the evidence for the afterlife ?" Run? I asked for a very specific set of evidences from NDEs and none were given. >A man has an NDE during surgery and comes back 'able to describe' the operating room. No >A man has an NDE during surgery and comes back with a prediction of a natural disaster right down to the date and time. Yes CHartsil
Chartsil I believe that almost all csi comes from a mind as our experience as human beings has shown, but I found one rare exception . Actually it's super rare. I found that this extremely rare message is an example of a message that couldn't possibly come from a mind therefore I believe it arose by blind chance and chemical interactions. """Atheism is an intellectual worldview and not an emotional one so na an an on you "" Also Chartsil why did u run away from the evidence for the afterlife ? I understand , if I was an atheist it would also make me run away from it as well ;) I wish more people were aware of aware ;) Diogenes where for art thou of Diogenes . ::crickets:: wallstreeter43
Definition Deficit Disorder Definition Deficit Disorder (“DDD”), also known as the “me no speaka the English distraction” and “definition derby” is a form of sophistry by obfuscation that demands that one’s opponent fulfil unreasonable or even impossible definitional criteria, not to advance the debate but to avoid the debate by claiming one’s opponent cannot adequately define their terms. An example: ID advocate: Intelligent design theory asserts chance causes cannot account for the generation of novel macroevolutionary features and that the best explanation for complex, functionally specified information beyond a reasonable chance threshold is the “artifact of an intelligent agent. ID opponent: What do you mean by the terms “intelligent,” “design,” “chance,” “complex,” “functional,” “specified” and “information.” These terms are so vague as to render your argument meaningless. One can be certain that DDD is being employed when a person involved in a debate displays a convenient lapse of understanding of even the most common terms. In extreme cases ID opponents have even claimed that a term they themselves injected into the debate has no clear meaning. Silver Asiatic
"Information – including the information in every post on this forum – cannot be produced by blind processes." Define information CHartsil
By the way, you never answered. What is the origin of the requisite energy to rearrange matter to create organisms? Zachriel
Box: The origin of the information is a entirely different story. There are no known processes that violate the 2nd law of thermodynamics. Zachriel
Zach,
Zach: If you think the manufacture of refrigerators violates the 2nd law of thermodynamics, then you are sorely mistaken.
Does the manufacture also include the design process? If so, see below.
Zach: The energy to power your body come from food sources that derive their energy from the sun.
Likely, but irrelevant. The origin of the information is a entirely different story. Surely you are sorely mistaken if you think that the origin is the sun. Information - including the information in every post on this forum - cannot be produced by blind processes. Box
Box: Hahaha. Good joke! Not an argument. If you think the manufacture of refrigerators violates the 2nd law of thermodynamics, then you are sorely mistaken. No known process violates the 2nd law of thermodynamics, no matter how smart the engineer. Box: I am also referring to living organisms and their creation. The energy to power your body come from food sources that derive their energy from the sun. From the time you strike the key through its transmission across the internet, the energy comes from electrical product with multiple possible origins. But every step of the process is consistent with the 2nd law of thermodynamics. No known process violates the 2nd law of thermodynamics, no matter how smart the engineer. Zachriel
Zachriel: Yes, it’s called energy and work.
Hahaha. Good joke!
Zachriel: Are you claiming that the intelligence somehow rearranges matter to inject ‘information’?
Box: Yup. Consider any post on this forum.
Zachriel: We were referring to living organisms and their creation.
I am also referring to living organisms and their creation. Box
Box: You must think that you have an explanation for the decrease of entropy by the appearance of refrigerators, humans, cars, high-speed computers, libraries full of science texts and encyclopedias, TV sets, airplanes and spaceships. Yes, it's called energy and work. Zachriel: Are you claiming that the intelligence somehow rearranges matter to inject ‘information’? If so, there is cost measured in work. What is the origin of the requisite energy? Box: Yup. Consider any post on this forum. We were referring to living organisms and their creation. What is the origin of the requisite energy to rearrange matter? Zachriel
My $0,02 cents for this very important discussion: Journal of Theoretical Biology Volume 359, 21 October 2014, Pages 192–198 Two faces of entropy and information in biological systems Yuriy Mitrokhin Highlights • Thermodynamic and information entropy are considered as two forms of total entropic process in biosystems. • The origination of complexity cannot be compensated only by thermodynamic entropy. • When and where in the past the entropy has been produced that is a payment for biological organization at present? • The idea is discussed that the genetic information is an instrument of entropy disproportioning in time. • The Second Low realization today cannot be without taking into account the information entropy in past generations. Abstract The article attempts to overcome the well-known paradox of contradictions between the emerging biological organization and entropy production in biological systems. It is assumed that quality, speculative correlation between entropy and antientropy processes taking place both in the past and today in the metabolic and genetic cellular systems may be perfectly authorized for adequate description of the evolution of biological organization. So far as thermodynamic entropy itself cannot compensate for the high degree of organization which exists in the cell, we discuss the mode of conjunction of positive entropy events (mutations) in the genetic systems of the past generations and the formation of organized structures of current cells. We argue that only the information which is generated in the conditions of the information entropy production (mutations and other genome reorganization) in genetic systems of the past generations provides the physical conjunction of entropy and antientropy processes separated from each other in time generations. It is readily apparent from the requirements of the Second law of thermodynamics. Keywords Information entropy; Generating of new information; Disproportionation of entropy; Unsteadiness in genetic systems; Compensation for antientropic processes http://www.sciencedirect.com/science/article/pii/S0022519314003610 Enezio E. De Almeida Filho
Zach: We were referring to the workings of a refrigerator,
That’s fine, but utterly irrelevant to the discussion.
Zach: though the manufacture of refrigerators are completely in accord with the 2nd law of thermodynamics.
A bold statement. You must think that you have an explanation for the decrease of entropy by the appearance of refrigerators, humans, cars, high-speed computers, libraries full of science texts and encyclopedias, TV sets, airplanes and spaceships.
Zach: In addition, there are many cooling mechanisms in the natural world.
Irrelevant.
Zach: In any case, adding intelligence, as in Intelligent Design, does not allow one to violate the 2nd Law of Thermodynamics.
Indeed, because the information enters from the outside, like I said.
Zach: Are you claiming that the intelligence somehow rearranges matter to inject ‘information’?
Yup. Consider any post on this forum. You seem to disagree, state your case. Box
Crap man, AIG doesn't even use the SLoT argument anymore. In fact, they advise against it. So basically, you're lagging behind Ken Ham's understanding of science. CHartsil
Box: So electrical energy brings a refrigerator into existence? You do realize the 2nd law of thermodynamics came about because engineers discovered limits to how efficient they could make a heat engine — no matter how smart the engineer? That discovery includes manufacturing and operating refrigerators. Zachriel
Box: So electrical energy brings a refrigerator into existence? We were referring to the workings of a refrigerator, though the manufacture of refrigerators are completely in accord with the 2nd law of thermodynamics. In addition, there are many cooling mechanisms in the natural world. In any case, adding intelligence, as in Intelligent Design, does not allow one to violate the 2nd Law of Thermodynamics. Are you claiming that the intelligence somehow rearranges matter to inject ‘information’? If so, there is cost measured in work. What is the origin of the requisite energy? Zachriel
Zachriel,
Zach: With a refrigerator, it’s not ‘information’ that enters, but electrical energy.
So electrical energy brings a refrigerator into existence? Oh, aha, you are simply assuming the existence of a refrigerator. I thought you were trying to make a case for the idea that the coming into existence of a refrigerator - by e.g. a tornado - doesn't violate the second law ....
Zachriel: Are you claiming that the intelligence somehow rearranges matter to “inject information”?
Yup. Consider any post on this forum. Box
Zachriel:
With a refrigerator, it’s not ‘information’ that enters, but electrical energy.
So electrical energy cools?
In any case, adding intelligence, as in Intelligent Design, does not allow one to violate the 2nd Law of Thermodynamics.
It allows there to be order out of disorder. It allows for the order to be maintained longer than if nature operated freely.
Are you claiming that the intelligence somehow rearranges matter to “inject information”?
That seems to be what we do. Joe
Box: Yes it is indeed consistent, because the information enters from the outside. With a refrigerator, it's not 'information' that enters, but electrical energy. In any case, adding intelligence, as in Intelligent Design, does not allow one to violate the 2nd Law of Thermodynamics. Are you claiming that the intelligence somehow rearranges matter to inject 'information'? If so, there is cost measured in work. What is the origin of the requisite energy? Zachriel
Hangonasec:
Fine, GAs don’t model unguided evolution AND they demonstrate that unguided evolution leads to genetic entropy.
Wrong. They CAN be used to demonstrate unguided evolution leads to genetic entropy. Joe
Zachriel: Life and evolution are consistent with the 2nd Law of Thermodynamics, as are refrigerators and designer genes.
Yes it is indeed consistent, because the information enters from the outside.
Sewell: (...) we conclude that the fact that order can increase in an open system does not mean that tornados can turn rubble into houses and cars without violating the second law. And it does not mean that computers can appear on a barren planet as long as the planet receives solar energy. Something must be entering from outside which makes the appearance of computers not extremely improbable, for example, computers.
Box
Adding intelligence, as in Intelligent Design, does not allow one to violate the 2nd Law of Thermodynamics. Life and evolution are consistent with the 2nd Law of Thermodynamics, as are refrigerators and designer genes. Zachriel
Biological Information - Entropy, Evolution and Open Systems, 11-15-2014 by Paul Giem - at youtube. I notice some silly 'arguments' in this thread. E.g. Gordon Davisson who seems to argue that there is a meaningful relationship between gravity and the organizational order that Granville Sewell is talking about - humans, cars, high-speed computers, libraries full of science texts and encyclopedias, TV sets, airplanes and spaceships. One has to wonder if by "Oh good grief, not this nonsense again." Gordon really addresses his own drivel. Box
Hey Joe, if you don't mind being logically inconsistent ... what am I saying? You're Joe! Fine, GAs don't model unguided evolution AND they demonstrate that unguided evolution leads to genetic entropy. That's logic, that is! Hangonasec
Hangonasec:
If they don’t model unguided evolution, Sanford’s GA cannot be used to support the contention that unguided evolution leads to ‘genetic entropy’
Cuz you say so? Joe
Joe - and therefore ... ? If they don't model unguided evolution, Sanford's GA cannot be used to support the contention that unguided evolution leads to 'genetic entropy'. That thing with a hole in it - that's your foot, Joe. :D Hangonasec
Hangonasec:
As for Sanford – too funny; GAs suddenly work just fine, with no accusations of ‘smuggling’, when one appears to dismantle evolution.
GAs do work fine. They just do not model unguided evolution. Joe
Of related note, entropy is the ultimate cause of the death for our temporal, i.e. material, bodies:
Entropy Explains Aging, Genetic Determinism Explains Longevity, and Undefined Terminology Explains Misunderstanding Both - 2007 Excerpt: There is a huge body of knowledge supporting the belief that age changes are characterized by increasing entropy, which results in the random loss of molecular fidelity, and accumulates to slowly overwhelm maintenance systems [1–4].,,, http://www.plosgenetics.org/article/info%3Adoi/10.1371/journal.pgen.0030220 John Sanford on (Genetic Entropy) - Down, Not Up - 2-4-2012 (at Loma Linda University) - video http://www.youtube.com/watch?feature=player_detailpage&v=PHsu94HQrL0#t=1040s Notes from John Sanford's preceding video: *3 new mutations every time a cell divides in your body * Average cell of 15 year old has up to 6000 mutations *Average cell of 60 year old has 40,000 mutations Reproductive cells are 'designed' so that, early on in development, they are 'set aside' and thus they do not accumulate mutations as the rest of the cells of our bodies do. Regardless of this protective barrier against the accumulation of slightly detrimental mutations still we find that,,, *60-175 mutations are passed on to each new generation.
This following video brings the point personally home to us about the effects of genetic entropy:
Aging Process - 85 years in 40 seconds - video http://www.youtube.com/watch?v=A91Fwf_sMhk
Also of interest is where the greatest source of entropy is found in the universe:
Entropy of the Universe – Hugh Ross – May 2010 Excerpt: Egan and Lineweaver found that supermassive black holes are the largest contributor to the observable universe’s entropy. They showed that these supermassive black holes contribute about 30 times more entropy than what the previous research teams estimated. http://www.reasons.org/entropy-universe “Einstein’s equation predicts that, as the astronaut reaches the singularity (of the black-hole), the tidal forces grow infinitely strong, and their chaotic oscillations become infinitely rapid. The astronaut dies and the atoms which his body is made become infinitely and chaotically distorted and mixed-and then, at the moment when everything becomes infinite (the tidal strengths, the oscillation frequencies, the distortions, and the mixing), spacetime ceases to exist.” Kip S. Thorne – “Black Holes and Time Warps: Einstein’s Outrageous Legacy” pg. 476
Also of note, Christ's overcame gravity, and thus entropy, in his resurrection from death:
A Quantum Hologram of Christ’s Resurrection? by Chuck Missler Excerpt: “You can read the science of the Shroud, such as total lack of gravity, lack of entropy (without gravitational collapse), no time, no space—it conforms to no known law of physics.” The phenomenon of the image brings us to a true event horizon, a moment when all of the laws of physics change drastically. Dame Piczek created a one-fourth size sculpture of the man in the Shroud. When viewed from the side, it appears as if the man is suspended in mid air (see graphic, below), indicating that the image defies previously accepted science. The phenomenon of the image brings us to a true event horizon, a moment when all of the laws of physics change drastically. http://www.khouse.org/articles/2008/847 THE EVENT HORIZON (Space-Time Singularity) OF THE SHROUD OF TURIN. – Isabel Piczek – Particle Physicist Excerpt: We have stated before that the images on the Shroud firmly indicate the total absence of Gravity. Yet they also firmly indicate the presence of the Event Horizon. These two seemingly contradict each other and they necessitate the past presence of something more powerful than Gravity that had the capacity to solve the above paradox. http://shroud3d.com/findings/isabel-piczek-image-formation Turin shroud – (Particle Physicist explains event horizon) – video https://www.youtube.com/watch?v=HHVUGK6UFK8 The Center Of The Universe Is Life (Jesus) - General Relativity, Quantum Mechanics, Entropy and The Shroud Of Turin - video http://vimeo.com/34084462
Verses and Music:
John 8:23-24 But he continued, “You are from below; I am from above. You are of this world; I am not of this world. I told you that you would die in your sins; if you do not believe that I am he, you will indeed die in your sins. Matthew 10:28 “Do not fear those who kill the body but are unable to kill the soul; but rather fear Him who is able to destroy both soul and body in hell. Colossians 1:15-20 The Son is the image of the invisible God, the firstborn over all creation. For in him all things were created: things in heaven and on earth, visible and invisible, whether thrones or powers or rulers or authorities; all things have been created through him and for him. He is before all things, and in him all things hold together. And he is the head of the body, the church; he is the beginning and the firstborn from among the dead, so that in everything he might have the supremacy. For God was pleased to have all his fullness dwell in him, and through him to reconcile to himself all things, whether things on earth or things in heaven, by making peace through his blood, shed on the cross. Evanescence – The Other Side (Music-Lyric Video) http://www.vevo.com/watch/evanescence/the-other-side-lyric-video/USWV41200024?source=instantsearch
bornagain77
LoL!@ Me Think- Read Moran Joe
Bob O'H as to this question,,,
As a layman, if there truly is no conflict between evolution and thermodynamics, then why did Dr. Behe formulate the first rule,
you state in responce to that question:
Because what Behe wrote doesn’t involve thermodynamics.
Really??? and yet Dr. Behe states that ' that even the great majority of helpful mutations degrade the genome to a greater or lesser extent' :
Of course we had already known that the great majority of mutations that have a visible effect on an organism are deleterious. Now, surprisingly, it seems that even the great majority of helpful mutations degrade the genome to a greater or lesser extent.,,, I dub it “The First Rule of Adaptive Evolution”: Break or blunt any functional coded element whose loss would yield a net fitness gain.
So Bob O'H in your book, this tendency towards degradation, i.e. disorder, instead of towards order that Behe found, for laboratory evolution experiments of microbes going back four decades no less, does not reflect the Second Law's tendency to drive everything in the universe towards disorder??? What, pray tell, is this 'other' law operating in the universe besides the second law that is driving living systems towards disorder? And why can't we separate that unknown law's overall effect from the second law's overall effect to drive everything in the universe towards disorder?
A 'flat universe', which is actually another very surprising finely-tuned 'coincidence' of the universe, means this universe, left to its own present course of accelerating expansion due to Dark Energy, will continue to expand forever, thus fulfilling the thermodynamic equilibrium of the second law to its fullest extent (entropic 'Heat Death' of the universe). The Future of the Universe (Heat Death) Excerpt: After all the black holes have evaporated, (and after all the ordinary matter made of protons has disintegrated, if protons are unstable), the universe will be nearly empty. Photons, neutrinos, electrons and positrons will fly from place to place, hardly ever encountering each other. It will be cold, and dark, and there is no known process which will ever change things. --- Not a happy ending. http://spiff.rit.edu/classes/phys240/lectures/future/future.html Shining Light on Dark Energy – October 21, 2012 Excerpt: It (Entropy) explains time; it explains every possible action in the universe;,, Even gravity, Vedral argued, can be expressed as a consequence of the law of entropy. ,,, The principles of thermodynamics are at their roots all to do with information theory. Information theory is simply an embodiment of how we interact with the universe —,,, http://crev.info/2012/10/shining-light-on-dark-energy/ "We have the sober scientific certainty that the heavens and earth shall ‘wax old as doth a garment’.... Dark indeed would be the prospects of the human race if unilluminated by that light which reveals ‘new heavens and a new earth.’" Lord Kelvin Psalm 102:25-27 Of old You laid the foundation of the earth, And the heavens are the work of Your hands. They will perish, but You will endure; Yes, they will all grow old like a garment; Like a cloak You will change them, And they will be changed. But You are the same, And Your years will have no end.
Bob O'H, as to this question,,,
and why are mutations overwhelming detrimental,
you answer:
because (a) most of the fitness effects are tiny, so their effects are too small to notice,
Really??? Better inform these guys of that:
Unexpectedly small effects of mutations in bacteria bring new perspectives - November 2010 Excerpt: Most mutations in the genes of the Salmonella bacterium have a surprisingly small negative impact on bacterial fitness. And this is the case regardless whether they lead to changes in the bacterial proteins or not.,,, using extremely sensitive growth measurements, doctoral candidate Peter Lind showed that most mutations reduced the rate of growth of bacteria by only 0.500 percent. No mutations completely disabled the function of the proteins, and very few had no impact at all. Even more surprising was the fact that mutations that do not change the protein sequence had negative effects similar to those of mutations that led to substitution of amino acids. A possible explanation is that most mutations may have their negative effect by altering mRNA structure, not proteins, as is commonly assumed. http://www.physorg.com/news/2010-11-unexpectedly-small-effects-mutations-bacteria.html
and as to this question Bob,,,
and why does all evidence support Dr. Sanford’s contention of Genetic Entropy?
you answer
(b) it doesn’t.
Really??? Man you Darwinists really ought to share some of the evidence that only you guys seem privy to! , The evidence for the detrimental nature of mutations in humans is overwhelming for scientists have already cited over 100,000 mutational disorders.
Inside the Human Genome: A Case for Non-Intelligent Design - Pg. 57 By John C. Avise Excerpt: "Another compilation of gene lesions responsible for inherited diseases is the web-based Human Gene Mutation Database (HGMD). Recent versions of HGMD describe more than 75,000 different disease causing mutations identified to date in Homo-sapiens." I went to the mutation database website cited by John Avise and found: Mutation total (as of 2014-05-02) - 148,413 http://www.hgmd.cf.ac.uk/ac/ Human Genetic Variation Recent, Varies Among Populations - (Nov. 28, 2012) Excerpt: Nearly three-quarters of mutations in genes that code for proteins -- the workhorses of the cell -- occurred within the past 5,000 to 10,000 years,,, "One of the most interesting points is that Europeans have more new deleterious (potentially disease-causing) mutations than Africans,",,, "Having so many of these new variants can be partially explained by the population explosion in the European population. However, variation that occur in genes that are involved in Mendelian traits and in those that affect genes essential to the proper functioning of the cell tend to be much older." (A Mendelian trait is controlled by a single gene. Mutations in that gene can have devastating effects.) The amount variation or mutation identified in protein-coding genes (the exome) in this study is very different from what would have been seen 5,000 years ago,,, The report shows that "recent" events have a potent effect on the human genome. Eighty-six percent of the genetic variation or mutations that are expected to be harmful arose in European-Americans in the last five thousand years, said the researchers. The researchers used established bioinformatics techniques to calculate the age of more than a million changes in single base pairs (the A-T, C-G of the genetic code) that are part of the exome or protein-coding portion of the genomes (human genetic blueprint) of 6,515 people of both European-American and African-American decent.,,, http://www.sciencedaily.com/releases/2012/11/121128132259.htm Scientists Discover Proof That Humanity Is Getting Dumber, Smaller And Weaker By Michael Snyder, on April 29th, 2014 Excerpt: An earlier study by Cambridge University found that mankind is shrinking in size significantly. Experts say humans are past their peak and that modern-day people are 10 percent smaller and shorter than their hunter-gatherer ancestors. And if that’s not depressing enough, our brains are also smaller. The findings reverse perceived wisdom that humans have grown taller and larger, a belief which has grown from data on more recent physical development. The decline, said scientists, has happened over the past 10,000 years. http://thetruthwins.com/archives/scientists-discover-proof-that-humanity-is-getting-dumber-smaller-and-weaker Human mutation rate slower than thought - Moms and dads not equal in passing down genetic typos - June 2011 Excerpt: the rate indicates that, on average, about one DNA chemical letter in every 85 million gets mutated per generation through copying mistakes made during sperm and egg production. The new rate means each child inherits somewhere in the neighborhood of 30 to 50 new mutations.,,, http://www.sciencenews.org/view/generic/id/331194/title/Human_mutation_rate_slower_than_thought
Interestingly, this ‘slightly detrimental’ mutation rate of 30 per generation is far greater than what even evolutionists agree is an acceptable mutation rate since detrimental mutations will accumulate far faster than ‘selection’ can eliminate them in any given genome:
Human evolution or extinction - discussion on acceptable mutation rate per generation (with clips from Dr. John Sanford) - video http://www.youtube.com/watch?v=aC_NyFZG7pM
bornagain77
BA77 @5
As a layman, if there truly is no conflict between evolution and thermodynamics, then why did Dr. Behe formulate the first rule, and why are mutations overwhelming detrimental, and why does all evidence support Dr. Sanford’s contention of Genetic Entropy?
Why on earth would anyone think detrimental mutations involve more entropy change than beneficial ones? Apart, that is, from people with barely a clue what entropy is. I'll let Behe off; as a biochemist, he probably does know what entropy is, and therefore wisely steers clear of this most bogus of arguments, BA's co-option of his name notwithstanding. As for Sanford - too funny; GAs suddenly work just fine, with no accusations of 'smuggling', when one appears to dismantle evolution. Hangonasec
Joe @ 12
Darwinian claims for evolution are consistent with astrology’s claims for the future. Astrology is science :-) Read Behe
Me_Think
rvb8: You say you are not a maths and science major (which was no secret to anyone here, given the way you've always avoided detailed discussion of anything close to science), but you also say that you like Gordon Davisson's presentation because it's written in a way that you can understand. Really? So you understood this?: "calculate the X-entropy change as a block of X is compressed by 0.1% (i.e. compressed to 0.999 of its original volume." and this?: "If you integrate the final X-entropy over the original volume of the block, you’ll find that the change in X-entropy in the 0.1% that no longer contains X … diverges to negative infinity. If you integrate only over the final volume, the X-entropy change depends on a constant of integration … which is undefined." I'm not knocking Gordon's presentation, which seems to me to be pretty good (which is not to say I entirely agree with his conclusions, but only to give credit where it's due); I am questioning why you would agree with paragraphs which, given your self-admitted lack of training in the area, it is very unlikely that you comprehend. Or do you automatically agree with anything that seems to denounce ID, whether you understand it or not? It certainly *looks* as if you have read only the refutations of Sewell, and not Sewell's argument which is being refuted. If so, you have made up your mind without first hearing both sides. Charles Darwin directly stated that in order to get a fair intellectual result, one must hear both sides of any controversy. So you operate against the spirit of Darwin himself. (Which is not uncommon, for modern Darwinians.) Timaeus
Darwinian claims for evolution are consistent with astrology's claims for the future. ;) Joe
As a layman, if there truly is no conflict between evolution and thermodynamics, then why did Dr. Behe formulate the first rule,
Because what Behe wrote doesn't involve thermodynamics.
and why are mutations overwhelming detrimental, and why does all evidence support Dr. Sanford’s contention of Genetic Entropy?
because (a) most of the fitness effects are tiny, so their effects are too small to notice, and (b) it doesn't. Bob O'H
RVB8 If you're hanging onto the words of Nick Matzke, you're not in very good company, we are still waiting for his book that proves macro-evolution. You might want to reconsider what you consider a trustworthy source..... Andre
F/N: Sewell's core point -- never mind the many dismissals -- is right. Per Clausius' formulation, how we get the expression ds >/= d'Q/T, is that we consider two interacting bodies A and B within an isolated overall system; with A hotter than B. On passing increment of heat d'Q, and taking the ratio and adding up, the heat-importing body INCREASES its entropy by an amount sufficient to over-compensate for the loss of entropy in A on giving up the increment of heat. Crossing over into an informational, microscopic view (objectors tend to denigrate or dismiss this perspective) the added increment allows mass and energy in B to be distributed and arranged in more possible ways. This effectively means that the average missing information to specify the particular microstate of B consistent with its macro-scale [P,V, T etc] state, has increased sharply. That can be measured as string length of the chain of Y/N q's to describe the state. This means, in effect (and in first approximation terms), that a body that imports raw, uncorrelated energy not coupled to mechanisms that perform work, will tend to move towards a less ordered state. The notion that we may reasonably expect functionally specific, complex organisation and associated information [FSCO/I] relevant to cell based life to spontaneously emerge from such a raw energy injection, fails. Fails, because of the very premise of the statistical form of the 2nd law of thermodynamics: the probability (rooted in statistical weights] of special ordered -- much less, organised -- states on blind undirected needle in haystack search, is vanishingly small relative to the bulk of configurational possibilities. In short, on first principles, it is not reasonable to expect a warm sun-irradiated and lightning struck pond or the like to spontaneously open the gateway to C-chemistry, aqueous medium, encapsulated, gated, code and algorithm using cell based life. But, we live in an age where to say such is ever so politically incorrect and it will be denounced and attacked. In answer I say the same as I say to string theorists, multiverse aficionados and the like: SHOW me. I can readily show how raw e=nergy importation tends to ADD to entropy and to disorganise, starting with impacts of diffusion and destructive chemistry etc. If you want to argue for a successful Darwin's pond or the like and insist that thermodynamics and its underlying considerations have no constraining relevance, SHOW us empirically. I guarantee that once the exaggerated headlines and hype are set aside, it will be seen that it's froth not mauby. KF PS: Interested onlookers may wish to look here http://www.angelfire.com/pro/kairosfocus/resources/Info_design_and_science.htm#shnn_info . . . and here in my always linked briefing note: http://www.angelfire.com/pro/kairosfocus/resources/Info_design_and_science.htm#thermod kairosfocus
Holy moley, BA77!! You're like Mr. Universe in the movie "Serenity". You have access to ALL the information! Those are some great links. William J Murray
All of the tired 2nd Law nonsense is beautifully exposed at a new science friendly site compiled by the very smart, James Downard. It is Trouble In Paradise (TIP), and is available at http://tortucan.wordpress.com. Described by Matzke as a voluminous history of all forms of creationism, its contents covers all shannanigins upto 2004 and the eve of Dover. Apparently he is currently compiling 2004 to today soon. Downard has culled over 30,000 documents and all the embarassing missteps of the past are easily, and glaringly available; a new and uneditable resource for science history. rvb8
Sewell's papers have been discussed at TSZ a number of times. Here are technical threads: A Second Look at the Second Law… Granville Sewell vs Bob Lloyd Granville Sewell Doubles Down The bottom line: Sewell does not understand statistical physics and thermodynamics all that well. skram
As a layman, if there truly is no conflict between evolution and thermodynamics, then why did Dr. Behe formulate the first rule, and why are mutations overwhelming detrimental, and why does all evidence support Dr. Sanford's contention of Genetic Entropy?
“The First Rule of Adaptive Evolution”: Break or blunt any functional coded element whose loss would yield a net fitness gain - Michael Behe - December 2010 Excerpt: In its most recent issue The Quarterly Review of Biology has published a review by myself of laboratory evolution experiments of microbes going back four decades.,,, The gist of the paper is that so far the overwhelming number of adaptive (that is, helpful) mutations seen in laboratory evolution experiments are either loss or modification of function. Of course we had already known that the great majority of mutations that have a visible effect on an organism are deleterious. Now, surprisingly, it seems that even the great majority of helpful mutations degrade the genome to a greater or lesser extent.,,, I dub it “The First Rule of Adaptive Evolution”: Break or blunt any functional coded element whose loss would yield a net fitness gain. http://behe.uncommondescent.com/2010/12/the-first-rule-of-adaptive-evolution/ Multiple Overlapping Genetic Codes Profoundly Reduce the Probability of Beneficial Mutation George Montañez 1, Robert J. Marks II 2, Jorge Fernandez 3 and John C. Sanford 4 - May 2013 Excerpt: It is almost universally acknowledged that beneficial mutations are rare compared to deleterious mutations [1–10].,, It appears that beneficial mutations may be too rare to actually allow the accurate measurement of how rare they are [11]. 1. Kibota T, Lynch M (1996) Estimate of the genomic mutation rate deleterious to overall fitness in E. coli . Nature 381:694–696. 2. Charlesworth B, Charlesworth D (1998) Some evolutionary consequences of deleterious mutations. Genetica 103: 3–19. 3. Elena S, et al (1998) Distribution of fitness effects caused by random insertion mutations in Escherichia coli. Genetica 102/103: 349–358. 4. Gerrish P, Lenski R N (1998) The fate of competing beneficial mutations in an asexual population. Genetica 102/103:127–144. 5. Crow J (2000) The origins, patterns, and implications of human spontaneous mutation. Nature Reviews 1:40–47. 6. Bataillon T (2000) Estimation of spontaneous genome-wide mutation rate parameters: whither beneficial mutations? Heredity 84:497–501. 7. Imhof M, Schlotterer C (2001) Fitness effects of advantageous mutations in evolving Escherichia coli populations. Proc Natl Acad Sci USA 98:1113–1117. 8. Orr H (2003) The distribution of fitness effects among beneficial mutations. Genetics 163: 1519–1526. 9. Keightley P, Lynch M (2003) Toward a realistic model of mutations affecting fitness. Evolution 57:683–685. 10. Barrett R, et al (2006) The distribution of beneficial mutation effects under strong selection. Genetics 174:2071–2079. 11. Bataillon T (2000) Estimation of spontaneous genome-wide mutation rate parameters: whither beneficial mutations? Heredity 84:497–501. http://www.worldscientific.com/doi/pdf/10.1142/9789814508728_0006 Genetic Entropy - Dr. John Sanford - Evolution vs. Reality - video (Notes in description) http://vimeo.com/35088933 "Moreover, there is strong theoretical reasons for believing there is no truly neutral nucleotide positions. By its very existence, a nucleotide position takes up space, affects spacing between other sites, and affects such things as regional nucleotide composition, DNA folding, and nucleosome building. If a nucleotide carries absolutely no (useful) information, it is, by definition, slightly deleterious, as it slows cell replication and wastes energy.,, Therefore, there is no way to change any given site without some biological effect, no matter how subtle." - John Sanford - Genetic Entropy and The Mystery of The Genome - pg. 21 - Inventor of the 'Gene Gun' "The neo-Darwinians would like us to believe that large evolutionary changes can result from a series of small events if there are enough of them. But if these events all lose information they can’t be the steps in the kind of evolution the neo-Darwin theory is supposed to explain, no matter how many mutations there are. Whoever thinks macroevolution can be made by mutations that lose information is like the merchant who lost a little money on every sale but thought he could make it up on volume." Lee Spetner (Ph.D. Physics - MIT - Not By Chance)
Now you may still insist there is no conflict between thermodynamics and evolution, but the empirical evidence itself tells me otherwise! Myself, not being versed in math, I think I will follow the evidence instead of you. Hope you don't mind.
The Scientific Method - Richard Feynman - video Quote: 'If it disagrees with experiment, it’s wrong. In that simple statement is the key to science. It doesn’t make any difference how beautiful your guess is, it doesn’t matter how smart you are who made the guess, or what his name is… If it disagrees with experiment, it’s wrong. That’s all there is to it.” https://www.youtube.com/watch?v=OL6-x0modwY
Of related note: Classical Information in the cell has now been physically measured and is shown to correlate to the thermodynamics of the cell:
Maxwell’s demon demonstration (knowledge of a particle’s position) turns information into energy – November 2010 Excerpt: Scientists in Japan are the first to have succeeded in converting information into free energy in an experiment that verifies the “Maxwell demon” thought experiment devised in 1867.,,, In Maxwell’s thought experiment the demon creates a temperature difference simply from information about the gas molecule temperatures and without transferring any energy directly to them.,,, Until now, demonstrating the conversion of information to energy has been elusive, but University of Tokyo physicist Masaki Sano and colleagues have succeeded in demonstrating it in a nano-scale experiment. In a paper published in Nature Physics they describe how they coaxed a Brownian particle to travel upwards on a “spiral-staircase-like” potential energy created by an electric field solely on the basis of information on its location. As the particle traveled up the staircase it gained energy from moving to an area of higher potential, and the team was able to measure precisely how much energy had been converted from information. http://www.physorg.com/news/2010-11-maxwell-demon-energy.html Demonic device converts information to energy – 2010 Excerpt: “This is a beautiful experimental demonstration that information has a thermodynamic content,” says Christopher Jarzynski, a statistical chemist at the University of Maryland in College Park. In 1997, Jarzynski formulated an equation to define the amount of energy that could theoretically be converted from a unit of information2; the work by Sano and his team has now confirmed this equation. “This tells us something new about how the laws of thermodynamics work on the microscopic scale,” says Jarzynski. http://www.scientificamerican.com/article.cfm?id=demonic-device-converts-inform
Moreover, Dr. McIntosh, who is the Professor of Thermodynamics Combustion Theory at the University of Leeds, holds that regarding information as independent of energy and matter ‘resolves the thermodynamic issues and invokes the correct paradigm for understanding the vital area of thermodynamic/organisational interactions’.
Information and Thermodynamics in Living Systems – Andy C. McIntosh – 2013 Excerpt: ,,, information is in fact non-material and that the coded information systems (such as, but not restricted to the coding of DNA in all living systems) is not defined at all by the biochemistry or physics of the molecules used to store the data. Rather than matter and energy defining the information sitting on the polymers of life, this approach posits that the reverse is in fact the case. Information has its definition outside the matter and energy on which it sits, and furthermore constrains it to operate in a highly non-equilibrium thermodynamic environment. This proposal resolves the thermodynamic issues and invokes the correct paradigm for understanding the vital area of thermodynamic/organisational interactions, which despite the efforts from alternative paradigms has not given a satisfactory explanation of the way information in systems operates.,,, http://www.worldscientific.com/doi/abs/10.1142/9789814508728_0008
Here is a recent video by Dr. Giem, that gets the main points of Dr. McIntosh’s paper over very well, in an easy to understand manner, for the lay person:
Biological Information – Information and Thermodynamics in Living Systems 11-22-2014 by Paul Giem (A. McIntosh) – video https://www.youtube.com/watch?v=IR_r6mFdwQM
bornagain77
Thank you Gordon. Not being a scientist or maths major, it's always pleasant when explanations are given in a way I can follow. Like you, I exclaimed, 'this hoary old chestnut again, hasn't it been refuted, proved wrong, laughed at and then spanked for presumption dozens of times?' I knew it had, but couldn't be bothered with their tired reclamation of barren thought. Cheers! rvb8
Oh good grief, not this nonsense again. Look, I know it's tempting to think there should be a conflict between evolution and thermodynamics, but nobody's actually been able to point one out. That certainly includes Granville Sewell: in the first place, he doesn't even claim to have found a conflict with the second law, only that he thinks there's some sort of conflict (that he can't precisely point out) and what he sees as the fundamental principle behind the second law (which is to say, not the second law itself). In the second place, his attempts to explain this fundamental principle are pretty obviously wrong. Let me point out two examples of how he gets it wrong, specifically problems with his X-Entropies (see e.g. section 2 of his article "Entropy, Evolution and Open Systems" in "BIO-Complexity and Biological Information: New Perspectives"). Essentially, he defines a family of entropy functions for each diffusing substance (e.g. the carbon-entropy), and winds up claiming:
Furthermore, equation (5) does not simply say that the X-entropy cannot decrease in an isolated system; it also says that in a non-isolated system, the X-entropy cannot decrease faster than it is exported through the boundary, because the boundary integral there represents the rate at which X-entropy is exported across the boundary.Furthermore, equation (5) does not simply say that the X-entropy cannot decrease in an isolated system; it also says that in a non-isolated system, the X-entropy cannot decrease faster than it is exported through the boundary, because the boundary integral there represents the rate at which X-entropy is exported across the boundary.
This is true in the situation he considered, but not true in general. I gave a simple counterexample back in 2011, and I'll repeat it here:
Consider a jar containing nitrogen gas and some powdered graphite (a form of carbon). Let me start with the jar’s contents thoroughly mixed: the graphite is uniformly scattered throughout the volume inside the jar, and it and the gas is all at the same temperature. What happens if the jar is isolated (except for gravity), and just left to sit for a while? All of the graphite will settle to the bottom; its arrangement is then more ordered, and in fact the carbon entropy within the jar has decreased. This does not, however, violate the second law of thermodynamics (as I’ll explain in a bit).
So what's happening in this example is what Sewell claims to have shown is impossible: a carbon-entropy decrease in an isolated system. Apparently, gravity is all it takes to violate his version of the second law. Now, you could complain that I'm talking about gasses and powders, while Sewell's calculations only apply to diffusion through a solid, but the same basic thing'll happen in solids as well (denser components diffusing toward the bottom, lighter ones toward the top). And besides, if Sewell's calculations only apply to solids, they're pretty thoroughly irrelevant to life, aren't they? Now, the next bit of my earlier post is also relevant, because it turns out that this is actually an example of what Denyse thinks is fishy: an entopy decrease of one sort being compensated for by an increase of another sort of entropy:
Something even more interesting happened to the distribution of thermal energy within the jar. As the graphite particles settle to the bottom, their gravitational energy is converted to kinetic energy (their downward motion), and then that’s converted to heat (both by friction as they fall, and the inelastic collisions when they hit the bottom of the jar). More of this heat is released near the bottom of the jar than the top, so the jar’s contents will become warmer at the bottom than the top. Intuitively (to me at least), this means that the heat has become more ordered. But the thermal entropy went up because there is more heat than at the beginning, and this outweighs the entropy decrease from nonuniformity. (As with human vs. bacterium, it seems to be that the best way to think of this is that since there’s more heat than there was at the beginning, it’s not a contradiction that the heat can be both more ordered and more disordered than it was at the beginning.) (BTW, the heat will eventually even out, removing the thermal order and increasing the thermal entropy even further.) So how does this fit with the second law of thermodynamics? It turns out that the increase in thermal entropy is larger than the decrease in carbon entropy, so the total entropy has increased, and the second law is satisfied. Sewell claims that the second law applies separately to each different kind of entropy, but this is not true in general. The only reason it works out that way in Sewell’s math is that there’s no coupling between the distribution of heat and carbon when they’re diffusing through a solid; anytime there’s any coupling between them, you can have conversion from one form of entropy to another.
I haven't read the Parunak and Brueckner paper yet, but from the abstract this seems to be the same sort of coupling they're talking about (although my example is certainly a lot simpler). I promised to point out two examples of problems with Sewell's analysis, so here's another. Well, actually, let me phrase this as a challenge for anyone who things his definition X-entropy makes mathematical sense: calculate the X-entropy change as a block of X is compressed by 0.1% (i.e. compressed to 0.999 of its original volume). This'll obviously require some knowledge of calculus, but anyone with the requisite expertise is welcome to take a stab at it (Sewell? Rob Sheldon? I'd suggest Sal Cordova, but he knows enough physics to realize the entropy arguments are bogus). The problem you're going to run into if you try to run the calculation is that the math falls apart. I don't mean it gives wrong answers, I mean it doesn't give meaningful answers. If you integrate the final X-entropy over the original volume of the block, you'll find that the change in X-entropy in the 0.1% that no longer contains X ... diverges to negative infinity. If you integrate only over the final volume, the X-entropy change depends on a constant of integration ... which is undefined. Either way, Sewell's math falls apart even for this very simple situation. Actually, I'll add a third problem (which also shows up in Jim Smith's comment #2): he misunderstands the relationship between entropy and probability. There's a direct relationship between entropy and probability only in the case of an isolated system fluctuating around thermodynamic equilibrium. In open systems fluctuating around equilibrium there's a more complicated relationship, and for non-equilibrium systems (e.g. pretty much everything relating to life) the relationship breaks down entirely. Gordon Davisson
Entropy is a fancy word for probability. Even in an open system, you have to add up the probabilities to prove your point, you can't just wave your hands or make a declaration. For example, the fact that the earth is an open system does not allow tornadoes to turn rubble into buildings. You may find this relevant: http://bio-complexity.org/ojs/index.php/main/article/download/BIO-C.2013.2/BIO-C.2013.2 Entropy and Evolution Granville Sewell Mathematics Department, University of Texas, El Paso, Texas, USA Abstract It is widely argued that the spectacular local decreases in entropy that occurred on Earth as a result of the origin and evolution of life and the development of human intelligence are not inconsistent with the second law of thermodynamics, because the Earth is an open system and entropy can decrease in an open system, provided the decrease is compensated by entropy increases outside the system. I refer to this as the compensation argument, and I argue that it is without logical merit, amounting to little more than an attempt to avoid the extraordinary probabilistic difficulties posed by the assertion that life has originated and evolved by spontaneous processes. To claim that what has happened on Earth does not violate the fundamental natural principle behind the second law, one must instead make a more direct and difficult argument. Jim Smith
OT: Stephen Meyer Talks ID on New Zealand's Leighton Smith Show, pt. 1 (February 2015) https://www.youtube.com/watch?v=bhsVUMvMAvM part 2 https://www.youtube.com/watch?v=yvHBSC5RYc4 bornagain77

Leave a Reply