Uncommon Descent Serving The Intelligent Design Community

Evidence of Decay is Evidence of Progress?

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

It’s called entropy, and it applies to everything. If you’re a pianist and don’t practice on a regular basis you don’t stay the same, you get worse, and it takes extra discipline, effort, and dedication to get better.

Natural selection is a buffer against decay that is constantly operating in nature. Natural selection throws out bad stuff in a competitive environment but has no creative powers. Since decay is the norm, and random errors, statistically speaking, essentially always result in decay, a creature living underground will lose its eyes because the informational cost of producing eyes is high.

Thus, a crippled, decayed creature in a pathologically hostile environment will have a survival advantage. This is devolution, not evolution.

This phenomenon is not only logically obvious, but Michael Behe and others have demonstrated that such is empirically the case.

Belief in the infinitely creative powers of natural selection is illogical, empirically falsified, and essentially represents, in my view, a cult-like mindset.

When evidence of decay is presented as evidence of progress, one must wonder what is going on in the minds of such people.

Comments
The Second Law of Thermodynamics states that the entropy of an isolated system, such as the universe, that is not in equilibrium will tend to increase over time, approaching a maximum value at equilibrium. The Third Law of Thermodynamics states that as the temperature approaches absolute zero, the entropy of a system approaches a constant. Fortunately for us, the temperature of the universe is not zero. It is moving that way each moment, but it is not there yet. - ICR
Q1: Is entropy highest or lowest at equilibrium? Why? Q2: Is entropy highest or lowest at absolute zero? Why?Mung
May 15, 2011
May
05
May
15
15
2011
01:05 PM
1
01
05
PM
PDT
Mung, bornagain77: Ok, FWIW, I'll take a stab at clarifying how "entropy gain is information loss": Note carefully that "information" is not the same as the "physical medium" in which it is encoded: "Information" per se (e.g. 111100001111; a sequence of 4 on bits, 4 off bits, and 4 on bits) is an abstract intangible (not physical), but the medium in which the information is encoded is physical and tangible and can be transistors on a chip, voltages on a bus, cans on a fence rail, or nucleotides in a DNA strand. If the encoding system is closed or isolated, then even though the energy within is conserved, the physical states will move to equilibrium having greater "multiplicity" and greater entropy, which state(s) for the purposes of encoding information have a commensurately greater disorder, i.e. a loss of encoded information. To restore order or encode information requires energy input, which by definition requires an open system (not isolated) into which encoding energy can be input. From the Physorg link http://www.physorg.com/news/2011-01-scientists-erase-energy.html bornagain77 posted: For example, Landauer said that information is physical because it takes energy to erase it. We are saying that the reason it is physical has a broader context than that.” A more precise statement would be that information is abstract yet always encoded in physical states, which physical states invariably require energy input to alter those physical states such that information is re-encoded (i.e. erased; erasure being a specific information state e.g. all 0's, or all 1's, or a pseudo-random bit pattern, etc.). Earlier from the same Physorg link: theoretically, information can be erased without using any energy at all. Instead, the cost of erasure can be paid in terms of another conserved quantity, such as spin angular momentum. Erasure (of information) still costs something, though I've not thought through (yet) how that cost can be paid with spin angular momentum. “Gain in entropy always means loss of information, and nothing more.” Because intangible information is encoded in physical mediums (e.g. molecular lattice in a crystal), when the physical medium becomes disordered (e.g. the crystal melts) then its entropy has increased even though the encoding material remains (albeit in a different state), but its encoded information (the lattice) is lost. The entropy (of the crystal) increased and information (the lattice) is lost. See as background on entropy and disorder: http://hyperphysics.phy-astr.gsu.edu/hbase/therm/entropcon.html http://hyperphysics.phy-astr.gsu.edu/hbase/therm/entrop.html http://hyperphysics.phy-astr.gsu.edu/hbase/therm/entrop2.html http://hyperphysics.phy-astr.gsu.edu/hbase/thermo/seclaw.html#c1Charles
May 15, 2011
May
05
May
15
15
2011
12:55 PM
12
12
55
PM
PDT
DrBOT; this may be of interest to you; Evolutionary Algorithms: Are We There Yet? - Ann Gauger Excerpt: In the recent past, several papers have been published that claim to demonstrate that biological evolution can readily produce new genetic information, using as their evidence the ability of various evolutionary algorithms to find a specific target. This is a rather large claim.,,,,, As perhaps should be no surprise, the authors found that ev uses sources of active information (meaning information added to the search to improve its chances of success compared to a blind search) to help it find its target. Indeed, the algorithm is predisposed toward success because information about the search is built into its very structure. These same authors have previously reported on the hidden sources of information that allowed another evolutionary algorithm, AVIDA [3-5], to find its target. Once again, active information introduced by the structure of the algorithm was what allowed it to be successful. These results confirm that there is no free lunch for evolutionary algorithms. Active information is needed to guide any search that does better than a random walk. http://biologicinstitute.org/2010/12/17/evolutionary-algorithms-are-we-there-yet/bornagain77
May 15, 2011
May
05
May
15
15
2011
12:22 PM
12
12
22
PM
PDT
Mung you ask; So you agree your second source contradicted your first source? NO!bornagain77
May 15, 2011
May
05
May
15
15
2011
11:56 AM
11
11
56
AM
PDT
Mung, I posted the second resource for a very technical reason having to do with how Landauer was trying to make information merely a emergent property of material particles.
So you agree your second source contradicted your first source? How was anyone reading your post supposed to understand your intent? So are you now arguing that information is non-physical? What about entropy, is it also non-physical?
Gain in entropy always means loss of information, and nothing more.
Why? How so? On the one hand we have a physical system. On the other hand we have something called information. How does entropy tie the two together?
None-the-less I’m glad you agree with the basic premise of the Landuaer resource and am glad you ‘finally’ see how deeply the Lewis quote describes the relation between entropy and information.
Funny guy.Mung
May 15, 2011
May
05
May
15
15
2011
11:47 AM
11
11
47
AM
PDT
wrt information. I wonder if this is the insight John von Neumann had when he suggested to Shannon that he should name his term entropy. (And no, I don't mean because no one knows what it is.) But as something mathematical, a certain kind of quantity.Mung
May 15, 2011
May
05
May
15
15
2011
11:34 AM
11
11
34
AM
PDT
Mung, I posted the second resource for a very technical reason having to do with how Landauer was trying to make information merely a emergent property of material particles. One that I certainly do not want to go through with you. None-the-less I'm glad you agree with the basic premise of the Landuaer resource and am glad you 'finally' see how deeply the Lewis quote describes the relation between entropy and information.bornagain77
May 15, 2011
May
05
May
15
15
2011
11:11 AM
11
11
11
AM
PDT
Indeed, when it comes to that, in the end energy is just as mysterious a concept as entropy. Feynman in his famous Lectures:
There is a fact, or if you wish, a law, governing all natural phenomena that are known to date. There is no known exception to this law—it is exact so far as we know. The law is called the conservation of energy. It states that there is a certain quantity, which we call energy, that does not change in manifold changes which nature undergoes. That is a most abstract idea, because it is a mathematical principle; it says that there is a numerical quantity which does not change when something happens. It is not a description of a mechanism, or anything concrete; it is just a strange fact that we can calculate some number and when we finish watching nature go through her tricks and calculate the number again, it is the same. [The Feynman Lectures on Physics]
Beautiful. Can we therefore say almost exactly the same thing about entropy and/or information? How would we phrase it? wrt to entropy. It is not a description of a mechanism, or anything concrete; it is just a strange fact that we can calculate some number and when we finish watching nature go through her tricks and calculate the number again, it is either the same, or has increased. That is a most abstract idea, because it is a mathematical principle; it says that there is a numerical quantity which does not decrease when something happens.Mung
May 15, 2011
May
05
May
15
15
2011
11:07 AM
11
11
07
AM
PDT
ME:
Yet you cannot even begin to explain why the statement is true, probably don’t even know what it means, and yet you continue to post it anyways.
bornagain77:
pretty mean spirited mung!
I want you to go back through your posts and look at the comments you've made to me and about me and what I think to be the case. I don't think what I wrote was at all mean spirited. I am asking you to demonstrate that you understand the material you're quoting. I also don't think you're following along with what I've said, both in this thread and in others that explore entropy and information. If anything, I've been agreeing with Lewis and am trying to make the same case. But it's important to me to understand why. You mention "Landauer’s principle." In another thread I pretty much made the same point without even having heard of it. But then you post another quote that seems to directly contradict the previous one.
But a new study shows that, theoretically, information can be erased without using any energy at all.
Just sayin'Mung
May 15, 2011
May
05
May
15
15
2011
10:53 AM
10
10
53
AM
PDT
...even when you are wrong.
lol. Like I wrote elsewhere, if wrong were exercise I'd be slim and trim. The point is that we're all learning. Would it even be possible to learn without right and wrong? Hmm... Perhaps a topic for another thread.Mung
May 15, 2011
May
05
May
15
15
2011
10:14 AM
10
10
14
AM
PDT
DrBOT states; 'Evolving machine code will always be slightly easier than evolving a high level language like C because there is no compiler enforcing strict syntactic rules.' And DrBOT, exactly where is the compiler enforcing strict syntactic rules to explain the 'high level language' in DNA that is far, far more advanced than any computer code man has ever written????; ------- "The manuals needed for building the entire space shuttle and all its components and all its support systems would be truly enormous! Yet the specified complexity (information) of even the simplest form of life - a bacterium - is arguably as great as that of the space shuttle." J.C. Sanford - Geneticist - Genetic Entropy and the Mystery Of the Genome 'The information content of a simple cell has been estimated as around 10^12 bits, comparable to about a hundred million pages of the Encyclopedia Britannica." Carl Sagan, "Life" in Encyclopedia Britannica: Macropaedia (1974 ed.), pp. 893-894 Also of interest is that a cell apparently seems to be successfully designed along the very stringent guidelines laid out by Landauer's principle of 'reversible computation' in order to achieve such amazing energy efficiency, something man has yet to accomplish in any meaningful way for computers: Notes on Landauer’s principle, reversible computation, and Maxwell’s Demon - Charles H. Bennett Excerpt: Of course, in practice, almost all data processing is done on macroscopic apparatus, dissipating macroscopic amounts of energy far in excess of what would be required by Landauer’s principle. Nevertheless, some stages of biomolecular information processing, such as transcription of DNA to RNA, appear to be accomplished by chemical reactions that are reversible not only in principle but in practice.,,,, http://www.hep.princeton.edu/~mcdonald/examples/QM/bennett_shpmp_34_501_03.pdf The Coding Found In DNA Surpasses Man's Ability To Code - Stephen Meyer - video http://www.metacafe.com/watch/4050638 DNA - Evolution Vs. Polyfuctionality - video http://www.metacafe.com/watch/4614519/ DNA - Poly-Functional Complexity equals Poly-Constrained Complexity http://docs.google.com/Doc?docid=0AYmaSrBPNEmGZGM4ejY3d3pfMjdoZmd2emZncQ etc.. etc.. The Capabilities of Chaos and Complexity - David L. Abel - 2009 Excerpt: "A monstrous ravine runs through presumed objective reality. It is the great divide between physicality and formalism. On the one side of this Grand Canyon lies everything that can be explained by the chance and necessity of physicodynamics. On the other side lies those phenomena than can only be explained by formal choice contingency and decision theory—the ability to choose with intent what aspects of ontological being will be preferred, pursued, selected, rearranged, integrated, organized, preserved, and used. Physical dynamics includes spontaneous non linear phenomena, but not our formal applied-science called “non linear dynamics”(i.e. language,information). http://www.mdpi.com/1422-0067/10/1/247/pdfbornagain77
May 15, 2011
May
05
May
15
15
2011
10:08 AM
10
10
08
AM
PDT
The problem is not with the analogy to natural systems, it is the character of the search space which the programme is designed to explore.
What are the characteristics of the search spaces that biological evolution explores? There certainly seem to be plenty of viable solutions around, and many smooth gradients between each solution. Take a look at ring species for an example: http://en.wikipedia.org/wiki/Ring_species
The discrete search therefore comes to approximate a continuous gradient/steepest descent problem in continuous variables where the final answer “falls out” like water running downhill.
I'm not sure that is strictly correct. Certainly GA's do not work for every search space but they can be very effective in large spaces with few fitness peaks - much more effectively than gradient descent.
Try evolving the code in any normal programme and you won’t get that behaviour.
Do you know why? Like I said, GA's do not work in any search space or on any system. There is a term 'Evolvability' that can be used to describe different systems. Not all systems are evolvable and the topology of the fitness landscape, for example the degree of neutrality, plays a major part. Evolving machine code will always be slightly easier than evolving a high level language like C because there is no compiler enforcing strict syntactic rules.DrBot
May 15, 2011
May
05
May
15
15
2011
09:32 AM
9
09
32
AM
PDT
DrBot: I go away for a day and there's a ton of new comments. I did make a quick mention about self-evolving programmes and genetic algorithms. The problem is not with the analogy to natural systems, it is the character of the search space which the programme is designed to explore. It succeeds because the number of viable solutions is large enough compared to the total possible to make the search for better solutions possible in a reasonable time. The discrete search therefore comes to approximate a continuous gradient/steepest descent problem in continuous variables where the final answer "falls out" like water running downhill. No intelligence required, except perhaps for some short hops to avoid local minima. Try evolving the code in any normal programme and you won't get that behaviour.SCheesman
May 15, 2011
May
05
May
15
15
2011
06:30 AM
6
06
30
AM
PDT
Mung this may be of help; Landauer's principle Of Note: "any logically irreversible manipulation of information, such as the erasure of a bit or the merging of two computation paths, must be accompanied by a corresponding entropy increase ,,, Specifically, each bit of lost information will lead to the release of an (specific) amount (at least kT ln 2) of heat.,,, http://en.wikipedia.org/wiki/Landauer%27s_principle ============= also of technical interest; Scientists show how to erase information without using energy - January 2011 Excerpt: Until now, scientists have thought that the process of erasing information requires energy. But a new study shows that, theoretically, information can be erased without using any energy at all. Instead, the cost of erasure can be paid in terms of another conserved quantity, such as spin angular momentum.,,, "Landauer said that information is physical because it takes energy to erase it. We are saying that the reason it is physical has a broader context than that.", Vaccaro explained. http://www.physorg.com/news/2011-01-scientists-erase-energy.html i.e. “Gain in entropy always means loss of information, and nothing more.” Gilbert Newton Lewisbornagain77
May 15, 2011
May
05
May
15
15
2011
03:05 AM
3
03
05
AM
PDT
Mung:
Can we replace entropy with information-theoretic terminology?
For some things, and that is done. For other things, it is sensible to retain both for different contexts. For instance, there is a reason why the classical form of the 2nd law of thermodynamics is stated in terms of entropy, temperature and heat flow, in the key Clausius statement. Namely, in an isolated system where we have heat flow d'Q between hot body A and colder body B: ds >/= d'Q/T (The premises of classical thermodynamics can be grounded in stat mech and/or kinetic theory, but that is not about to relegate classical thermo-D to the ash heap of history. No more than is the rise of relativity and quantum theory about to push Newtonian dynamics off the stage. Both are simply far too useful and helpful in their own spheres.) If you will look at the statement of Newton's three laws of motion, for a comparison, the second is really F = dP/dt, and the first law is directly implied: F = 0 => dP/dt = 0. But, no-one is calling for dismissal of NL1. The significance and utility of understanding that inertia implies that an undisturbed body will keep its present velocity: speed and direction of motion -- including the special case of being at rest relative to an onlooker in an inertial frame of reference, is enough for it to need stating. Entropy is quantitatively and conceptually related to information, but that does not make it simply reducible to information. Ben-Naim is probably best understood as making a rhetorical point on the -- until very recently quite controversial -- informational view of thermodynamics. Indeed, when it comes to that, in the end energy is just as mysterious a concept as entropy. Feynman in his famous Lectures:
There is a fact, or if you wish, a law, governing all natural phenomena that are known to date. There is no known exception to this law—it is exact so far as we know. The law is called the conservation of energy. It states that there is a certain quantity, which we call energy, that does not change in manifold changes which nature undergoes. That is a most abstract idea, because it is a mathematical principle; it says that there is a numerical quantity which does not change when something happens. It is not a description of a mechanism, or anything concrete; it is just a strange fact that we can calculate some number and when we finish watching nature go through her tricks and calculate the number again, it is the same. [The Feynman Lectures on Physics]
GEM of TKI PS: When a system is opened up and energy is dumped in, there is a tendency for the entropy to INCREASE. That BTW is why the typ0ical "open systems" answer to the issue of entropy vs OOL and OO body plans is misdirected. Which is the point prof Sewell has been underscoring.kairosfocus
May 14, 2011
May
05
May
14
14
2011
09:20 PM
9
09
20
PM
PDT
Mung, Thanks for the levity and great discussion. Your contributions are greatly appreciated, inspire thoughtful consideration, and spice things up, even when you are wrong. :-)GilDodgen
May 14, 2011
May
05
May
14
14
2011
08:58 PM
8
08
58
PM
PDT
GilDodgen: "... As Granville puts it, the fact that a system is open doesn’t mean that the highly improbable automatically becomes probable. . To put it in simple terms, shining light on dirt doesn’t increase the probability that dirt will spontaneously generate life." Or, to put it in even simpler/blunter terms, setting off a nuclear device in a sleepy rackwater town does not turn it into a bustling metropolis.Ilion
May 14, 2011
May
05
May
14
14
2011
08:26 PM
8
08
26
PM
PDT
What chance does a dog have against a cougar?
Chiquita the chihuahua takes on a cougar with surprising results.
Mung
May 14, 2011
May
05
May
14
14
2011
07:49 PM
7
07
49
PM
PDT
Sorry, I’m a Huskies fan. Them's fightin' words. What chance does a dog have against a cougar?GilDodgen
May 14, 2011
May
05
May
14
14
2011
07:37 PM
7
07
37
PM
PDT
Mung you state, 'Yet you cannot even begin to explain why the statement is true, probably don’t even know what it means, and yet you continue to post it anyways.' pretty mean spirited mung! Since I take Gilbert Newton Lewis's word over yours, in this matter, why should I not post it. You have done nothing but flail about in trying to disprove the reality of entropy and its relation to information. Perhaps I do, perhaps I don't have a basic understanding of what he means by the statement, but why should I expend the effort to elucidate to someone who is unwilling to pick up the very basics in the first place??? What exactly is my payoff for enduring such unreasonableness???bornagain77
May 14, 2011
May
05
May
14
14
2011
07:16 PM
7
07
16
PM
PDT
“Gain in entropy always means loss of information, and nothing more.”
Can we replace entropy with information-theoretic terminology?
Mung since you don’t believe in Entropy.
Well, hey, maybe I will eventually come to not believe in entropy.
The principal message of this book is that thermodynamics and statistical mechanics will benefit from replacing the unfortunate, misleading and mysterious term “entropy” with a more familiar, meaningful and appropriate term such as information, missing information or uncertainty. This replacement would facilitate the interpretation of the “driving force” of many processes in terms of informational changes and dispel the mystery that has always enshrouded entropy.
A FAREWELL TO ENTROPYMung
May 14, 2011
May
05
May
14
14
2011
07:11 PM
7
07
11
PM
PDT
Mung, I find this statement,,,; “Gain in entropy always means loss of information, and nothing more.” Gilbert Newton Lewis ,,,To be a take home statement, written by a man that very well could be without peer on the subject.
Yet you cannot even begin to explain why the statement is true, probably don't even know what it means, and yet you continue to post it anyways. Can you even tell me what kind of entropy he was talking about? Can you show how and why it applies to "genetic entropy" (whatever that is)? As the office of Hugh Ross is becoming more disordered, who or what is losing information, and why? If Ross straightens up his office does he gain information, or lose it?
Mung, the intelligent design objection to Natural Selection is that it always reduces information. It never creates information. ,,, Information is the whole key.
It does that by increasing entropy?
Mung, If you are not denying the reality of entropy, in the preceding statements, then please tell me exactly what in blue blazes you are doing???
Seeking clarity.Mung
May 14, 2011
May
05
May
14
14
2011
07:00 PM
7
07
00
PM
PDT
Sorry, I'm a Huskies fan. I don't suppose you'd happen to have a pdf copy if his The Symmetry of Time In Physics. I'd sure like to know the context of his quote.Mung
May 14, 2011
May
05
May
14
14
2011
06:47 PM
6
06
47
PM
PDT
“Gain in entropy always means loss of information, and nothing more.” Gilbert Newton Lewis My father named me after Gilbert Newton Lewis, with whom my father worked on the Manhattan A-bomb project. GilbertGilDodgen
May 14, 2011
May
05
May
14
14
2011
05:47 PM
5
05
47
PM
PDT
Mung, I find this statement,,,; “Gain in entropy always means loss of information, and nothing more.” Gilbert Newton Lewis ,,,To be a take home statement, written by a man that very well could be without peer on the subject. A statement that always applies in whatever field of physical science that you may happen to be in.,,, GilDodgen has given you some very clear examples to help you, and the videos I listed by Perry Marshall are very helpful as well.bornagain77
May 14, 2011
May
05
May
14
14
2011
05:44 PM
5
05
44
PM
PDT
bornagain77:
“Gain in entropy always means loss of information, and nothing more.” Gilbert Newton Lewis
Why? And if that's all it means, and nothing more, then it follows that it doesn't mean what people think it means.Mung
May 14, 2011
May
05
May
14
14
2011
04:49 PM
4
04
49
PM
PDT
To put it in simple terms, shining light on dirt doesn’t increase the probability that dirt will spontaneously generate life.
I have Sewell's In The Beginning so I have a good idea what you are talking about and do not dispute that.Mung
May 14, 2011
May
05
May
14
14
2011
04:44 PM
4
04
44
PM
PDT
Is Entropy a Measure of "Disorder"? Let us dispense with at least one popular myth: "Entropy is disorder" is a common enough assertion, but commonality does not make it right. Entropy is not "disorder", although the two can be related to one another. For a good lesson on the traps and pitfalls of trying to assert what entropy is, see Insight into entropy by Daniel F. Styer, American Journal of Physics 68(12): 1090-1096 (December 2000). Styer uses liquid crystals to illustrate examples of increased entropy accompanying increased "order", quite impossible in the entropy is disorder worldview. And also keep in mind that "order" is a subjective term, and as such it is subject to the whims of interpretation. This too mitigates against the idea that entropy and "disorder" are always the same, a fact well illustrated by Canadian physicist Doug Craigen, in his online essay "Entropy, God and Evolution".
Entropy is also sometimes confused with complexity, the idea being that a more complex system must have a higher entropy. In fact, that is in all liklihood the opposite of reality. A system in a highly complex state is probably far from equilibrium and in a low entropy (improbable) state, where the equilibrium state would be simpler, less complex, and higher entropy.
http://www.tim-thompson.com/entropy1.html#whatMung
May 14, 2011
May
05
May
14
14
2011
04:35 PM
4
04
35
PM
PDT
Mung: Don’t we have a physicist who posts here at times? You are probably referring to Granville Sewell who is a professor of mathematics and a fellow of the Evolutionary Informatics Lab, along with yours truly. I’m not going to watch an hour long video. What does he say that’s relevant to this discussion and where in the video does he say it? The video (try starting at 44:00) points out what Granville has in slightly different terms, which is that in an open system (like the earth which has free energy available from the sun) the Second Law cannot automatically be overcome simply as a result of the availability of free energy. That energy must be directed and harnessed to produce a local entropy decrease. As Granville puts it, the fact that a system is open doesn't mean that the highly improbable automatically becomes probable. To put it in simple terms, shining light on dirt doesn't increase the probability that dirt will spontaneously generate life.GilDodgen
May 14, 2011
May
05
May
14
14
2011
04:32 PM
4
04
32
PM
PDT
bornagain77:
Mung, Entropy is the tendency of things to decay in this universe, and is considered by many to be the MOST irrefutable law of science
Do you even read the material you cut and paste? How do you get that from what Eddington wrote, or from any of the other quotes you posted? Eddington writes that entropy is a measure. So does Ross. From the Penrose video (0:07):
that's a slightly misleading way of looking at it
From the Penrose video (2:55):
...with regard to the matter it (the initial state) was maximum entropy..
Ross:
Entropy measures the amount of decay or disorganization in a system as the system moves continually from order to chaos.
Ross is just wrong. No wonder people are confused. He got "is a measure" right, but entropy is not a measure of the amount of decay or disorganization in a system as the system moves continually from order to chaos. Ross again:
Physical life is possible because universal entropy increases. All organisms take advantage of increasing entropy to run metabolic reactions, such as digestion and energy production. Work is possible because of the universe’s increasing entropy. And because work is possible, human creative activity is possible. With this miraculous creation in mind, our family spends some time on Thanksgiving Day expressing gratefulness to God for making the universe as entropic as He did.
On this I agree with Ross. But following his line of argument, without death and decay life is not possible. That's not quite the view of a lot of fundamentalist Christians who think death and decay did not enter the world until after the fall. More Ross:
The entropy measure of the universe is important for several other reasons. It determines which features of the universe are reversible and which are not.
More sloppy language. Entropy tells us whether a process is reversible. It does not determine it.Mung
May 14, 2011
May
05
May
14
14
2011
04:21 PM
4
04
21
PM
PDT
1 2 3 4

Leave a Reply