Uncommon Descent Serving The Intelligent Design Community

Evidence of Decay is Evidence of Progress?

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

It’s called entropy, and it applies to everything. If you’re a pianist and don’t practice on a regular basis you don’t stay the same, you get worse, and it takes extra discipline, effort, and dedication to get better.

Natural selection is a buffer against decay that is constantly operating in nature. Natural selection throws out bad stuff in a competitive environment but has no creative powers. Since decay is the norm, and random errors, statistically speaking, essentially always result in decay, a creature living underground will lose its eyes because the informational cost of producing eyes is high.

Thus, a crippled, decayed creature in a pathologically hostile environment will have a survival advantage. This is devolution, not evolution.

This phenomenon is not only logically obvious, but Michael Behe and others have demonstrated that such is empirically the case.

Belief in the infinitely creative powers of natural selection is illogical, empirically falsified, and essentially represents, in my view, a cult-like mindset.

When evidence of decay is presented as evidence of progress, one must wonder what is going on in the minds of such people.

Comments
Sorry it's taken me a while to reply to your comment, Charles.
Gordon Davisson:
The comparison was entirely within scenario #2, where the high bit went from being 0 with 100% probability, to being 0 with 50% probability and 1 with 50% probability. The multiplicity and entropy are based on the probabilities of various states, not what the encoding allows.
Charles:
The inconsistency is that there never was, by your definition, any information in the hi-order bit. All the information, is in the lo-order 7 bits. That information has a multiplicity of 2: a) ASCII letter with the hi-order bit=1 and b) ASCII letter with the hi-order bit=0
Huh? I'm really not sure what you mean here. Let me run the math on a (very simple) example. Suppose that at a certain point in the English message, if you look at the message up to that point, you can make a pretty good guess at what the next letter will be (based on syntax, grammar, and semantics). Specifically, suppose you can estimate that the next letter has an 80% chance of being an "e", a 10% chance of being a "a", and a 5% chance each of being "t" or "o". As a result, before randomization of the high bit, the byte that encodes that next letter has an 80% chance of being (in hexidecimal) 65 (the ascii code for "e"), a 10% chance of being 61 (ascii "a"), and a 5% chance of being 74 (ascii "t") or 6f (ascii "o"). If you plug these into the formula for Shannon-entropy, H in bits = sum for each possibility i of -P(i)*log2(P(i)) = -0.8*log2(0.8) - 0.1*log2(0.1) - 0.05*log2(0.05) - 0.05*log2(0.05) = 0.2578 + 0.3322 + 0.2161 + 0.2161 = 1.0219, so the initial entropy of this byte (given the ones before it) is a hair over 1 bit. Now, take a look at the probabilities after the high bit is randomized: it has a 40% chance each of being 65 or e5 (61+the high bit), a 5% chance each of being 61 or e1, and a 2.5% chance each of being 74, f4, 6f, or ef. Plug these probabilities into the same formula (I'll skip writing out the math this time), and you'll get 2.0219 bits of entropy. BTW, while the example above is unrealistic (the probabilities were chosen to make calculations easy, not for realism), this is essentially how the 1 bit of entropy per letter figure was derived. If you read some text, stop at an arbitrary point, and try to guess the next letter, you'll sometimes have no good idea, and sometimes be able to guess the entire next word or phrase. 1 bit per letter is a rough overall average.
You can’t reasonably argue no change of information on the lo-order 7-bits while arguing an entropy change on all 8-bits. Compare before and after information and entropy on only 7-bits (apples) or compare before and after information and entropy on all 8-bits (oranges), but don’t conflate the two comparisons as disproving Lewis.
I'm comparing both on all 8 bits. All of the meaningful information happens to be concentrated in the low 7 bits, with the 8th bit being meaningless (both before and after randomization). But the entire byte corresponds to the same English letter both before and after randomization. BTW, I can give you lots more examples like this -- anytime you have a redundant encoding that can tollerate some errors (Hamming codes, Reed-Solomon codes, etc), errors within its correction limits will increase Shannon-entropy without damaging any of the encoded information.
There’s only one way to encode a given English message, but there are many possible English messages.
This is precisely the inconsistency to be avoided. Compare the loss of information when losing a given message with losing another given message (apples) or the loss of information when losing all possible messages with all possible messages (oranges), but don’t conflate them.
There are a variety of ways to define "information"; one way that's particularly relevant here is that information corresponds to a choice of one particular option (e.g. a specific message) from a set of possibilities (e.g. the set of all possible messages). After randomizing the 8th bit (scenario #2), the author's choice of what to say is can still be determined from the memory's contents (the information -- the author's choice -- is still there). After the memory is either entirely randomized (#1) or erased (#3), the author's choice cannot be determined from the memory's contents (the information is gone).
If you power a DRAM chip on and start reading from it without writing first, you will get some pattern of 0s and 1s out. It’s undefined only in the sense that nobody will make any particular promises about what that pattern will be;
Please. Without any promise as to what the pattern will be is as “undefined” as it gets. undefined means undefined; no promises, no guarantees, no defined meaning.
And yet, promises or no, if you actually do the experiment it'll come out the same way time after time (as long as you left it off long enough). And anytime you have one outcome with probability 100%, its Shannon-entropy is going to be 0.
in practice, if the chip has been off long enough, it’ll be whatever the read/store circuitry treats the all-discharged state as.
And an “off” chip can’t be read. Its values are unknowable, it has no defined information. OTOH, as previously noted, if the memory is designed to retain value without power (e.g. bubble or magnetic) then its defined state doesn’t change without power and there is no loss of information, so that comparison remains moot.
I think it can be defined (the charge level on the capacitor exists even when you don't read it), but I have a feeling this may be one of those philosophical questions like "if a tree falls in the forest and nobody hears it, does it make a sound?" and we may just have to disagree. In any case, it doesn't matter to my argument: all I need is that after you turn it back on, it's in a predictable state and hence has lost its Shannon-entropy.
BTW, DRAM can drain even when powered on; each row must be read frequently to refresh its charges.
Indeed. All volatile memory has some designed in “refresh” logic to ensure memory retains its “defined” values. It would be useless otherwise.
(This is irrelevant to the main discussion, I'm just being picky here.) Actually, the old-school DRAM I worked with had no built-in refresh logic; it had to be handled entirely by logic outside the DRAM chips themselves. Some modern chips have part of the logic onboard, but it (at least as I understand it) still needs to be externally triggered with a CAS before RAS cycle.
BTW, needing to add energy to get back to the original state does not confirm an entropy decrease. If anything, it does the opposite, as adding energy will generally increase entropy.
Adding energy is the only way to restore the previous “order”. Adding energy is the only way to move from a large multiplicity to a small multiplicity.
This is a very common misunderstanding (I found versions of it in both the hyperphysics pages you linked and the Motion Mountain book KF linked [near the bottom of page 299]). Nonetheless, it is a misunderstanding; in order to decrease entropy, you have to have heat or matter (or some more obscure things) leaving the system. I think the reason this misunderstanding occurs is that machines that reduce entropy (like refrigerators) or stay at constant entropy despite doing things that produce entropy (like computers) need both a supply of nonthermal energy ("work" in thermo terminology) and a way to dispose of waste heat. Heat is generally easy to get rid of, but work is in limited supply, so we tend to concentrate on work as the critical element for keeping the machine operating. But it's actually the waste heat that's responsible for carrying away entropy. To see this clearly, you need to consider situations where work input and heat output don't occur together. Take, for example, an inert object that's warmer than its surroundings. As it looses heat to its surroundings, its entropy will drop. If its surroundings are cold enough, it'll approach a temperature of absolute zero, and its entropy will also approach absolute zero. Now, take a situation where work is done on the system, but no heat flows: lifting a weight is a simple example. Its gravitational energy goes up, but there's no change in multiplicity or entropy. Mind you, I'm talking about what's needed for an entropy decrease; that may or may not be the same as an increase in order. "Order" is not a well-defined term in thermodynamics, and our intuitive sense of "order" doesn't correspond well to any particular thermodynamic quantity.
Even in your “cold boot attack” example, after 5 minutes of power off there is no retrievable information. Regardless in any/every case, energy must be input to retrieve whatever might be left from an undefined DRAM state, but no promises.
Computer memory requires an energy input to read data, store data, erase data, or even preserve existing data unchanged; surely not all of these operations decrease entropy? Gordon Davisson
Gordon Davisson: The comparison was entirely within scenario #2, where the high bit went from being 0 with 100% probability, to being 0 with 50% probability and 1 with 50% probability. The multiplicity and entropy are based on the probabilities of various states, not what the encoding allows. The inconsistency is that there never was, by your definition, any information in the hi-order bit. All the information, is in the lo-order 7 bits. That information has a multiplicity of 2: a) ASCII letter with the hi-order bit=1 and b) ASCII letter with the hi-order bit=0 You can't reasonably argue no change of information on the lo-order 7-bits while arguing an entropy change on all 8-bits. Compare before and after information and entropy on only 7-bits (apples) or compare before and after information and entropy on all 8-bits (oranges), but don't conflate the two comparisons as disproving Lewis. If you power a DRAM chip on and start reading from it without writing first, you will get some pattern of 0s and 1s out. It’s undefined only in the sense that nobody will make any particular promises about what that pattern will be; Please. Without any promise as to what the pattern will be is as "undefined" as it gets. undefined means undefined; no promises, no guarantees, no defined meaning. in practice, if the chip has been off long enough, it’ll be whatever the read/store circuitry treats the all-discharged state as. And an "off" chip can't be read. Its values are unknowable, it has no defined information. OTOH, as previously noted, if the memory is designed to retain value without power (e.g. bubble or magnetic) then its defined state doesn't change without power and there is no loss of information, so that comparison remains moot. BTW, DRAM can drain even when powered on; each row must be read frequently to refresh its charges. Indeed. All volatile memory has some designed in "refresh" logic to ensure memory retains its "defined" values. It would be useless otherwise. BTW, needing to add energy to get back to the original state does not confirm an entropy decrease. If anything, it does the opposite, as adding energy will generally increase entropy. Adding energy is the only way to restore the previous "order". Adding energy is the only way to restore the previous "order". Adding energy is the only way to move from a large multiplicity to a small multiplicity. Note further your response to
But further note there is no information entropy definable for powered off RAM.
It’s reasonably well-defined in terms of what would happen if the memory were powered back on and read (see the cold boot attack link above for examples).
Precisely. Energy must be input to reduce entropy, i.,e. to restore order from disorder, to restore a defined state from an undefined state. Even in your "cold boot attack" example, after 5 minutes of power off there is no retrievable information. Regardless in any/every case, energy must be input to retrieve whatever might be left from an undefined DRAM state, but no promises. There’s only one way to encode a given English message, but there are many possible English messages. This is precisely the inconsistency to be avoided. Compare the loss of information when losing a given message with losing another given message (apples) or the loss of information when losing all possible messages with all possible messages (oranges), but don't conflate them. Charles
Charles:
There seem some inconsistencies in your definitions/comparisons: By your definition, the “information” is the 7-bit ASCII letter (you explicitly affirm this by the phrase “the information was all in the lower 7 bits, and is intact”). Agreed. But further, the information is actually in 2 states: a) ASCII letter with the hi-order bit=1 and b) ASCII letter with the hi-order bit=0
If I understand what you're getting at here, a small clarification may resolve the inconsistency: I was assuming that the initial state had 0s in the high (8th) bit of each byte (I just neglected to mention this assumption). The situation you're describing only occurs in scenario #2 after the high bit is randomized.
[In scenario #2] ... no information was lost by randomizing the hi-order bit, but you state “an increase in entropy with no information loss”. But, to what other entropy state are you comparing? [...] If the comparison is strictly within scenario 2, wherein the hi-order bit being randomized is compared with the hi-order bit not being randomized, there are regardless the same 2 valid informational states having a multiplicity of 2 (as noted above), which is the same entropy, *not* an increase.
The comparison was entirely within scenario #2, where the high bit went from being 0 with 100% probability, to being 0 with 50% probability and 1 with 50% probability. The multiplicity and entropy are based on the probabilities of various states, not what the encoding allows.
In “Scenario #3: The RAM is powered off, and all of the charges storing the 1 bits drain gradually away. The memory is now all 0s,” As a practical matter of the example chosen, the state “without power” is in digital systems actually an undefined condition. Powerless dynamic ram has neither 1?s nor 0?s. [...] But draining power from dynamic memory causes a loss of information. Even erased memory that was all 0?s loses its “erased” condition when the power is turned off, and must be reinitialized to all 0?s when power is restored.
If you power a DRAM chip on and start reading from it without writing first, you will get some pattern of 0s and 1s out. It's undefined only in the sense that nobody will make any particular promises about what that pattern will be; in practice, if the chip has been off long enough, it'll be whatever the read/store circuitry treats the all-discharged state as. I'll give you two authorities for this: first, my personal experence as a computer repairman. Some of the really early Macintosh models used a section of main memory (which was DRAM) as their video buffer. Normally, when the Mac was powered on, it'd initialize that memory before the picture tube warmed up enough to display its contents; but if something prevented this (dead CPU, bad ROM, stuck reset line...) the video circuitry would read from the unitialized DRAM and you'd get a characteristic pattern of jagged black-and-white blocks (the exact pattern depended on what memory chips were installed, and how their read/store circuitry was laid out). Second, computer security researchers have recently been looking into recovering remnant data from powered-off DRAM (google keyphrase: "cold boot attack"). They have some nice images and video here showing an image of the Mona Lisa gradually draining to black and white bars showing exactly the phenomenon I'm talking about. BTW, DRAM can drain even when powered on; each row must be read frequently to refresh its charges.
The thermodynamic entropy [S=klnW] of the ‘powered off’ RAM increased as its heat dissipates and its temperature drops. This is further confirmed by the amount of energy input required to restore RAM to its powered-on “erased” state.
Actually, when you turn it off and its temperature drops, its entropy will drop as well. As long as there's nothing else going on to conplicate things, adding heat to a system always(*) increases its entropy, and removing heat always(*) decreases its entropy. In fact, that's how thermodynamic entropy is defined: for a reversible process, DeltaH (change in entropy) = DeltaQ/T (change in heat divided by absolute temperature). (* There is one sort-of exception: systems at negative absolute temperatures. But they're deeply weird and it's not clear they even strictly exist, so we should probably ignore them.) BTW, needing to add energy to get back to the original state does not confirm an entropy decrease. If anything, it does the opposite, as adding energy will generally increase entropy. If you want to reverse an entropy increase you need something associated with an entropy flux out of the system: heat leaving the system, matter leaving the system, etc.
The information entropy [H=lnM] did not change because the information entropy of the single bit pattern comprising an ASCII letter is no different than the information entropy of the single bit pattern comprising all 0?s; either can be encoded in only a single way. Note that M=1 (because there is only 1 way to encode all 0?s or only 1 way to encode a specific ASCII letter) is the number of “Messages” (whose probability actually occured) out of a 7-bit Message space (all possible ASCII letters).
There's only one way to encode a given English message, but there are many possible English messages. It's rather hard to determine exactly how many possible messages there are of a given length (and what their probabilities are), but the estimates I've seen put the Shannon-entropy of english at around 1 bit per letter. Since each message corresponds to a unique sequence of bytes of ASCII, the Shannon-entropy of the encoded message will be around 1 bit per byte. Put another way: the multiplicity of English messages 1000 characters long is around 2^1000; therefore the multiplcity of a 1000-byte block of memory containing an ASCII-encoded English message (with the high bits all 0) is also around 2^1000, and so its Shannon-entropy is log2(2^1000) bits = 1000 bits.
But comparatively, the information entropy is insignificant relative to the thermodynamic entropy (because of the relatively huge number of semiconductor ‘microstates’ and Boltzmann’s constant).
...I'm in complete agreement here...
But further note there is no information entropy definable for powered off RAM.
It's reasonably well-defined in terms of what would happen if the memory were powered back on and read (see the cold boot attack link above for examples). If that's too hypothetical, you could define it in terms of the charge on the individual capacitors: if a cap is more than half charged, consider it a 1, less than half charged consider it a 0. Note that depending on where the read circuit's threshold is, these two definitions may be equivalent.
If however you intended to simply re-initialize or “erase” all bits to 0?s, magically without an expenditure of energy input, there is still an inconsistency of definition. The original memory definition was 8-bit bytes, the hi-order bit being ignored and the lo-order 7 bits containing specific information (an ASCII letter). The multiplicity of the original ASCII letter state (ignoring the hi-order bit) is 1 (there is only a single bit pattern that comprises the ASCII letter. But the multiplicity of the all 0?s state is also 1 (there is only a single bit pattern that comprises all 0?s). To change from the specific ASCII letter state to the specific all 0?s state involves no net change in multiplicity, hence no net change in information entropy, but the state change is irreversible in that the information state of the ASCII letter is irretrievably lost.
Again, there are many possible English messages, so the initial multiplicity is much higher than 1 and the initial entropy higher than 0.
Landauer argued:
… there is no thermodynamic objection to a logically reversible operation potentially being achieved in a physically reversible way in the system. It is only logically irreversible operations — for example, the erasing of a bit to a known state, or the merging of two computation paths — which must be accompanied by a corresponding entropy increase. When information is physical, all processing of its representations, i.e. generation, encoding, transmission, decoding and interpretation, are natural processes where entropy increases by consumption of free energy.[4]
So in fact, scenario 3 is essentially “erasure” and represents an irreversible loss of the ASCII letter information state but with no net change in information entropy because the multiplicity of the ASCII and erased states are the same, but a real world gain of thermodynamic entropy when RAM is powered off (as confirmed by energy input being required to restore prior state).
This is indeed erasure, but there is a decrease of information entropy (as I argued above). There need not be a gain of thermodynamic entropy. There must be an increase in thermal (or spin or...) entropy, but only enough to compensate for the loss of information entropy and avoid an overall (thermodynamic) entropy decrease. Here's how Landauer puts it (note that he assumes erasing to all 1s instead of all 0s):
Consider a statistical ensemble of bits in thermal equilibrium [i.e. half 0s and half 1s -GD]. If these are all reset to ONE, the number of states covered in the ensemble has been cut in half. The entropy therefore has been reduced by k log_e 2 = 0.6931 k per bit [this is a change in what I'm calling the information component of entropy -GD]. The entropy of a closed system, e.g., a computer with its own batteries, cannot decrease; hence this entropy must appear elsewhere as a heating effect, supplying 0.6931 kT per restored bit to the surroundings. This is, of course, a minimum heating effect, and our method of reasoning gives no guarantee that this minimum is in fact achievable.
[I'll skip the sections of your summary I've dealt with above...]
In scenario 1 [complete randomization -GD], we are agreed on “an information loss linked to an increase in Shannon-entropy” but no, as to a thermodynamic entropy decrease as well as some energy input is required to effect a new (randomized) equilibrium. The system is not isolated as energy (electric or magnetic) must be input to effect the state change.
I don't see any inherent requirement for an energy input here (nor any obstacle to this occurring in isolation). Quite the opposite, if you reverse the reasoning in the Landauer quote I gave above, you'll see that the increase in entropy inherent in increasing the number of states the memory could, at least in theory, be coupled to a decrease in thermal entropy, in the form of some heat being converted to free energy. In the paper I cited by Charles Bennett, he actually describes a (theoretical) mechanism to randomize memory while converting heat to work. (His mechanism requires starting with completely blank memory, but it should in principle be generalizable to work with semi-predictable memory like something containing English text).
Lewis noted: “Gain in entropy always means loss of information, and nothing more”. In scenarios 1 and 3 there was a loss of information and a related gain in thermodynamic entropy. In scenario 2, there was no loss of information, but neither was there a loss of thermodynamic entropy for the lo-order 7 bits comprising the ASCII letter. In all three scenarios, Lewis’s point has not been disproven.
I'll stand by my original claims, but add an additional one: none of the three scenarios necessarily involve an increase in overall thermodynamic entropy. In each case, the change in information entropy could (at least in theory) be coupled to an equal-and-opposite change in thermal entropy, leaving the total unchanged. In practice, of course, there'll always be an overall increase in entropy; but there's no particular limit on how small this increase could be. Gordon Davisson
Mung:
While it is true that thermodynamic entropy and thermal entropy are not the same thing would you agree or disagree that entropy in both has the same meaning if entropy is understood as “degrees of freedom” in the system. Can we connect “degrees of freedom” with “multiplicity” and can both be described by the same equation?
"Degrees of freedom" are ways of factoring a system's multiplicity. Hmm, that reads a lot like gibberish, right? Let me give an example that'll hopefully clarify: Suppose we have a billiard ball on a billiard table. There are a large number of possible locations the ball might be in. If we wanted to calculate the number of possible positions, we could figure out how many possibilities there are for the ball's position's X coordinate (the width of the table divided by some quantum limit) and similarly how many Y coordinates (length, divided by quantum limit) and multiply them together to get the total number of positions. The X and Y coordinates are examples of what I'm calling degrees of freedom. If there are Nx possible X coordinates and Ny possible Y coordinates (and a total of Nx*Ny possible positions), then then X-entropy will be k*ln(Nx), the Y-entropy will be k*ln(Ny), and the total entropy of the ball's position will be k*ln(Nx*Ny) = X-entropy + Y-entropy. We've broken the ball's entropy into horizontal and vertical components! There are a couple of possible complications I've ducked: it might be that not all coordinate combinations are possible (e.g. there are bumpers in the middle of the table, and the ball can't occupy the same space as them); in that case, there'll be fewer total possibilities than this calculation gives, and hence less total entropy. In that case, we'd say that the degrees of freedom are not independent, and so the calculation breaks down. The other complication is that some possibilities are more likely than others; this makes the math messier, but as long as the degrees of freedom are statistically independent, the total entropy will still be the sum of the X- and Y-entropies. Also, I've also only looked at the ball's position. The ball also might be in a large number of orientations, so if you want to know how many states the ball might be in you'd also have to figure out how many possible orientations the ball could have (No), and the multiply No into the multiplicity and add k*ln(No) into the entropy. And the ball might be moving, so there'll be a range of X- and Y-velocities the ball might have, as well as a range of rotational velocities. (Note that the rotational velocity tends to be at-least-mostly determined by the linear velocity, so it's not going to be an independent degree of freedom.) Again, calculate the number of possibilities, and it multiplies into the total multiplicity and its entropy adds to the total entropy. In general, one doesn't bother to track each degree of freedom individually (as I've mostly done above), but categorize them into convenient groups. One might, for example, group the configuration (position and orientation) degrees together, and and the motion (linear and rotational) together, and then it'd make sense to think of the configurational and motion entropies as contributions to the total entropy (although since they aren't properly independent, their total will be a little more than the total entropy). There's also an elephant in the room that I've been carefully ignoring: all of the degrees of freedom of the atoms within the ball. While each atom has a very limited range of positions and momenta it might have, there are a ridiculously huge number of them, so the entropy contribution from these microscopic degrees of freedom completely swamp the macroscopic contributions I've been talking about. One can try to categorize these degrees of freedom (e.g. configurational entropy has to do with how many ways the polymer strands could tangle together), but usually they're all too tightly linked and it's easier to just lump them all together as thermal entropy (because they mostly have to do with exactly how the system's thermal energy is scattered among the various atoms and their degrees of freedom). Is that a bit clearer?
And is it possible, if there is an increase in a system’s thermal degrees of freedom or an increase in all of the system’s degrees of freedom, that there is a loss of something that can be associated with the concept of information? If there is an increase in freedom is there also a corresponding increase in uncertainty in some sense?
Usually, the degrees of freedom themselves don't change, but the multiplicity of states within them does. For example, if you add heat to a system, you get an entropy increase because your have more energy scattered among the same degrees of freedom (and hence more ways to scatter it). There are ways to relate entropy to information, but they don't have a great deal to do with the sort of information most people think of. The thermodynamic entropy of a system is proportional to the amount of information needed to fully describe the exact microstate of that system. But this is a huge amount of uninteresting information (think about book after book cataloging the location of every atom...) -- there might be some interesting information mixed in, but it's going to be completely swamped by the thermal component. (Note that you can think of this either as positive information -- information inherent in the system's state -- or negative information -- information we don't have about the system's state. So it's a little hard to define what's an increase vs. what's a decrease.) To give you some idea what I mean by "swamped", consider that the absolute entropy of 1 gram of water (= 1 cc = about a thimblefull) at 1 atmosphere of pressure and 25 degrees Celsius (= 77 Farenheit) is 3.88 Joules/Kelvin, which corresponds to 4.05e23 bits (= 50.7 zettabytes) of information. That's safely more than the total data processed by the world's servers last year. Heating that 1 gram of water by 1 degree Celsius (- 1.8 degrees Farenheit) would increase its entropy by 0.0140 Joules/Kelvin, which would correspond to 1.46e21 bits of additional information needed to specify its microstate. Cooling the water would decrease its entropy by about the same amount. But none of this corresponds to anything most people would recognize as information.... Gordon Davisson
Mung: Why doesn’t Shannon Information require a measurement? Because, by definition, Shannon's own definition to be precise, information is in units of binary digits (bits) and in digital systems (i.e. information systems) or their examples (e.g. coin toss, dies) we count the number of bits which is how we take a measurement, we simply count. We can count bits because all we need to agree on is the definition of what determines the binary states (i.e. 1's and 0's, heads or tails, black or white, etc.) and then we count or compute. We don't need to "measure" with some instrument the value of either 1 or 0, or longer strings thereof. But in analog systems, there is an infinitude of continuous values. In an analog context we are concerned with more than pure black or white (which was an arbitrary binary choice), rather we become concerned with all the shades of grey in between and what is their reflectivity, absorption, granularity, resolution, and because we can no longer simply count binary states we now have to resort to measurements with various instruments. In the analog world, one seldom finds either perfect extreme of black or white, but instead some mixture of grey and measurement is the only way to know its state or value, and even then measurements will be imperfect. The answer to your question basically lies in the difference between analog and digital. Accordingly, the answer to How much information is there in a two-sided coin, and why? is 1 bit, because as opposed to the monetary realm wherein a coin may have the values of 1 cent, 5 cents, 10 cents, 25 cents, 50 cents, 1 dollar, 20 dollars, 50 dollars, etc. (considering only US coins), in the digital realm, by definition, a coin only has two values (heads or tails) and those two values can be represented with a single bit having a value of either 1 or 0. If we substitute a 6 sided die, now what is the information content, and why? What are the probability distributions, and how are they related to the information content? I gave you a link earlier that showed this: http://hyperphysics.phy-astr.gsu.edu/hbase/therm/entrop2.html It answered many of your questions. I agree that we need to consider the number of states, but just what, exactly, are we talking about? Microstates. Macrostates. Probablitties. Distributions? In binary digital systems, a "state" is simply the value of the bit, byte, word etc. For example, a bit can have either of 2 states: 1 or 0. A byte can be in any of 256 states because 8-bits can encode the values 0-255. A "macrostate" generally refers to a particular "combination" while a "microstate" generally refers to a "permutation". See again the above link. We randomize 8 bits. That gives those 8 bits the maximum Shannon entropy. Why? Because a randomized 8-bits can have any of 256 possible values (the "byte" can be in any of 256 possible states). The maxmimum Shannon entropy (or maxmimum information entropy) is just the maximum possible number of states, which for an 8-bit byte is 256. How is it that any physical system could increase or decrease in information unless we were asking a question or taking a measurement? How is it that a tree can fall in the forest if no one is present to see it? Charles
Scenario #1: The contents of the RAM are randomized. Its Shannon-entropy rises to the maximum possible, 8 bits per byte. (This is very similar to what happens in Charles’ example with S-O-S encoded in cans.)
I love talking 1's and 0's. I can understand 1's and 0's. On/Off, Yes/No, True/False. Black/White. Two sided coin. Heads/Tails. How much information is there in a two-sided coin, and why? If we substitute a 6 sided die, now what is the information content, and why? What are the probability distributions, and how are they related to the information content? Surely our uncertainty is lower with a two sided coin. I mean, only two options, right? Heads or tails. And if our uncertainty is lower, is the information content higher? But is out uncertainty really higher with the 6-sided die? What if all we are interested in is even/odd? 2/4/6 vs 1/3/5?
By your definition, the “information” is the 7-bit ASCII letter (you explicitly affirm this by the phrase “the information was all in the lower 7 bits, and is intact”). Agreed. But further, the information is actually in 2 states: a) ASCII letter with the hi-order bit=1 and b) ASCII letter with the hi-order bit=0
I agree that we need to consider the number of states, but just what, exactly, are we talking about? Microstates. Macrostates. Probablitties. Distributions? We randomize 8 bits. That gives those 8 bits the maximum Shannon entropy. Why? Why doesn't Shannon Information require a measurement? How is it that any physical system could increase or decrease in information unless we were asking a question or taking a measurement? Mung
BA77:
But one thing I do note is that Landauer’s principle is fairly rigorously establish, and the technical violation of it, which I listed in 75, was only stated ‘as possible’ i.e. they did not actually violate the principle but only showed it to be possible! Thus, I’m very reserved to accept your conclusion.
I don't think any of my conclusions depend on the spin paper, and I'm not advocating a violation of Laudauer's principal (other than maybe generalizing it to include spin entropy as well as thermal). Only the third of my scenarios involved erasure (in the strict sense where Landauer applies), and there I certainly agree there'll need to be a compensating entropy increase of some sort. Gordon Davisson
Charles, for someone just 'taking a stab at it', I would say that is a very admirable 'stab' at clarifying. Thanks once again! bornagain77
Gordon Davisson: I'll take a stab at clarifying the applicability of Lewis' point. Between the three scenarios, we’ve got information loss with an increase in Shannon-entropy [scenario #1], an information loss with a decrease in entropy [scenario #3], and an increase in entropy with no information loss [scenario #2]. There seem some inconsistencies in your definitions/comparisons: By your definition, the "information" is the 7-bit ASCII letter (you explicitly affirm this by the phrase "the information was all in the lower 7 bits, and is intact"). Agreed. But further, the information is actually in 2 states: a) ASCII letter with the hi-order bit=1 and b) ASCII letter with the hi-order bit=0 Between the three scenarios, we’ve got ... information loss with an increase in Shannon-entropy This is "Scenario #1: The contents of the RAM are randomized." Memory is randomized, information is lost and multiplicity (entropy) is maxmium (because there are 128 ways the lo-order 7-bits can be randomized and not retain the original ASCII letter. ... an increase in entropy with no information loss This is "Scenario #2: Only the 8th bit of each byte of RAM is randomized ... But the Shannon-entropy has now risen to 2 bits per byte." Yes, you have a Shannon entropy of 2 because as per your definitions, the high-order bit is not used in the ASCII letter, hence there are 2 valid states containing the ASCII letter (as noted above). So, agreed, no information was lost by randomizing the hi-order bit, but you state "an increase in entropy with no information loss". But, to what other entropy state are you comparing? In scenario 1 all 8-bits were randomized, representing the highest multiplicity (or entropy) possible, and by comparison of scenario 1 with scenario 2, scenario 2 has lower entropy than scenario 1. If the comparison is strictly within scenario 2, wherein the hi-order bit being randomized is compared with the hi-order bit not being randomized, there are regardless the same 2 valid informational states having a multiplicity of 2 (as noted above), which is the same entropy, *not* an increase. ... an information loss with a decrease in entropy, ... In "Scenario #3: The RAM is powered off, and all of the charges storing the 1 bits drain gradually away. The memory is now all 0s," As a practical matter of the example chosen, the state "without power" is in digital systems actually an undefined condition. Powerless dynamic ram has neither 1's nor 0's. If you're going to argue bubble or magnetic memory that doesn't need power to maintain state, then draining power is of no consequence because the bits won't change state (ignoring long-term decay) and no information is lost. But draining power from dynamic memory causes a loss of information. Even erased memory that was all 0's loses its "erased" condition when the power is turned off, and must be reinitialized to all 0's when power is restored. The thermodynamic entropy [S=klnW] of the 'powered off' RAM increased as its heat dissipates and its temperature drops. This is further confirmed by the amount of energy input required to restore RAM to its powered-on "erased" state. The information entropy [H=lnM] did not change because the information entropy of the single bit pattern comprising an ASCII letter is no different than the information entropy of the single bit pattern comprising all 0's; either can be encoded in only a single way. Note that M=1 (because there is only 1 way to encode all 0's or only 1 way to encode a specific ASCII letter) is the number of "Messages" (whose probability actually occured) out of a 7-bit Message space (all possible ASCII letters). But comparatively, the information entropy is insignificant relative to the thermodynamic entropy (because of the relatively huge number of semiconductor 'microstates' and Boltzmann's constant). But further note there is no information entropy definable for powered off RAM. If however you intended to simply re-initialize or "erase" all bits to 0's, magically without an expenditure of energy input, there is still an inconsistency of definition. The original memory definition was 8-bit bytes, the hi-order bit being ignored and the lo-order 7 bits containing specific information (an ASCII letter). The multiplicity of the original ASCII letter state (ignoring the hi-order bit) is 1 (there is only a single bit pattern that comprises the ASCII letter. But the multiplicity of the all 0's state is also 1 (there is only a single bit pattern that comprises all 0's). To change from the specific ASCII letter state to the specific all 0's state involves no net change in multiplicity, hence no net change in information entropy, but the state change is irreversible in that the information state of the ASCII letter is irretrievably lost. Landauer argued:
... there is no thermodynamic objection to a logically reversible operation potentially being achieved in a physically reversible way in the system. It is only logically irreversible operations — for example, the erasing of a bit to a known state, or the merging of two computation paths — which must be accompanied by a corresponding entropy increase. When information is physical, all processing of its representations, i.e. generation, encoding, transmission, decoding and interpretation, are natural processes where entropy increases by consumption of free energy.[4]
So in fact, scenario 3 is essentially "erasure" and represents an irreversible loss of the ASCII letter information state but with no net change in information entropy because the multiplicity of the ASCII and erased states are the same, but a real world gain of thermodynamic entropy when RAM is powered off (as confirmed by energy input being required to restore prior state). So where are we now? In scenario #3, we have an information loss linked to a decrease in Shannon-entropy and an increase in thermal or spin or some other component of the thermodynamic entropy. In #1, we have an information loss linked to an increase in Shannon-entropy, and possibly a decrease in thermal, spin, etc entropy. In #2, we have no loss of information, but an increase in Shannon-entropy, and again possibly a decrease in thermal, spin, etc entropy. As before, I don’t see any direct link between information loss and any particular change in any particular type of entropy. Summarizing my agreement and disagreement and taking Shannon entropy to mean information entropy, then: In scenario 3 we disagree, as there is both an irreversible loss of the ASCII information and also a gain of "all 0's" information (an equally probable albeit different logical state), but with an increase in thermodynamic entropy. I've not yet taken a position on whether that cost can be paid with spin. In scenario 1, we are agreed on "an information loss linked to an increase in Shannon-entropy" but no, as to a thermodynamic entropy decrease as well as some energy input is required to effect a new (randomized) equilibrium. The system is not isolated as energy (electric or magnetic) must be input to effect the state change. In scenario 2, we are agreed there is no information loss, but disagree as neither is there an increase in Shannon entropy nor any possible decrease in thermodynamic entropy as there was no change to the lo-order 7-bits encoding the ASCII letter, but some energy input required to effect a new equilibrium (randomize only the hi-order bit), as again the system is not isolated as energy (electric or magnetic) must be input to effect the randomizing state change. Lewis noted: "Gain in entropy always means loss of information, and nothing more". In scenarios 1 and 3 there was a loss of information and a related gain in thermodynamic entropy. In scenario 2, there was no loss of information, but neither was there a loss of thermodynamic entropy for the lo-order 7 bits comprising the ASCII letter. In all three scenarios, Lewis's point has not been disproven. Charles
Gordon Davisson @106:
(Note: there’s an opportunity for confusion here: thermodynamic entropy and thermal entropy are not the same thing. Thermal entropy is just the entropy in a system’s thermal degrees of freedom, while the thermodynamic entropy is the total entropy in all of the system’s degrees of freedom. Thermal entropy is one contribution to the thermodynamic entropy, but not the whole thing.)
Hi Gordon. Thanks for your posting. While it is true that thermodynamic entropy and thermal entropy are not the same thing would you agree or disagree that entropy in both has the same meaning if entropy is understood as "degrees of freedom" in the system. Can we connect "degrees of freedom" with "multiplicity" and can both be described by the same equation? And is it possible, if there is an increase in a system’s thermal degrees of freedom or an increase in all of the system’s degrees of freedom, that there is a loss of something that can be associated with the concept of information? If there is an increase in freedom is there also a corresponding increase in uncertainty in some sense? Mung
Gordon this tidbit may interest you: Maxwell's demon demonstration turns information into energy - November 2010 Excerpt: Until now, demonstrating the conversion of information to energy has been elusive, but University of Tokyo physicist Masaki Sano and colleagues have succeeded in demonstrating it in a nano-scale experiment. In a paper published in Nature Physics they describe how they coaxed a Brownian particle to travel upwards on a "spiral-staircase-like" potential energy created by an electric field solely on the basis of information on its location. As the particle traveled up the staircase it gained energy from moving to an area of higher potential, and the team was able to measure precisely how much energy had been converted from information. http://www.physorg.com/news/2010-11-maxwell-demon-energy.html bornagain77
Gordon, not that I am all that well versed on the technicalities of Landauer's principle or even the technicalities of Maxwell's demon which accompanies it, and indeed I scarcely followed some of the technical points of your own argument. But one thing I do note is that Landauer's principle is fairly rigorously establish, and the technical violation of it, which I listed in 75, was only stated 'as possible' i.e. they did not actually violate the principle but only showed it to be possible! Thus, I'm very reserved to accept your conclusion. bornagain77
“Gain in entropy always means loss of information, and nothing more.” Gilbert Newton Lewis
I think the relation between information and entropy is rather less direct than this. To illustrate, consider three scenarios. All will start with the same thing: some computer memory containing ASCII-encoded English text. English generally has around 1 bit of Shannon (information-theoretic) entropy per letter, so the Shannon-entropy of this computer memory starts at 1 bit per byte of RAM. Scenario #1: The contents of the RAM are randomized. Its Shannon-entropy rises to the maximum possible, 8 bits per byte. (This is very similar to what happens in Charles' example with S-O-S encoded in cans.) Scenario #2: Only the 8th bit of each byte of RAM is randomized. Since ASCII only uses 7 bits (as opposed to more modern encodings that use all 8 bits to allow more characters), the information was all in the lower 7 bits, and is intact. But the Shannon-entropy has now risen to 2 bits per byte. Note that neither of the above is particularly realistic. Computer memory (specifically dynamic RAM) stores information as the presence or absence of an electric charge in a very small capacitor (I'll assume 1=charged, 0=uncharged, although it actually depends on the read/store circuitry). For the third scenario, let me look at what actually happens… Scenario #3: The RAM is powered off, and all of the charges storing the 1 bits drain gradually away. The memory is now all 0s, so there's only one possible state, and the Shannon-entropy is zero. (Note: if the memory doesn't follow the uncharged=0 convention, the contents won't be zero, but there'll still be only one possible state to the entropy will still be zero). Between the three scenarios, we've got information loss with an increase in Shannon-entropy, an information loss with a decrease in entropy, and an increase in entropy with no information loss. Based on this, I don't think you can make any direct correlation between information loss and entropy change. Now, to make things more complicated, let me try adding thermodynamic entropy to the mix. According to theory (at least as I understand it), information stored in a physical system contributes to the thermodynamic entropy of the system at the rate of 1 bit of Shannon entropy = k (Boltzmann's constant) * ln(2) = 9.57e-24 Joule/Kelvin of thermodynamic entropy. This contribution is very small, and as far as I know it's never actually been measured; but I find the arguments for it convincing, so I'll assume it's real. What does this do with the scenarios above? Well, in scenario #3, it means that the decrease in Shannon-entropy corresponds to a (very small) decrease in thermo-entropy, which (per the second law) can only occur if it's coupled to an at-least-as-large increase in some other entropy component, so thermo-entropy overall doesn't go down (this is Landauer's principle, as BA77 mentioned in #75). And in real life, that's certainly what happens: as the charge in the memory cells drains away, its energy is converted to heat, increasing the thermal component of entropy by E/T (that is, the drained energy divided by the temperature). As long as the energy per bit is much larger than kT (which it is), this'll outweigh the decrease due to the lost Shannon-entropy, and the second law is satisfied. (Note: there's an opportunity for confusion here: thermodynamic entropy and thermal entropy are not the same thing. Thermal entropy is just the entropy in a system's thermal degrees of freedom, while the thermodynamic entropy is the total entropy in all of the system's degrees of freedom. Thermal entropy is one contribution to the thermodynamic entropy, but not the whole thing.) If I understand the research BA77 linked in #75 (I haven't read the original paper), the entropy decrease could also be compensated by an increase in spin entropy. This isn't really all that suprising, as various forms of entropy are generally interconvertable, as long as the total doesn't decrease. Now, in scenarios #1 and #2, there's an increase in Shannon-entropy; this could theoretically be coupled to a decrease in some other component of entropy (thermal, spin, whatever). Charles Bennett has proposed an information-powered heat engine that'd do more-or-less this, although it's never been built (see section 5 of Bennett, Charles H. (1982), "The thermodynamics of computation -- a review", _International Journal of Theoretical Physics_, v. 21, pp. 905-940). So where are we now? In scenario #3, we have an information loss linked to a decrease in Shannon-entropy and an increase in thermal or spin or some other component of the thermodynamic entropy. In #1, we have an information loss linked to an increase in Shannon-entropy, and possibly a decrease in thermal, spin, etc entropy. In #2, we have no loss of information, but an increase in Shannon-entropy, and again possibly a decrease in thermal, spin, etc entropy. As before, I don't see any direct link between information loss and any particular change in any particular type of entropy. Gordon Davisson
Mung: Does [S = log W] then make sense? Yes. Though, more precisely Entropy is the natural log of Multiplicity, and is also sometimes (historically) expressed as Boltzmann's constant times the natural log of Multiplicity Charles
If all that’s needed for an increase in entropy is an increase in multiplicity, why isn’t that alone sufficient to bring about a loss of information? It is.
So on the cover of A Farewell to Entropy is the following: S = log W Does that then make sense? Mung
Mung: It assumes that you take entropy to be a physical quantity. Entropy is a physical quantity; whether derived from a system's heat and temperature, or from a system's multiplicity, it is a physical quantity. What is the physical/material quantity that is being changed in your cans example Their physical orientation. The physical directions in which they face. Changing the orientation of the cans is an increase in entropy? In the example I gave, yes. There is a very large "multiplicity" of orientations which have no informational meaning but only 1 orientation which means S-O-S. A change of state from the "multiplicity" of 1 meaning S-O-S to any of the very much larger "multiplicity" orientation states having no informational meaning, is an increase in entropy. And if we change them back to spell out SOS again, is that a decrease in entropy? Yes. And that requires energy input to the system. You don’t think that’s a fruitful idea to explore? That there might be a relationship between mutiplicity and information such that an increase in the one would count as a loss in the other? Its fruit has been explored and even indexed by Google, however novel you may find it. If all that’s needed for an increase in entropy is an increase in multiplicity, why isn’t that alone sufficient to bring about a loss of information? It is. Charles
You had asked “how can a change in a physical/material quantity cause a loss of something non-physical/non-material?"
We're talking about entropy. How can an increase in entropy cause a loss of information. My question should be understood in that context. Here's what I wrote:
IOW, if entropy is physical, and information is not, how can a change in a physical/material quantity cause a loss of something non-physical/non-material?
It assumes that you take entropy to be a physical quantity. I don't think that a message written in the snow that disappears when the snow melts is what Lewis had in mind. Do you? What is the physical/material quantity that is being changed in your cans example and my snow example? Can it be shown to be equivalent to the change in entropy? Changing the orientation of the cans is an increase in entropy? And if we change them back to spell out SOS again, is that a decrease in entropy? Or do we also get information gain through increase in entropy? You wrote:
So an increase in multiplicity is an increase in entropy.
You don't think that's a fruitful idea to explore? That there might be a relationship between mutiplicity and information such that an increase in the one would count as a loss in the other? You even go on to write:
...which state(s) for the purposes of encoding information have a commensurately greater disorder...
So you even make the relationship between encoding and multiplicity. In the cans and morse code example, what would correspond to an increase in multiplicity. An increase in possible encodings? An increase in possible messages? Lewis (1930):
Gain in entropy always means loss of information, and nothing more.
Or to connect this with your words:
So an increase in multiplicity is an increase in entropy, which means loss of information, and nothing more.
If all that's needed for an increase in entropy is an increase in multiplicity, why isn't that alone sufficient to bring about a loss of information? Mung
Mung: What you are talking about, essentially, seems to be noise entering a communication channel. No, I gave a clear example of how a change in a physical encoding medium can lose non-physical information. The original signal (S-O-S) was not obscured by noise, the signal itself (the cans) was disordered. The same disordering could be caused by you deliberately changing the orientation of the cans yourself. The mechanism by which the cans are disordered/re-ordered is irrelevant. The point remains the information conveyed in their ordering/orientation is intangible, abstract and immaterial while the encoding medium is physical. You had asked "how can a change in a physical/material quantity cause a loss of something non-physical/non-material?", not how can noise (a physical phenomena) obscure a signal (another physical phenomena). Charles
p.s. Getting back to multiplicity. I think there is a direct correlation between the coin example i've offered and a physical system in which there is an increased number of microstates (from 8 to 16) consistent with a given macrostate. You now "know less" than you did before about which box contains the coin. Mung
Assume, hypothetically...
Hi Charles, I'm not sure how this relates to an increase in entropy. Assume, hypothetically, that I wrote HELP in the snow and the sun came out and melted the snow. (increase in entropy?). But then I wouldn't need help. :) To be sure Lewis (1930) wrote before Shannon (1948), but he also wrote after Hartley (1928). One wonders what he had in mind. What you are talking about, essentially, seems to be noise entering a communication channel. I don't see that as an increase in entropy in either a physical or a communications theory sense of the term. Mung
How much information would you require?
In my "coin in a box" example in @95 above, consider answering the question of how much information in terms of bits, as the number of yes/no questions you would need to have answered in order to locate the coin.
Information must not be confused with meaning.... To be sure, this word information in communication theory relates not so much to what you do say, as to what you could say. That is, information is a measure of one's freedom of choice when one selects a message. If one is confronted with a very elementary situation where he has to choose one of two alternative messages, then it is arbitrarily said that the information, associated with this situation, is unity.... The amount of information is defined, in the simplest cases, to be measured with the logarithm of the number of available choices. (Shannon and Weaver 1949, 8-9)
Mung
Mung: IOW, if entropy is physical, and information is not, how can a change in a physical/material quantity cause a loss of something non-physical/non-material? Assume, hypothetically, that your phone and power are out at your mountain cabin, in winter, your vehicle is stuck in snow drifts, and temps are sub-zero and will be for weeks. To passersby, you signal an "S-O-S" in morse code (3 short, 3 long, 3 short) by putting 9 tin cans on a fence rail in front of your cabin: 3 cans horizontal end-on facing the road, 3 horizontal length-wise, and 3 more horizontal end-on facing the road. Obviously your call for help (information) encoded in the cans (physical medium) is only understood by those who know both morse code and the meaning of S-O-S (an intangible mental agreement on meaning). Assume further that a wind gust disturbs the cans so they all randomly spin around such that some are at an angle and some have the opposite orientation they had previously. The S-O-S (information) is no longer encoded, it exists (metaphysically?) only in your mind (encoded in neurons yes/no?) but all 9 cans (the medium) remain with every physical quantity and attribute (metalurgical composition, density, mass, reflectivity, tensile strength, static charge, etc.) exactly as before on the fence rail. Information was lost but the physical encoding medium remains, although its "state" has changed (from an ordered position to a disordered position), and energy is required to go back outside and twist them back into the proper, ordered S-O-S orientation, i.e. to re-encode the information as before. Charles
Mung: "Q3: As a system approaches absolute zero, does it move closer to equilibrium?" Wouldn't it be a truism that it does? Is it not the understanding that at absolute zero all molecular motion ceases; or, to put it another way, all change ceases. Would that not a description of be absolute equilibrium? "Q4: If we begin at absolute zero and add heat, does entropy increase or decrease?" Whence comes the heat? (i.e. entropy increases) Ilion
Note carefully that “information” is not the same as the “physical medium” in which it is encoded ... Information” per se ... is an abstract intangible (not physical)
Hi Charles, thanks for the comments and links. Must information be encoded, or must it only be encoded if it is to be transferred to some physical medium? What do you think? Is entropy a physical quantity? IOW, if entropy is physical, and information is not, how can a change in a physical/material quantity cause a loss of something non-physical/non-material? (Maybe you already answered taht and I just need to read your post again.)
... the physical states will move to equilibrium having greater “multiplicity” and greater entropy, which state(s) for the purposes of encoding information have a commensurately greater disorder, i.e. a loss of encoded information.
So an increase in multiplicity is an increase in entropy. But I don't think that it's "the increase in disorder" which causes a "loss of information," but rather the increase in multiplicity. Say I have 8 boxes and one coin, and the coin is in one of the boxes, and I ask you to determine which box contains the coin. The coin has the same probability of being in any one of the boxes. How much information would you require? Now say I double the number of boxes, but still one coin. Now how much information will you require to locate the coin? I'm sure you understand that this can all be expressed mathematically. I think this is a more general case, because (as far as I can tell) there is no issue of encoding. Mung
The Second Law of Thermodynamics states that the entropy of an isolated system, such as the universe, that is not in equilibrium will tend to increase over time, approaching a maximum value at equilibrium. The Third Law of Thermodynamics states that as the temperature approaches absolute zero, the entropy of a system approaches a constant. Fortunately for us, the temperature of the universe is not zero. It is moving that way each moment, but it is not there yet. – ICR
Q3: As a system approaches absolute zero, does it move closer to equilibrium? Q4: If we begin at absolute zero and add heat, does entropy increase or decrease? Mung
For what it's worth, and to those interested in the topic of information, I am really enjoying The Information: A History, A Theory, A Flood Mung
Sorry to hear about that kf. Mung
bornagain77: nice links and commentary on how deep entropy goes. Thanks... just trying to be helpful when I can. And though I agree with your comment that ‘classical information’ is for the most part ‘intangible’, there has recently been discovered ‘quantum information’, on a massive scale, in molecular biology. "Quantum information" is simply information encoded in a quantum transmission "medium", albeit a poorly understood "medium" at present. In fact the quantum entanglement experiments, the theories for dark energy, the universe seemingly to be infinite and flat, all suggest that we don't, in fact, know the "boundaries" of our universe well enough to declare it "isolated" for the purposes of applying the 2nd Law of Thermo. The Second Law of Thermodynamics states that the entropy of an isolated system, such as the universe, Actually we don't know the universe is "isolated" in thermodynamic terms. We can pose "what if" scenarios, we make various assumptions and we can propose different models and theories and try experimentally to confirm them (which is what gave rise to the search for "dark energy" and the discovery of quantum entanglement, as examples) but as yet we don't factually know the universe is "isolated", we certainly don't know where are its boundaries or how they work. The boundaries of thermodynamically "isolated" systems are well understood, quantifiable, and measurable. We can measure energy movement across or reflected at such boundaries. We know where the boundaries are and how they work. However, we can not at present say the same about the "boundaries" of our universe, not at the quantum scale (as quantum entanglement and quantum vacuum suggest) nor at the cosmological scale as we can't even probe those limits as yet. We only know the universe seems to have zero (within 2%) curvature or is "flat" but may have infinite expanse. None of which tells us about the nature of the universe's boundaries for thermodynamic purposes. Charles
Okay: Been busy doing a data recovery today, I guess a practical lesson on entropy. Just a note: when "erasure" is used it is actually resetting. Beyond that, I think the voltage and volume should be turned down G kairosfocus
Charles, nice links and commentary on how deep entropy goes. ,,, And though I agree with your comment that 'classical information' is for the most part 'intangible', there has recently been discovered 'quantum information', on a massive scale, in molecular biology. Information that has a lot more 'tangibility' to it than 'classical information. Quantum Information/Entanglement In DNA & Protein Folding - short video http://www.metacafe.com/watch/5936605/ Quantum entanglement holds together life’s blueprint - 2010 Excerpt: “If you didn’t have entanglement, then DNA would have a simple flat structure, and you would never get the twist that seems to be important to the functioning of DNA,” says team member Vlatko Vedral of the University of Oxford. http://neshealthblog.wordpress.com/2010/09/15/quantum-entanglement-holds-together-lifes-blueprint/ The relevance of continuous variable entanglement in DNA – June 21, 2010 Abstract: We consider a chain of harmonic oscillators with dipole-dipole interaction between nearest neighbours resulting in a van der Waals type bonding. The binding energies between entangled and classically correlated states are compared. We apply our model to DNA. By comparing our model with numerical simulations we conclude that entanglement may play a crucial role in explaining the stability of the DNA double helix. http://arxiv.org/abs/1006.4053v1 Quantum no-hiding theorem experimentally confirmed for first time - March 2011 Excerpt: In the classical world, information can be copied and deleted at will. In the quantum world, however, the conservation of quantum information means that information cannot be created nor destroyed. http://www.physorg.com/news/2011-03-quantum-no-hiding-theorem-experimentally.html Moreover Charles, there is persuasive evidence that this quantum information is what is the main entity that is constraining living systems to be so far out of thermodynamic equilibrium; Information and entropy – top-down or bottom-up development in living systems? A.C. McINTOSH – May 2010 Excerpt: It is proposed in conclusion that it is the non-material information (transcendent to the matter and energy) that is actually itself constraining the local thermodynamics to be in ordered disequilibrium and with specified raised free energy levels necessary for the molecular and cellular machinery to operate. http://journals.witpress.com/journals.asp?iid=47 Does DNA Have Telepathic Properties?-A Galaxy Insight Excerpt: DNA has been found to have a bizarre ability to put itself together, even at a distance, when according to known science it shouldn't be able to. Explanation: None, at least not yet.,,, The recognition of similar sequences in DNA’s chemical subunits, occurs in a way unrecognized by science. There is no known reason why the DNA is able to combine the way it does, and from a current theoretical standpoint this feat should be chemically impossible. http://www.dailygalaxy.com/my_weblog/2009/04/does-dna-have-t.html Quantum states in proteins and protein assemblies: The essence of life? - Hameroff Excerpt: The unitary oneness and ineffability of living systems may depend on mesoscopic/macroscopic quantum states in protoplasm. http://www.tony5m17h.net/SHJTQprotein.pdf and this,,, The Unbearable Wholeness of Beings - Steve Talbott Excerpt: Virtually the same collection of molecules exists in the canine cells during the moments immediately before and after death. But after the fateful transition no one will any longer think of genes as being regulated, nor will anyone refer to normal or proper chromosome functioning. No molecules will be said to guide other molecules to specific targets, and no molecules will be carrying signals, which is just as well because there will be no structures recognizing signals. Code, information, and communication, in their biological sense, will have disappeared from the scientist’s vocabulary. http://www.thenewatlantis.com/publications/the-unbearable-wholeness-of-beings The ‘Fourth Dimension’ Of Living Systems https://docs.google.com/document/pub?id=1Gs_qvlM8-7bFwl9rZUB9vS6SZgLH17eOZdT4UbPoy0Y bornagain77
The Second Law of Thermodynamics states that the entropy of an isolated system, such as the universe, that is not in equilibrium will tend to increase over time, approaching a maximum value at equilibrium. The Third Law of Thermodynamics states that as the temperature approaches absolute zero, the entropy of a system approaches a constant. Fortunately for us, the temperature of the universe is not zero. It is moving that way each moment, but it is not there yet. - ICR
Q1: Is entropy highest or lowest at equilibrium? Why? Q2: Is entropy highest or lowest at absolute zero? Why? Mung
Mung, bornagain77: Ok, FWIW, I'll take a stab at clarifying how "entropy gain is information loss": Note carefully that "information" is not the same as the "physical medium" in which it is encoded: "Information" per se (e.g. 111100001111; a sequence of 4 on bits, 4 off bits, and 4 on bits) is an abstract intangible (not physical), but the medium in which the information is encoded is physical and tangible and can be transistors on a chip, voltages on a bus, cans on a fence rail, or nucleotides in a DNA strand. If the encoding system is closed or isolated, then even though the energy within is conserved, the physical states will move to equilibrium having greater "multiplicity" and greater entropy, which state(s) for the purposes of encoding information have a commensurately greater disorder, i.e. a loss of encoded information. To restore order or encode information requires energy input, which by definition requires an open system (not isolated) into which encoding energy can be input. From the Physorg link http://www.physorg.com/news/2011-01-scientists-erase-energy.html bornagain77 posted: For example, Landauer said that information is physical because it takes energy to erase it. We are saying that the reason it is physical has a broader context than that.” A more precise statement would be that information is abstract yet always encoded in physical states, which physical states invariably require energy input to alter those physical states such that information is re-encoded (i.e. erased; erasure being a specific information state e.g. all 0's, or all 1's, or a pseudo-random bit pattern, etc.). Earlier from the same Physorg link: theoretically, information can be erased without using any energy at all. Instead, the cost of erasure can be paid in terms of another conserved quantity, such as spin angular momentum. Erasure (of information) still costs something, though I've not thought through (yet) how that cost can be paid with spin angular momentum. “Gain in entropy always means loss of information, and nothing more.” Because intangible information is encoded in physical mediums (e.g. molecular lattice in a crystal), when the physical medium becomes disordered (e.g. the crystal melts) then its entropy has increased even though the encoding material remains (albeit in a different state), but its encoded information (the lattice) is lost. The entropy (of the crystal) increased and information (the lattice) is lost. See as background on entropy and disorder: http://hyperphysics.phy-astr.gsu.edu/hbase/therm/entropcon.html http://hyperphysics.phy-astr.gsu.edu/hbase/therm/entrop.html http://hyperphysics.phy-astr.gsu.edu/hbase/therm/entrop2.html http://hyperphysics.phy-astr.gsu.edu/hbase/thermo/seclaw.html#c1 Charles
DrBOT; this may be of interest to you; Evolutionary Algorithms: Are We There Yet? - Ann Gauger Excerpt: In the recent past, several papers have been published that claim to demonstrate that biological evolution can readily produce new genetic information, using as their evidence the ability of various evolutionary algorithms to find a specific target. This is a rather large claim.,,,,, As perhaps should be no surprise, the authors found that ev uses sources of active information (meaning information added to the search to improve its chances of success compared to a blind search) to help it find its target. Indeed, the algorithm is predisposed toward success because information about the search is built into its very structure. These same authors have previously reported on the hidden sources of information that allowed another evolutionary algorithm, AVIDA [3-5], to find its target. Once again, active information introduced by the structure of the algorithm was what allowed it to be successful. These results confirm that there is no free lunch for evolutionary algorithms. Active information is needed to guide any search that does better than a random walk. http://biologicinstitute.org/2010/12/17/evolutionary-algorithms-are-we-there-yet/ bornagain77
Mung you ask; So you agree your second source contradicted your first source? NO! bornagain77
Mung, I posted the second resource for a very technical reason having to do with how Landauer was trying to make information merely a emergent property of material particles.
So you agree your second source contradicted your first source? How was anyone reading your post supposed to understand your intent? So are you now arguing that information is non-physical? What about entropy, is it also non-physical?
Gain in entropy always means loss of information, and nothing more.
Why? How so? On the one hand we have a physical system. On the other hand we have something called information. How does entropy tie the two together?
None-the-less I’m glad you agree with the basic premise of the Landuaer resource and am glad you ‘finally’ see how deeply the Lewis quote describes the relation between entropy and information.
Funny guy. Mung
wrt information. I wonder if this is the insight John von Neumann had when he suggested to Shannon that he should name his term entropy. (And no, I don't mean because no one knows what it is.) But as something mathematical, a certain kind of quantity. Mung
Mung, I posted the second resource for a very technical reason having to do with how Landauer was trying to make information merely a emergent property of material particles. One that I certainly do not want to go through with you. None-the-less I'm glad you agree with the basic premise of the Landuaer resource and am glad you 'finally' see how deeply the Lewis quote describes the relation between entropy and information. bornagain77
Indeed, when it comes to that, in the end energy is just as mysterious a concept as entropy. Feynman in his famous Lectures:
There is a fact, or if you wish, a law, governing all natural phenomena that are known to date. There is no known exception to this law—it is exact so far as we know. The law is called the conservation of energy. It states that there is a certain quantity, which we call energy, that does not change in manifold changes which nature undergoes. That is a most abstract idea, because it is a mathematical principle; it says that there is a numerical quantity which does not change when something happens. It is not a description of a mechanism, or anything concrete; it is just a strange fact that we can calculate some number and when we finish watching nature go through her tricks and calculate the number again, it is the same. [The Feynman Lectures on Physics]
Beautiful. Can we therefore say almost exactly the same thing about entropy and/or information? How would we phrase it? wrt to entropy. It is not a description of a mechanism, or anything concrete; it is just a strange fact that we can calculate some number and when we finish watching nature go through her tricks and calculate the number again, it is either the same, or has increased. That is a most abstract idea, because it is a mathematical principle; it says that there is a numerical quantity which does not decrease when something happens. Mung
ME:
Yet you cannot even begin to explain why the statement is true, probably don’t even know what it means, and yet you continue to post it anyways.
bornagain77:
pretty mean spirited mung!
I want you to go back through your posts and look at the comments you've made to me and about me and what I think to be the case. I don't think what I wrote was at all mean spirited. I am asking you to demonstrate that you understand the material you're quoting. I also don't think you're following along with what I've said, both in this thread and in others that explore entropy and information. If anything, I've been agreeing with Lewis and am trying to make the same case. But it's important to me to understand why. You mention "Landauer’s principle." In another thread I pretty much made the same point without even having heard of it. But then you post another quote that seems to directly contradict the previous one.
But a new study shows that, theoretically, information can be erased without using any energy at all.
Just sayin' Mung
...even when you are wrong.
lol. Like I wrote elsewhere, if wrong were exercise I'd be slim and trim. The point is that we're all learning. Would it even be possible to learn without right and wrong? Hmm... Perhaps a topic for another thread. Mung
DrBOT states; 'Evolving machine code will always be slightly easier than evolving a high level language like C because there is no compiler enforcing strict syntactic rules.' And DrBOT, exactly where is the compiler enforcing strict syntactic rules to explain the 'high level language' in DNA that is far, far more advanced than any computer code man has ever written????; ------- "The manuals needed for building the entire space shuttle and all its components and all its support systems would be truly enormous! Yet the specified complexity (information) of even the simplest form of life - a bacterium - is arguably as great as that of the space shuttle." J.C. Sanford - Geneticist - Genetic Entropy and the Mystery Of the Genome 'The information content of a simple cell has been estimated as around 10^12 bits, comparable to about a hundred million pages of the Encyclopedia Britannica." Carl Sagan, "Life" in Encyclopedia Britannica: Macropaedia (1974 ed.), pp. 893-894 Also of interest is that a cell apparently seems to be successfully designed along the very stringent guidelines laid out by Landauer's principle of 'reversible computation' in order to achieve such amazing energy efficiency, something man has yet to accomplish in any meaningful way for computers: Notes on Landauer’s principle, reversible computation, and Maxwell’s Demon - Charles H. Bennett Excerpt: Of course, in practice, almost all data processing is done on macroscopic apparatus, dissipating macroscopic amounts of energy far in excess of what would be required by Landauer’s principle. Nevertheless, some stages of biomolecular information processing, such as transcription of DNA to RNA, appear to be accomplished by chemical reactions that are reversible not only in principle but in practice.,,,, http://www.hep.princeton.edu/~mcdonald/examples/QM/bennett_shpmp_34_501_03.pdf The Coding Found In DNA Surpasses Man's Ability To Code - Stephen Meyer - video http://www.metacafe.com/watch/4050638 DNA - Evolution Vs. Polyfuctionality - video http://www.metacafe.com/watch/4614519/ DNA - Poly-Functional Complexity equals Poly-Constrained Complexity http://docs.google.com/Doc?docid=0AYmaSrBPNEmGZGM4ejY3d3pfMjdoZmd2emZncQ etc.. etc.. The Capabilities of Chaos and Complexity - David L. Abel - 2009 Excerpt: "A monstrous ravine runs through presumed objective reality. It is the great divide between physicality and formalism. On the one side of this Grand Canyon lies everything that can be explained by the chance and necessity of physicodynamics. On the other side lies those phenomena than can only be explained by formal choice contingency and decision theory—the ability to choose with intent what aspects of ontological being will be preferred, pursued, selected, rearranged, integrated, organized, preserved, and used. Physical dynamics includes spontaneous non linear phenomena, but not our formal applied-science called “non linear dynamics”(i.e. language,information). http://www.mdpi.com/1422-0067/10/1/247/pdf bornagain77
The problem is not with the analogy to natural systems, it is the character of the search space which the programme is designed to explore.
What are the characteristics of the search spaces that biological evolution explores? There certainly seem to be plenty of viable solutions around, and many smooth gradients between each solution. Take a look at ring species for an example: http://en.wikipedia.org/wiki/Ring_species
The discrete search therefore comes to approximate a continuous gradient/steepest descent problem in continuous variables where the final answer “falls out” like water running downhill.
I'm not sure that is strictly correct. Certainly GA's do not work for every search space but they can be very effective in large spaces with few fitness peaks - much more effectively than gradient descent.
Try evolving the code in any normal programme and you won’t get that behaviour.
Do you know why? Like I said, GA's do not work in any search space or on any system. There is a term 'Evolvability' that can be used to describe different systems. Not all systems are evolvable and the topology of the fitness landscape, for example the degree of neutrality, plays a major part. Evolving machine code will always be slightly easier than evolving a high level language like C because there is no compiler enforcing strict syntactic rules. DrBot
DrBot: I go away for a day and there's a ton of new comments. I did make a quick mention about self-evolving programmes and genetic algorithms. The problem is not with the analogy to natural systems, it is the character of the search space which the programme is designed to explore. It succeeds because the number of viable solutions is large enough compared to the total possible to make the search for better solutions possible in a reasonable time. The discrete search therefore comes to approximate a continuous gradient/steepest descent problem in continuous variables where the final answer "falls out" like water running downhill. No intelligence required, except perhaps for some short hops to avoid local minima. Try evolving the code in any normal programme and you won't get that behaviour. SCheesman
Mung this may be of help; Landauer's principle Of Note: "any logically irreversible manipulation of information, such as the erasure of a bit or the merging of two computation paths, must be accompanied by a corresponding entropy increase ,,, Specifically, each bit of lost information will lead to the release of an (specific) amount (at least kT ln 2) of heat.,,, http://en.wikipedia.org/wiki/Landauer%27s_principle ============= also of technical interest; Scientists show how to erase information without using energy - January 2011 Excerpt: Until now, scientists have thought that the process of erasing information requires energy. But a new study shows that, theoretically, information can be erased without using any energy at all. Instead, the cost of erasure can be paid in terms of another conserved quantity, such as spin angular momentum.,,, "Landauer said that information is physical because it takes energy to erase it. We are saying that the reason it is physical has a broader context than that.", Vaccaro explained. http://www.physorg.com/news/2011-01-scientists-erase-energy.html i.e. “Gain in entropy always means loss of information, and nothing more.” Gilbert Newton Lewis bornagain77
Mung:
Can we replace entropy with information-theoretic terminology?
For some things, and that is done. For other things, it is sensible to retain both for different contexts. For instance, there is a reason why the classical form of the 2nd law of thermodynamics is stated in terms of entropy, temperature and heat flow, in the key Clausius statement. Namely, in an isolated system where we have heat flow d'Q between hot body A and colder body B: ds >/= d'Q/T (The premises of classical thermodynamics can be grounded in stat mech and/or kinetic theory, but that is not about to relegate classical thermo-D to the ash heap of history. No more than is the rise of relativity and quantum theory about to push Newtonian dynamics off the stage. Both are simply far too useful and helpful in their own spheres.) If you will look at the statement of Newton's three laws of motion, for a comparison, the second is really F = dP/dt, and the first law is directly implied: F = 0 => dP/dt = 0. But, no-one is calling for dismissal of NL1. The significance and utility of understanding that inertia implies that an undisturbed body will keep its present velocity: speed and direction of motion -- including the special case of being at rest relative to an onlooker in an inertial frame of reference, is enough for it to need stating. Entropy is quantitatively and conceptually related to information, but that does not make it simply reducible to information. Ben-Naim is probably best understood as making a rhetorical point on the -- until very recently quite controversial -- informational view of thermodynamics. Indeed, when it comes to that, in the end energy is just as mysterious a concept as entropy. Feynman in his famous Lectures:
There is a fact, or if you wish, a law, governing all natural phenomena that are known to date. There is no known exception to this law—it is exact so far as we know. The law is called the conservation of energy. It states that there is a certain quantity, which we call energy, that does not change in manifold changes which nature undergoes. That is a most abstract idea, because it is a mathematical principle; it says that there is a numerical quantity which does not change when something happens. It is not a description of a mechanism, or anything concrete; it is just a strange fact that we can calculate some number and when we finish watching nature go through her tricks and calculate the number again, it is the same. [The Feynman Lectures on Physics]
GEM of TKI PS: When a system is opened up and energy is dumped in, there is a tendency for the entropy to INCREASE. That BTW is why the typ0ical "open systems" answer to the issue of entropy vs OOL and OO body plans is misdirected. Which is the point prof Sewell has been underscoring. kairosfocus
Mung, Thanks for the levity and great discussion. Your contributions are greatly appreciated, inspire thoughtful consideration, and spice things up, even when you are wrong. :-) GilDodgen
GilDodgen: "... As Granville puts it, the fact that a system is open doesn’t mean that the highly improbable automatically becomes probable. . To put it in simple terms, shining light on dirt doesn’t increase the probability that dirt will spontaneously generate life." Or, to put it in even simpler/blunter terms, setting off a nuclear device in a sleepy rackwater town does not turn it into a bustling metropolis. Ilion
What chance does a dog have against a cougar?
Chiquita the chihuahua takes on a cougar with surprising results.
Mung
Sorry, I’m a Huskies fan. Them's fightin' words. What chance does a dog have against a cougar? GilDodgen
Mung you state, 'Yet you cannot even begin to explain why the statement is true, probably don’t even know what it means, and yet you continue to post it anyways.' pretty mean spirited mung! Since I take Gilbert Newton Lewis's word over yours, in this matter, why should I not post it. You have done nothing but flail about in trying to disprove the reality of entropy and its relation to information. Perhaps I do, perhaps I don't have a basic understanding of what he means by the statement, but why should I expend the effort to elucidate to someone who is unwilling to pick up the very basics in the first place??? What exactly is my payoff for enduring such unreasonableness??? bornagain77
“Gain in entropy always means loss of information, and nothing more.”
Can we replace entropy with information-theoretic terminology?
Mung since you don’t believe in Entropy.
Well, hey, maybe I will eventually come to not believe in entropy.
The principal message of this book is that thermodynamics and statistical mechanics will benefit from replacing the unfortunate, misleading and mysterious term “entropy” with a more familiar, meaningful and appropriate term such as information, missing information or uncertainty. This replacement would facilitate the interpretation of the “driving force” of many processes in terms of informational changes and dispel the mystery that has always enshrouded entropy.
A FAREWELL TO ENTROPY Mung
Mung, I find this statement,,,; “Gain in entropy always means loss of information, and nothing more.” Gilbert Newton Lewis ,,,To be a take home statement, written by a man that very well could be without peer on the subject.
Yet you cannot even begin to explain why the statement is true, probably don't even know what it means, and yet you continue to post it anyways. Can you even tell me what kind of entropy he was talking about? Can you show how and why it applies to "genetic entropy" (whatever that is)? As the office of Hugh Ross is becoming more disordered, who or what is losing information, and why? If Ross straightens up his office does he gain information, or lose it?
Mung, the intelligent design objection to Natural Selection is that it always reduces information. It never creates information. ,,, Information is the whole key.
It does that by increasing entropy?
Mung, If you are not denying the reality of entropy, in the preceding statements, then please tell me exactly what in blue blazes you are doing???
Seeking clarity. Mung
Sorry, I'm a Huskies fan. I don't suppose you'd happen to have a pdf copy if his The Symmetry of Time In Physics. I'd sure like to know the context of his quote. Mung
“Gain in entropy always means loss of information, and nothing more.” Gilbert Newton Lewis My father named me after Gilbert Newton Lewis, with whom my father worked on the Manhattan A-bomb project. Gilbert GilDodgen
Mung, I find this statement,,,; “Gain in entropy always means loss of information, and nothing more.” Gilbert Newton Lewis ,,,To be a take home statement, written by a man that very well could be without peer on the subject. A statement that always applies in whatever field of physical science that you may happen to be in.,,, GilDodgen has given you some very clear examples to help you, and the videos I listed by Perry Marshall are very helpful as well. bornagain77
bornagain77:
“Gain in entropy always means loss of information, and nothing more.” Gilbert Newton Lewis
Why? And if that's all it means, and nothing more, then it follows that it doesn't mean what people think it means. Mung
To put it in simple terms, shining light on dirt doesn’t increase the probability that dirt will spontaneously generate life.
I have Sewell's In The Beginning so I have a good idea what you are talking about and do not dispute that. Mung
Is Entropy a Measure of "Disorder"? Let us dispense with at least one popular myth: "Entropy is disorder" is a common enough assertion, but commonality does not make it right. Entropy is not "disorder", although the two can be related to one another. For a good lesson on the traps and pitfalls of trying to assert what entropy is, see Insight into entropy by Daniel F. Styer, American Journal of Physics 68(12): 1090-1096 (December 2000). Styer uses liquid crystals to illustrate examples of increased entropy accompanying increased "order", quite impossible in the entropy is disorder worldview. And also keep in mind that "order" is a subjective term, and as such it is subject to the whims of interpretation. This too mitigates against the idea that entropy and "disorder" are always the same, a fact well illustrated by Canadian physicist Doug Craigen, in his online essay "Entropy, God and Evolution".
Entropy is also sometimes confused with complexity, the idea being that a more complex system must have a higher entropy. In fact, that is in all liklihood the opposite of reality. A system in a highly complex state is probably far from equilibrium and in a low entropy (improbable) state, where the equilibrium state would be simpler, less complex, and higher entropy.
http://www.tim-thompson.com/entropy1.html#what Mung
Mung: Don’t we have a physicist who posts here at times? You are probably referring to Granville Sewell who is a professor of mathematics and a fellow of the Evolutionary Informatics Lab, along with yours truly. I’m not going to watch an hour long video. What does he say that’s relevant to this discussion and where in the video does he say it? The video (try starting at 44:00) points out what Granville has in slightly different terms, which is that in an open system (like the earth which has free energy available from the sun) the Second Law cannot automatically be overcome simply as a result of the availability of free energy. That energy must be directed and harnessed to produce a local entropy decrease. As Granville puts it, the fact that a system is open doesn't mean that the highly improbable automatically becomes probable. To put it in simple terms, shining light on dirt doesn't increase the probability that dirt will spontaneously generate life. GilDodgen
bornagain77:
Mung, Entropy is the tendency of things to decay in this universe, and is considered by many to be the MOST irrefutable law of science
Do you even read the material you cut and paste? How do you get that from what Eddington wrote, or from any of the other quotes you posted? Eddington writes that entropy is a measure. So does Ross. From the Penrose video (0:07):
that's a slightly misleading way of looking at it
From the Penrose video (2:55):
...with regard to the matter it (the initial state) was maximum entropy..
Ross:
Entropy measures the amount of decay or disorganization in a system as the system moves continually from order to chaos.
Ross is just wrong. No wonder people are confused. He got "is a measure" right, but entropy is not a measure of the amount of decay or disorganization in a system as the system moves continually from order to chaos. Ross again:
Physical life is possible because universal entropy increases. All organisms take advantage of increasing entropy to run metabolic reactions, such as digestion and energy production. Work is possible because of the universe’s increasing entropy. And because work is possible, human creative activity is possible. With this miraculous creation in mind, our family spends some time on Thanksgiving Day expressing gratefulness to God for making the universe as entropic as He did.
On this I agree with Ross. But following his line of argument, without death and decay life is not possible. That's not quite the view of a lot of fundamentalist Christians who think death and decay did not enter the world until after the fall. More Ross:
The entropy measure of the universe is important for several other reasons. It determines which features of the universe are reversible and which are not.
More sloppy language. Entropy tells us whether a process is reversible. It does not determine it. Mung
The first part of this video has an example of artificially introduced entropy; THIRD DAY - YOU ARE MINE - LIVE http://www.youtube.com/watch?v=Plwpdgae6UI This is a short video that clearly shows what entropy/noise looks like to information; Random Mutations Destroy Information - Perry Marshall - video http://www.metacafe.com/watch/4023143/ This also may be of interest as to illustrating one strict constraint for 'evolvability'; Shannon Information - Channel Capacity - Perry Marshall - video http://www.metacafe.com/watch/5457552/ bornagain77
BA77:
I don’t know Mung. Entropy has been called the ‘second LAW of thermodynamics’ since it was developed in the 18th century, and yet you declare it is not a law and then tell me to do some research on the second law and then speak. Mung, you really impress me with some of your insights sometimes, but other times, and this is one of those other times, you really do seem like you have no clue what in the world you are talking about.
No, entropy is not called the ‘second law of thermodynamics’. The second law of thermo is a law about entropy, but it is not the only law about entropy (the third law of thermo is an obvious example), and it's not just about entropy (for example, it can also be formulated as a law about free energy). Entropy is not a law, tendency, or state. It is a property like mass, volume, density, or temperature (although it's generally harder to measure than any of those). Gordon Davisson
bornagain77:
Mung, here is a good video that may give you some better understanding:
This video file is a 62 minute seminar by Thomas Kindell, which was recorded during the Seattle Creation Conference
I'm not going to watch an hour long video. What does he say that's relevant to this discussion and where in the video does he say it? Mung
GilDodgen: Belief in the infinitely creative powers of natural selection is illogical, empirically falsified, and essentially represents, in my view, a cult-like mindset. They unwittingly imbue a euphemistic term with an "active" power which at bottom is really just "passive" criteria. "Natural Selection" is merely a euphemism for a dead-or-alive outcome imposed on whatever was able to reproduce previously and whatever fails to reproduce again. Reproduction takes place at several successive levels (genomic, cellular, organism and population) and Natural Selection euphemistically "filters out" failures and "filters in" successes at each level. The reasons for failure are numerous and varied, but the reasons for success are essentially two: lesser environmental pressure or greater fitness. Natural Selection does not destroy, create or modify the reasons for failure or success. Natural Selection is like nature's "Quality Control"; it doesn't make or break anything, it only 'reports' anything reproduced and surviving a previous step as qualified for the next step. Anything that died isn't even evaluated having already been "selected out". Natural Selection is not active; it is a passive after-the-fact consequence of life or death, it is the effect, it is *not* the cause of life or death itself. Natural Selection doesn't create; rather mutations/variations and reproduction do that, nor does Natural Selection eliminate, as death in all its manifestations does that. Natural Selection is merely the euphemism assigned to the consequential result of surviving or dying. As Darwnists often confuse cause with effect in their search for cause, and are notoriously imprecise in their "theories", it is not surprising they mistake the effect or results of selection with the cause and means of selection. Charles
To build up the physics background at 101 level, I suggest Schiller’s Motion Mountain.
awesome. thanks. Mung
What is natural selection? Natural selection is nothing more than differential reproduction due to heritable variation. Differential reproduction just means that some will (may) out-reproduce others. And if that differential reproduction is due to some heritable variation then you have natural selection. The heritable variation doesn't even have to be genetic as behavioral characteristics can be passed down also. And all of that depends on the environment as what is beneficial or working good enough in one environment may not be beneficial nor working good enough in another environment. And that brings us to another point- whatever works "good enough" gets kept as natural selection basically eliminates that which doesn't work good enough. Does natural selection have a direction? Only if survivability is a direction. Joseph
Mung, so now you deny the 'observed fact' of the expansion of the universe, throughout its cosmic history, as well as you deny the 'observed fact' of entropy through that same cosmic history. And Mung, exactly why am I suppose to take your 'gut feeling' over observed fact when it comes to dealing with science??? bornagain77
Where is the centre of the universe?: Excerpt: There is no centre of the universe! According to the standard theories of cosmology, the universe started with a "Big Bang" about 14 thousand million years ago and has been expanding ever since. Yet there is no centre to the expansion; it is the same everywhere. The Big Bang should not be visualized as an ordinary explosion. The universe is not expanding out from a centre into space; rather, the whole universe is expanding and it is doing so equally at all places, as far as we can tell. http://math.ucr.edu/home/baez/physics/Relativity/GR/centre.html bornagain77
BA77:
Mung, for me the most common sense proof of Entropy, besides the fact that we all grow old and die, is that all things in the universe are headed for thermodynamic equilibrium, is the fact that the universe is expanding. i.e. is the fact that the universe is ‘spreading out’ equally in all places
More silliness. So the universe is expanding, like a gas, to fill the available volume? Mung
BA77: 'Entropy is the tendency of things to decay in this universe' Mung: 'No, it isn’t.' BA77 ' …and is considered by many to be the MOST irrefutable law of science;' Mung; 'Entropy is not a law.' Mung, If you are not denying the reality of entropy, in the preceding statements, then please tell me exactly what in blue blazes you are doing??? bornagain77
BA77:
Mung since you don’t believe in Entropy.
This is just silly. Mung
Well DrBOT, seeing as you are not so confident as to say information can be generated so as to prove OOL feasible, perhaps you can falsify this null hypothesis which I listed earlier with your comrades computer program; The Law of Physicodynamic Insufficiency – Dr David L. Abel – November 2010 Excerpt: “If decision-node programming selections are made randomly or by law rather than with purposeful intent, no non-trivial (sophisticated) function will spontaneously arise.”,,, After ten years of continual republication of the null hypothesis with appeals for falsification, no falsification has been provided. The time has come to extend this null hypothesis into a formal scientific prediction: “No non trivial algorithmic/computational utility will ever arise from chance and/or necessity alone.” http://www.scitopics.com/The_Law_of_Physicodynamic_Insufficiency.html bornagain77
ba77. We were discussing evolution, not OOL. How are those portable goalposts holding up, they certainly do a lot of traveling ;) DrBot
DrBOT, if you really are so confident that you, and your comrades, are reflecting reality and are actually generating prescriptive information over and above what was already present, instead of playing games with smoke and mirrors, then by-golly submit the computer program for a 1 million dollar prize!!!! "The Origin-of-Life Prize" ® (hereafter called "the Prize") will be awarded for proposing a highly plausible natural-process mechanism for the spontaneous rise of genetic instructions in nature sufficient to give rise to life. The explanation must be consistent with empirical biochemical, kinetic, and thermodynamic concepts as further delineated herein, and be published in a well-respected, peer-reviewed science journal(s). http://lifeorigin.info/ bornagain77
All of this research is a form of artificial selection, not natural selection. Choice for potential function at decision nodes, prior to the realization of that function, is always artificial, never natural.
Yes, and random mutations are also artificial when generated in an artificial system. Weather in a weather simulator is artificial - does the simulation reflect reality? A simulation of erosion is not real - it is artificial erosion. And whilst I'm at it:
The computer is programmed from the outset to converge on the solution.
the solution? There are often many solutions, and sometimes none - not all GA's find a solution, and many find solutions that were unexpected - take a look at the work on evolvable hardware. DrBot
further notes of related interest; The immune system is a comparable situation to what neo-Darwinists are trying to claim for DESIGNED computer programs that hill climb to a predetermined goal; Falk’s fallacy - Feb. 2010 Excerpt: This (the immune system) is one of the most amazing processes ever described.,,, Whatever may be said about it, it is a highly regulated, specified, directed and choreographed process. It is obviously the product of overwhelmingly brilliant design,,, https://uncommondesc.wpengine.com/intelligent-design/falks-falacy/ Response to Kathryn Applegate - Caroline Crocker PhD.- cell biologist and immunologist - October 2010 Excerpt: Diversity of antibodies generated by B cells is due to deliberate, cell-engineered changes in the DNA sequence, not random mutations. In fact, I have never before heard the process whereby functional antibodies are formed (before they encounter antigen) described as mutation. And it is well-known that the appearance of functionality as a result of a mistake-mutation is extremely rare. Of course, after encountering antigen the hypervariable regions of the antibody DNA do undergo somatic hypermutation, but again this is in particular places and is controlled by enzymes.,,, https://uncommondesc.wpengine.com/intelligent-design/comments-on-kathryn-applegate%E2%80%99s-may-posts-on-biologos/#more-15176 Generation of Antibody Diversity is Unlike Darwinian Evolution - microbiologist Don Ewert - November 2010 Excerpt: The evidence from decades of research reveals a complex network of highly regulated processes of gene expression that leave very little to chance, but permit the generation of receptor diversity without damaging the function of the immunoglobulin protein or doing damage to other sites in the genome. http://www.evolutionnews.org/2010/11/response_to_edward_max_on_talk040661.html "A Masterful Feat of Courtroom Deception": Immunologist Donald Ewert on Dover Trial - audio http://intelligentdesign.podomatic.com/player/web/2010-12-20T15_01_03-08_00 bornagain77
What would be surprising DrBOT is if the operating system of the computer program which was designed to ‘evolve’ the code, was itself opened up to the random mutation, natural selection process. DrBOT, how far do you think the program would get if it actually reflected reality like that???
Yes, that would be surprising and I don't think it would get very far! I'm not sure why you think that would reflect reality though? As far as I am aware the laws of physics do not mutate at random - if they did the universe would break! DrBot
What would be surprising DrBOT is if the operating system of the computer program which was designed to 'evolve' the code, was itself opened up to the random mutation, natural selection process. DrBOT, how far do you think the program would get if it actually reflected reality like that??? bornagain77
DrBOT, excerpt from your article: 'In this arrangement, the host PC begins the run by creating the initial random population (with the XC6216 waiting). Then, for generation 0 (and each succeeding generation), the PC creates the necessary configuration bits to enable the XC6216 to measure the fitness of the first individual program in the population (with the XC6216 waiting). Thereafter, the XC6216 measures the fitness of one individual. Note that the PC can simultaneously prepare the configuration bits for the next individual in the population and poll to see if the XC6216 is finished. After the fitness of all individuals in the current generation of the population is measured, the genetic operations (reproduction, crossover, and mutation) are performed (with the XC6216 waiting). This arrangement is beneficial because the computational burden of creating the initial random population and of performing the genetic operations is small in comparison with the fitness measurement task.' and the problem with that DrBOT is,,, Constraints vs. Controls - Abel - 2010 Excerpt: Classic examples of the above confusion are found in the faulty-inference conclusions drawn from many so-called “directed evolution,” “evolutionary algorithm,” and computer-programmed “computational evolutionary” experimentation. All of this research is a form of artificial selection, not natural selection. Choice for potential function at decision nodes, prior to the realization of that function, is always artificial, never natural. http://www.bentham.org/open/tocsj/articles/V004/14TOCSJ.pdf Evolutionary Synthesis of Nand Logic: Dissecting a Digital Organism - Dembski - Marks - Dec. 2009 Excerpt: The effectiveness of a given algorithm can be measured by the active information introduced to the search. We illustrate this by identifying sources of active information in Avida, a software program designed to search for logic functions using nand gates. Avida uses stair step active information by rewarding logic functions using a smaller number of nands to construct functions requiring more. Removing stair steps deteriorates Avida’s performance while removing deleterious instructions improves it. http://evoinfo.org/publications/evolutionary-synthesis-of-nand-logic-avida/ The Capabilities of Chaos and Complexity - David L. Abel Excerpt: "To stem the growing swell of Intelligent Design intrusions, it is imperative that we provide stand-alone natural process evidence of non trivial self-organization at the edge of chaos. We must demonstrate on sound scientific grounds the formal capabilities of naturally-occurring physicodynamic complexity. Evolutionary algorithms, for example, must be stripped of all artificial selection and the purposeful steering of iterations toward desired products. The latter intrusions into natural process clearly violate sound evolution theory." http://www.mdpi.com/1422-0067/10/1/247/pdf In computer science we recognize the algorithmic principle described by Darwin - the linear accumulation of small changes through random variation - as hill climbing, more specifically random mutation hill climbing. However, we also recognize that hill climbing is the simplest possible form of optimization and is known to work well only on a limited class of problems. Watson R.A. - 2006 - Compositional Evolution - MIT Press - Pg. 272 In the following podcast, Robert Marks gives a very informative talk as to the strict limits we can expect from any evolutionary computer program: Darwin as the Pinball Wizard: Talking Probability with Robert Marks - podcast http://www.idthefuture.com/2010/03/darwin_as_the_pinball_wizard_t.html Here are a few quotes from Robert Marks from the preceding podcast, as well as link to further quotes by Dr. Marks: * [Computer] programs to demonstrate Darwinian evolution are akin to a pinball machine. The steel ball bounces around differently every time but eventually falls down the little hole behind the flippers. * It's a lot easier to play pinball than it is to make a pinball machine. * Computer programs, including all of the models of Darwinian evolution of which I am aware, perform the way their programmers intended. Doing so requires the programmer infuse information about the program's goal. You can't write a good program without [doing so]. Robert J. Marks II - Distinguished Professor of Electrical and Computer Engineering at Baylor University http://en.wikiquote.org/wiki/Robert_J._Marks_II Signature In The Cell - Review Excerpt: There is absolutely nothing surprising about the results of these (evolutionary) algorithms. The computer is programmed from the outset to converge on the solution. The programmer designed to do that. What would be surprising is if the program didn't converge on the solution. That would reflect badly on the skill of the programmer. Everything interesting in the output of the program came as a result of the programmer's skill-the information input. There are no mysterious outputs. Software Engineer - quoted to Stephen Meyer http://www.scribd.com/full/29346507?access_key=key-1ysrgwzxhb18zn6dtju0 bornagain77
Efficient Evolution of Machine Code for CISC Architectures using Blocks and Homologous Crossover. Peter Nordin, Wolfgang Banzhaf and Frank Francone This chapter describes recent advances in genetic programming of machine code. Evolutionary program induction of binary machine code is one of the fastest1 GP methods and the most well studiedlinear approach. The technique has previously been known as Compiling Genetic Programming System (CGPS) but to avoid confusion with methods using an actual compiler and to separate the system from the method, the name has been changed to Automatic Induction of Machine Code with Genetic Programming (AIM-GP). AIM-GP stores individuals as a linear string of native binary machine code, which is directly executed by the processor. The absence of an interpreter and complex memory handling allows increased speed of several orders of magnitudes. AIM-GP has so far been applied to processors with a fixed instruction length (RISC) using integer arithmetics. This chapter describes several new advances to the AIM-GP method which are important for the applicability of the technique. Such advances include enabling the induction of code for CISC processors such as the most widespread computer architecture INTEL x86 as well as JAVA and many embedded processors. The new technique also makes AIM-GP more portable in general and simplifies the adaptation to any processor architecture. Other additions include the use of floating point instructions, control flow instructions, ADFs and new genetic operators e.g. aligned homologous crossover. We also discuss the benefits and drawbacks of register machine GP versus tree-based GP. This chapter is meant to be a directed towards the practitioner, who wants to extend AIM-GP to new architectures and application domains.
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.88.9589&rep=rep1&type=pdf DrBot
Mung, You are basically making the same exact claim that neo-Darwinists make if you think 'non-crashed' computer programs can evolve into better computer programs, moreover, as SCheesman pointed out in the fatal flaw of your software analogy, no one has ever seen a computer program write even a single piece of programming code (prescriptive information) Natural Selection, Genetic Mutations and Information - EXPELLED http://www.metacafe.com/watch/4036840/ --------------- Dr. David Berlinski: Accounting for Variations (Clip 3) http://www.youtube.com/watch?v=aW2GkDkimkE The Law of Physicodynamic Insufficiency - Dr David L. Abel - November 2010 Excerpt: “If decision-node programming selections are made randomly or by law rather than with purposeful intent, no non-trivial (sophisticated) function will spontaneously arise.”,,, After ten years of continual republication of the null hypothesis with appeals for falsification, no falsification has been provided. The time has come to extend this null hypothesis into a formal scientific prediction: “No non trivial algorithmic/computational utility will ever arise from chance and/or necessity alone.” http://www.scitopics.com/The_Law_of_Physicodynamic_Insufficiency.html The Capabilities of Chaos and Complexity: David L. Abel - Null Hypothesis For Information Generation - 2009 To focus the scientific community’s attention on its own tendencies toward overzealous metaphysical imagination bordering on “wish-fulfillment,” we propose the following readily falsifiable null hypothesis, and invite rigorous experimental attempts to falsify it: "Physicodynamics cannot spontaneously traverse The Cybernetic Cut: physicodynamics alone cannot organize itself into formally functional systems requiring algorithmic optimization, computational halting, and circuit integration." A single exception of non trivial, unaided spontaneous optimization of formal function by truly natural process would falsify this null hypothesis. http://www.mdpi.com/1422-0067/10/1/247/pdf Can We Falsify Any Of The Following Null Hypothesis (For Information Generation) 1) Mathematical Logic 2) Algorithmic Optimization 3) Cybernetic Programming 4) Computational Halting 5) Integrated Circuits 6) Organization (e.g. homeostatic optimization far from equilibrium) 7) Material Symbol Systems (e.g. genetics) 8 ) Any Goal Oriented bona fide system 9) Language 10) Formal function of any kind 11) Utilitarian work http://mdpi.com/1422-0067/10/1/247/ag bornagain77
Mung: I'll say it a different way. Not a single piece of code you have ever produced was obtained by the random mutation of a previous piece of code. I can say that with absolute certainty. How can you even speak of code "evolving" into something more useful or complex (or "fit") by that process, given your own experience writing it? SCheesman
Mung - sorry I just came upon this thread. I am also a software developer, and came upon your comment:
But it’s obvious that a running program has more opportunity to complete than a non-running program. So if these programs were ‘evolving’ by duplicating themselves and competing for processing power, which program would have more opportunities to evolve? The program that crashed, or the program that did not crash?
This, in a nutshell sums up the whole problem with explaining evolution as the result of selecting beneficial mutations. Software does not "evolve" itself unless it is carefully designed to do so, and then only in a tightly bound scope. Introduce any sort of mutation and it won't even compile, let alone do something new and useful that would give it a global advantage over the previous version. The universe is not large enough, nor time long enough for one programme to evolve a better one by random substitution of code primitives. This is why you design code, instead of closing your eyes and pecking keys at random, then opening them to pick out version that runs better. Doesn't happen. Won't happen. SCheesman
Folks: As a 101 on entropy and related topics in the context of ID issues, you may consider reading here. Note too, the discussion in the ID foundation series here at UD. To build up the physics background at 101 level, I suggest Schiller's Motion Mountain. I point to his vol I, chs on the specific topic. The secret to how entropy can be reduced locally while organisation is built up, is that we have directed energy conversion devices that do work according to a program. That allows entropy to be exported to the environment as energy and materials waste -- ever wondered why power plants have cooling systems and why cities have junkyards, or why houses have bathrooms? Cf the way cells are set up as self-replicating entities and how there is a bodybuilding program that moves zygotes to complete functioning organisms. Beyond that, entropy is a quantity [or, these days, a cluster of 3 - 4 dozen more or less related quantities: TYPE-X entropy . . . where TYPE_X is a name or a group of names as a rule] and the second law discusses how it tends to increase, with stat mech providing the statistical underpinnings. Fundamentally, once there is some freedom for systems to spontaneously develop at micro-levels, they tend to move towards states where energy and mass are in clusters of microstates that have the biggest relative statistical weight. To stop that from happening in something like a living system, there is complex functional organisation that creates metabolic and self-replicating processes. These are of course highly unusual and specific, often algorithm-effecting structures. The best, empirically warranted explanation for such is design. GEM of TKI (FWIW, Applied Physicist) kairosfocus
Mung, for me the most common sense proof of Entropy, besides the fact that we all grow old and die, is that all things in the universe are headed for thermodynamic equilibrium, is the fact that the universe is expanding. i.e. is the fact that the universe is 'spreading out' equally in all places; Every 3D Place Is Center In This Universe – expanding 4D space/time – video http://www.metacafe.com/watch/3991873/ History of The Universe Timeline- Graph Image http://www.astronomynotes.com/cosmolgy/CMB_Timeline.jpg In fact the accelerating 'cosmic inflation' of the universe, if unchecked in its present course, is hypothesized to do the following,,, Big Rip Excerpt: The Big Rip is a cosmological hypothesis first published in 2003, about the ultimate fate of the universe, in which the matter of universe, from stars and galaxies to atoms and subatomic particles, are progressively torn apart by the expansion of the universe at a certain time in the future. Theoretically, the scale factor of the universe becomes infinite at a finite time in the future. http://en.wikipedia.org/wiki/Big_Rip i.e. Entropic decay towards thermodynamic equilibrium seems to be tied semi-directly to the rate of the expansion of the universe,,, But as Christians we have a promised hope for a 'new' future,,, Revelation 21:1-4 Then I saw “a new heaven and a new earth, for the first heaven and the first earth had passed away, and there was no longer any sea. I saw the Holy City, the new Jerusalem, coming down out of heaven from God, prepared as a bride beautifully dressed for her husband. And I heard a loud voice from the throne saying, “Look! God’s dwelling place is now among the people, and he will dwell with them. They will be his people, and God himself will be with them and be their God. ‘He will wipe every tear from their eyes. There will be no more death’ or mourning or crying or pain, for the old order of things has passed away.” of related interest: A Quantum Hologram of Christ's Resurrection? by Chuck Missler Excerpt: “You can read the science of the Shroud, such as total lack of gravity, lack of entropy (without gravitational collapse), no time, no space—it conforms to no known law of physics.” The phenomenon of the image brings us to a true event horizon, a moment when all of the laws of physics change drastically. Dame Piczek created a one-fourth size sculpture of the man in the Shroud. When viewed from the side, it appears as if the man is suspended in mid air (see graphic, below), indicating that the image defies previously accepted science. The phenomenon of the image brings us to a true event horizon, a moment when all of the laws of physics change drastically. http://www.khouse.org/articles/2008/847 ------------ The Way - Fastball - music video http://www.metacafe.com/watch/4193448/ bornagain77
It has always seems to me that people (in general, and irrespective of their stance about Darwinism) are frequently using terms like 'order' (and 'disorder') and 'entropy' equivocally. For instance, 'order' (and 'disorder') is used, frequently simultaneously, to denote two very opposite states or conditions: living systems are 'orderly' and crystals are 'orderly' (maximally so, in fact), yet is not a crystal about as opposite from a living system as one can get? Ilion
Mung: "I’ve come to be among those who do not feel a need to absolve God of being the author of “evil.” Am I alone?" Nope. And, for the record, God does not shy from taking the responsibility. Ilion
Mung: "Historically, has Christianity ever been seen to be in opposition to an eternal universe?" But, of course. How can you not know this. Would not an "eternal universe" be one that is not caused? Christianity has always said that "the universe" is caused; is this not equivalent to saying that "the universe" is not eternal? At the same time, looking at the question for a different perspective, there is no such thing as "the universe" (which is why I keep putting the term in quote marks). The concept 'the universe' is just that, a concept; it is not a concrete physically-existing thing. To speak of "the universe" as though it were an actual entity is like speaking of 'humanity' as though it were an actual entity. Ilion
Mung: "I’ll probably never understand how Darwin’s theory ever came to be interpreted as a theory which excludes design." Because that was the whole point from the start. Ilion
correction; so that THINGS like broken eggs put themselves back together again,,, bornagain77
Mung since you don't believe in Entropy. Thus I take it that you don't believe the universe is heading for maximum entropy either, i.e. thermodynamic equilibrium??? notes,,, Did the Universe Hyperinflate? – Hugh Ross – April 2010 Excerpt: Perfect geometric flatness is where the space-time surface of the universe exhibits zero curvature (see figure 3). Two meaningful measurements of the universe’s curvature parameter, ½k, exist. Analysis of the 5-year database from WMAP establishes that -0.0170 < ½k < 0.0068.4 Weak gravitational lensing of distant quasars by intervening galaxies places -0.031 < ½k < 0.009.5 Both measurements confirm the universe indeed manifests zero or very close to zero geometric curvature,,, http://www.reasons.org/did-universe-hyperinflate A 'flat universe', which is actually another very surprising finely-tuned 'coincidence' of the universe, means this universe, left to its own present course of accelerating expansion due to Dark Energy, will continue to expand forever, thus fulfilling the thermodynamic equilibrium of the second law to its fullest extent (entropic 'Heat Death' of the universe). The Future of the Universe Excerpt: After all the black holes have evaporated, (and after all the ordinary matter made of protons has disintegrated, if protons are unstable), the universe will be nearly empty. Photons, neutrinos, electrons and positrons will fly from place to place, hardly ever encountering each other. It will be cold, and dark, and there is no known process which will ever change things. --- Not a happy ending. http://spiff.rit.edu/classes/phys240/lectures/future/future.html Psalm 102:25-27 Of old You laid the foundation of the earth, And the heavens are the work of Your hands. They will perish, but You will endure; Yes, they will all grow old like a garment; Like a cloak You will change them, And they will be changed. But You are the same, And Your years will have no end. ------------------- Mung since you are making the grand claim that Entropy does not exist, i.e. that things do not tend to decay towards thermodynamic equilibrium, I guess the burden is on you to prove it. To demonstrate that you are right and everybody else is wrong, all you have to do is to make time flow backwards, so that that thinks like broken eggs put themselves back together and all things old become new again!!! :) bornagain77
B1 paragwinn
What's in a name? In the case of Shannon's measure the naming was not accidental. In 1961 one of us (Tribus) asked Shannon what he had though about when he had finally confirmed his famous measure. Shannon replied: "My greatest concern was what to call it. I thought of calling it 'information,' but the word was overly used, so I decided to call it 'uncertainty.' When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, 'You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name. In the second place, and more important, no one knows what entropy really is, so in a debate you will always have the advantage.'" - Tribus and McIrvine
Mung
Entropy cont.
By doing this, rather than extracting a name from the body of the current language (say: lost heat), he succeeded in coining a word that meant the same thing to everybody: nothing. - Leon Cooper
Mung
Back to Entropy:
I prefer going to the ancient languages for the names of important scientific quantities, so that they mean the same thing in all living tongues. I propose, accordingly, to call S the entropy of a body, after the Greek word "transformation." I have designedly coined the word entropy to be similar to energy, for these two quantities are so analogous in their physical significance, that an analogy of denominations seems to me helpful. - Rudolf Clausius
Mung
Long ago I was taught that in thermodynamics, the term "entropy" basically means thermal energy that is unavailable to do work. That definition seemed to make sense to me in the context of thermodynamics. From what I read here, it seems that there is some other concept of "entropy" as that term is used in thermodynamics. Could someone here please explain that different usage to me? It would be most appreciated. ocbouvieri
While I find ID fascinating, even encouraging, at the same time I’ve never really felt that natural selection – properly understood – is in the running against ‘design’.
Raised Protestant but who cares, lol. I'd take communion with a Catholic any day. Natural selection was designed to absolve God of making the choice between who wins and who loses. That is a theological question, not a design question. I've come to be among those who do not feel a need to absolve God of being the author of "evil." Am I alone? Mung
Kansas – Dust In the Wind
Even dust in the wind follows certain laws. Mung
Mung, I have to admit, I find your approach interesting. And really, mine is similar. While I find ID fascinating, even encouraging, at the same time I've never really felt that natural selection - properly understood - is in the running against 'design'. Really, it just seems like yet another implementation OF design to me. Always has. Always does. Mind you, I've dived into the topic with a habit of ignoring people telling me what I'm supposed to think of evolution. Granted, in my (catholic) upbringing there was no YEC voice, so the main people I experienced pushing that were atheists, despite learning about evolution early on. When I finally encountered Dawkins going on about a blind watchmaker, all I could think is "How do you know what's blind and what's sighted, you smug hairball?" But hey, that's my take on it, at least in part. nullasalus
The law that entropy always increases-the second law of thermodynamics-holds, I think, the supreme position among the laws of Nature. The second law of thermodynamics is a law. If you can formulate the law of entropy or supply a link to the law of entropy I'm interested. Mung
Kansas - Dust In the Wind http://www.youtube.com/watch?v=1qxSwJC3Ly0 bornagain77
Mung, with all due respect (and I appreciate your feedback at UD), this is hyper-illogical. Thank you, and yet, not thank you. ;) Hopefully if one looks at my overall contribution here it is clear that: 1. I am a Christian 2. I support Intelligent Design And yes, I state things in that order. If I were not a Christian I could care less about ID. But as a Christian FIRST, it is incumbent upon us to not bring the Gospel of Jesus Christ into disrepute. To be clear, I do not think that ID does that. But I honestly do not understand how I am being hyper-illogical.
Natural selection does not create opportunities.
This is simply the negation of what I stated. It's not a rebuttal. It's like saying, "you're wrong."
Your thesis is equivalent to saying that a computer program that crashes creates an opportunity for a program that doesn’t, when no programmers are involved.
My thesis is programmer agnostic. A program that crashes is "out of the running." If I were a designer I'd choose the program that didn't crash. But it's obvious that a running program has more opportunity to complete than a non-running program. So if these programs were 'evolving' by duplicating themselves and competing for processing power, which program would have more opportunities to evolve? The program that crashed, or the program that did not crash? Mung
Mung you state; 'Entropy is not a law.' And yet Eddington states: The law that entropy always increases-the second law of thermodynamics-holds, I think, the supreme position among the laws of Nature. If someone points out to you that your pet theory of the universe is in disagreement with Maxwell’s equations-then so much the worse for Maxwell’s equations. If it is found to be contradicted by observation-well, these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation.” (Eddington, A.S., “The Nature of the Physical World,” [1928], The Gifford Lectures 1927, Cambridge University Press: Cambridge UK, 1933, reprint, pp.74-75. Emphasis original). I don't know Mung. Entropy has been called the 'second LAW of thermodynamics' since it was developed in the 18th century, and yet you declare it is not a law and then tell me to do some research on the second law and then speak. Mung, you really impress me with some of your insights sometimes, but other times, and this is one of those other times, you really do seem like you have no clue what in the world you are talking about. bornagain77
BA77:
Entropy is the tendency of things to decay in this universe
No, it isn't.
...and is considered by many to be the MOST irrefutable law of science;
Entropy is not a law. I offer you the following advice, research the subject, then speak. Mung
Au contraire. Natural selection creates opportunities for “the good stuff” by getting rid of “the bad stuff.” Mung, with all due respect (and I appreciate your feedback at UD), this is hyper-illogical. Natural selection does not create opportunities. Your thesis is equivalent to saying that a computer program that crashes creates an opportunity for a program that doesn't, when no programmers are involved. GilDodgen
But I do appreciate you not trying to swamp me with links and quotes.
Slap me for speaking too soon. Mung
Mung, here is a good video that may give you some better understanding:
Better understanding of what? Entropy? Not likely (pun intended). But I do appreciate you not trying to swamp me with links and quotes. What is the relevance of the material at the link you posted?
Thermodynamic Arguments for Creation
I don't deny creation (sigh). Does it answer any of the questions I posed in #2 and #3 above? Historically, has Christianity ever been seen to be in opposition to an eternal universe? Mung
Mung, the intelligent design objection to Natural Selection is that it always reduces information. It never creates information. ,,, Information is the whole key. bornagain77
Mung, Entropy is the tendency of things to decay in this universe, and is considered by many to be the MOST irrefutable law of science; "The practical measure of the random element which can increase in the universe but can never decrease is called entropy. Measuring by entropy is the same as measuring by the chance explained in the last paragraph, only the unmanageably large numbers are transformed (by a simple formula) into a more convenient scale of reckoning. Entropy continually increases. We can, by isolating parts of the world and postulating rather idealised conditions in our problems, arrest the increase, but we cannot turn it into a decrease. That would involve something much worse than a violation of an ordinary law of Nature, namely, an improbable coincidence. The law that entropy always increases-the second law of thermodynamics-holds, I think, the supreme position among the laws of Nature. If someone points out to you that your pet theory of the universe is in disagreement with Maxwell's equations-then so much the worse for Maxwell's equations. If it is found to be contradicted by observation-well, these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation." (Eddington, A.S., "The Nature of the Physical World," [1928], The Gifford Lectures 1927, Cambridge University Press: Cambridge UK, 1933, reprint, pp.74-75. Emphasis original). * A theory is the more impressive the greater the simplicity of its premises, the more different kinds of things it relates, and the more extended its area of applicability. Therefore the deep impression that classical thermodynamics made upon me. It is the only physical theory of universal content which I am convinced will never be overthrown, within the framework of applicability of its basic concepts. o Albert Einstein (author), Paul Arthur, Schilpp (editor). Autobiographical Notes. A Centennial Edition. Open Court Publishing Company. 1979. p. 31 [As quoted by Don Howard, John Stachel. Einstein: The Formative Years, 1879-1909 (Einstein Studies, vol. 8). Birkhäuser Boston. 2000. p. 1] As well Mung, Entropy was found by Roger Penrose to be far and away the most extremely fine tuned initial condition of the universe; ,,,Although 1 part in 10^120 for the expansion of the universe and 1 part in 10^60 for the mass density of the universe far exceeds, by many orders of magnitude, the highest tolerance ever achieved in any man-made machine, which is 1 part in 10^22 for a gravity wave detector, according to esteemed British mathematical physicist Roger Penrose (1931-present), the odds of one particular individual constant, the 'original phase-space volume' of the universe, required such precision that the "Creator’s aim must have been to an accuracy of 1 part in 10^10^123”. This number is gargantuan. If this number were written out in its entirety, 1 with 10^123 zeros to the right, it could not be written on a piece of paper the size of the entire visible universe, even if a number were written down on each sub-atomic particle in the entire universe, since the universe only has 10^80 sub-atomic particles in it. Roger Penrose discusses initial entropy of the universe. - video http://www.youtube.com/watch?v=WhGdVMBk6Zo The Physics of the Small and Large: What is the Bridge Between Them? Roger Penrose Excerpt: "The time-asymmetry is fundamentally connected to with the Second Law of Thermodynamics: indeed, the extraordinarily special nature (to a greater precision than about 1 in 10^10^123, in terms of phase-space volume) can be identified as the "source" of the Second Law (Entropy)." How special was the big bang? - Roger Penrose Excerpt: This now tells us how precise the Creator's aim must have been: namely to an accuracy of one part in 10^10^123. (from the Emperor’s New Mind, Penrose, pp 339-345 - 1989) As well, contrary to speculation of 'budding universes' arising from Black Holes, Black Hole singularities are completely opposite the singularity of the Big Bang in terms of the ordered physics of entropic thermodynamics. In other words, Black Holes are singularities of destruction and disorder rather than singularities of creation and order. Roger Penrose - How Special Was The Big Bang? “But why was the big bang so precisely organized, whereas the big crunch (or the singularities in black holes) would be expected to be totally chaotic? It would appear that this question can be phrased in terms of the behaviour of the WEYL part of the space-time curvature at space-time singularities. What we appear to find is that there is a constraint WEYL = 0 (or something very like this) at initial space-time singularities-but not at final singularities-and this seems to be what confines the Creator’s choice to this very tiny region of phase space.” Entropy of the Universe - Hugh Ross - May 2010 Excerpt: Egan and Lineweaver found that supermassive black holes are the largest contributor to the observable universe’s entropy. They showed that these supermassive black holes contribute about 30 times more entropy than what the previous research teams estimated. http://www.reasons.org/entropy-universe Evolution is a Fact, Just Like Gravity is a Fact! UhOh! Excerpt: The results of this paper suggest gravity arises as an entropic force, once space and time themselves have emerged. https://uncommondesc.wpengine.com/intelligent-design/evolution-is-a-fact-just-like-gravity-is-a-fact-uhoh/ This 1 in 10^10^123 number, for the time-asymmetry of the initial state of the 'ordered entropy' for the universe, also lends strong support for 'highly specified infinite information' creating the universe since; "Gain in entropy always means loss of information, and nothing more." Gilbert Newton Lewis - Eminent Chemist bornagain77
The one thing natural selection obviously can’t do is create anything new, but this is what is perpetually claimed for it.
Well, like I said, what natural selection creates is opportunity. That's hardly an anti-design proposition. And as I stated in response to nullasalus, the objection to NS is primarily theological. If you can provide an "Intelligent Design" objection to natural selection I'd like to hear it. Mung
If I’m reading you right, you’re saying that ID theorists should start to include some ‘mainstream’ findings (natural selection) under the ‘design’ header.
If they make sense from a design perspective, absolutely. As a software designer, I often get rid of my "poor designs" and continue on and improve my "better designs." Finding the same principle operating in nature only strengthens the design inference, imo. But NS is blind! Or is it. IMO, the major problem that Creationists (not labeling anyone here) have with Natural Selection is the elimination of some at the expense of others. That's hardly Godlike, right? But it is very "designer like." I'll probably never understand how Darwin's theory ever came to be interpreted as a theory which excludes design. Mung
Gill at 5:
It is interesting that natural selection can both preserve good stuff that already exists and preserve bad stuff that happens by accident, if the bad stuff results in a survival advantage in a particular situation.
Good gosh. If something results in a survival advantage, then it's not bad. 'Good' and 'bad' in this context are completely relative. A new trait can be a 'good' trait if it helps its owner survive; that same trait can be a 'bad' trait if it is a detriment to survival.
The one thing natural selection obviously can’t do is create anything new, but this is what is perpetually claimed for it.
It is mutations that create novelty. Natural selection promotes this novelty if it's advantageous, and removes it if it's disadvantageous. If it's neutral, it may stick around. Surely you must know this. Who has ever claimed that natural selection by itself [with no variation to act on] creates anything? I'm all for someone questioning evolutionists. But please, question what they actually believe.
Since decay is the norm, and random errors, statistically speaking, essentially always result in decay,
It sounds like you're saying almost all mutations are detrimental. If so, this is incorrect; most are neutral - they have little to no effect on survival whatsoever. jurassicmac
Mung, If I'm reading you right, you're saying that ID theorists should start to include some 'mainstream' findings (natural selection) under the 'design' header. Am I right? nullasalus
It is interesting that natural selection can both preserve good stuff that already exists and preserve bad stuff that happens by accident, if the bad stuff results in a survival advantage in a particular situation. The one thing natural selection obviously can't do is create anything new, but this is what is perpetually claimed for it. This is so transparently obvious that I'm constantly bewildered by people in academia with Piled Higher and Deeper credentials who can't seem to figure this out. GilDodgen
Mung, here is a good video that may give you some better understanding: Thermodynamic Arguments for Creation - Dr. Thomas Kindell http://www.youtube.com/watch?v=Ki0QB5i_gkg bornagain77
Natural selection is a buffer against decay that is constantly operating in nature.
So why isn't it embraced by design theorists? Could "the designer" have come up with a better design?
Natural selection throws out bad stuff in a competitive environment but has no creative powers.
Au contraire. Natural selection creates opportunities for "the good stuff" by getting rid of "the bad stuff."
Since decay is the norm, and random errors, statistically speaking, essentially always result in decay, a creature living underground will lose its eyes because the informational cost of producing eyes is high.
Do you mean the informational cost of maintaining eyes? Presumably, these critters evolved from some other critters that had eyes.
Thus, a crippled, decayed creature in a pathologically hostile environment will have a survival advantage. This is devolution, not evolution.
No, that is evolution. A creature surviving in a pathologically hostile environment by shedding unnecessary and even hindering attributes can only be deemed crippled and decayed by twisted and tortuous logic. Mung
Hi Gil, Thanks for opening a thread on entropy! Hopefully this will turn out to be an interesting debate. Don't we have a physicist who posts here at times? But seriously, not getting better at playing the piano is due to entropy? How so? Doesn't getting better at playing the piano increase entropy? So can I blame my poor piano playing on someone who is practicing too much? Is entropy a cause of anything at all? When Carbon 14 decays, is that caused by entropy? Does the decay of Carbon 14 cause an increase in entropy? Does entropy really apply to everything?
When evidence of decay is presented as evidence of progress, one must wonder what is going on in the minds of such people.
Decay is part of a cycle. Without decay would the cycle of life on earth continue? I don't have a problem seeing decay as evidence of progress, and from what I know of your background, neither should you. I'll let you think on that. Hint: What does it mean "to progress"? Mung
Ahhh, but evolution doesn't equal progess. And obviously you don't understand the power of magical mystery mutations. So there. :razz: :cool: Joseph

Leave a Reply