Uncommon Descent Serving The Intelligent Design Community

Evidence of Decay is Evidence of Progress?

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

It’s called entropy, and it applies to everything. If you’re a pianist and don’t practice on a regular basis you don’t stay the same, you get worse, and it takes extra discipline, effort, and dedication to get better.

Natural selection is a buffer against decay that is constantly operating in nature. Natural selection throws out bad stuff in a competitive environment but has no creative powers. Since decay is the norm, and random errors, statistically speaking, essentially always result in decay, a creature living underground will lose its eyes because the informational cost of producing eyes is high.

Thus, a crippled, decayed creature in a pathologically hostile environment will have a survival advantage. This is devolution, not evolution.

This phenomenon is not only logically obvious, but Michael Behe and others have demonstrated that such is empirically the case.

Belief in the infinitely creative powers of natural selection is illogical, empirically falsified, and essentially represents, in my view, a cult-like mindset.

When evidence of decay is presented as evidence of progress, one must wonder what is going on in the minds of such people.

Comments
Sorry it's taken me a while to reply to your comment, Charles.
Gordon Davisson:
The comparison was entirely within scenario #2, where the high bit went from being 0 with 100% probability, to being 0 with 50% probability and 1 with 50% probability. The multiplicity and entropy are based on the probabilities of various states, not what the encoding allows.
Charles:
The inconsistency is that there never was, by your definition, any information in the hi-order bit. All the information, is in the lo-order 7 bits. That information has a multiplicity of 2: a) ASCII letter with the hi-order bit=1 and b) ASCII letter with the hi-order bit=0
Huh? I'm really not sure what you mean here. Let me run the math on a (very simple) example. Suppose that at a certain point in the English message, if you look at the message up to that point, you can make a pretty good guess at what the next letter will be (based on syntax, grammar, and semantics). Specifically, suppose you can estimate that the next letter has an 80% chance of being an "e", a 10% chance of being a "a", and a 5% chance each of being "t" or "o". As a result, before randomization of the high bit, the byte that encodes that next letter has an 80% chance of being (in hexidecimal) 65 (the ascii code for "e"), a 10% chance of being 61 (ascii "a"), and a 5% chance of being 74 (ascii "t") or 6f (ascii "o"). If you plug these into the formula for Shannon-entropy, H in bits = sum for each possibility i of -P(i)*log2(P(i)) = -0.8*log2(0.8) - 0.1*log2(0.1) - 0.05*log2(0.05) - 0.05*log2(0.05) = 0.2578 + 0.3322 + 0.2161 + 0.2161 = 1.0219, so the initial entropy of this byte (given the ones before it) is a hair over 1 bit. Now, take a look at the probabilities after the high bit is randomized: it has a 40% chance each of being 65 or e5 (61+the high bit), a 5% chance each of being 61 or e1, and a 2.5% chance each of being 74, f4, 6f, or ef. Plug these probabilities into the same formula (I'll skip writing out the math this time), and you'll get 2.0219 bits of entropy. BTW, while the example above is unrealistic (the probabilities were chosen to make calculations easy, not for realism), this is essentially how the 1 bit of entropy per letter figure was derived. If you read some text, stop at an arbitrary point, and try to guess the next letter, you'll sometimes have no good idea, and sometimes be able to guess the entire next word or phrase. 1 bit per letter is a rough overall average.
You can’t reasonably argue no change of information on the lo-order 7-bits while arguing an entropy change on all 8-bits. Compare before and after information and entropy on only 7-bits (apples) or compare before and after information and entropy on all 8-bits (oranges), but don’t conflate the two comparisons as disproving Lewis.
I'm comparing both on all 8 bits. All of the meaningful information happens to be concentrated in the low 7 bits, with the 8th bit being meaningless (both before and after randomization). But the entire byte corresponds to the same English letter both before and after randomization. BTW, I can give you lots more examples like this -- anytime you have a redundant encoding that can tollerate some errors (Hamming codes, Reed-Solomon codes, etc), errors within its correction limits will increase Shannon-entropy without damaging any of the encoded information.
There’s only one way to encode a given English message, but there are many possible English messages.
This is precisely the inconsistency to be avoided. Compare the loss of information when losing a given message with losing another given message (apples) or the loss of information when losing all possible messages with all possible messages (oranges), but don’t conflate them.
There are a variety of ways to define "information"; one way that's particularly relevant here is that information corresponds to a choice of one particular option (e.g. a specific message) from a set of possibilities (e.g. the set of all possible messages). After randomizing the 8th bit (scenario #2), the author's choice of what to say is can still be determined from the memory's contents (the information -- the author's choice -- is still there). After the memory is either entirely randomized (#1) or erased (#3), the author's choice cannot be determined from the memory's contents (the information is gone).
If you power a DRAM chip on and start reading from it without writing first, you will get some pattern of 0s and 1s out. It’s undefined only in the sense that nobody will make any particular promises about what that pattern will be;
Please. Without any promise as to what the pattern will be is as “undefined” as it gets. undefined means undefined; no promises, no guarantees, no defined meaning.
And yet, promises or no, if you actually do the experiment it'll come out the same way time after time (as long as you left it off long enough). And anytime you have one outcome with probability 100%, its Shannon-entropy is going to be 0.
in practice, if the chip has been off long enough, it’ll be whatever the read/store circuitry treats the all-discharged state as.
And an “off” chip can’t be read. Its values are unknowable, it has no defined information. OTOH, as previously noted, if the memory is designed to retain value without power (e.g. bubble or magnetic) then its defined state doesn’t change without power and there is no loss of information, so that comparison remains moot.
I think it can be defined (the charge level on the capacitor exists even when you don't read it), but I have a feeling this may be one of those philosophical questions like "if a tree falls in the forest and nobody hears it, does it make a sound?" and we may just have to disagree. In any case, it doesn't matter to my argument: all I need is that after you turn it back on, it's in a predictable state and hence has lost its Shannon-entropy.
BTW, DRAM can drain even when powered on; each row must be read frequently to refresh its charges.
Indeed. All volatile memory has some designed in “refresh” logic to ensure memory retains its “defined” values. It would be useless otherwise.
(This is irrelevant to the main discussion, I'm just being picky here.) Actually, the old-school DRAM I worked with had no built-in refresh logic; it had to be handled entirely by logic outside the DRAM chips themselves. Some modern chips have part of the logic onboard, but it (at least as I understand it) still needs to be externally triggered with a CAS before RAS cycle.
BTW, needing to add energy to get back to the original state does not confirm an entropy decrease. If anything, it does the opposite, as adding energy will generally increase entropy.
Adding energy is the only way to restore the previous “order”. Adding energy is the only way to move from a large multiplicity to a small multiplicity.
This is a very common misunderstanding (I found versions of it in both the hyperphysics pages you linked and the Motion Mountain book KF linked [near the bottom of page 299]). Nonetheless, it is a misunderstanding; in order to decrease entropy, you have to have heat or matter (or some more obscure things) leaving the system. I think the reason this misunderstanding occurs is that machines that reduce entropy (like refrigerators) or stay at constant entropy despite doing things that produce entropy (like computers) need both a supply of nonthermal energy ("work" in thermo terminology) and a way to dispose of waste heat. Heat is generally easy to get rid of, but work is in limited supply, so we tend to concentrate on work as the critical element for keeping the machine operating. But it's actually the waste heat that's responsible for carrying away entropy. To see this clearly, you need to consider situations where work input and heat output don't occur together. Take, for example, an inert object that's warmer than its surroundings. As it looses heat to its surroundings, its entropy will drop. If its surroundings are cold enough, it'll approach a temperature of absolute zero, and its entropy will also approach absolute zero. Now, take a situation where work is done on the system, but no heat flows: lifting a weight is a simple example. Its gravitational energy goes up, but there's no change in multiplicity or entropy. Mind you, I'm talking about what's needed for an entropy decrease; that may or may not be the same as an increase in order. "Order" is not a well-defined term in thermodynamics, and our intuitive sense of "order" doesn't correspond well to any particular thermodynamic quantity.
Even in your “cold boot attack” example, after 5 minutes of power off there is no retrievable information. Regardless in any/every case, energy must be input to retrieve whatever might be left from an undefined DRAM state, but no promises.
Computer memory requires an energy input to read data, store data, erase data, or even preserve existing data unchanged; surely not all of these operations decrease entropy?Gordon Davisson
May 26, 2011
May
05
May
26
26
2011
02:16 PM
2
02
16
PM
PDT
Gordon Davisson: The comparison was entirely within scenario #2, where the high bit went from being 0 with 100% probability, to being 0 with 50% probability and 1 with 50% probability. The multiplicity and entropy are based on the probabilities of various states, not what the encoding allows. The inconsistency is that there never was, by your definition, any information in the hi-order bit. All the information, is in the lo-order 7 bits. That information has a multiplicity of 2: a) ASCII letter with the hi-order bit=1 and b) ASCII letter with the hi-order bit=0 You can't reasonably argue no change of information on the lo-order 7-bits while arguing an entropy change on all 8-bits. Compare before and after information and entropy on only 7-bits (apples) or compare before and after information and entropy on all 8-bits (oranges), but don't conflate the two comparisons as disproving Lewis. If you power a DRAM chip on and start reading from it without writing first, you will get some pattern of 0s and 1s out. It’s undefined only in the sense that nobody will make any particular promises about what that pattern will be; Please. Without any promise as to what the pattern will be is as "undefined" as it gets. undefined means undefined; no promises, no guarantees, no defined meaning. in practice, if the chip has been off long enough, it’ll be whatever the read/store circuitry treats the all-discharged state as. And an "off" chip can't be read. Its values are unknowable, it has no defined information. OTOH, as previously noted, if the memory is designed to retain value without power (e.g. bubble or magnetic) then its defined state doesn't change without power and there is no loss of information, so that comparison remains moot. BTW, DRAM can drain even when powered on; each row must be read frequently to refresh its charges. Indeed. All volatile memory has some designed in "refresh" logic to ensure memory retains its "defined" values. It would be useless otherwise. BTW, needing to add energy to get back to the original state does not confirm an entropy decrease. If anything, it does the opposite, as adding energy will generally increase entropy. Adding energy is the only way to restore the previous "order". Adding energy is the only way to restore the previous "order". Adding energy is the only way to move from a large multiplicity to a small multiplicity. Note further your response to
But further note there is no information entropy definable for powered off RAM.
It’s reasonably well-defined in terms of what would happen if the memory were powered back on and read (see the cold boot attack link above for examples).
Precisely. Energy must be input to reduce entropy, i.,e. to restore order from disorder, to restore a defined state from an undefined state. Even in your "cold boot attack" example, after 5 minutes of power off there is no retrievable information. Regardless in any/every case, energy must be input to retrieve whatever might be left from an undefined DRAM state, but no promises. There’s only one way to encode a given English message, but there are many possible English messages. This is precisely the inconsistency to be avoided. Compare the loss of information when losing a given message with losing another given message (apples) or the loss of information when losing all possible messages with all possible messages (oranges), but don't conflate them.Charles
May 24, 2011
May
05
May
24
24
2011
01:01 PM
1
01
01
PM
PDT
Charles:
There seem some inconsistencies in your definitions/comparisons: By your definition, the “information” is the 7-bit ASCII letter (you explicitly affirm this by the phrase “the information was all in the lower 7 bits, and is intact”). Agreed. But further, the information is actually in 2 states: a) ASCII letter with the hi-order bit=1 and b) ASCII letter with the hi-order bit=0
If I understand what you're getting at here, a small clarification may resolve the inconsistency: I was assuming that the initial state had 0s in the high (8th) bit of each byte (I just neglected to mention this assumption). The situation you're describing only occurs in scenario #2 after the high bit is randomized.
[In scenario #2] ... no information was lost by randomizing the hi-order bit, but you state “an increase in entropy with no information loss”. But, to what other entropy state are you comparing? [...] If the comparison is strictly within scenario 2, wherein the hi-order bit being randomized is compared with the hi-order bit not being randomized, there are regardless the same 2 valid informational states having a multiplicity of 2 (as noted above), which is the same entropy, *not* an increase.
The comparison was entirely within scenario #2, where the high bit went from being 0 with 100% probability, to being 0 with 50% probability and 1 with 50% probability. The multiplicity and entropy are based on the probabilities of various states, not what the encoding allows.
In “Scenario #3: The RAM is powered off, and all of the charges storing the 1 bits drain gradually away. The memory is now all 0s,” As a practical matter of the example chosen, the state “without power” is in digital systems actually an undefined condition. Powerless dynamic ram has neither 1?s nor 0?s. [...] But draining power from dynamic memory causes a loss of information. Even erased memory that was all 0?s loses its “erased” condition when the power is turned off, and must be reinitialized to all 0?s when power is restored.
If you power a DRAM chip on and start reading from it without writing first, you will get some pattern of 0s and 1s out. It's undefined only in the sense that nobody will make any particular promises about what that pattern will be; in practice, if the chip has been off long enough, it'll be whatever the read/store circuitry treats the all-discharged state as. I'll give you two authorities for this: first, my personal experence as a computer repairman. Some of the really early Macintosh models used a section of main memory (which was DRAM) as their video buffer. Normally, when the Mac was powered on, it'd initialize that memory before the picture tube warmed up enough to display its contents; but if something prevented this (dead CPU, bad ROM, stuck reset line...) the video circuitry would read from the unitialized DRAM and you'd get a characteristic pattern of jagged black-and-white blocks (the exact pattern depended on what memory chips were installed, and how their read/store circuitry was laid out). Second, computer security researchers have recently been looking into recovering remnant data from powered-off DRAM (google keyphrase: "cold boot attack"). They have some nice images and video here showing an image of the Mona Lisa gradually draining to black and white bars showing exactly the phenomenon I'm talking about. BTW, DRAM can drain even when powered on; each row must be read frequently to refresh its charges.
The thermodynamic entropy [S=klnW] of the ‘powered off’ RAM increased as its heat dissipates and its temperature drops. This is further confirmed by the amount of energy input required to restore RAM to its powered-on “erased” state.
Actually, when you turn it off and its temperature drops, its entropy will drop as well. As long as there's nothing else going on to conplicate things, adding heat to a system always(*) increases its entropy, and removing heat always(*) decreases its entropy. In fact, that's how thermodynamic entropy is defined: for a reversible process, DeltaH (change in entropy) = DeltaQ/T (change in heat divided by absolute temperature). (* There is one sort-of exception: systems at negative absolute temperatures. But they're deeply weird and it's not clear they even strictly exist, so we should probably ignore them.) BTW, needing to add energy to get back to the original state does not confirm an entropy decrease. If anything, it does the opposite, as adding energy will generally increase entropy. If you want to reverse an entropy increase you need something associated with an entropy flux out of the system: heat leaving the system, matter leaving the system, etc.
The information entropy [H=lnM] did not change because the information entropy of the single bit pattern comprising an ASCII letter is no different than the information entropy of the single bit pattern comprising all 0?s; either can be encoded in only a single way. Note that M=1 (because there is only 1 way to encode all 0?s or only 1 way to encode a specific ASCII letter) is the number of “Messages” (whose probability actually occured) out of a 7-bit Message space (all possible ASCII letters).
There's only one way to encode a given English message, but there are many possible English messages. It's rather hard to determine exactly how many possible messages there are of a given length (and what their probabilities are), but the estimates I've seen put the Shannon-entropy of english at around 1 bit per letter. Since each message corresponds to a unique sequence of bytes of ASCII, the Shannon-entropy of the encoded message will be around 1 bit per byte. Put another way: the multiplicity of English messages 1000 characters long is around 2^1000; therefore the multiplcity of a 1000-byte block of memory containing an ASCII-encoded English message (with the high bits all 0) is also around 2^1000, and so its Shannon-entropy is log2(2^1000) bits = 1000 bits.
But comparatively, the information entropy is insignificant relative to the thermodynamic entropy (because of the relatively huge number of semiconductor ‘microstates’ and Boltzmann’s constant).
...I'm in complete agreement here...
But further note there is no information entropy definable for powered off RAM.
It's reasonably well-defined in terms of what would happen if the memory were powered back on and read (see the cold boot attack link above for examples). If that's too hypothetical, you could define it in terms of the charge on the individual capacitors: if a cap is more than half charged, consider it a 1, less than half charged consider it a 0. Note that depending on where the read circuit's threshold is, these two definitions may be equivalent.
If however you intended to simply re-initialize or “erase” all bits to 0?s, magically without an expenditure of energy input, there is still an inconsistency of definition. The original memory definition was 8-bit bytes, the hi-order bit being ignored and the lo-order 7 bits containing specific information (an ASCII letter). The multiplicity of the original ASCII letter state (ignoring the hi-order bit) is 1 (there is only a single bit pattern that comprises the ASCII letter. But the multiplicity of the all 0?s state is also 1 (there is only a single bit pattern that comprises all 0?s). To change from the specific ASCII letter state to the specific all 0?s state involves no net change in multiplicity, hence no net change in information entropy, but the state change is irreversible in that the information state of the ASCII letter is irretrievably lost.
Again, there are many possible English messages, so the initial multiplicity is much higher than 1 and the initial entropy higher than 0.
Landauer argued:
… there is no thermodynamic objection to a logically reversible operation potentially being achieved in a physically reversible way in the system. It is only logically irreversible operations — for example, the erasing of a bit to a known state, or the merging of two computation paths — which must be accompanied by a corresponding entropy increase. When information is physical, all processing of its representations, i.e. generation, encoding, transmission, decoding and interpretation, are natural processes where entropy increases by consumption of free energy.[4]
So in fact, scenario 3 is essentially “erasure” and represents an irreversible loss of the ASCII letter information state but with no net change in information entropy because the multiplicity of the ASCII and erased states are the same, but a real world gain of thermodynamic entropy when RAM is powered off (as confirmed by energy input being required to restore prior state).
This is indeed erasure, but there is a decrease of information entropy (as I argued above). There need not be a gain of thermodynamic entropy. There must be an increase in thermal (or spin or...) entropy, but only enough to compensate for the loss of information entropy and avoid an overall (thermodynamic) entropy decrease. Here's how Landauer puts it (note that he assumes erasing to all 1s instead of all 0s):
Consider a statistical ensemble of bits in thermal equilibrium [i.e. half 0s and half 1s -GD]. If these are all reset to ONE, the number of states covered in the ensemble has been cut in half. The entropy therefore has been reduced by k log_e 2 = 0.6931 k per bit [this is a change in what I'm calling the information component of entropy -GD]. The entropy of a closed system, e.g., a computer with its own batteries, cannot decrease; hence this entropy must appear elsewhere as a heating effect, supplying 0.6931 kT per restored bit to the surroundings. This is, of course, a minimum heating effect, and our method of reasoning gives no guarantee that this minimum is in fact achievable.
[I'll skip the sections of your summary I've dealt with above...]
In scenario 1 [complete randomization -GD], we are agreed on “an information loss linked to an increase in Shannon-entropy” but no, as to a thermodynamic entropy decrease as well as some energy input is required to effect a new (randomized) equilibrium. The system is not isolated as energy (electric or magnetic) must be input to effect the state change.
I don't see any inherent requirement for an energy input here (nor any obstacle to this occurring in isolation). Quite the opposite, if you reverse the reasoning in the Landauer quote I gave above, you'll see that the increase in entropy inherent in increasing the number of states the memory could, at least in theory, be coupled to a decrease in thermal entropy, in the form of some heat being converted to free energy. In the paper I cited by Charles Bennett, he actually describes a (theoretical) mechanism to randomize memory while converting heat to work. (His mechanism requires starting with completely blank memory, but it should in principle be generalizable to work with semi-predictable memory like something containing English text).
Lewis noted: “Gain in entropy always means loss of information, and nothing more”. In scenarios 1 and 3 there was a loss of information and a related gain in thermodynamic entropy. In scenario 2, there was no loss of information, but neither was there a loss of thermodynamic entropy for the lo-order 7 bits comprising the ASCII letter. In all three scenarios, Lewis’s point has not been disproven.
I'll stand by my original claims, but add an additional one: none of the three scenarios necessarily involve an increase in overall thermodynamic entropy. In each case, the change in information entropy could (at least in theory) be coupled to an equal-and-opposite change in thermal entropy, leaving the total unchanged. In practice, of course, there'll always be an overall increase in entropy; but there's no particular limit on how small this increase could be.Gordon Davisson
May 22, 2011
May
05
May
22
22
2011
11:43 PM
11
11
43
PM
PDT
Mung:
While it is true that thermodynamic entropy and thermal entropy are not the same thing would you agree or disagree that entropy in both has the same meaning if entropy is understood as “degrees of freedom” in the system. Can we connect “degrees of freedom” with “multiplicity” and can both be described by the same equation?
"Degrees of freedom" are ways of factoring a system's multiplicity. Hmm, that reads a lot like gibberish, right? Let me give an example that'll hopefully clarify: Suppose we have a billiard ball on a billiard table. There are a large number of possible locations the ball might be in. If we wanted to calculate the number of possible positions, we could figure out how many possibilities there are for the ball's position's X coordinate (the width of the table divided by some quantum limit) and similarly how many Y coordinates (length, divided by quantum limit) and multiply them together to get the total number of positions. The X and Y coordinates are examples of what I'm calling degrees of freedom. If there are Nx possible X coordinates and Ny possible Y coordinates (and a total of Nx*Ny possible positions), then then X-entropy will be k*ln(Nx), the Y-entropy will be k*ln(Ny), and the total entropy of the ball's position will be k*ln(Nx*Ny) = X-entropy + Y-entropy. We've broken the ball's entropy into horizontal and vertical components! There are a couple of possible complications I've ducked: it might be that not all coordinate combinations are possible (e.g. there are bumpers in the middle of the table, and the ball can't occupy the same space as them); in that case, there'll be fewer total possibilities than this calculation gives, and hence less total entropy. In that case, we'd say that the degrees of freedom are not independent, and so the calculation breaks down. The other complication is that some possibilities are more likely than others; this makes the math messier, but as long as the degrees of freedom are statistically independent, the total entropy will still be the sum of the X- and Y-entropies. Also, I've also only looked at the ball's position. The ball also might be in a large number of orientations, so if you want to know how many states the ball might be in you'd also have to figure out how many possible orientations the ball could have (No), and the multiply No into the multiplicity and add k*ln(No) into the entropy. And the ball might be moving, so there'll be a range of X- and Y-velocities the ball might have, as well as a range of rotational velocities. (Note that the rotational velocity tends to be at-least-mostly determined by the linear velocity, so it's not going to be an independent degree of freedom.) Again, calculate the number of possibilities, and it multiplies into the total multiplicity and its entropy adds to the total entropy. In general, one doesn't bother to track each degree of freedom individually (as I've mostly done above), but categorize them into convenient groups. One might, for example, group the configuration (position and orientation) degrees together, and and the motion (linear and rotational) together, and then it'd make sense to think of the configurational and motion entropies as contributions to the total entropy (although since they aren't properly independent, their total will be a little more than the total entropy). There's also an elephant in the room that I've been carefully ignoring: all of the degrees of freedom of the atoms within the ball. While each atom has a very limited range of positions and momenta it might have, there are a ridiculously huge number of them, so the entropy contribution from these microscopic degrees of freedom completely swamp the macroscopic contributions I've been talking about. One can try to categorize these degrees of freedom (e.g. configurational entropy has to do with how many ways the polymer strands could tangle together), but usually they're all too tightly linked and it's easier to just lump them all together as thermal entropy (because they mostly have to do with exactly how the system's thermal energy is scattered among the various atoms and their degrees of freedom). Is that a bit clearer?
And is it possible, if there is an increase in a system’s thermal degrees of freedom or an increase in all of the system’s degrees of freedom, that there is a loss of something that can be associated with the concept of information? If there is an increase in freedom is there also a corresponding increase in uncertainty in some sense?
Usually, the degrees of freedom themselves don't change, but the multiplicity of states within them does. For example, if you add heat to a system, you get an entropy increase because your have more energy scattered among the same degrees of freedom (and hence more ways to scatter it). There are ways to relate entropy to information, but they don't have a great deal to do with the sort of information most people think of. The thermodynamic entropy of a system is proportional to the amount of information needed to fully describe the exact microstate of that system. But this is a huge amount of uninteresting information (think about book after book cataloging the location of every atom...) -- there might be some interesting information mixed in, but it's going to be completely swamped by the thermal component. (Note that you can think of this either as positive information -- information inherent in the system's state -- or negative information -- information we don't have about the system's state. So it's a little hard to define what's an increase vs. what's a decrease.) To give you some idea what I mean by "swamped", consider that the absolute entropy of 1 gram of water (= 1 cc = about a thimblefull) at 1 atmosphere of pressure and 25 degrees Celsius (= 77 Farenheit) is 3.88 Joules/Kelvin, which corresponds to 4.05e23 bits (= 50.7 zettabytes) of information. That's safely more than the total data processed by the world's servers last year. Heating that 1 gram of water by 1 degree Celsius (- 1.8 degrees Farenheit) would increase its entropy by 0.0140 Joules/Kelvin, which would correspond to 1.46e21 bits of additional information needed to specify its microstate. Cooling the water would decrease its entropy by about the same amount. But none of this corresponds to anything most people would recognize as information....Gordon Davisson
May 21, 2011
May
05
May
21
21
2011
08:38 AM
8
08
38
AM
PDT
Mung: Why doesn’t Shannon Information require a measurement? Because, by definition, Shannon's own definition to be precise, information is in units of binary digits (bits) and in digital systems (i.e. information systems) or their examples (e.g. coin toss, dies) we count the number of bits which is how we take a measurement, we simply count. We can count bits because all we need to agree on is the definition of what determines the binary states (i.e. 1's and 0's, heads or tails, black or white, etc.) and then we count or compute. We don't need to "measure" with some instrument the value of either 1 or 0, or longer strings thereof. But in analog systems, there is an infinitude of continuous values. In an analog context we are concerned with more than pure black or white (which was an arbitrary binary choice), rather we become concerned with all the shades of grey in between and what is their reflectivity, absorption, granularity, resolution, and because we can no longer simply count binary states we now have to resort to measurements with various instruments. In the analog world, one seldom finds either perfect extreme of black or white, but instead some mixture of grey and measurement is the only way to know its state or value, and even then measurements will be imperfect. The answer to your question basically lies in the difference between analog and digital. Accordingly, the answer to How much information is there in a two-sided coin, and why? is 1 bit, because as opposed to the monetary realm wherein a coin may have the values of 1 cent, 5 cents, 10 cents, 25 cents, 50 cents, 1 dollar, 20 dollars, 50 dollars, etc. (considering only US coins), in the digital realm, by definition, a coin only has two values (heads or tails) and those two values can be represented with a single bit having a value of either 1 or 0. If we substitute a 6 sided die, now what is the information content, and why? What are the probability distributions, and how are they related to the information content? I gave you a link earlier that showed this: http://hyperphysics.phy-astr.gsu.edu/hbase/therm/entrop2.html It answered many of your questions. I agree that we need to consider the number of states, but just what, exactly, are we talking about? Microstates. Macrostates. Probablitties. Distributions? In binary digital systems, a "state" is simply the value of the bit, byte, word etc. For example, a bit can have either of 2 states: 1 or 0. A byte can be in any of 256 states because 8-bits can encode the values 0-255. A "macrostate" generally refers to a particular "combination" while a "microstate" generally refers to a "permutation". See again the above link. We randomize 8 bits. That gives those 8 bits the maximum Shannon entropy. Why? Because a randomized 8-bits can have any of 256 possible values (the "byte" can be in any of 256 possible states). The maxmimum Shannon entropy (or maxmimum information entropy) is just the maximum possible number of states, which for an 8-bit byte is 256. How is it that any physical system could increase or decrease in information unless we were asking a question or taking a measurement? How is it that a tree can fall in the forest if no one is present to see it?Charles
May 21, 2011
May
05
May
21
21
2011
06:16 AM
6
06
16
AM
PDT
Scenario #1: The contents of the RAM are randomized. Its Shannon-entropy rises to the maximum possible, 8 bits per byte. (This is very similar to what happens in Charles’ example with S-O-S encoded in cans.)
I love talking 1's and 0's. I can understand 1's and 0's. On/Off, Yes/No, True/False. Black/White. Two sided coin. Heads/Tails. How much information is there in a two-sided coin, and why? If we substitute a 6 sided die, now what is the information content, and why? What are the probability distributions, and how are they related to the information content? Surely our uncertainty is lower with a two sided coin. I mean, only two options, right? Heads or tails. And if our uncertainty is lower, is the information content higher? But is out uncertainty really higher with the 6-sided die? What if all we are interested in is even/odd? 2/4/6 vs 1/3/5?
By your definition, the “information” is the 7-bit ASCII letter (you explicitly affirm this by the phrase “the information was all in the lower 7 bits, and is intact”). Agreed. But further, the information is actually in 2 states: a) ASCII letter with the hi-order bit=1 and b) ASCII letter with the hi-order bit=0
I agree that we need to consider the number of states, but just what, exactly, are we talking about? Microstates. Macrostates. Probablitties. Distributions? We randomize 8 bits. That gives those 8 bits the maximum Shannon entropy. Why? Why doesn't Shannon Information require a measurement? How is it that any physical system could increase or decrease in information unless we were asking a question or taking a measurement?Mung
May 20, 2011
May
05
May
20
20
2011
07:50 PM
7
07
50
PM
PDT
BA77:
But one thing I do note is that Landauer’s principle is fairly rigorously establish, and the technical violation of it, which I listed in 75, was only stated ‘as possible’ i.e. they did not actually violate the principle but only showed it to be possible! Thus, I’m very reserved to accept your conclusion.
I don't think any of my conclusions depend on the spin paper, and I'm not advocating a violation of Laudauer's principal (other than maybe generalizing it to include spin entropy as well as thermal). Only the third of my scenarios involved erasure (in the strict sense where Landauer applies), and there I certainly agree there'll need to be a compensating entropy increase of some sort.Gordon Davisson
May 20, 2011
May
05
May
20
20
2011
04:13 PM
4
04
13
PM
PDT
Charles, for someone just 'taking a stab at it', I would say that is a very admirable 'stab' at clarifying. Thanks once again!bornagain77
May 20, 2011
May
05
May
20
20
2011
11:12 AM
11
11
12
AM
PDT
Gordon Davisson: I'll take a stab at clarifying the applicability of Lewis' point. Between the three scenarios, we’ve got information loss with an increase in Shannon-entropy [scenario #1], an information loss with a decrease in entropy [scenario #3], and an increase in entropy with no information loss [scenario #2]. There seem some inconsistencies in your definitions/comparisons: By your definition, the "information" is the 7-bit ASCII letter (you explicitly affirm this by the phrase "the information was all in the lower 7 bits, and is intact"). Agreed. But further, the information is actually in 2 states: a) ASCII letter with the hi-order bit=1 and b) ASCII letter with the hi-order bit=0 Between the three scenarios, we’ve got ... information loss with an increase in Shannon-entropy This is "Scenario #1: The contents of the RAM are randomized." Memory is randomized, information is lost and multiplicity (entropy) is maxmium (because there are 128 ways the lo-order 7-bits can be randomized and not retain the original ASCII letter. ... an increase in entropy with no information loss This is "Scenario #2: Only the 8th bit of each byte of RAM is randomized ... But the Shannon-entropy has now risen to 2 bits per byte." Yes, you have a Shannon entropy of 2 because as per your definitions, the high-order bit is not used in the ASCII letter, hence there are 2 valid states containing the ASCII letter (as noted above). So, agreed, no information was lost by randomizing the hi-order bit, but you state "an increase in entropy with no information loss". But, to what other entropy state are you comparing? In scenario 1 all 8-bits were randomized, representing the highest multiplicity (or entropy) possible, and by comparison of scenario 1 with scenario 2, scenario 2 has lower entropy than scenario 1. If the comparison is strictly within scenario 2, wherein the hi-order bit being randomized is compared with the hi-order bit not being randomized, there are regardless the same 2 valid informational states having a multiplicity of 2 (as noted above), which is the same entropy, *not* an increase. ... an information loss with a decrease in entropy, ... In "Scenario #3: The RAM is powered off, and all of the charges storing the 1 bits drain gradually away. The memory is now all 0s," As a practical matter of the example chosen, the state "without power" is in digital systems actually an undefined condition. Powerless dynamic ram has neither 1's nor 0's. If you're going to argue bubble or magnetic memory that doesn't need power to maintain state, then draining power is of no consequence because the bits won't change state (ignoring long-term decay) and no information is lost. But draining power from dynamic memory causes a loss of information. Even erased memory that was all 0's loses its "erased" condition when the power is turned off, and must be reinitialized to all 0's when power is restored. The thermodynamic entropy [S=klnW] of the 'powered off' RAM increased as its heat dissipates and its temperature drops. This is further confirmed by the amount of energy input required to restore RAM to its powered-on "erased" state. The information entropy [H=lnM] did not change because the information entropy of the single bit pattern comprising an ASCII letter is no different than the information entropy of the single bit pattern comprising all 0's; either can be encoded in only a single way. Note that M=1 (because there is only 1 way to encode all 0's or only 1 way to encode a specific ASCII letter) is the number of "Messages" (whose probability actually occured) out of a 7-bit Message space (all possible ASCII letters). But comparatively, the information entropy is insignificant relative to the thermodynamic entropy (because of the relatively huge number of semiconductor 'microstates' and Boltzmann's constant). But further note there is no information entropy definable for powered off RAM. If however you intended to simply re-initialize or "erase" all bits to 0's, magically without an expenditure of energy input, there is still an inconsistency of definition. The original memory definition was 8-bit bytes, the hi-order bit being ignored and the lo-order 7 bits containing specific information (an ASCII letter). The multiplicity of the original ASCII letter state (ignoring the hi-order bit) is 1 (there is only a single bit pattern that comprises the ASCII letter. But the multiplicity of the all 0's state is also 1 (there is only a single bit pattern that comprises all 0's). To change from the specific ASCII letter state to the specific all 0's state involves no net change in multiplicity, hence no net change in information entropy, but the state change is irreversible in that the information state of the ASCII letter is irretrievably lost. Landauer argued:
... there is no thermodynamic objection to a logically reversible operation potentially being achieved in a physically reversible way in the system. It is only logically irreversible operations — for example, the erasing of a bit to a known state, or the merging of two computation paths — which must be accompanied by a corresponding entropy increase. When information is physical, all processing of its representations, i.e. generation, encoding, transmission, decoding and interpretation, are natural processes where entropy increases by consumption of free energy.[4]
So in fact, scenario 3 is essentially "erasure" and represents an irreversible loss of the ASCII letter information state but with no net change in information entropy because the multiplicity of the ASCII and erased states are the same, but a real world gain of thermodynamic entropy when RAM is powered off (as confirmed by energy input being required to restore prior state). So where are we now? In scenario #3, we have an information loss linked to a decrease in Shannon-entropy and an increase in thermal or spin or some other component of the thermodynamic entropy. In #1, we have an information loss linked to an increase in Shannon-entropy, and possibly a decrease in thermal, spin, etc entropy. In #2, we have no loss of information, but an increase in Shannon-entropy, and again possibly a decrease in thermal, spin, etc entropy. As before, I don’t see any direct link between information loss and any particular change in any particular type of entropy. Summarizing my agreement and disagreement and taking Shannon entropy to mean information entropy, then: In scenario 3 we disagree, as there is both an irreversible loss of the ASCII information and also a gain of "all 0's" information (an equally probable albeit different logical state), but with an increase in thermodynamic entropy. I've not yet taken a position on whether that cost can be paid with spin. In scenario 1, we are agreed on "an information loss linked to an increase in Shannon-entropy" but no, as to a thermodynamic entropy decrease as well as some energy input is required to effect a new (randomized) equilibrium. The system is not isolated as energy (electric or magnetic) must be input to effect the state change. In scenario 2, we are agreed there is no information loss, but disagree as neither is there an increase in Shannon entropy nor any possible decrease in thermodynamic entropy as there was no change to the lo-order 7-bits encoding the ASCII letter, but some energy input required to effect a new equilibrium (randomize only the hi-order bit), as again the system is not isolated as energy (electric or magnetic) must be input to effect the randomizing state change. Lewis noted: "Gain in entropy always means loss of information, and nothing more". In scenarios 1 and 3 there was a loss of information and a related gain in thermodynamic entropy. In scenario 2, there was no loss of information, but neither was there a loss of thermodynamic entropy for the lo-order 7 bits comprising the ASCII letter. In all three scenarios, Lewis's point has not been disproven.Charles
May 20, 2011
May
05
May
20
20
2011
10:58 AM
10
10
58
AM
PDT
Gordon Davisson @106:
(Note: there’s an opportunity for confusion here: thermodynamic entropy and thermal entropy are not the same thing. Thermal entropy is just the entropy in a system’s thermal degrees of freedom, while the thermodynamic entropy is the total entropy in all of the system’s degrees of freedom. Thermal entropy is one contribution to the thermodynamic entropy, but not the whole thing.)
Hi Gordon. Thanks for your posting. While it is true that thermodynamic entropy and thermal entropy are not the same thing would you agree or disagree that entropy in both has the same meaning if entropy is understood as "degrees of freedom" in the system. Can we connect "degrees of freedom" with "multiplicity" and can both be described by the same equation? And is it possible, if there is an increase in a system’s thermal degrees of freedom or an increase in all of the system’s degrees of freedom, that there is a loss of something that can be associated with the concept of information? If there is an increase in freedom is there also a corresponding increase in uncertainty in some sense?Mung
May 20, 2011
May
05
May
20
20
2011
09:11 AM
9
09
11
AM
PDT
Gordon this tidbit may interest you: Maxwell's demon demonstration turns information into energy - November 2010 Excerpt: Until now, demonstrating the conversion of information to energy has been elusive, but University of Tokyo physicist Masaki Sano and colleagues have succeeded in demonstrating it in a nano-scale experiment. In a paper published in Nature Physics they describe how they coaxed a Brownian particle to travel upwards on a "spiral-staircase-like" potential energy created by an electric field solely on the basis of information on its location. As the particle traveled up the staircase it gained energy from moving to an area of higher potential, and the team was able to measure precisely how much energy had been converted from information. http://www.physorg.com/news/2010-11-maxwell-demon-energy.htmlbornagain77
May 19, 2011
May
05
May
19
19
2011
06:04 PM
6
06
04
PM
PDT
Gordon, not that I am all that well versed on the technicalities of Landauer's principle or even the technicalities of Maxwell's demon which accompanies it, and indeed I scarcely followed some of the technical points of your own argument. But one thing I do note is that Landauer's principle is fairly rigorously establish, and the technical violation of it, which I listed in 75, was only stated 'as possible' i.e. they did not actually violate the principle but only showed it to be possible! Thus, I'm very reserved to accept your conclusion.bornagain77
May 19, 2011
May
05
May
19
19
2011
04:34 PM
4
04
34
PM
PDT
“Gain in entropy always means loss of information, and nothing more.” Gilbert Newton Lewis
I think the relation between information and entropy is rather less direct than this. To illustrate, consider three scenarios. All will start with the same thing: some computer memory containing ASCII-encoded English text. English generally has around 1 bit of Shannon (information-theoretic) entropy per letter, so the Shannon-entropy of this computer memory starts at 1 bit per byte of RAM. Scenario #1: The contents of the RAM are randomized. Its Shannon-entropy rises to the maximum possible, 8 bits per byte. (This is very similar to what happens in Charles' example with S-O-S encoded in cans.) Scenario #2: Only the 8th bit of each byte of RAM is randomized. Since ASCII only uses 7 bits (as opposed to more modern encodings that use all 8 bits to allow more characters), the information was all in the lower 7 bits, and is intact. But the Shannon-entropy has now risen to 2 bits per byte. Note that neither of the above is particularly realistic. Computer memory (specifically dynamic RAM) stores information as the presence or absence of an electric charge in a very small capacitor (I'll assume 1=charged, 0=uncharged, although it actually depends on the read/store circuitry). For the third scenario, let me look at what actually happens… Scenario #3: The RAM is powered off, and all of the charges storing the 1 bits drain gradually away. The memory is now all 0s, so there's only one possible state, and the Shannon-entropy is zero. (Note: if the memory doesn't follow the uncharged=0 convention, the contents won't be zero, but there'll still be only one possible state to the entropy will still be zero). Between the three scenarios, we've got information loss with an increase in Shannon-entropy, an information loss with a decrease in entropy, and an increase in entropy with no information loss. Based on this, I don't think you can make any direct correlation between information loss and entropy change. Now, to make things more complicated, let me try adding thermodynamic entropy to the mix. According to theory (at least as I understand it), information stored in a physical system contributes to the thermodynamic entropy of the system at the rate of 1 bit of Shannon entropy = k (Boltzmann's constant) * ln(2) = 9.57e-24 Joule/Kelvin of thermodynamic entropy. This contribution is very small, and as far as I know it's never actually been measured; but I find the arguments for it convincing, so I'll assume it's real. What does this do with the scenarios above? Well, in scenario #3, it means that the decrease in Shannon-entropy corresponds to a (very small) decrease in thermo-entropy, which (per the second law) can only occur if it's coupled to an at-least-as-large increase in some other entropy component, so thermo-entropy overall doesn't go down (this is Landauer's principle, as BA77 mentioned in #75). And in real life, that's certainly what happens: as the charge in the memory cells drains away, its energy is converted to heat, increasing the thermal component of entropy by E/T (that is, the drained energy divided by the temperature). As long as the energy per bit is much larger than kT (which it is), this'll outweigh the decrease due to the lost Shannon-entropy, and the second law is satisfied. (Note: there's an opportunity for confusion here: thermodynamic entropy and thermal entropy are not the same thing. Thermal entropy is just the entropy in a system's thermal degrees of freedom, while the thermodynamic entropy is the total entropy in all of the system's degrees of freedom. Thermal entropy is one contribution to the thermodynamic entropy, but not the whole thing.) If I understand the research BA77 linked in #75 (I haven't read the original paper), the entropy decrease could also be compensated by an increase in spin entropy. This isn't really all that suprising, as various forms of entropy are generally interconvertable, as long as the total doesn't decrease. Now, in scenarios #1 and #2, there's an increase in Shannon-entropy; this could theoretically be coupled to a decrease in some other component of entropy (thermal, spin, whatever). Charles Bennett has proposed an information-powered heat engine that'd do more-or-less this, although it's never been built (see section 5 of Bennett, Charles H. (1982), "The thermodynamics of computation -- a review", _International Journal of Theoretical Physics_, v. 21, pp. 905-940). So where are we now? In scenario #3, we have an information loss linked to a decrease in Shannon-entropy and an increase in thermal or spin or some other component of the thermodynamic entropy. In #1, we have an information loss linked to an increase in Shannon-entropy, and possibly a decrease in thermal, spin, etc entropy. In #2, we have no loss of information, but an increase in Shannon-entropy, and again possibly a decrease in thermal, spin, etc entropy. As before, I don't see any direct link between information loss and any particular change in any particular type of entropy.Gordon Davisson
May 19, 2011
May
05
May
19
19
2011
04:11 PM
4
04
11
PM
PDT
Mung: Does [S = log W] then make sense? Yes. Though, more precisely Entropy is the natural log of Multiplicity, and is also sometimes (historically) expressed as Boltzmann's constant times the natural log of MultiplicityCharles
May 18, 2011
May
05
May
18
18
2011
01:27 PM
1
01
27
PM
PDT
If all that’s needed for an increase in entropy is an increase in multiplicity, why isn’t that alone sufficient to bring about a loss of information? It is.
So on the cover of A Farewell to Entropy is the following: S = log W Does that then make sense?Mung
May 18, 2011
May
05
May
18
18
2011
11:03 AM
11
11
03
AM
PDT
Mung: It assumes that you take entropy to be a physical quantity. Entropy is a physical quantity; whether derived from a system's heat and temperature, or from a system's multiplicity, it is a physical quantity. What is the physical/material quantity that is being changed in your cans example Their physical orientation. The physical directions in which they face. Changing the orientation of the cans is an increase in entropy? In the example I gave, yes. There is a very large "multiplicity" of orientations which have no informational meaning but only 1 orientation which means S-O-S. A change of state from the "multiplicity" of 1 meaning S-O-S to any of the very much larger "multiplicity" orientation states having no informational meaning, is an increase in entropy. And if we change them back to spell out SOS again, is that a decrease in entropy? Yes. And that requires energy input to the system. You don’t think that’s a fruitful idea to explore? That there might be a relationship between mutiplicity and information such that an increase in the one would count as a loss in the other? Its fruit has been explored and even indexed by Google, however novel you may find it. If all that’s needed for an increase in entropy is an increase in multiplicity, why isn’t that alone sufficient to bring about a loss of information? It is.Charles
May 17, 2011
May
05
May
17
17
2011
06:39 AM
6
06
39
AM
PDT
You had asked “how can a change in a physical/material quantity cause a loss of something non-physical/non-material?"
We're talking about entropy. How can an increase in entropy cause a loss of information. My question should be understood in that context. Here's what I wrote:
IOW, if entropy is physical, and information is not, how can a change in a physical/material quantity cause a loss of something non-physical/non-material?
It assumes that you take entropy to be a physical quantity. I don't think that a message written in the snow that disappears when the snow melts is what Lewis had in mind. Do you? What is the physical/material quantity that is being changed in your cans example and my snow example? Can it be shown to be equivalent to the change in entropy? Changing the orientation of the cans is an increase in entropy? And if we change them back to spell out SOS again, is that a decrease in entropy? Or do we also get information gain through increase in entropy? You wrote:
So an increase in multiplicity is an increase in entropy.
You don't think that's a fruitful idea to explore? That there might be a relationship between mutiplicity and information such that an increase in the one would count as a loss in the other? You even go on to write:
...which state(s) for the purposes of encoding information have a commensurately greater disorder...
So you even make the relationship between encoding and multiplicity. In the cans and morse code example, what would correspond to an increase in multiplicity. An increase in possible encodings? An increase in possible messages? Lewis (1930):
Gain in entropy always means loss of information, and nothing more.
Or to connect this with your words:
So an increase in multiplicity is an increase in entropy, which means loss of information, and nothing more.
If all that's needed for an increase in entropy is an increase in multiplicity, why isn't that alone sufficient to bring about a loss of information?Mung
May 16, 2011
May
05
May
16
16
2011
10:36 PM
10
10
36
PM
PDT
Mung: What you are talking about, essentially, seems to be noise entering a communication channel. No, I gave a clear example of how a change in a physical encoding medium can lose non-physical information. The original signal (S-O-S) was not obscured by noise, the signal itself (the cans) was disordered. The same disordering could be caused by you deliberately changing the orientation of the cans yourself. The mechanism by which the cans are disordered/re-ordered is irrelevant. The point remains the information conveyed in their ordering/orientation is intangible, abstract and immaterial while the encoding medium is physical. You had asked "how can a change in a physical/material quantity cause a loss of something non-physical/non-material?", not how can noise (a physical phenomena) obscure a signal (another physical phenomena).Charles
May 16, 2011
May
05
May
16
16
2011
06:38 PM
6
06
38
PM
PDT
p.s. Getting back to multiplicity. I think there is a direct correlation between the coin example i've offered and a physical system in which there is an increased number of microstates (from 8 to 16) consistent with a given macrostate. You now "know less" than you did before about which box contains the coin.Mung
May 16, 2011
May
05
May
16
16
2011
03:08 PM
3
03
08
PM
PDT
Assume, hypothetically...
Hi Charles, I'm not sure how this relates to an increase in entropy. Assume, hypothetically, that I wrote HELP in the snow and the sun came out and melted the snow. (increase in entropy?). But then I wouldn't need help. :) To be sure Lewis (1930) wrote before Shannon (1948), but he also wrote after Hartley (1928). One wonders what he had in mind. What you are talking about, essentially, seems to be noise entering a communication channel. I don't see that as an increase in entropy in either a physical or a communications theory sense of the term.Mung
May 16, 2011
May
05
May
16
16
2011
03:02 PM
3
03
02
PM
PDT
How much information would you require?
In my "coin in a box" example in @95 above, consider answering the question of how much information in terms of bits, as the number of yes/no questions you would need to have answered in order to locate the coin.
Information must not be confused with meaning.... To be sure, this word information in communication theory relates not so much to what you do say, as to what you could say. That is, information is a measure of one's freedom of choice when one selects a message. If one is confronted with a very elementary situation where he has to choose one of two alternative messages, then it is arbitrarily said that the information, associated with this situation, is unity.... The amount of information is defined, in the simplest cases, to be measured with the logarithm of the number of available choices. (Shannon and Weaver 1949, 8-9)
Mung
May 16, 2011
May
05
May
16
16
2011
02:47 PM
2
02
47
PM
PDT
Mung: IOW, if entropy is physical, and information is not, how can a change in a physical/material quantity cause a loss of something non-physical/non-material? Assume, hypothetically, that your phone and power are out at your mountain cabin, in winter, your vehicle is stuck in snow drifts, and temps are sub-zero and will be for weeks. To passersby, you signal an "S-O-S" in morse code (3 short, 3 long, 3 short) by putting 9 tin cans on a fence rail in front of your cabin: 3 cans horizontal end-on facing the road, 3 horizontal length-wise, and 3 more horizontal end-on facing the road. Obviously your call for help (information) encoded in the cans (physical medium) is only understood by those who know both morse code and the meaning of S-O-S (an intangible mental agreement on meaning). Assume further that a wind gust disturbs the cans so they all randomly spin around such that some are at an angle and some have the opposite orientation they had previously. The S-O-S (information) is no longer encoded, it exists (metaphysically?) only in your mind (encoded in neurons yes/no?) but all 9 cans (the medium) remain with every physical quantity and attribute (metalurgical composition, density, mass, reflectivity, tensile strength, static charge, etc.) exactly as before on the fence rail. Information was lost but the physical encoding medium remains, although its "state" has changed (from an ordered position to a disordered position), and energy is required to go back outside and twist them back into the proper, ordered S-O-S orientation, i.e. to re-encode the information as before.Charles
May 16, 2011
May
05
May
16
16
2011
12:54 PM
12
12
54
PM
PDT
Mung: "Q3: As a system approaches absolute zero, does it move closer to equilibrium?" Wouldn't it be a truism that it does? Is it not the understanding that at absolute zero all molecular motion ceases; or, to put it another way, all change ceases. Would that not a description of be absolute equilibrium? "Q4: If we begin at absolute zero and add heat, does entropy increase or decrease?" Whence comes the heat? (i.e. entropy increases)Ilion
May 16, 2011
May
05
May
16
16
2011
09:45 AM
9
09
45
AM
PDT
Note carefully that “information” is not the same as the “physical medium” in which it is encoded ... Information” per se ... is an abstract intangible (not physical)
Hi Charles, thanks for the comments and links. Must information be encoded, or must it only be encoded if it is to be transferred to some physical medium? What do you think? Is entropy a physical quantity? IOW, if entropy is physical, and information is not, how can a change in a physical/material quantity cause a loss of something non-physical/non-material? (Maybe you already answered taht and I just need to read your post again.)
... the physical states will move to equilibrium having greater “multiplicity” and greater entropy, which state(s) for the purposes of encoding information have a commensurately greater disorder, i.e. a loss of encoded information.
So an increase in multiplicity is an increase in entropy. But I don't think that it's "the increase in disorder" which causes a "loss of information," but rather the increase in multiplicity. Say I have 8 boxes and one coin, and the coin is in one of the boxes, and I ask you to determine which box contains the coin. The coin has the same probability of being in any one of the boxes. How much information would you require? Now say I double the number of boxes, but still one coin. Now how much information will you require to locate the coin? I'm sure you understand that this can all be expressed mathematically. I think this is a more general case, because (as far as I can tell) there is no issue of encoding.Mung
May 16, 2011
May
05
May
16
16
2011
09:23 AM
9
09
23
AM
PDT
The Second Law of Thermodynamics states that the entropy of an isolated system, such as the universe, that is not in equilibrium will tend to increase over time, approaching a maximum value at equilibrium. The Third Law of Thermodynamics states that as the temperature approaches absolute zero, the entropy of a system approaches a constant. Fortunately for us, the temperature of the universe is not zero. It is moving that way each moment, but it is not there yet. – ICR
Q3: As a system approaches absolute zero, does it move closer to equilibrium? Q4: If we begin at absolute zero and add heat, does entropy increase or decrease?Mung
May 16, 2011
May
05
May
16
16
2011
09:04 AM
9
09
04
AM
PDT
For what it's worth, and to those interested in the topic of information, I am really enjoying The Information: A History, A Theory, A FloodMung
May 15, 2011
May
05
May
15
15
2011
05:26 PM
5
05
26
PM
PDT
Sorry to hear about that kf.Mung
May 15, 2011
May
05
May
15
15
2011
04:29 PM
4
04
29
PM
PDT
bornagain77: nice links and commentary on how deep entropy goes. Thanks... just trying to be helpful when I can. And though I agree with your comment that ‘classical information’ is for the most part ‘intangible’, there has recently been discovered ‘quantum information’, on a massive scale, in molecular biology. "Quantum information" is simply information encoded in a quantum transmission "medium", albeit a poorly understood "medium" at present. In fact the quantum entanglement experiments, the theories for dark energy, the universe seemingly to be infinite and flat, all suggest that we don't, in fact, know the "boundaries" of our universe well enough to declare it "isolated" for the purposes of applying the 2nd Law of Thermo. The Second Law of Thermodynamics states that the entropy of an isolated system, such as the universe, Actually we don't know the universe is "isolated" in thermodynamic terms. We can pose "what if" scenarios, we make various assumptions and we can propose different models and theories and try experimentally to confirm them (which is what gave rise to the search for "dark energy" and the discovery of quantum entanglement, as examples) but as yet we don't factually know the universe is "isolated", we certainly don't know where are its boundaries or how they work. The boundaries of thermodynamically "isolated" systems are well understood, quantifiable, and measurable. We can measure energy movement across or reflected at such boundaries. We know where the boundaries are and how they work. However, we can not at present say the same about the "boundaries" of our universe, not at the quantum scale (as quantum entanglement and quantum vacuum suggest) nor at the cosmological scale as we can't even probe those limits as yet. We only know the universe seems to have zero (within 2%) curvature or is "flat" but may have infinite expanse. None of which tells us about the nature of the universe's boundaries for thermodynamic purposes.Charles
May 15, 2011
May
05
May
15
15
2011
04:10 PM
4
04
10
PM
PDT
Okay: Been busy doing a data recovery today, I guess a practical lesson on entropy. Just a note: when "erasure" is used it is actually resetting. Beyond that, I think the voltage and volume should be turned down Gkairosfocus
May 15, 2011
May
05
May
15
15
2011
02:48 PM
2
02
48
PM
PDT
Charles, nice links and commentary on how deep entropy goes. ,,, And though I agree with your comment that 'classical information' is for the most part 'intangible', there has recently been discovered 'quantum information', on a massive scale, in molecular biology. Information that has a lot more 'tangibility' to it than 'classical information. Quantum Information/Entanglement In DNA & Protein Folding - short video http://www.metacafe.com/watch/5936605/ Quantum entanglement holds together life’s blueprint - 2010 Excerpt: “If you didn’t have entanglement, then DNA would have a simple flat structure, and you would never get the twist that seems to be important to the functioning of DNA,” says team member Vlatko Vedral of the University of Oxford. http://neshealthblog.wordpress.com/2010/09/15/quantum-entanglement-holds-together-lifes-blueprint/ The relevance of continuous variable entanglement in DNA – June 21, 2010 Abstract: We consider a chain of harmonic oscillators with dipole-dipole interaction between nearest neighbours resulting in a van der Waals type bonding. The binding energies between entangled and classically correlated states are compared. We apply our model to DNA. By comparing our model with numerical simulations we conclude that entanglement may play a crucial role in explaining the stability of the DNA double helix. http://arxiv.org/abs/1006.4053v1 Quantum no-hiding theorem experimentally confirmed for first time - March 2011 Excerpt: In the classical world, information can be copied and deleted at will. In the quantum world, however, the conservation of quantum information means that information cannot be created nor destroyed. http://www.physorg.com/news/2011-03-quantum-no-hiding-theorem-experimentally.html Moreover Charles, there is persuasive evidence that this quantum information is what is the main entity that is constraining living systems to be so far out of thermodynamic equilibrium; Information and entropy – top-down or bottom-up development in living systems? A.C. McINTOSH – May 2010 Excerpt: It is proposed in conclusion that it is the non-material information (transcendent to the matter and energy) that is actually itself constraining the local thermodynamics to be in ordered disequilibrium and with specified raised free energy levels necessary for the molecular and cellular machinery to operate. http://journals.witpress.com/journals.asp?iid=47 Does DNA Have Telepathic Properties?-A Galaxy Insight Excerpt: DNA has been found to have a bizarre ability to put itself together, even at a distance, when according to known science it shouldn't be able to. Explanation: None, at least not yet.,,, The recognition of similar sequences in DNA’s chemical subunits, occurs in a way unrecognized by science. There is no known reason why the DNA is able to combine the way it does, and from a current theoretical standpoint this feat should be chemically impossible. http://www.dailygalaxy.com/my_weblog/2009/04/does-dna-have-t.html Quantum states in proteins and protein assemblies: The essence of life? - Hameroff Excerpt: The unitary oneness and ineffability of living systems may depend on mesoscopic/macroscopic quantum states in protoplasm. http://www.tony5m17h.net/SHJTQprotein.pdf and this,,, The Unbearable Wholeness of Beings - Steve Talbott Excerpt: Virtually the same collection of molecules exists in the canine cells during the moments immediately before and after death. But after the fateful transition no one will any longer think of genes as being regulated, nor will anyone refer to normal or proper chromosome functioning. No molecules will be said to guide other molecules to specific targets, and no molecules will be carrying signals, which is just as well because there will be no structures recognizing signals. Code, information, and communication, in their biological sense, will have disappeared from the scientist’s vocabulary. http://www.thenewatlantis.com/publications/the-unbearable-wholeness-of-beings The ‘Fourth Dimension’ Of Living Systems https://docs.google.com/document/pub?id=1Gs_qvlM8-7bFwl9rZUB9vS6SZgLH17eOZdT4UbPoy0Ybornagain77
May 15, 2011
May
05
May
15
15
2011
02:39 PM
2
02
39
PM
PDT
1 2 3 4

Leave a Reply