Uncommon Descent Serving The Intelligent Design Community

Intelligent Design Basics – Information – Part III – Shannon

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

In this post I want to consider another aspect of information.  Specifically, I want to consider the concept of “Shannon information.”

First of all, I admit to having ruffled a few feathers when I mentioned in passing in a prior post that “Shannon information is not really information.”  As I have also written before in comments on UD, I don’t begrudge anyone referring to the Shannon metric as “information.”  That terminology has penetrated the English language and has become regularly-used in information theory.  So, no, I am not going to police everyone who puts the words “Shannon” and “information” next to each other.

However, no small amount of misunderstanding has resulted from the unfortunate term “Shannon information.”  In particular, as it relates to intelligent design, some critics have seized on the idea of Shannon information and have argued that because this or that computer program or this or that natural process can produce a complex string or a complex sequence, that therefore such a program or process is producing new complex “information.”  This proves, the argument goes, that purely natural processes can produce new and large amounts of information, contra the claims of intelligent design.

Such thinking demonstrates a lack of understanding of CSI – in particular the need for specification.  However, a large part of the problem results from the use of the word “information” in reference to the Shannon metric.  As I have stated before, somewhat provocatively, we would all have been better off if instead of “Shannon information” the concept were referred to as the “Shannon measurement” or the “Shannon metric.”

Claude Shannon published a paper entitled “A Mathematical Theory of Communication” in the July 1948 volume of The Bell System Technical Journal.  This paper is available online here and is considered a foundational groundwork for not only Shannon’s subsequent research on the topic, but for information theory generally.  To be sure, there are many other aspects of information theory and many other individuals worthy of acclaim in the field, but Shannon is perhaps justifiably referred to as the father of information theory.

But before delving into other details in subsequent posts, time permitting, I want to relate a short experience and then a parable.  Consider this a primer, a teaser, if you will.

The Warehouse

When I was a teenager in high school, one of my part time jobs was working in a warehouse that housed and sold equipment and materials for the construction industry.  On a regular weekly schedule we would load a truck with supplies at the main warehouse and drive the truck to a smaller warehouse in a different city to supply the needs in that locale.  The day of the week was fixed (if memory serves, it was generally a Friday) and the sending warehouse foreman made sure that there were enough people on hand in the morning to pull inventory and load the truck, while the receiving warehouse foreman in turn ensured that there were enough people on hand in the afternoon to unload the truck and stock the inventory.

Due to the inevitable uneven customer demand in the receiving city, the needs of the receiving warehouse would vary.  With good inventory management, a large portion of the receiving warehouse’s needs could be anticipated up front.  However, it was not uncommon for the receiving warehouse to have a special order at the last minute that would necessitate removing a large crate or some boxes from the truck that had already been loaded in order to make room for the special order.  At other times when no large orders had been made, we would finish loading all the supplies and find that we still had room on the truck.  In this latter case, the sending foreman would often decide to send some additional supplies – usually a high turnover item that he knew the receiving warehouse would likely need shortly anyway.

In either case, the goal was to make most efficient use of the time, money and expense of the truck and driver that were already slated to head to the other town – taking the best possible advantage of the previously-allocated sunk costs, if you will.  Ensuring that the shipment container (in this case a truck) made best use of the available capacity was a key to efficient operations.

I want to now take this experience and turn it into a parable that relates to Shannon information.

The Parable of the Fruit Truck

Let’s assume that instead of heating and cooling equipment and supplies, the warehouse sells fruit directly to customers.   Let’s further assume that the various kinds of fruit are shipped in different-sized boxes – the watermelons in one size of box, the pineapples in another, the apples in another, and the strawberries in yet another.

Now, for simplicity, let’s suppose that customers purchase the fruit on a long-term contract with a pre-set price, so the primary variable expense of the warehouse is the expense of operating the truck.  The warehouse would thus be highly incentivized to maximize the efficiency of the truck – sending it out on the road only as often as needed, and maximizing the carrying capacity of the truck.

The dock workers in our parable, however, are not particularly sharp.  As the fruit comes in from the farms, the dock workers, without confirming the contents, simply start packing the boxes at the front of the truck, working their way to the back.  Invariably, there are gaps and open spaces as the various-sized boxes do not precisely conform to the internal capacity of the truck.  Some days are better than others by dint of luck, but the owner quickly realizes that the packing of the truck is inefficient.  Worse still, customers regularly complain that (i) the truck is arriving only partly filled, (ii) boxes contain the wrong kind of fruit, or (iii) in particularly egregious cases, the boxes contain rotten fruit or no fruit at all.

As a result, the warehouse owner decides to hire a sharp young man fresh from the university whose sole job it is to figure out the best way to pack the truck, to create the most efficient and time-saving way to deliver as much fruit as possible given the carrying capacity of the truck.

Let’s say this young man’s name is, oh, I don’t know, perhaps “Shannon.”

Now our hero of the parable, Shannon, works in the office, not the loading dock, and is unable to confirm the actual contents of the boxes that are loaded on the truck.  Further, he quite reasonably assumes the dock workers should be doing that part of the job.  Notwithstanding those limitations, Shannon is a sharp fellow and quickly comes up with a formula that gives the owner a precise calculation of the truck’s carrying capacity and the exact number of each type of fruit box that can be loaded on the truck to ensure that every square inch of the truck is filled.

Elated with the prospect of putting all the customer complaints behind him, the warehouse owner hands down the instruction to the dock workers: henceforth the truck will be packed with so many watermelon boxes, so many pineapple boxes, so many apple boxes and so on.  Furthermore, they will be packed according to Shannon’s carefully worked out order and placement of the boxes.

After the next week’s shipments, the owner is surprised to receive a number of customer complaints.  Although not a single customer complains that the truck was only partly full (it was packed tightly to the brim in all cases), several customers still complain that (i) boxes contain the wrong kind of fruit, or (ii) in particularly egregious cases, the boxes contain rotten fruit or no fruit at all.

Furious, the owner marches to Shannon’s desk and threatens to fire him on the spot.  “I hired you to figure out the best way to pack the truck to create the most efficient approach to delivering as much fruit as possible!  But I am still swamped by customer complaints,” he fumes as he throws down the list of customer complaints on Shannon’s desk.  Unfazed, Shannon calmly looks at the customer complaints and says, “I understand you used to get complaints that the truck was only partially filled, but I notice that not a single customer has complained about that problem this week.  You hired me to find the most efficient delivery method, to ensure that the truck was maximizing its carrying capacity of boxes.  I did that.  And that is all I have ever claimed to be able to do.”

“But some of the customers got the wrong fruit or got no fruit at all,” sputters the owner.  Based on your work we told them they would be receiving a specific quantity of specific types of fruit each week.

“I’m sorry to hear that,” retorts Shannon, “but you should not have promised any specific fruit or any particular quantity of fruit based on my formula alone.  From my desk I have no way of knowing what is actually in the boxes.  The supplier farms and dock workers can answer for that.  What is in the boxes – what is actually delivered to the customer – has nothing to do with me.  I have no ability from where I am sitting, nor frankly any interest, in guaranteeing the contents of the boxes.  My only task, the only thing I have ever claimed to be able to do, is calculate the maximum carrying capacity of the truck with the given boxes.

The Analogy

The fruit truck is obviously but a simple and fun analogy.  However, it does, I believe, help newcomers get a feel for what Shannon can do (analyze maximum carrying capacity of a delivery channel) and what Shannon cannot do (analyze, confirm, understand or quantify the underlying substance).  We’ll get into more details later, but let’s kick it off with this analogy.

What similarities and differences are there between our parable of the fruit truck and Shannon information?  What other analogies are you familiar with or perhaps have yourself used to help bring these rather intangible concepts down to earth in a concrete way for people to understand?

 

Comments
information [in-fer-mey-shun] noun 1. knowledge acquired through experience or study 2. knowledge of specific and timely events; news 3. the act of informing or the condition of being informed 4. in computing: (a) the meaning given to data by the way in which it is interpreted; (b) another word for data Synonyms: data, facts, intelligence, advice. ----- Neil suggests @55 that it is best to leave things like understanding and purposeful informing and meaning "out of the issue." In other words, if we strip the word "information" of everything that makes it information, if we strip it of all ordinary meaning (no pun intended), then we can use the word to mean something else than what is normally meant. Shoot, if we're going that far, why not just use the word "dogs" instead? We could strip it of all its normal meaning and say that we are now dealing with "Shannon dogs." Just as rational.
What makes something information, is our choice to use Shannon’s theory to mathematically model it.
Sorry. Shannon was a great guy, but he doesn't get to completely strip the word "information" of everything that the word has meant for hundreds of years and say, in effect, "anything that can be modeled with my theory is information, everything else isn't." Furthermore, Shannon never made any such claim. Again, I don't begrudge anyone using the term "Shannon information" as long as they are clear what they are talking about. So far, it is wholly unclear what you mean.
The reason I like Shannon information, is that it gives us that stark picture that nothing actually is information.
Wait a minute. Any string, every string, can be modeled with the Shannon metric if we know the relevant parameters. Previously you said it was all information. Now you're saying nothing is?
Rather, we count something as information based on how we use it.
Oh, I see. We count something as information if it is usable to convey something of substance, if it has meaning, if it informs. I certainly agree with that. And it takes us right back to the old dictionary definition. -----
And sure, we have people claiming that the Universe itself is just information. Whatever it is that they mean by “information”, it isn’t anything that I can make sense of.
I think we probably have the same uneasy skepticism here. Not real clear to me either, though I do understand the general thrust of information being at the heart of something or being the source of something.Eric Anderson
July 1, 2014
July
07
Jul
1
01
2014
07:53 AM
7
07
53
AM
PDT
Hi all, This will be my last comment. I have irritated our host and come off as a know-it-all. This is important to me, I guess. I think that a lot of confusion, argument and new vocabulary could be avoided if ID advocates knew information theory a little better. Perhaps this summary may help a little: Shannon divided the information transmission problem into two pieces: the encode/decode problem and the transmit/receive problem. His famous channel capacity theorem is the solution to the second problem and is the domain our host and most others have been working in. But the nature of information, what it is, and how to best represent it is the subject of the encode/decode problem. Here, information is defined, mainly by the receiver, and meaning is the most important aspect of the problem. In his encode/decode theorem, Shannon showed how a message with redundancy (an example of redundancy is how u nearly always follows q in English) could be encoded into the minimum-sized message possible ("pure" information if you will). We are all familiar with the technique: zip and tar files reduce size by removing redundancy. In this first problem domain information is simply what informs the receiver. Nothing more or less. Once the information is reduced to its smallest generic form, it is prepared to be sent through the channel and we enter the second problem domain. Here, a special form of redundancy (error correction codes) are added to the generic messages and they are transmitted. In this domain the meaning of the messages is irrelevant, thus the comments to this effect. But once the messages are received they are decoded and meaning is once again assigned to each message. We are back into the domain where meaning matters. It is as simple as that: information is the stuff the receiver sees as information. Complex patterns in mud do not inform us much...they don't contain much information. But we have learned a great deal from DNA. It is tremendously information-rich to both us and the molecular machines that process it. We are privileged to live in a time when the Shannon channel capacity has been practically reached. We have technology that is fast and cheap enough to transmit information error-free at essentially the maximum theoretical rate. I believe the entire universe was set up so we are able to figure out, in every detail, exactly how it was all done. A magnificent classroom or lab, if you will. What marvelous discoveries await us!GBDixon
July 1, 2014
July
07
Jul
1
01
2014
06:51 AM
6
06
51
AM
PDT
It's best to leave whatever Neil sez out of the discussion as it has become very obvious that Neil doesn't know jack about information. BTW marks on a paper that look like letters could very well be information. As I said, Neil is the wrong person to discuss information with as he thinks communication can take -place with meaningless words- whatever those are.Joe
July 1, 2014
July
07
Jul
1
01
2014
05:23 AM
5
05
23
AM
PDT
Just to be clear, are you saying that something entered into a communication system has to be a valid communication (i.e., meaningful, understandable, purposeful, or whatever similar term we want to use here)?
No. It is best to leave those ideas out of the issue. A video camera transmitting data could still count as a communication system, even if it is entirely mechanical and no part of the system has a capacity for meaning or understanding. You can look at it as in my reply to Mung. What makes something information, is our choice to use Shannon's theory to mathematically model it. Your string of letters is Shannon information, in the sense that I took you as communicating that string in a way that can be modeled as Shannon information. However, in completely objective metaphysical terms, your string of letters is ink marks on paper or illuminated dots on a screen. The reason I like Shannon information, is that it gives us that stark picture that nothing actually is information. Rather, we count something as information based on how we use it. And sure, we have people claiming that the Universe itself is just information. Whatever it is that they mean by "information", it isn't anything that I can make sense of.Neil Rickert
June 30, 2014
June
06
Jun
30
30
2014
10:01 PM
10
10
01
PM
PDT
Eric (#51):
Well, that depends on what we mean by information.
This may be the most important point of the whole discussion. The word "information" is used in multiple conflicting ways. Unfortunately, people talk is if information were an objective or metaphysical entity. We would do better to understand it as an attribution appropriate for a particular use.Neil Rickert
June 30, 2014
June
06
Jun
30
30
2014
09:46 PM
9
09
46
PM
PDT
Neil @50: Thanks.
This is why it is reasonable to say that Shannon information is always specified (by being entered into a communication system).
Just to be clear, are you saying that something entered into a communication system has to be a valid communication (i.e., meaningful, understandable, purposeful, or whatever similar term we want to use here)? Such underlying meaning would indeed be a specification. Or are you arguing that the simple act of entering a string into the system (whether or not that string has any underlying meaning or substance) makes that string a specification?Eric Anderson
June 30, 2014
June
06
Jun
30
30
2014
09:35 PM
9
09
35
PM
PDT
Mung (#38):
https://uncommondescent.com/design-inference/intelligent-design-basics-information-part-iii-shannon/#comment-505811
Strictly speaking, that is correct. It is a mathematical theory. One chooses whether to use the theory to model an actual communication system. However, people find it more convenient to say "x is Shannon information" than "x can be modeled as Shannon information."Neil Rickert
June 30, 2014
June
06
Jun
30
30
2014
09:34 PM
9
09
34
PM
PDT
Hmmm . . . Slipped in a #48 while I was typing, eh?
Are we then in agreement that Shannon theory cannot tell us what is information and what is not information?
Well, that depends on what we mean by information. I'm inclined to think that information has to be meaningful, but other folks (as we have seen), argue that any old string of letters can be information. That is part of the definitional confusion that results from the term "Shannon information."
That Shannon theory cannot tell us which messages are meaningful and which are not meaningful?
Quite true. Shannon himself said as much. Indeed, he used the word "irrelevant" when talking about the meaning of a message in the context of his theory.
Because Shannon theory lacks this capability, does it in any way logically follow from Shannon theory that information can be meaningless?
Again, this is a definitional issue. Shannon theory simply cannot say whether we are dealing with meaning or not. So if information necessarily has meaning, as you appear to be arguing (and with which I would provisionally be inclined to agree), then -- by definition -- Shannon theory cannot tell us whether we are dealing with information or not. At least not in the underlying substance. Thus the so-called "Shannon information" must be about something entirely different than the substance of the message, the substance of the string, the substance of the communication. The metric is measuring something separate from and apart from the underlying information. That is part of my whole point.Eric Anderson
June 30, 2014
June
06
Jun
30
30
2014
09:31 PM
9
09
31
PM
PDT
That being the case, perhaps you can clarify for me what the difference is between Shannon information and strings of characters?
Strictly speaking, it is Shannon information if it is part of a communication system or an information processing system. Marks on paper that just happen to look like letters would not count. This is why it is reasonable to say that Shannon information is always specified (by being entered into a communication system).Neil Rickert
June 30, 2014
June
06
Jun
30
30
2014
09:27 PM
9
09
27
PM
PDT
Mung @41-47: Whoa, slow down there cowboy! Let Neil get a word in edge-wise! :) I know you're already fired up about this! @47:
Unfortunately for ID, that is also the view presented by Stephen Meyer in Signature in the Cell.
Do you have a cite for this? I could maybe look it up, but I'm too lazy and I might not find the actual passage that you are referring to, so if you have a quote handy, that would be helpful. If what Meyer is saying is that a string of characters can be meaningless, then I'd have to agree. If what you're saying is that a string of characters (given relevant parameters, of course) can contain a certain number of bits of Shannon information and that, therefore, the Shannon information is "meaningful" because it is "about" something, then I'd say that is trivially true . . . and also wholly uninteresting for purposes of ID, which is no doubt what Meyer was focusing on. Anyway, if you have the quote handy, that would be great.Eric Anderson
June 30, 2014
June
06
Jun
30
30
2014
09:21 PM
9
09
21
PM
PDT
Eric:
The Analogy The fruit truck is obviously but a simple and fun analogy. However, it does, I believe, help newcomers get a feel for what Shannon can do (analyze maximum carrying capacity of a delivery channel) and what Shannon cannot do (analyze, confirm, understand or quantify the underlying substance).
Are we then in agreement that Shannon theory cannot tell us what is information and what is not information? That Shannon theory cannot tell us which messages are meaningful and which are not meaningful? Because Shannon theory lacks this capability, does it in any way logically follow from Shannon theory that information can be meaningless? Say the fruit truck confused Avocado with Almond. IF the symbol "A" could mean either one what are the implications?Mung
June 30, 2014
June
06
Jun
30
30
2014
09:17 PM
9
09
17
PM
PDT
Eric:
Shannon information = string of characters
This is what Elizabeth Liddle thought. Toss a coin x number of times you get y bits of "Shannon information." The coin tosses were meaningless. When asked what they were about she had no answer. But because y bits of Shannon Information could be calculated it somehow meant that information could be meaningless. Unfortunately for ID, that is also the view presented by Stephen Meyer in Signature in the Cell.Mung
June 30, 2014
June
06
Jun
30
30
2014
08:35 PM
8
08
35
PM
PDT
Neil Rickert:
This is not the ID kind of specification, as in “Oh, it looks specified, so it just must be.” These are detailed engineering specification that you can find in libraries and probably on the internet.
Right. These are real specifications But they have nothing to do with "the ID kind of specification." Right.Mung
June 30, 2014
June
06
Jun
30
30
2014
08:11 PM
8
08
11
PM
PDT
GBDixon:
The set of valid messages among the total of all possible messages that can be received defines the information. Channel capacity has no meaning without this message list and information entropy and the channel capacity cannot be calculated without knowing the number of valid messages among the number of possible messages.
Mung
June 30, 2014
June
06
Jun
30
30
2014
08:05 PM
8
08
05
PM
PDT
Neil Rickert:
Just to be clear, the metric is not the information.
The measure is not that which is being measured? I could not agree more. Confusing the two is a source of much confusion. Is it not also the case that if the measure is "an amount of information" it does not logically follow that it is information that is being measured? Neil Rickert
Just to be clear, the metric is not the information.
It does not follow that because a measurement can be taken in "amounts of information" that what is being measured is information. It also does not follow that because a measurement can be taken in "amounts of information" that what is being measured is meaningless information. Neil Rickert
The metric is a way of measuring the amount of information (or the channel capacity).
The amount of information of what? All that's being "measured" is the capacity to process symbols. The symbols may or may not convey meaning. Whether or not they convey meaning is irrelevant to the mathematical problem. From this, it does not logically follow that there is or can be such a thing as "meaningless information."Mung
June 30, 2014
June
06
Jun
30
30
2014
07:52 PM
7
07
52
PM
PDT
Nota bene: I drafted this post more than two months ago
The Shannon Information has degraded accordingly!Mung
June 30, 2014
June
06
Jun
30
30
2014
07:35 PM
7
07
35
PM
PDT
hi Eric, When I said the question was meaningless, I meant that the question did not have an answer according to Shannon's theory. It's like asking what is the Shannon information of the sky. Like asking whether a red sky has more or less Shannon information than a blue sky. Yes, Shannon Information is a measure, but it's a measure of probabilities, not a measure of information in the classical sense. Let me try to give an example: Say we could create a symbol generator that could generate an infinite number of different symbols each with the same probability. Could Shannon theory be used to measure the information in bits? Say we could create a symbol generator that could generate a finite number of different symbols but with no predictable probability for any given symbol. Could Shannon theory be used to measure the information in bits? If you have a "coin tossing" machine and the two symbols it generates are H and T respectively and they are equiprobable, then when you see an H you can say your uncertainty has been reduced by a certain amount, and we can call this "information." But if we randomly change the probability of the T or the H, or if we randomly insert other symbols, then how do we measure the probability, or the reduction in uncertainty, or the "Shannon Information"? CheersMung
June 30, 2014
June
06
Jun
30
30
2014
07:30 PM
7
07
30
PM
PDT
Joe:
What is information without meaning if not meaningless information?
Give me an example of meaningless information.
The word information in this theory is used in a special mathematical sense that must not be confused with its ordinary usage. In particular, information must not be confused with meaning.
So when he says information in the second sentence you don't think he means information [as used] in this theory ... in a special mathematical sense? So why should information have a special sense in this theory but be nonsense elsewhere? If information can be meaningless, why is Shannon Information exempt? Is Shannon Information not information? Do you think he means Shannon Information can be meaningless? All Weaver is saying is what I said previously. Shannon Information is mathematical, it doesn't tell you whether whether or not the message is meaningful. It does not logically follow that information can be meaningless. In fact, it is not logically possible for information to be meaningless. But to the specific challenge I raised, which you did not address. How does one get "meaningless information?" from Shannon's theory. Given that it's a mathematical theory of communication, I expect to see some maths.
A fundamental, but a somehow forgotten fact, is that information is always information about something. The Mathematical Theory of Information
Mung
June 30, 2014
June
06
Jun
30
30
2014
07:13 PM
7
07
13
PM
PDT
Mung: Thanks for your comments. I definitely want your input, as you have been very vocal about this for a long time. I do trust that after we've had a chance to mull on your questions for a day that you'll treat us to an actual explanation and not just let us flounder at sea while you ask provoking questions. :)
That may not be a “trick” question, but it is a meaningless question.
It is not meaningless at all. I'm trying to step back to square one to understand what Neil is referring to when he talks about "Shannon information." The example is very much on point. Yes, I could have said the letters had equal probability if we wanted to make the calculation simpler. That impact on the calculation is not directly important, however, to the question I posed. Hopefully, my follow up @39 will help flesh out precisely what Neil views as "Shannon information" so that I can understand it and be on the same page.Eric Anderson
June 30, 2014
June
06
Jun
30
30
2014
06:37 PM
6
06
37
PM
PDT
Neil @31:
The string itself is the information.
Thanks. Let me make sure I'm understanding you. With a string, say, ETTTBSNHOBOORTUEHTISEEATQTNOIO, you are saying that the string itself is "Shannon information." Presumably, then, the string HTISEEATQTNOIOETTTBSNHOBOORTUE would also be "Shannon information." And the string HTISEEATQSNHOBOORTUETNOIOETTTB would also constitute "Shannon information," and so on. In other words, under your definition any string of characters constitutes "Shannon information." And a single letter and a whole page of characters or a whole book also each constitute "Shannon information." That being the case, perhaps you can clarify for me what the difference is between Shannon information and strings of characters? Under your definition they seem to be equivalent. If you are right, then we can say: Shannon information = string of characters Which is to say, if we have a string of characters, then the Shannon information concept tells us that we have . . . a string of characters. What does the term "Shannon information" bring to the table in your definition, if it simply is another way of saying that it is "the string itself"?Eric Anderson
June 30, 2014
June
06
Jun
30
30
2014
06:28 PM
6
06
28
PM
PDT
Don't mean to pick on you Neil, because I haven't yet read all your comments in the current thread, but you make some fundamental errors which will hopefully be instructive.
The string itself is the information.
No, it isn't. Shannon's theory does not tell you whether this message constitutes information or not.
If you intended to ask about the amount of information (the metric), I guess that’s about 141 bits (the number of letters, times the log of 26 to the base 2).
Is that assuming an alphabet of 26 symbols with each having an equal probability? IOW, not English? I say assuming, because how did you know? Did Shannon's theory tell you? Eric Anderson:
Take the following string: ETTTBSNHOBOORTUEHTISEEATQTNOIO What is the Shannon information?
That may not be a "trick" question, but it is a meaningless question. In the English language there are a specific number of letters and certain letters appear more frequently than others. Is that a message in the English language? I think not. So how then are we able to reduce the uncertainty?Mung
June 30, 2014
June
06
Jun
30
30
2014
06:24 PM
6
06
24
PM
PDT
Mung:
But meaningless information? Really? How does anyone get that from Shannon’s theory? Please. Speak up.
Warren Weaver, one of Shannon’s collaborators:
The word information in this theory is used in a special mathematical sense that must not be confused with its ordinary usage. In particular, information must not be confused with meaning.
What is information without meaning if not meaningless information?Joe
June 30, 2014
June
06
Jun
30
30
2014
06:23 PM
6
06
23
PM
PDT
1.) Shannon information is information. 2.) Shannon was completely correct to not refer to it as a measurement or a metric, because what he explicitly denies is that it tells you whether or not you are measuring information! 3.) People who deny #1 are mistaken. They fail to understand what Shannon information is information about. It is explicitly not about measuring information! 4.) People who do not understand #2 and #3 often mistakenly claim that Shannon information demonstrates that information can be meaningless. We get our fair share of both here at UD. Allen MacNeill and Elizabeth Liddle spring to mind. But meaningless information? Really? How does anyone get that from Shannon's theory? Please. Speak up. So, Eric, thanks again for treading where angels fear :) Questiosn for all to ponder: Q1: What is Shannon Information? Q2: What is Shannon Information about? Q3: Is Shannon Information independent of the information content (or lack thereof) of the message? Q4: Does Shannon Information tell us whether the message is meaningful or not? Q5: If Shannon Information cannot tell us whether the message is meaningful (or not) can it therefore be deduced from Shannon's theory that information can be meaningless? Q5: What would "meaningless Shannon Information" look like?Mung
June 30, 2014
June
06
Jun
30
30
2014
06:06 PM
6
06
06
PM
PDT
I'm really curious at what level the OP understands the subtle implications of information theory, me thinking probably he doesn't. Conversely, it seems the OP thinks that information theory is an attempt to conform to some function that he thinks it should but but somehow comes up short in its founder's aspirations to achieve, maybe because of the misunderstanding of the subtleties. For example, because of the OP's little scenario it seems to me that he thinks there should be some type of quality factor attached to information, as if an artistic work could somehow be measured and shoveled in there. Or somehow the qualities of decision making by the young manager on the job. The original OP seems also to be confused by what Shannon was referring to as capacity. In a channel operating near BIT RATE capacity, the error-correction codes are embedded such that the message can be recovered WITHOUT ERROR due to the employment of the error-correction code (redundant information). It is the amazing result of the Shannon- Hartley theorem that it takes into account the success of any error-correction code and is one of those mysterious results from applied math that the theorem MUST take into account the action of error-correction whether or not the originators of the theorem intended this to be the case. And if you read the 1948 paper as I have, Shannon discusses this in entertaining terms. All communications will produce errors, but the amazing thing is that error correction can reduce the errors to zero as long as the recovered information bit rate does not exceed that specified by the Shannon-Hartley theorem, irregardless of the efficiency of the error-correction code employed. (efficiency measured as the numbers of errors corrected per second per additional bandwidth required for the redundant bits for the error correction code)groovamos
June 30, 2014
June
06
Jun
30
30
2014
05:15 PM
5
05
15
PM
PDT
- facepalm - Neil now wants to hook up a random generator to the input channel of the system in order to "specify" randomness to the output. By doing this, he has concocted a scenario where he can save his claim that "all Shannon information is specified". . (I believe I'll move along now)Upright BiPed
June 30, 2014
June
06
Jun
30
30
2014
04:18 PM
4
04
18
PM
PDT
Not only do words have to be defined, but ball-park definitions, at the very least, have to be agreed. As a matter of fact, agreement/usage is paramount, much to the chagrin of the Academie Francaise. So, you could scarcely be more comprehensively mistaken, Neil.Axel
June 30, 2014
June
06
Jun
30
30
2014
03:13 PM
3
03
13
PM
PDT
Neil Rickert:
People were succesfully communicationg with words thousands of years before there were written languages or definitions or dictionaries.
So those words had no meaning then. What, exactly, were they communicating with those meaningless words? And if words don't have any meaning how can you tell if they are words? We are right back to: If we didn’t define words, ie if words were not specified, communication would be impossible. Definitions are word specifications. The definitions don't have to be written down, Neil.Joe
June 30, 2014
June
06
Jun
30
30
2014
02:30 PM
2
02
30
PM
PDT
Take the following string: ETTTBSNHOBOORTUEHTISEEATQTNOIO What is the Shannon information?
The string itself is the information. If you intended to ask about the amount of information (the metric), I guess that's about 141 bits (the number of letters, times the log of 26 to the base 2).Neil Rickert
June 30, 2014
June
06
Jun
30
30
2014
02:12 PM
2
02
12
PM
PDT
The context was communication using words, Neil.
People were succesfully communicationg with words thousands of years before there were written languages or definitions or dictionaries.Neil Rickert
June 30, 2014
June
06
Jun
30
30
2014
02:08 PM
2
02
08
PM
PDT
Shannon information does not consider any specification in the communication channel, consequently, unspecified noise in the communuication channel is included in Shannon information – just as Claude Shannon states in the second paragraph of his paper.
This is plainly wrong. Shannon was concerned about noise, and does not count noise as information. Suppose I have a random generator, set to generate random strings. I take the output and put that up on a web page. If you read that web page, you will that particular string, exactly as specified by my random generator. You won't see a different string from what was specified. Maybe it looks like noise to you, but it is what the random generator specified. And the theory is concerned with transmitting that correctly (i.e. as specified). There's a lot of electrical noise around. But you still see the string that my random generator produced (and thereby specified). You do not see any sign of the noise.Neil Rickert
June 30, 2014
June
06
Jun
30
30
2014
02:05 PM
2
02
05
PM
PDT
1 2 3 4

Leave a Reply