Uncommon Descent Serving The Intelligent Design Community

Intelligent Design Basics – Information – Part III – Shannon

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

In this post I want to consider another aspect of information.  Specifically, I want to consider the concept of “Shannon information.”

First of all, I admit to having ruffled a few feathers when I mentioned in passing in a prior post that “Shannon information is not really information.”  As I have also written before in comments on UD, I don’t begrudge anyone referring to the Shannon metric as “information.”  That terminology has penetrated the English language and has become regularly-used in information theory.  So, no, I am not going to police everyone who puts the words “Shannon” and “information” next to each other.

However, no small amount of misunderstanding has resulted from the unfortunate term “Shannon information.”  In particular, as it relates to intelligent design, some critics have seized on the idea of Shannon information and have argued that because this or that computer program or this or that natural process can produce a complex string or a complex sequence, that therefore such a program or process is producing new complex “information.”  This proves, the argument goes, that purely natural processes can produce new and large amounts of information, contra the claims of intelligent design.

Such thinking demonstrates a lack of understanding of CSI – in particular the need for specification.  However, a large part of the problem results from the use of the word “information” in reference to the Shannon metric.  As I have stated before, somewhat provocatively, we would all have been better off if instead of “Shannon information” the concept were referred to as the “Shannon measurement” or the “Shannon metric.”

Claude Shannon published a paper entitled “A Mathematical Theory of Communication” in the July 1948 volume of The Bell System Technical Journal.  This paper is available online here and is considered a foundational groundwork for not only Shannon’s subsequent research on the topic, but for information theory generally.  To be sure, there are many other aspects of information theory and many other individuals worthy of acclaim in the field, but Shannon is perhaps justifiably referred to as the father of information theory.

But before delving into other details in subsequent posts, time permitting, I want to relate a short experience and then a parable.  Consider this a primer, a teaser, if you will.

The Warehouse

When I was a teenager in high school, one of my part time jobs was working in a warehouse that housed and sold equipment and materials for the construction industry.  On a regular weekly schedule we would load a truck with supplies at the main warehouse and drive the truck to a smaller warehouse in a different city to supply the needs in that locale.  The day of the week was fixed (if memory serves, it was generally a Friday) and the sending warehouse foreman made sure that there were enough people on hand in the morning to pull inventory and load the truck, while the receiving warehouse foreman in turn ensured that there were enough people on hand in the afternoon to unload the truck and stock the inventory.

Due to the inevitable uneven customer demand in the receiving city, the needs of the receiving warehouse would vary.  With good inventory management, a large portion of the receiving warehouse’s needs could be anticipated up front.  However, it was not uncommon for the receiving warehouse to have a special order at the last minute that would necessitate removing a large crate or some boxes from the truck that had already been loaded in order to make room for the special order.  At other times when no large orders had been made, we would finish loading all the supplies and find that we still had room on the truck.  In this latter case, the sending foreman would often decide to send some additional supplies – usually a high turnover item that he knew the receiving warehouse would likely need shortly anyway.

In either case, the goal was to make most efficient use of the time, money and expense of the truck and driver that were already slated to head to the other town – taking the best possible advantage of the previously-allocated sunk costs, if you will.  Ensuring that the shipment container (in this case a truck) made best use of the available capacity was a key to efficient operations.

I want to now take this experience and turn it into a parable that relates to Shannon information.

The Parable of the Fruit Truck

Let’s assume that instead of heating and cooling equipment and supplies, the warehouse sells fruit directly to customers.   Let’s further assume that the various kinds of fruit are shipped in different-sized boxes – the watermelons in one size of box, the pineapples in another, the apples in another, and the strawberries in yet another.

Now, for simplicity, let’s suppose that customers purchase the fruit on a long-term contract with a pre-set price, so the primary variable expense of the warehouse is the expense of operating the truck.  The warehouse would thus be highly incentivized to maximize the efficiency of the truck – sending it out on the road only as often as needed, and maximizing the carrying capacity of the truck.

The dock workers in our parable, however, are not particularly sharp.  As the fruit comes in from the farms, the dock workers, without confirming the contents, simply start packing the boxes at the front of the truck, working their way to the back.  Invariably, there are gaps and open spaces as the various-sized boxes do not precisely conform to the internal capacity of the truck.  Some days are better than others by dint of luck, but the owner quickly realizes that the packing of the truck is inefficient.  Worse still, customers regularly complain that (i) the truck is arriving only partly filled, (ii) boxes contain the wrong kind of fruit, or (iii) in particularly egregious cases, the boxes contain rotten fruit or no fruit at all.

As a result, the warehouse owner decides to hire a sharp young man fresh from the university whose sole job it is to figure out the best way to pack the truck, to create the most efficient and time-saving way to deliver as much fruit as possible given the carrying capacity of the truck.

Let’s say this young man’s name is, oh, I don’t know, perhaps “Shannon.”

Now our hero of the parable, Shannon, works in the office, not the loading dock, and is unable to confirm the actual contents of the boxes that are loaded on the truck.  Further, he quite reasonably assumes the dock workers should be doing that part of the job.  Notwithstanding those limitations, Shannon is a sharp fellow and quickly comes up with a formula that gives the owner a precise calculation of the truck’s carrying capacity and the exact number of each type of fruit box that can be loaded on the truck to ensure that every square inch of the truck is filled.

Elated with the prospect of putting all the customer complaints behind him, the warehouse owner hands down the instruction to the dock workers: henceforth the truck will be packed with so many watermelon boxes, so many pineapple boxes, so many apple boxes and so on.  Furthermore, they will be packed according to Shannon’s carefully worked out order and placement of the boxes.

After the next week’s shipments, the owner is surprised to receive a number of customer complaints.  Although not a single customer complains that the truck was only partly full (it was packed tightly to the brim in all cases), several customers still complain that (i) boxes contain the wrong kind of fruit, or (ii) in particularly egregious cases, the boxes contain rotten fruit or no fruit at all.

Furious, the owner marches to Shannon’s desk and threatens to fire him on the spot.  “I hired you to figure out the best way to pack the truck to create the most efficient approach to delivering as much fruit as possible!  But I am still swamped by customer complaints,” he fumes as he throws down the list of customer complaints on Shannon’s desk.  Unfazed, Shannon calmly looks at the customer complaints and says, “I understand you used to get complaints that the truck was only partially filled, but I notice that not a single customer has complained about that problem this week.  You hired me to find the most efficient delivery method, to ensure that the truck was maximizing its carrying capacity of boxes.  I did that.  And that is all I have ever claimed to be able to do.”

“But some of the customers got the wrong fruit or got no fruit at all,” sputters the owner.  Based on your work we told them they would be receiving a specific quantity of specific types of fruit each week.

“I’m sorry to hear that,” retorts Shannon, “but you should not have promised any specific fruit or any particular quantity of fruit based on my formula alone.  From my desk I have no way of knowing what is actually in the boxes.  The supplier farms and dock workers can answer for that.  What is in the boxes – what is actually delivered to the customer – has nothing to do with me.  I have no ability from where I am sitting, nor frankly any interest, in guaranteeing the contents of the boxes.  My only task, the only thing I have ever claimed to be able to do, is calculate the maximum carrying capacity of the truck with the given boxes.

The Analogy

The fruit truck is obviously but a simple and fun analogy.  However, it does, I believe, help newcomers get a feel for what Shannon can do (analyze maximum carrying capacity of a delivery channel) and what Shannon cannot do (analyze, confirm, understand or quantify the underlying substance).  We’ll get into more details later, but let’s kick it off with this analogy.

What similarities and differences are there between our parable of the fruit truck and Shannon information?  What other analogies are you familiar with or perhaps have yourself used to help bring these rather intangible concepts down to earth in a concrete way for people to understand?

 

Comments
Shannon Information: A Misleading Analogy From the OP:
My only task, the only thing I have ever claimed to be able to do, is calculate the maximum carrying capacity of the truck with the given boxes.
Shannon's Capacity Theorem See also:
The Shannon limit or Shannon capacity of a communications channel is the theoretical maximum information transfer rate of the channel, for a particular noise level. - Noisy-channel coding theorem
In information theory, the Shannon–Hartley theorem tells the maximum rate at which information can be transmitted over a communications channel of a specified bandwidth in the presence of noise. It is an application of the noisy channel coding theorem to the archetypal case of a continuous-time analog communications channel subject to Gaussian noise. The theorem establishes Shannon's channel capacity for such a communication link, a bound on the maximum amount of error-free digital data (that is, information) that can be transmitted with a specified bandwidth in the presence of the noise interference, assuming that the signal power is bounded, and that the Gaussian noise process is characterized by a known power or power spectral density. The law is named after Claude Shannon and Ralph Hartley. - Shannon–Hartley theorem
In electrical engineering, computer science and information theory, channel capacity is the tightest upper bound on the rate of information that can be reliably transmitted over a communications channel. By the noisy-channel coding theorem, the channel capacity of a given channel is the limiting information rate (in units of information per unit time) that can be achieved with arbitrarily small error probability. Information theory, developed by Claude E. Shannon during World War II, defines the notion of channel capacity and provides a mathematical model by which one can compute it. The key result states that the capacity of the channel, as defined above, is given by the maximum of the mutual information between the input and output of the channel, where the maximization is with respect to the input distribution - Channel capacity
There's more to Shannon's paper than just the channel capacity, and it's this "something more" that people have in mind when they think of Shannon Information. It's in Section 6 (Choice, Uncertainty and Entropy) starting on page 392. So to expand the analogy, Shannon didn't just compute the carrying capacity of the truck, he developed a way to quantify (measure) the "sizes" (or average size) of the boxes.Mung
August 2, 2014
August
08
Aug
2
02
2014
08:29 PM
8
08
29
PM
PDT
"He [MacKay] proposed that both Shannon and Bavelas were concerned with what he called 'selective-information,' that is information calculated by considering the selection of message elements from a set." - http://www.physics.utoronto.ca/people/homepages/logan/ch2wii2.pdfMung
July 30, 2014
July
07
Jul
30
30
2014
07:26 PM
7
07
26
PM
PDT
"The meaning of information is given by the processes that interpret it." - Ed FredkinMung
July 30, 2014
July
07
Jul
30
30
2014
07:25 PM
7
07
25
PM
PDT
Shannon information is not really information.
Shannon Information really is information.
...a large part of the problem results from the use of the word “information” in reference to the Shannon metric. As I have stated before, somewhat provocatively, we would all have been better off if instead of “Shannon information” the concept were referred to as the “Shannon measurement” or the “Shannon metric.”
So let's call it Shannon's Measure of Information (SMI). Eric Anderson @ 102:
What is Shannon information? Please define.
Your OP is titled "Intelligent Design Basics – Information – Part III – Shannon" Did you fail to define Shannon Information in the OP? Have you defined Shannon Information anywhere in this thread? Perhaps we can work together to define Shannon Information. Assuming, of course, that you are willing to dispense with the claim that Shannon Information is not really information.
A fundamental, but somehow forgotten fact, is that information is always information about something. - Jan Kahre, The Mathematical Theory of Information
What is Shannon's Measure of Information (SMI) about? To assert that Shannon information is "not really information" is to assert that there is no "aboutness" to Shannon Information. This assertion ought to be discarded as absurd on it's face.Mung
July 29, 2014
July
07
Jul
29
29
2014
07:31 PM
7
07
31
PM
PDT
Shannon Information is still information. It's not as if with the appearance of Shannon's Theorems information gained a little brother, a distinct and separate entity. However, it is confined to something specific within the greater realm of information: probability distributions.Mung
July 29, 2014
July
07
Jul
29
29
2014
05:19 PM
5
05
19
PM
PDT
Let's identify another mistaken ID argument about Shannon Information.
Shannon's classical information theory concerns itself only with statistical relationships of material symbols found within the code of Universal Information. This was because nothing more was necessary in order to address the technical issues of information transmission and storage. While Shannon stated this point clearly in his landmark paper, most modern day evolutionary theorists champion his definition primarily because it allows for the creation of 'information' by randomly assembling symbols. This makes creation of biological information trivial, and separate biological information from biological functionality. The attempt to define biological information in this way is clearly ideologically driven and is obviously not sufficient, since no thinking person would exclude meaning and purpose from biological (functional) information. - Biological Information: New Perspectives. p. 16
There are many things I find wrong here, but I'll focus on just one. Notice the clear connection to the same mistake that Meyer makes. Shannon's theory does not and cannot tell us what is required for the creation of information. It does not and cannot 'allow' for the creation of 'information' by randomly assembling symbols.Mung
July 27, 2014
July
07
Jul
27
27
2014
06:02 PM
6
06
02
PM
PDT
Hey Eric, In case you ever find time to get back to this thread. I was reading a book recently that indicated that there are four different Shannon entropies. Would it follow that there are four different measures and that there are thus perhaps four different definitions of "Shannon Information"?Mung
July 27, 2014
July
07
Jul
27
27
2014
05:35 PM
5
05
35
PM
PDT
P1 The Elementary Problem: What Is Information? Information can be viewed from three perspectives... Many extensionalist approaches to the definition of information as reality or about reality provide different starting points for answering P1. 1. the information theory approach (mathematical theory of codification and communication of data/signals, Shannon and Weaver (1949 rep. 1988) defines [Shannon] information in terms of probability space distribution; Each extensionalist approach can be given an intentionalist reading, by interpreting the relevant space as a doxastic space, in which information is seen as a reduction in the degree of uncertainty or level of surprise in an informee, given the state of information in the informee. Information theory in (1) approaches information as a physical phenomenon, syntactically. It is not interested in the usefulness, relevance, meaning, interpretation, or aboutness of data, but in the level of detail and frequency in the uninterpreted data (signals or messages). It provides a successful mathematical theory because its central problem is whether and how much data, not what information is conveyed. - Luciano Floridi, The Philosophy of Information. p. 30-31Mung
July 26, 2014
July
07
Jul
26
26
2014
03:47 PM
3
03
47
PM
PDT
More on the definition of Shannon Information. Formal-mathematical information A third class of information-theoretic notions includes the formal concepts of information, which have been initially introduced as mathematical tools for measuring the performance of communications devices. The classical notion, in this category, was introduced by the mathematical theory of communication of Shannon (1948) and Shannon and Weaver (1949). In the latter, [Shannon] information is a measure of one's freedom of choice when one selects a message (the logarithm of the number of available choices or of probabilities). - Nicolas J. Bullot. Attention, Inforamtion, and Epistemic Perception. In Inforamtion and Living Systems: Philosophical and Scientific Perspectives. George Terzis and Robert Arp, eds.Mung
July 26, 2014
July
07
Jul
26
26
2014
03:45 PM
3
03
45
PM
PDT
Serious difficulties arise when scientists try to separate the idea of probability from the idea of information, because the first cannot be defined without the help of the second. In Shannon's theory, entropy is a probability distribution, assigning various probabilities to a set of possible messages. But entropy is also a measure of what the person receiving a message does not know about it before it arrives. Entropy is an index of his uncertainty as to what to expect. - Grammatical Man: Information, Entropy, Language and Life, p. 92-93
Mung
July 22, 2014
July
07
Jul
22
22
2014
09:27 PM
9
09
27
PM
PDT
Eric, do we have a definition yet of Shannon Information? If not, why not?Mung
July 20, 2014
July
07
Jul
20
20
2014
09:40 PM
9
09
40
PM
PDT
: Information: The New Language of Science : Hans christian von Baeyer : Chapter 12 : Randomness: The flip side of information : p. 99 Shannon's technical definition of the information content of a message - the number of digits when the message is written in the binary code of the computer - doesn't distinguish between sense and nonsense.Mung
July 20, 2014
July
07
Jul
20
20
2014
09:30 PM
9
09
30
PM
PDT
: Information: The New Language of Science : Hans Christian von Baeyer : Chapter 4 : Counting Bits: The scientific measure of information : p. 28 In contrast to the vague verbal definition of information, the technical definition, though skeletal, is a model of specificity and succinctness. Claude Shannon, the founder of information theory, invented a way to measure 'the amount of information' in a message without defining the word 'information' itself, nor even addressing the question of the meaning of the message. He produced, in effect, an operational definition like that of temperature, except that his measuring device - a simple recipe - is not a physical apparatus, like a thermometer, but a conceptual tool. Shannon's information measure is most easily applied to a message that consists of a string of binary possibilities - yes or no, heads or tails, zero or one - each of which is equally likely at every step along the string. According to Shannon, each choice corresponds to one bit (short for 'binary digit') of information. Communicating to a friend the outcome of three consecutive tosses of a penny, for example, requires three bits of information, which would define eight possible strings of heads and tails. More generally, Shannon's recipe is simple: To find the information content of any message, translate the message into the binary code of the computer and count the digits of the resulting string of zeros and ones. The number so obtained is called 'Shannon information', and the technique is known, somewhat dismissively, as bit-counting.Mung
July 20, 2014
July
07
Jul
20
20
2014
09:30 PM
9
09
30
PM
PDT
In terms of the colloquial meaning of information, Shannon's paper deals with the carriers of information - symbols - and not with information itself. It deals with communication and the means of communication rather than that elusive end product of communication - information. - Information Theory and Coding, p. 2
We shall learn that the symbols must obey certain laws if they are to be capable of transmitting information... - Information Theory and Coding, p. 2
Mung
July 18, 2014
July
07
Jul
18
18
2014
11:34 PM
11
11
34
PM
PDT
RE: 101 How does Meyer know that these characters are from the English alphabet and not from the Latin alphabet?
Since both are composed of the same 26-letter English alphabet, the amount of uncertainty eliminated by each letter (or space) is identical.
The fact that symbols may appear to be from the same alphabet doesn't determine the frequency or probability of their occurrence in a sequence of characters or symbols, and that is what determines how "informative" a given symbol is.
The probability of producing each of those two sequences at random is identical.
But letters in the English language are not equally probable, so different letters convey different amounts of Shannon Information. Therefore, both sequences have an equal amount of Shannon information as measured by Shannon’s theory. The conclusion does not follow. It's a non sequitur. There's no reason to believe both sequences were produced at random. In fact, I think we can be pretty sure the first one was not produced at random. Perhaps the second sequence came from a source which uses an alphabet identical to that of the first sequence (say one containing 26 symbols plus a space), and perhaps each symbol was equally likely to be transmitted by the source with the same probability (though given the number of spaces I have my doubts), and each character was selected randomly and transmitted. What the average info per symbol according to Shannon's theory? But I find it extremely difficult to believe that the first sequence came from the same source. And why should I? And if it did not come from a source with the same characteristics, then it doesn't have "an equal amount of Shannon information as measured by Shannon’s theory."Mung
July 17, 2014
July
07
Jul
17
17
2014
08:51 PM
8
08
51
PM
PDT
Eric, see my response at 99. You don't agree with Meyer's definition of Shannon Information or you don't think he is explaining what Shannon Information is in the quoted material?Mung
July 17, 2014
July
07
Jul
17
17
2014
08:14 PM
8
08
14
PM
PDT
Mung, you are a genius at writing riddles. Less so at clear exposition of an answer. What is Shannon information? Please define.Eric Anderson
July 15, 2014
July
07
Jul
15
15
2014
08:42 AM
8
08
42
AM
PDT
Consider two sequences of characters: "Four score and seven years ago" "nenen ytawoi jll sn mekhdx nnx" Both of these sequences have an equal number of characters. Since both are composed of the same 26-letter English alphabet, the amount of uncertainty eliminated by each letter (or space) is identical. The probability of producing each of those two sequences at random is identical. Therefore, both sequences have an equal amount of Shannon information as measured by Shannon's theory. - Signature in the Cell p. 90
Meyer is mistaken, but how (or in how many ways) and why?Mung
July 14, 2014
July
07
Jul
14
14
2014
09:18 PM
9
09
18
PM
PDT
But this brings us back to the very nuanced point: Does a string “contain” Shannon information?
It is a very nuanced point and many people seem to be unaware of the nuance or even of the possibility of the necessity for any nuanced view. The answer is no. Strings of characters do not "contain" or "carry" or "convey" Shannon Information. Think of Shannon Information as meta-information.
Shannon information = string of characters
Sorry, I just needed someplace to start and that seemed to be convenient at the time. Didn't mean to imply that was your PoV. :)Mung
July 11, 2014
July
07
Jul
11
11
2014
07:23 PM
7
07
23
PM
PDT
Meyer provides a good starting point for discussing Shannon Information.
Shannon's theory of information was based upon a fundamental intuition: information and uncertainty are inversely related. - Signature in the Cell p. 88
Uncertainty, of course, implies uncertainty about something. So Shannon Information in no way reduces the requirement for the aboutness of information. In fact, it is inherent in the very concept of Shannon Information.
Claude Shannon wanted to develop a theory that could quantify the amount of information stored in or conveyed across a communication channel. He did this first by linking the concepts of information and uncertainty and then by linking these concepts to measures of probability. According to Shannon, the amount of information conveyed (and the amount of uncertainty reduced) in a series of symbols or characters is inversely proportional to the probability of a particular event, symbol, or character occurring. - Signature in the Cell p. 88
The amount of uncertainty about what? One wonders why Shannon Uncertainty isn't as popular a concept as Shannon Information. :)
By equating information with the reduction of uncertainty, Shannon's theory implied a mathematical relationship between information and probability. Specifically, it showed that the amount of information conveyed by an event is inversely proportional to the probability of it's occurrence. The greater the number of possibilities, the greater the improbability of any one being actualized, and thus the more information that is transmitted when a particular possibility occurs. - Signature in the Cell p. 89
To calculate the amount of Shannon Information one must be able to specify the amount of Shannon Uncertainty. And when that is done, it becomes readily apparent what Shannon Information is about. (And what it is not about.Mung
July 11, 2014
July
07
Jul
11
11
2014
07:11 PM
7
07
11
PM
PDT
Mung, Meyer's point by underscoring that ten digits may not even constitute a valid phone number is pretty clear. The whole point of those couple of pages is to get people thinking about the difference between mere complexity and specified complexity. His simple example is perfectly suited to that task.
. . . Smith's sequence exhibits what has been called specified complexity, while Jones's exhibits mere complexity.
p.107 This is true regardless of whether we define Shannon information as being a property of the string itself or the resulting measurement number of the Shannon calculation. Again, I'm not sure why you are hung up on this. More interesting to me though, is how you would define "Shannon information."Eric Anderson
July 10, 2014
July
07
Jul
10
10
2014
08:44 AM
8
08
44
AM
PDT
Eric:
Meyer didn’t “admit” anything on that front. It was part of his point. So you should have said, “As Meyer underscores.”
What was Meyer's point that he underscores by admitting that the ten digits may not even constitute a phone number? Was it that a random string of characters can provide meaningless information?Mung
July 9, 2014
July
07
Jul
9
09
2014
06:07 PM
6
06
07
PM
PDT
Neil Rickert:
I’m saying that meaning is inherently subjective.
So you're NOT saying that Shannon Information is meaningless, just that it's subjective. Thanks for clearing that up. Whew! Neil Rickert: If you insist on tying information to meaning, then that makes information subjective. Make up your mind, please. Information is inherently tied to meaning. There is no such thing as meaningless information. The very idea is absurd. Are you asserting that Shannon Information is not tied to meaning? If it isn't, then what use is it? Or are you saying Shannon Information is subjective? If so, so what?Mung
July 9, 2014
July
07
Jul
9
09
2014
05:51 PM
5
05
51
PM
PDT
Mung, thanks for taking time for a few additional comments. I hope you aren't quoting me for the notion that: Shannon information = string of characters. That was my assessment of Neil's confused approach @31. -----
Meyer is simply confused here (and confusing). By simply arranging ten characters into a string of characters “at random” one does not create information, not even Shannon Information.
I think you are taking your criticism a bridge too far. If we are saying that "Shannon information" is the resulting number that is spit out when we plug a string into a Shannon calculation, then, yes, Meyer should have been more careful in his light-hearted analogy. But this brings us back to the very nuanced point: Does a string "contain" Shannon information? If we are defining Shannon information as the mere numerical result spit out of a calculation, then perhaps no. If we are defining the result spit out of a calculation as a measure of something (say, information carrying capacity), then perhaps yes.
And his new ten-digit phone number may not even be a phone number! As Meyer admits.
Meyer didn't "admit" anything on that front. It was part of his point. So you should have said, "As Meyer underscores." ----- We seem to be circling around a definitive definition, and I suspect we are very much in agreement. However, what I would love to hear from you (notwithstanding your wonderful and energetic Socratic method of asking lots of questions), is your definition of Shannon information. Let's say I approach you on the street and ask you for a straight-up, unambiguous, no-hidden-nuances, non-rhetorical definition of "Shannon information," what would you say?Eric Anderson
July 9, 2014
July
07
Jul
9
09
2014
02:03 PM
2
02
03
PM
PDT
Eric (#91):
You are confusing two different things. Just because something is created by an intelligence doesn’t mean it is subjective.
You seem to have missed the point. I'm saying that meaning is inherently subjective. Whether or not meaning is created by an intelligence, or is just a natural part of organic life, is not relevant to the point that meaning itself is subjective. If you insist on tying information to meaning, then that makes information subjective. Again, whether or not the information is created by an intelligence is not relevant. It's the fact that information, as you use that term, is tied to meaning, together with the fact that meaning is subjective.Neil Rickert
July 9, 2014
July
07
Jul
9
09
2014
12:20 PM
12
12
20
PM
PDT
Neil @ 2:
Just to be clear, the metric is not the information.
You're simply mistaken. The metric is the information. Else it would not be informative, it would not be a metric, it would not be objective, it would be meaningless.
The metric is a way of measuring the amount of information (or the channel capacity).
You're simply mistaken. The metric does not "measure the amount of information." There is no measure for information.Mung
July 8, 2014
July
07
Jul
8
08
2014
09:08 PM
9
09
08
PM
PDT
Hi Eric, Sorry, I've been occupied with so many other things lately. I haven't had the time to spend on this that I would like. Perhaps later this week and into next will improve. But I wanted to get back to you on something. Eric:
Shannon information = string of characters
Mung @ 47:
This is what Elizabeth Liddle thought. Toss a coin x number of times you get y bits of “Shannon information.” The coin tosses were meaningless. When asked what they were about she had no answer. But because y bits of Shannon Information could be calculated it somehow meant that information could be meaningless. Unfortunately for ID, that is also the view presented by Stephen Meyer in Signature in the Cell.
Eric:
Do you have a cite for this? I could maybe look it up, but I’m too lazy and I might not find the actual passage that you are referring to, so if you have a quote handy, that would be helpful.
For example:
He [Jones] knows that all long-distance phone numbers have ten characters and are constructed from the same ten digits. He writes down the digits 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, on another piece of paper. He gives a sigh of relief. Now he has the raw materials, the building blocks, to generate Smith's ten digit phone number. He quickly arranges the digits on his list at random into a sequence of ten characters. Now he has some information, some Shannon information. Being mathematically inclined, he quickly calculates how much information he has using Shannon's familiar formula. He's relieved and impressed. His new ten-digit phone number has a lot of information - 33.2 bits to be exact. - Signature in the Cell p. 105
I can come up what more, and will, time permitting. Meyer is simply confused here (and confusing). By simply arranging ten characters into a string of characters "at random" one does not create information, not even Shannon Information. And his new ten-digit phone number may not even be a phone number! As Meyer admits.Mung
July 8, 2014
July
07
Jul
8
08
2014
08:50 PM
8
08
50
PM
PDT
Neil: You are confusing two different things. Just because something is created by an intelligence doesn't mean it is subjective. If I discover a code and can determine what the symbols refer to, then I have learned something real about the code -- objectively, just like with mathematics or the genetic code. There is nothing "subjective" about it. If I discover an artifact or a machine and by reverse engineering am able to ascertain its function, there is nothing subjective about that function -- it exists, it is real. Don't make the category mistake of lumping everything that is produced by a mind or by an intelligence into the category of "subjective" (which can then be blithely dismissed without applying intellectual effort to the question at hand). Furthermore, the attempt to paint the design inference as dealing with things that are "subjective" is no more becoming than the attempt to paint it as dealing with things that are "supernatural." Both claims are but a rhetorical attempt to dismiss the substance -- you know, if it is "supernatural" it isn't "science"; if it is "subjective" it isn't objective, and therefore not worth considering. Not an intellectually worthy approach to take.Eric Anderson
July 7, 2014
July
07
Jul
7
07
2014
09:45 AM
9
09
45
AM
PDT
Eric (#89):
And because there is a symbolic connection between the real-world meaning and the string, we recognize that the string contains representation, meaning, substance.
No, there is no such symbolic connection, as far as I can tell. The string is symbolic, but not the connection. The only people that I know who would claim (without proof) that the connection is symbolic, are those AI proponents at the extremes of materialism, who claim that humans are just symbol processing computers.
With that in mind, can you nevertheless appreciate that meaning of a communication can be of importance, in its own right and irrespective of whatever optimal number of bits gets spit out of a Shannon calculation? Can you accept that ID is interested primarily in the former?
In effect, that amounts to saying that ID is a quest to find an objective account of the subjective. That, too, puts you in agreement with those at the extremes of materialism. Personally, I see any such quest as doomed to fail.Neil Rickert
July 6, 2014
July
07
Jul
6
06
2014
05:21 AM
5
05
21
AM
PDT
Neil @86:
Shannon’s theory is a theory of communication, so specification ought to mean the specification of what is to be communicated.
Again, all your "specification" means is the sequence of the string that is plugged into the Shannon calculation. Yes, we can call that a "specification," but it doesn't have anything to do with a substantive concept of specification, which is what ID is about.
Neither string has any meaning. We contribute the meaning. Meaning is not a property of a string. Rather, meaning comes from us.
Of course an intelligent being was involved in assigning meaning to a string. And because there is a symbolic connection between the real-world meaning and the string, we recognize that the string contains representation, meaning, substance. And we also know, just speaking of ID for a moment, that such a situation is only known to arise by the activity of intelligent beings. If you're arguing that meaningful information/communication doesn't arise without the activity of an intelligent being to provide that meaning, then I agree. Welcome to the ID side! :)
If you want an objective science of communication or of information, then you must leave meaning out of that science.
Look, Shannon was interested in quantifying bits for transmission and communication purposes. And his theory is not interested in the meaning of the underlying communication. We seem to agree on that point. With that in mind, can you nevertheless appreciate that meaning of a communication can be of importance, in its own right and irrespective of whatever optimal number of bits gets spit out of a Shannon calculation? Can you accept that ID is interested primarily in the former? ----- The upshot of all this is that the primary point of my post is underscored: generating random strings that have a high Shannon calculation result do not -- and by definition, cannot -- invalidate the design inference, because the design inference is not based on simply having a high Shannon calculation number.Eric Anderson
July 5, 2014
July
07
Jul
5
05
2014
08:24 PM
8
08
24
PM
PDT
1 2 3 4

Leave a Reply