Uncommon Descent Serving The Intelligent Design Community

Intelligent Design Basics – Information – Part III – Shannon

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

In this post I want to consider another aspect of information.  Specifically, I want to consider the concept of “Shannon information.”

First of all, I admit to having ruffled a few feathers when I mentioned in passing in a prior post that “Shannon information is not really information.”  As I have also written before in comments on UD, I don’t begrudge anyone referring to the Shannon metric as “information.”  That terminology has penetrated the English language and has become regularly-used in information theory.  So, no, I am not going to police everyone who puts the words “Shannon” and “information” next to each other.

However, no small amount of misunderstanding has resulted from the unfortunate term “Shannon information.”  In particular, as it relates to intelligent design, some critics have seized on the idea of Shannon information and have argued that because this or that computer program or this or that natural process can produce a complex string or a complex sequence, that therefore such a program or process is producing new complex “information.”  This proves, the argument goes, that purely natural processes can produce new and large amounts of information, contra the claims of intelligent design.

Such thinking demonstrates a lack of understanding of CSI – in particular the need for specification.  However, a large part of the problem results from the use of the word “information” in reference to the Shannon metric.  As I have stated before, somewhat provocatively, we would all have been better off if instead of “Shannon information” the concept were referred to as the “Shannon measurement” or the “Shannon metric.”

Claude Shannon published a paper entitled “A Mathematical Theory of Communication” in the July 1948 volume of The Bell System Technical Journal.  This paper is available online here and is considered a foundational groundwork for not only Shannon’s subsequent research on the topic, but for information theory generally.  To be sure, there are many other aspects of information theory and many other individuals worthy of acclaim in the field, but Shannon is perhaps justifiably referred to as the father of information theory.

But before delving into other details in subsequent posts, time permitting, I want to relate a short experience and then a parable.  Consider this a primer, a teaser, if you will.

The Warehouse

When I was a teenager in high school, one of my part time jobs was working in a warehouse that housed and sold equipment and materials for the construction industry.  On a regular weekly schedule we would load a truck with supplies at the main warehouse and drive the truck to a smaller warehouse in a different city to supply the needs in that locale.  The day of the week was fixed (if memory serves, it was generally a Friday) and the sending warehouse foreman made sure that there were enough people on hand in the morning to pull inventory and load the truck, while the receiving warehouse foreman in turn ensured that there were enough people on hand in the afternoon to unload the truck and stock the inventory.

Due to the inevitable uneven customer demand in the receiving city, the needs of the receiving warehouse would vary.  With good inventory management, a large portion of the receiving warehouse’s needs could be anticipated up front.  However, it was not uncommon for the receiving warehouse to have a special order at the last minute that would necessitate removing a large crate or some boxes from the truck that had already been loaded in order to make room for the special order.  At other times when no large orders had been made, we would finish loading all the supplies and find that we still had room on the truck.  In this latter case, the sending foreman would often decide to send some additional supplies – usually a high turnover item that he knew the receiving warehouse would likely need shortly anyway.

In either case, the goal was to make most efficient use of the time, money and expense of the truck and driver that were already slated to head to the other town – taking the best possible advantage of the previously-allocated sunk costs, if you will.  Ensuring that the shipment container (in this case a truck) made best use of the available capacity was a key to efficient operations.

I want to now take this experience and turn it into a parable that relates to Shannon information.

The Parable of the Fruit Truck

Let’s assume that instead of heating and cooling equipment and supplies, the warehouse sells fruit directly to customers.   Let’s further assume that the various kinds of fruit are shipped in different-sized boxes – the watermelons in one size of box, the pineapples in another, the apples in another, and the strawberries in yet another.

Now, for simplicity, let’s suppose that customers purchase the fruit on a long-term contract with a pre-set price, so the primary variable expense of the warehouse is the expense of operating the truck.  The warehouse would thus be highly incentivized to maximize the efficiency of the truck – sending it out on the road only as often as needed, and maximizing the carrying capacity of the truck.

The dock workers in our parable, however, are not particularly sharp.  As the fruit comes in from the farms, the dock workers, without confirming the contents, simply start packing the boxes at the front of the truck, working their way to the back.  Invariably, there are gaps and open spaces as the various-sized boxes do not precisely conform to the internal capacity of the truck.  Some days are better than others by dint of luck, but the owner quickly realizes that the packing of the truck is inefficient.  Worse still, customers regularly complain that (i) the truck is arriving only partly filled, (ii) boxes contain the wrong kind of fruit, or (iii) in particularly egregious cases, the boxes contain rotten fruit or no fruit at all.

As a result, the warehouse owner decides to hire a sharp young man fresh from the university whose sole job it is to figure out the best way to pack the truck, to create the most efficient and time-saving way to deliver as much fruit as possible given the carrying capacity of the truck.

Let’s say this young man’s name is, oh, I don’t know, perhaps “Shannon.”

Now our hero of the parable, Shannon, works in the office, not the loading dock, and is unable to confirm the actual contents of the boxes that are loaded on the truck.  Further, he quite reasonably assumes the dock workers should be doing that part of the job.  Notwithstanding those limitations, Shannon is a sharp fellow and quickly comes up with a formula that gives the owner a precise calculation of the truck’s carrying capacity and the exact number of each type of fruit box that can be loaded on the truck to ensure that every square inch of the truck is filled.

Elated with the prospect of putting all the customer complaints behind him, the warehouse owner hands down the instruction to the dock workers: henceforth the truck will be packed with so many watermelon boxes, so many pineapple boxes, so many apple boxes and so on.  Furthermore, they will be packed according to Shannon’s carefully worked out order and placement of the boxes.

After the next week’s shipments, the owner is surprised to receive a number of customer complaints.  Although not a single customer complains that the truck was only partly full (it was packed tightly to the brim in all cases), several customers still complain that (i) boxes contain the wrong kind of fruit, or (ii) in particularly egregious cases, the boxes contain rotten fruit or no fruit at all.

Furious, the owner marches to Shannon’s desk and threatens to fire him on the spot.  “I hired you to figure out the best way to pack the truck to create the most efficient approach to delivering as much fruit as possible!  But I am still swamped by customer complaints,” he fumes as he throws down the list of customer complaints on Shannon’s desk.  Unfazed, Shannon calmly looks at the customer complaints and says, “I understand you used to get complaints that the truck was only partially filled, but I notice that not a single customer has complained about that problem this week.  You hired me to find the most efficient delivery method, to ensure that the truck was maximizing its carrying capacity of boxes.  I did that.  And that is all I have ever claimed to be able to do.”

“But some of the customers got the wrong fruit or got no fruit at all,” sputters the owner.  Based on your work we told them they would be receiving a specific quantity of specific types of fruit each week.

“I’m sorry to hear that,” retorts Shannon, “but you should not have promised any specific fruit or any particular quantity of fruit based on my formula alone.  From my desk I have no way of knowing what is actually in the boxes.  The supplier farms and dock workers can answer for that.  What is in the boxes – what is actually delivered to the customer – has nothing to do with me.  I have no ability from where I am sitting, nor frankly any interest, in guaranteeing the contents of the boxes.  My only task, the only thing I have ever claimed to be able to do, is calculate the maximum carrying capacity of the truck with the given boxes.

The Analogy

The fruit truck is obviously but a simple and fun analogy.  However, it does, I believe, help newcomers get a feel for what Shannon can do (analyze maximum carrying capacity of a delivery channel) and what Shannon cannot do (analyze, confirm, understand or quantify the underlying substance).  We’ll get into more details later, but let’s kick it off with this analogy.

What similarities and differences are there between our parable of the fruit truck and Shannon information?  What other analogies are you familiar with or perhaps have yourself used to help bring these rather intangible concepts down to earth in a concrete way for people to understand?

 

Comments
Neil Rickart at #12 said:
The problem with semantics (or meaning), is that it cannot be specified. If you want to insist that “information” is a reference to semantics, then you must give up on the “specified” part of CSI. It is the nature of ordinary language, that meanings are inherently subjective and unspecifiable.
In other words he agrees with the OP exactly as written. Thanks for the confirmation Neil!johnp
June 30, 2014
June
06
Jun
30
30
2014
01:54 PM
1
01
54
PM
PDT
Neil Rickert:
Successful communication was going on thousands of years before there were definitions or dictionaries or written language.
The context was communication using words, Neil. But thanks for proving you prefer obfuscation over education.
Bird manage to communicate with their bird songs.
How do you know they are communicating?Joe
June 30, 2014
June
06
Jun
30
30
2014
12:00 PM
12
12
00
PM
PDT
Neil: Take the following string: ETTTBSNHOBOORTUEHTISEEATQTNOIO What is the Shannon information? This is not any kind of trick question. I'm sincerely trying to make sure we aren't talking past each other and that I understand your point. Assume the foregoing is based on a 26-letter alphabet, all caps, no other symbols. I think we're on the same page that meaning and content are not part of Shannon information, so I want to see if we can get on the same page as to what is Shannon information. Thanks,Eric Anderson
June 30, 2014
June
06
Jun
30
30
2014
11:54 AM
11
11
54
AM
PDT
Gpuccio The protein model seems very reasonable. Thank you.GBDixon
June 30, 2014
June
06
Jun
30
30
2014
11:43 AM
11
11
43
AM
PDT
Neil: Let's not get all hung up on a definitional battle. I've already said that if someone wants to use the term "Shannon information," fine. I'm not going to change that unfortunate term this late in the game. But they need to understand that when they use the term "Shannon information" they are not talking about meaning, or content, or specification, or function. Shannon himself makes this quite clear. We could call the shannon metric something like "statistical information" if people insist on calling it "information" at all. And we could call the other stuff "substantive information." The particular definition doesn't matter. As long as people are clear that there is a real, fundamental, substantive distinction between the two.Eric Anderson
June 30, 2014
June
06
Jun
30
30
2014
11:40 AM
11
11
40
AM
PDT
If you wanted to discuss something other than Shannon information, this thread seems to be the wrong place for that discussion.
Neil, you made a claim about Shannon information that is patently wrong. You were corrected. Shannon information does not consider any specification in the communication channel, consequently, unspecified noise in the communuication channel is included in Shannon information - just as Claude Shannon states in the second paragraph of his paper.Upright BiPed
June 30, 2014
June
06
Jun
30
30
2014
11:40 AM
11
11
40
AM
PDT
If we didn’t define words, ie if words were not specified, communication would be impossible. Definitions are word specifications.
Successful communication was going on thousands of years before there were definitions or dictionaries or written language. Bird manage to communicate with their bird songs. Where are the definitions that you claim would be needed?Neil Rickert
June 30, 2014
June
06
Jun
30
30
2014
11:36 AM
11
11
36
AM
PDT
The word information in this theory is used in a special mathematical sense that must not be confused with its ordinary usage. In particular, information must not be confused with meaning.- Warren Weaver, one of Shannon's collaborators
And from data:
Data as an abstract concept can be viewed as the lowest level of abstraction, from which information and then knowledge are derived. ..... Beynon-Davies uses the concept of a sign to distinguish between data and information; data are symbols while information occurs when the symbols are used to refer to something.
Joe
June 30, 2014
June
06
Jun
30
30
2014
11:03 AM
11
11
03
AM
PDT
Neil Rickert:
A definition of a word is an entry into a circular chain of dictionary lookups which never get you to a specification of meaning.
Yes, Neil, we already understand that you choose obfuscation rather than education. If we didn't define words, ie if words were not specified, communication would be impossible. Definitions are word specifications.Joe
June 30, 2014
June
06
Jun
30
30
2014
10:55 AM
10
10
55
AM
PDT
Responding to Joe (#16):
Neil- a definition of a word specifies that words meaning.
A definition of a word is an entry into a circular chain of dictionary lookups which never get you to a specification of meaning.Neil Rickert
June 30, 2014
June
06
Jun
30
30
2014
10:45 AM
10
10
45
AM
PDT
Replying to UB' comment (#14):
Eric made a comment about the specification inherent in information – the “aboutness” of information.
Aboutness (intentionality) is not specification.
I then point out to you that your comments are in direct opposition to what Shannon himself is saying regarding the “aboutness” of information (it’s specification) – i.e. its “correlation according to some system with certain physical or conceptual entities”.
Again, aboutness is not specification. I don't recall Shannon mentioning "aboutness", though he did mention meaning and semantics. A specification ought to be specific. Meaning is never specific -- it is always subjective and dependent on subjective interpretation.
... when Shannon was discussing the selection of a message for its specificity (its “correlation according to some system with certain physical or conceptual entities”) ...
Specification and correlation are not the same thing at all. When a topic claims to be about Shannon information, then it ought to be about Shannon information. Meaning and semantics, in the ordinary language sense, are not any part of Shannon information. Typically, Shannon information is a sequence of symbols. Specification, in the context of Shannon information, can only mean specification of symbols. If you wanted to discuss something other than Shannon information, this thread seems to be the wrong place for that discussion.Neil Rickert
June 30, 2014
June
06
Jun
30
30
2014
10:41 AM
10
10
41
AM
PDT
GBDixon: Thank you for your clear summary of some basic concepts about Shannon's theory. I would like to mention here how the concepts of Shannon are applied in Durston's model. There, the random state of a protein sequence of a certain length is considered as the highest uncertainly (highest H). The contraints given by the functional state (derived by the alignment of the known variants of that protein in the course of evolution) determine a reduction of uncertainty, which corresponds to the functional information in the protein sequence. In general, for a protein sequence, the total number of sequences which exhibit the molecular function is the "list" of meaningful "messages". Their meaning is their functional activity, and can be defined independently. In itself, the function has nothing to do with the calculation, except for its constraints on the sequence and the consequent reduction of uncertainty. So, an AA position which always remains the same will give a reduction of uncertainty of a little more than 4 bits, while an AA position which can assume randomly any value will give no reduction of uncertainty (0 functional information). All intermediate conditions can be found. The sum of the functional information at each position is the global functional information of that protein.gpuccio
June 30, 2014
June
06
Jun
30
30
2014
10:32 AM
10
10
32
AM
PDT
Neil- a definition of a word specifies that words meaning. And the entire world uses the word "information" in a way that information = meaning. Textbooks are full of meaningful information. Shannon did not care about meaning because, guess what, the machines that transmit and receive the signal don't care about it.Joe
June 30, 2014
June
06
Jun
30
30
2014
10:11 AM
10
10
11
AM
PDT
Hi Eric, In your first example of a four bit code: The sender and receiver decide 1001 and 0110 are valid messages. This is the information. The sender only sends one of these two messages when she sends. The channel possibly corrupts the messages into any of the other possible fourteen combinations of four bits, or leaves the messages the same. the fourteen combinations are not found in the message list and are NOT information. These invalid messages plus the valid messages are the set of all possible four bit combinations that can be received and with these two parameters we can calculate the entropy of the information. Without an agreed-upon message list and the number of invalid messages that are possible, entropy cannot be calculated. Note that we could have chosen any combination of four bits as a valid message. This is what Shannon means when he says the content of the messages do not matter. We may attach meaning to a message ("launch the water balloon") but as fundamental information, only what is in the valid message list is information according to Shannon. I attach semantics and go off on tangents to illustrate information depends on sender and receiver context. Here, the 1001 and 0110 messages are meaningless to us, but we can assert they are information because they are in the valid message list. They presumably have pithy meaning to both the sender and receiver but we as channel engineers are not privy to that meaning and don't care what it is: only that 1001 and 0110 are valid messages and the rest are not (we discard invalid messages and ask for a resend or do something else). Attaching semantics to valid messages does not alter any of the calculations or what constitutes a valid message but if we wish to know why 1001 is a valid message, we must consult the sender or receiver and find out what the message means. We have generalized the concept of information, as Shannon did (Shannon actually built on the work of Hartley). Shannon did indeed generalize and formalize the concept of information. I think you may be stuck on Shannon's channel capacity theorem, but he generated two fundamental theorems: the other is called the "source coding theorem" and deals with what information is. Many very smart people had trouble with Shannon's concepts at first. I recommend John R. Pierce "An Introduction to Information Theory" for a good explanation. It is cheap at Amazon. Eric says: Focus on (i) the size of the truck, (ii) the sizes of the different packets that can be carried on the truck, and (iii) the contents of the packets. Those are the key. I agree with this as it relates to channel capacity, Eric, but aren't we mainly talking about what constitutes information and not how to send it? Where we appear to disagree is the idea I assert that the receiver (and usually the sender) determine what the information is and what it means. Regardless of what they choose, the information can be distilled into a set of messages that are generic in nature and are sent across a possibly noisy channel to be received, checked for validity, decoded and interpreted by the receiver (again, in cells the receivers are the molecular machines that interpret the DNA sequences). Information cannot be defined without a receiver to interpret it, and information changes with receiver context.GBDixon
June 30, 2014
June
06
Jun
30
30
2014
09:52 AM
9
09
52
AM
PDT
Good grief Neil. Let’s run this down: Eric made a comment about the specification inherent in information – the “aboutness” of information. You then turn around to object, saying “no” all Shannon information is specified, 'noise isn’t considered Shannon information.' I then point out to you that your comments are in direct opposition to what Shannon himself is saying regarding the “aboutness” of information (it’s specification) – i.e. its “correlation according to some system with certain physical or conceptual entities”. So now you turn around and say the “specification” you’re talking about is about the system instead, not the information itself. Moreover, you want to pretend that when Shannon was discussing the selection of a message for its specificity (its “correlation according to some system with certain physical or conceptual entities”), he talking about the system specification as well. You are just deeply confused. Finally, after having bastardized what Shannon has said, you want to use the opportuinity to take a juvenile slap at ID for having an appropriate interest in the aspects of information that are “correlated according to some system with certain physical or conceptual entities”. Obviously, you take a shot at ID on an issue you clearly do not understand. Give it a rest.Upright BiPed
June 30, 2014
June
06
Jun
30
30
2014
09:22 AM
9
09
22
AM
PDT
A few related notes that may be of interest to the reader: Measuring the functional sequence complexity of proteins - Kirk K Durston, David KY Chiu, David L Abel and Jack T Trevors - 2007 Excerpt: We have extended Shannon uncertainty by incorporating the data variable with a functionality variable. The resulting measured unit, which we call Functional bit (Fit), is calculated from the sequence data jointly with the defined functionality variable. To demonstrate the relevance to functional bioinformatics, a method to measure functional sequence complexity was developed and applied to 35 protein families.,,, http://www.tbiomed.com/content/4/1/47 At the 17 minute mark of the following video, Winston Ewert speaks on how functional information is measured in proteins: Proposed Information Metric: Conditional Kolmogorov Complexity (Ewert) - July 2012 - video http://www.youtube.com/watch?v=fm3mm3ofAYU ============= Conversations with William Dembski--The Thesis of Being as Communion (The Metaphysics of Information) - video https://www.youtube.com/watch?v=cYAsaU9IvnIbornagain77
June 30, 2014
June
06
Jun
30
30
2014
09:00 AM
9
09
00
AM
PDT
Responding to EA's comment (#8):
Well, which is it, the information or the channel capacity?
Strictly speaking, the metric is the amount of information. The channel capacity is a rate (max amount of information per unit time).
In which case we have then lost the value of the word “information” and we need to come up with a different word to describe what we originally used to think of as information.
No, we have not lost the value of the word "information." As Shannon has said, information can carry semantics. But, as defined by Shannon, the term "information" does not refer to the semantics. I understand that you want "information" to actually refer to the semantics. I used to think that way, and I probably posted comments to that effect on usenet, maybe 20 years ago. I have since come to recognize that I was mistaken. I now see that Shannon information, which does not refer to the semantics, is the most appropriate meaning for "information." And, here, I am talking about what is appropriate for studying questions of human cognition. I'm not limiting myself to digital technology. The problem with semantics (or meaning), is that it cannot be specified. If you want to insist that "information" is a reference to semantics, then you must give up on the "specified" part of CSI. It is the nature of ordinary language, that meanings are inherently subjective and unspecifiable.Neil Rickert
June 30, 2014
June
06
Jun
30
30
2014
08:30 AM
8
08
30
AM
PDT
GBDixon @7: Thanks for your thoughts. Are you suggesting that if we know, for example, that we are dealing with, say, a 4-bit digital code, that it is impossible to calculate the shannon metric without first knowing all of the possible messages that can be sent using that 4-bit digital code? I hope that is not what you meant by your first paragraph, because that is completely wrong.
Shannon did more than determine channel capacity. He gave a fundamental definition of what constitutes information. This applies to all information. That is why his work is considered groundbreaking.
No. He focused on one particular aspect of communication which is highly relevant to the transmission of information. That is why his work is considered groundbreaking. Again, Shannon acknowledged that the meaning and the semantics -- what we generally think of as information if you were to ask anyone on the street, if you were to look the word up in the dictionary, if you were to look at the etymology of the word -- is irrelevant to his calculation.
In Eric’s fruit truck example there are more than one receivers and more than one set of information. To the woman who calculates truck gas consumption, the information is the weight of the load and the length of the route. To the person who loads the truck, the information is how much space is left. To the person who unloads the truck it is how much more is left to unload before I can take my coffee break. To the customer, it is what is wanted.
Look, any analogy can be stretched too far. You are going off on tangents. The reason I worded the analogy the way I did is to focus on the carrying capacity, which is what the Shannon metric deals with. Don't get off on irrelevant tangents about the driver, the coffee breaks for the dock workers, etc. Yes, there is lots of "information" about a physical system that can be gleaned by someone observing the system. That is separate from the information contained in the transmission itself. (This question about information being contained in a physical system has been dealt with in detail previously on this site.) Focus on (i) the size of the truck, (ii) the sizes of the different packets that can be carried on the truck, and (iii) the contents of the packets. Those are the key.
We usually presume that the sender only sends the things the receiver (customer) wants and that something bad happens to the messages along the way. In biology, this is usually considered to be the DNA mutations that creep in and do nothing or cause problems.
Yes, noise is an interesting issue, but it is easily included in our example if you want (spoilage in transit, box falling off the truck, someone stealing fruit from a box while the driver is parked at the coffee shop, etc.). Noise is an important point for transmission and one that Shannon spent a lot of time focusing on. However, I would note that most of the calculations of the Shannon metric that are done in the context of the design debate assume a noiseless channel. Indeed, we need to assume a noiseless channel initially to come up with a base calculation (as Shannon also did). Noise makes the question of transmission more challenging, but it is not typically relevant for the encoding aspect we are thinking about.Eric Anderson
June 30, 2014
June
06
Jun
30
30
2014
08:14 AM
8
08
14
AM
PDT
Responding to UB's comment. Sorry, but you are confused here.
You recognize the “apple” because it’s specified; you know what those particular scribbles mean. Shannon is telling you that this process of selection at the receiver is irrelevant to the problem of engineering a communication channel that can carry any scribble – the one selected and all the others that are not, i.e. the noise. That is what the system has to be able to do in order to function.
The actual engineering that Shannon mentions, mostly consists of specifications and of hardware that implements those specification. This is not the ID kind of specification, as in "Oh, it looks specified, so it just must be." These are detailed engineering specification that you can find in libraries and probably on the internet. A varying electrical signal can be converted to a sequence of bits in a gazillion different ways. But, somehow, the Internet works because we always manage to get the bits that were intended. It works because of all of those engineering specifications, and because the equipment all follows those specifications both in generating the signal and in interpreting it.Neil Rickert
June 30, 2014
June
06
Jun
30
30
2014
08:08 AM
8
08
08
AM
PDT
The argument from information sure has come a long way on UD. Years ago on UD I remember many neo-Darwinists denied information was even in the cell. In fact in the past, this following comment has been used far more than once on UD,,,
Information Theory, Evolution, and the Origin of Life - Hubert P. Yockey, 2005 Excerpt: “Information, transcription, translation, code, redundancy, synonymous, messenger, editing, and proofreading are all appropriate terms in biology. They take their meaning from information theory (Shannon, 1948) and are not synonyms, metaphors, or analogies.” http://www.cambridge.org/catalogue/catalogue.asp?isbn=9780521802932&ss=exc
Then after a while, when some of the Darwinists' realized that information actually was in the cell, and that it was not an 'illusion', their arguments 'evolved' to say that the information in the cell was merely Shannon information, and therefore, using Shannon's broad metric for information, information, the Darwinists held, can therefore increase with Darwinian evolution,,,
The Evolution-Lobby’s Useless Definition of Biological Information - Feb. 2010 Excerpt: By wrongly implying that Shannon information is the only “sense used by information theorists,” the NCSE avoids answering more difficult questions like how the information in biological systems becomes functional, or in its own words, “useful.”,,,Since biology is based upon functional information, Darwin-skeptics are interested in the far more important question of, Does neo-Darwinism explain how new functional biological information arises? http://www.evolutionnews.org/2010/02/the_evolutionlobbys_useless_de.html Three subsets of sequence complexity and their relevance to biopolymeric information - Abel, Trevors Excerpt: Three qualitative kinds of sequence complexity exist: random (RSC), ordered (OSC), and functional (FSC).,,, Shannon information theory measures the relative degrees of RSC and OSC. Shannon information theory cannot measure FSC. FSC is invariably associated with all forms of complex biofunction, including biochemical pathways, cycles, positive and negative feedback regulation, and homeostatic metabolism. The algorithmic programming of FSC, not merely its aperiodicity, accounts for biological organization. No empirical evidence exists of either RSC of OSC ever having produced a single instance of sophisticated biological organization. Organization invariably manifests FSC rather than successive random events (RSC) or low-informational self-ordering phenomena (OSC).,,, ,,,We repeat that a single incident of nontrivial algorithmic programming success achieved without selection for fitness at the decision-node programming level would falsify any of these null hypotheses. This renders each of these hypotheses scientifically testable. We offer the prediction that none of these four hypotheses will be falsified. http://www.tbiomed.com/content/2/1/29 Programming of Life - Information - Shannon, Functional & Prescriptive – video https://www.youtube.com/watch?v=h3s1BXfZ-3w Programming of Life - Dr. Donald Johnson (the difference between Shannon Information and Prescriptive Information) - audio podcast http://www.idthefuture.com/2010/11/programming_of_life.html
But even using the 'loose' definition of Shannon information, instead of using the more precise definitions of functional information, (i.e. CSI, prescriptive, etc..), has insurmountable problems for Darwinists.
The GS (genetic selection) Principle – David L. Abel – 2009 Excerpt: Stunningly, information has been shown not to increase in the coding regions of DNA with evolution. ,,, What the above papers show is that not even variation of the duplication produces new information, not even Shannon “information.” http://www.bioscience.org/fbs/getfile.php?FileName=/2009/v14/af/3426/3426.pdf
Moreover, using the Shannon metric for information (i.e. channel capacity), we find the first DNA code of life on earth had to be at least as complex as the current DNA code found in life is:
Shannon Information - Channel Capacity - Perry Marshall - video http://www.metacafe.com/watch/5457552/ “Because of Shannon channel capacity that previous (first) codon alphabet had to be at least as complex as the current codon alphabet (DNA code), otherwise transferring the information from the simpler alphabet into the current alphabet would have been mathematically impossible” Donald E. Johnson – Bioinformatics: The Information in Life “Biophysicist Hubert Yockey determined that natural selection would have to explore 1.40 x 10^70 different genetic codes to discover the optimal universal genetic code that is found in nature. The maximum amount of time available for it to originate is 6.3 x 10^15 seconds. Natural selection would have to evaluate roughly 10^55 codes per second to find the one that is optimal. Put simply, natural selection lacks the time necessary to find the optimal universal genetic code we find in nature.” (Fazale Rana, -The Cell's Design - 2008 - page 177)
The reason why a code is a all or nothing deal that cannot be evolved gradually is best summarized by Dawkins:
Venter vs. Dawkins on the Tree of Life - and Another Dawkins Whopper - March 2011 Excerpt:,,, But first, let's look at the reason Dawkins gives for why the code must be universal: "The reason is interesting. Any mutation in the genetic code itself (as opposed to mutations in the genes that it encodes) would have an instantly catastrophic effect, not just in one place but throughout the whole organism. If any word in the 64-word dictionary changed its meaning, so that it came to specify a different amino acid, just about every protein in the body would instantaneously change, probably in many places along its length. Unlike an ordinary mutation...this would spell disaster." (2009, p. 409-10) OK. Keep Dawkins' claim of universality in mind, along with his argument for why the code must be universal, and then go here (linked site listing 23 variants of the genetic code). Simple counting question: does "one or two" equal 23? That's the number of known variant genetic codes compiled by the National Center for Biotechnology Information. By any measure, Dawkins is off by an order of magnitude, times a factor of two. http://www.evolutionnews.org/2011/03/venter_vs_dawkins_on_the_tree_044681.html
And this problem of multiple codes, that can't be derived by gradual processes, has recently become much more acute for Darwinists:
A glimpse into nature's looking glass -- to find the genetic code is reassigned: Stop codon varies widely - May 22, 2014 Excerpt: While a few examples of organisms deviating from this canonical code had been serendipitously discovered before, these were widely thought of as very rare evolutionary oddities, absent from most places on Earth and representing a tiny fraction of species. Now, this paradigm has been challenged by the discovery of large numbers of exceptions from the canonical genetic code,,, Approximately 99% of all microbial species on Earth fall in this category, defying culture in the laboratory but profoundly influencing the most significant environmental processes from plant growth and health, to the carbon and other nutrient cycles on land and sea, and even climate processes.,,, "We were surprised to find that an unprecedented number of bacteria in the wild possess these codon reassignments, from "stop" to amino-acid encoding "sense," up to 10 percent of the time in some environments," said Rubin. Another observation the researchers made was that beyond bacteria, these reassignments were also happening in phage, viruses that attack bacterial cells.,,, The punch line, Rubin said, is that the dogma is wrong. "Phage apparently don't really 'care' about the codon usage of the host. http://www.sciencedaily.com/releases/2014/05/140522141422.htm
supplemental notes:
"In the last ten years, at least 20 different natural information codes were discovered in life, each operating to arbitrary conventions (not determined by law or physicality). Examples include protein address codes [Ber08B], acetylation codes [Kni06], RNA codes [Fai07], metabolic codes [Bru07], cytoskeleton codes [Gim08], histone codes [Jen01], and alternative splicing codes [Bar10]. Donald E. Johnson – Programming of Life – pg.51 - 2010 "Not only are there many different codes in the sequences, but they overlap, so that the same letters in a sequence may take part simultaneously in several different messages." Edward N. Trifonov - 2010
bornagain77
June 30, 2014
June
06
Jun
30
30
2014
07:47 AM
7
07
47
AM
PDT
Neil @2:
The metric is a way of measuring the amount of information (or the channel capacity).
Well, which is it, the information or the channel capacity? The only way you can say the metric is measuring "information" is to redefine information to mean the same thing as "channel capacity." In which case we have then lost the value of the word "information" and we need to come up with a different word to describe what we originally used to think of as information. The whole term "shannon information" causes more confusion than light. That is precisely why an analogy is helpful for people to start grasping what we are dealing with. Again, as I said, I'm not going to change decades of people referring to the shannon metric as "shannon information." That is too ingrained and there is too much inertia. But we need to be very clear when talking about shannon so-called information that we are not talking about the key aspects that we typically think of when we use the word information. We are not talking about semantics, vocabulary, meaning, intent, purpose, informing. All those, as Shannon himself clearly stated (and as UB pointed out above) are irrelevant to the shannon metric.
I am inclined to say that all Shannon information is specified.
Then you do not understand specification, as it is used in the intelligent design context. Specification involves meaning, purpose, function -- precisely the kinds of things Shannon said were irrelevant to his channel capacity problem.Eric Anderson
June 30, 2014
June
06
Jun
30
30
2014
07:42 AM
7
07
42
AM
PDT
The comments are correct: The set of valid messages among the total of all possible messages that can be received defines the information. Channel capacity has no meaning without this message list and information entropy and the channel capacity cannot be calculated without knowing the number of valid messages among the number of possible messages. When both a sender and receiver are involved, they decide together the set, or list, of valid messages: THEY decide what the information is. In SETI and molecular biology, there may be senders but their decisions regarding messages must be inferred as we cannot establish an a priori message list. In SETI we look for message sequences that are unlikely to be caused by natural events. In biology there are DNA sequences that are clearly information because they code for functional proteins. Some other sequences may not be clearly information, and we can argue whether they are or not, so the exact information content of a gene may be unknown but it has a lower bound. The set of all possible DNA sequences constitute the possible messages that can be received (by the molecular machines that interpret the sequences), and we can calculate an entropy bound as well. Shannon did more than determine channel capacity. He gave a fundamental definition of what constitutes information. This applies to all information. That is why his work is considered groundbreaking. In Eric's fruit truck example there are more than one receivers and more than one set of information. To the woman who calculates truck gas consumption, the information is the weight of the load and the length of the route. To the person who loads the truck, the information is how much space is left. To the person who unloads the truck it is how much more is left to unload before I can take my coffee break. To the customer, it is what is wanted. To the gas purchaser Mr. Shannon has maximized the truck's capacity, but by only delivering a small subset of what the customer wanted, Mr. Shannon's delivery is NOT what the truck is capable of, and he fails in his mission to maximize channel capacity for information as defined by the customer. The truck is a special case and therefore kind of a poor example because it is capable of delivering, unchanged, a complete load of exactly what the customer wants without corruption. There is no 'noise' that corrupts the shipment(except maybe spoilage). We usually presume that the sender only sends the things the receiver (customer) wants and that something bad happens to the messages along the way. In biology, this is usually considered to be the DNA mutations that creep in and do nothing or cause problems.GBDixon
June 30, 2014
June
06
Jun
30
30
2014
07:12 AM
7
07
12
AM
PDT
UB: Yes, the detection of a communicated message in the face of the interference of noise is a case of inference to design, to intelligent message rather than noise. We need to ponder the significance of the metric signal to noise power ratio that implies that there are characteristic . . . empirically characteristic . . . differences between overwhelmingly typical messages and overwhelmingly typical noise. This also appears in filter theory where noise as a rule is high frequency "grass" on a CRO screen -- in the case of the classic Telequipment D52 (I think eventually bought out by Tektronix), that was very literal as the screen is bright green. Signals tend to be much narrower band, centred on a carrier wave. And yes I am familiar with frequency hopping and other spread spectrum techniques. They boil down to imposing a code that if known allows us to pull a signal almost like magic out of what would otherwise appear to be noise. Just think of the old electronics rule of thumb to avoid differentiators like the plague as this is a noise amplifying process. KFkairosfocus
June 30, 2014
June
06
Jun
30
30
2014
02:41 AM
2
02
41
AM
PDT
EA et al, yes, Shannon Info is really an info carrying capacity metric. In particular the entropy in a message is the average info capacity per symbol, which is linked to the info approach to entropy. (Entropy, on such a view, is a metric of average missing info to specify microstate given only a macro-level description of a thermodynamic system; which forces us to treat the microstate as effectively rndom, in trying to extract work, etc.) That is why functionally specific complex organisation and associated info is a needed further step. KFkairosfocus
June 30, 2014
June
06
Jun
30
30
2014
02:33 AM
2
02
33
AM
PDT
Ahh. Forgot the two words that clarify the point: He is telling you that there is a message to be communicated from one point to another point, where it will be selected at that point [as well]. It will be selected — because it is specifiedUpright BiPed
June 30, 2014
June
06
Jun
30
30
2014
01:03 AM
1
01
03
AM
PDT
Neil, who could know where to start with your comments in #2. Listen to what Claude Shannon is telling you:
The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem. The significant aspect is that the actual message is one selected from a set of possible messages. The system must be designed to operate for each possible selection, not just the one which will actually be chosen since this is unknown at the time of design.
He is telling you that there is a message to be communicated from one point to another point, where it will be selected at that point. It will be selected -- because it is specified, Neil. Representations have to be related to their meaning at the receiver in order for the message to be communicated. It’s the distinction between me writing a meaningless scribble on a piece of paper, or, writing the word “apple”. You recognize the “apple” because it’s specified; you know what those particular scribbles mean. Shannon is telling you that this process of selection at the receiver is irrelevant to the problem of engineering a communication channel that can carry any scribble - the one selected and all the others that are not, i.e. the noise. That is what the system has to be able to do in order to function.
I am inclined to say that all Shannon information is specified. Noise signals are not usually considered to be Shannon information.
You need a complete overhaul.Upright BiPed
June 29, 2014
June
06
Jun
29
29
2014
11:56 PM
11
11
56
PM
PDT
As I have also written before in comments on UD, I don’t begrudge anyone referring to the Shannon metric as “information.”
Just to be clear, the metric is not the information. The metric is a way of measuring the amount of information (or the channel capacity).
Such thinking demonstrates a lack of understanding of CSI – in particular the need for specification.
I am inclined to say that all Shannon information is specified. Noise signals are not usually considered to be Shannon information.
I want to now take this experience and turn it into a parable that relates to Shannon information.
Your parable is a poor illustration. Shannon was concerned with correct transmission of information over a noisy channel. Your parable ignores the concern with correctness. It is perhaps an analogy for a noisy channel, but it is misleading as an account of Shannon information.Neil Rickert
June 29, 2014
June
06
Jun
29
29
2014
10:17 PM
10
10
17
PM
PDT
Nota bene: I drafted this post more than two months ago, meaning to work on it more before publishing.  Unfortunately, work obligations and life generally have prevented me from spending further time on it.  As a result, I am publishing it "as is" in the hope it might nevertheless serve as a point for discussion.Eric Anderson
June 29, 2014
June
06
Jun
29
29
2014
08:13 PM
8
08
13
PM
PDT
1 2 3 4

Leave a Reply