Claude Shannon: the man who failed to transform our understanding of information
|August 31, 2017||Posted by News under Evolution, Information|
Well, Columbia’s Rob Goodman thinks he did, at Aeon:
Shannon’s ‘mathematical theory’ sets out two big ideas. The first is that information is probabilistic. We should begin by grasping that information is a measure of the uncertainty we overcome, Shannon said – which we might also call surprise. What determines this uncertainty is not just the size of the symbol vocabulary, as Nyquist and Hartley thought. It’s also about the odds that any given symbol will be chosen. Take the example of a coin-toss, the simplest thing Shannon could come up with as a ‘source’ of information. A fair coin carries two choices with equal odds; we could say that such a coin, or any ‘device with two stable positions’, stores one binary digit of information. Or, using an abbreviation suggested by one of Shannon’s co-workers, we could say that it stores one bit.
But the crucial step came next. Shannon pointed out that most of our messages are not like fair coins. They are like weighted coins. A biased coin carries less than one bit of information, because the result of any flip is less surprising. Shannon illustrated the point with this graph. You see that the amount of information conveyed by our coin flip (on the y-axis) reaches its apex when the odds are 50-50, represented as 0.5 on the x-axis; but as the outcome grows more predictable in either direction depending on the size of the bias, the information carried by the coin steadily declines.
This is where language enters the picture as a key conceptual tool. Language is a perfect illustration of this rich interplay between predictability and surprise. We communicate with one another by making ourselves predictable, within certain limits. Put another way, the difference between random nonsense and a recognisable language is the presence of rules that reduce surprise.More.
The unfortunate reality is that the vast majority of science writers, while working within an information paradigm even as they write, write as though the Big Fix (the universe as simply matter and energy) is still just around the corner, along with the multiverse and the success of string theory. And the sure-thing tale of human evolution.
See also: Data basic: An introduction to information theory
Introduction to Evolutionary Informatics
New Scientist: Human evolution “more baffling than we thought” (after all this time)