Uncommon Descent Serving The Intelligent Design Community

What is “information”?

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Information, of course, is notoriously a concept that has many senses of meaning. As it is central to the design inference, let us look (again) at defining it.

We can dispose of one sense right off, Shannon was not directly interested in information but in information-carrying capacity; that is why his metric will peak for a truly random signal, which has as a result minimal redundancy. And, we can also see that the bit measure commonly seen in ICT circles or in our PC memories etc, is actually this measure, 1 k bit is 1,024 = 2^10 binary digits of storage or transmission capacity. One binary digit or bit being a unit of information storing one choice between a pair of alternatives such as yes/no, or true/false, or on/off, or high/low or N-pole/S-pole, etc. Where, obviously, the meaningful substance that is stored or communicated or may be implicit in a coherent organised functional entity is the sense of information that is most often relevant.

That is, as F. R. Connor put it in his telecommunication series of short textbooks,

Information is not what is actually in a message but what could constitute a message. The word could implies a statistical definition in that it involves some selection of the various possible messages. The important quantity is not the actual information content of the message but rather its possible information content.” [Signals, Edward Arnold, 1972, p. 79.]

So, we come to a version of the Shannon Communication system model:

A communication system

Elaborating slightly by expanding the encoder-decoder framework (and following the general framework of the ISO OSI model):

In this model, information-bearing messages flow from a source to a sink, by being: (1) encoded, (2) transmitted through a channel as a signal, (3) received, and (4) decoded. At each corresponding stage: source/sink encoding/decoding, transmitting/receiving, there is in effect a mutually agreed standard, a so-called protocol. [For instance, HTTP — hypertext transfer protocol — is a major protocol for the Internet. This is why many web page addresses begin: “http://www . . .”]

However, as the diagram hints at, at each stage noise affects the process, so that under certain conditions, detecting and distinguishing the signal from the noise becomes a challenge. Indeed, since noise is due to a random fluctuating value of various physical quantities [due in turn to the random behaviour of particles at molecular levels], the detection of a message and accepting it as a legitimate message rather than noise that got lucky, is a question of inference to design. In short, inescapably, the design inference issue is foundational to communication science and information theory.

Going beyond this, we can refer to the context of information technology, communication systems and computers, which provides a vital clarifying side-light from another view on how complex, specified information functions in information processing systems and so also what information is as contrasted with data and knowledge:

[In the context of computers, etc.] information is data — i.e. digital representations of raw events, facts, numbers and letters, values of variables, etc. — that have been put together in ways suitable for storing in special data structures [strings of characters, lists, tables, “trees” etc], and for processing and output in ways that are useful [i.e. functional]. . . . Information is distinguished from [a] data: raw events, signals, states etc represented digitally, and [b] knowledge: information that has been so verified that we can reasonably be warranted, in believing it to be true. [GEM/TKI, UWI FD12A Sci Med and Tech in Society Tutorial Note 7a, Nov 2005.]

Going to Principia Cybernetica Web as archived, we find three related discussions:

INFORMATION

1) that which reduces uncertainty. (Claude Shannon); 2) that which changes us. (Gregory Bateson)


Literally that which forms within, but more adequately: the equivalent of or the capacity of something to perform organizational work, the difference between two forms of organization or between two states of uncertainty before and after a message has been received, but also the degree to which one variable of a system depends on or is constrained by (see constraint) another. E.g., the dna carries genetic information inasmuch as it organizes or controls the orderly growth of a living organism. A message carries information inasmuch as it conveys something not already known. The answer to a question carries information to the extent it reduces the questioner’s uncertainty. A telephone line carries information only when the signals sent correlate with those received. Since information is linked to certain changes, differences or dependencies, it is desirable to refer to theme and distinguish between information stored, information carried, information transmitted, information required, etc. Pure and unqualified information is an unwarranted abstraction. information theory measures the quantities of all of these kinds of information in terms of bits. The larger the uncertainty removed by a message, the stronger the correlation between the input and output of a communication channel, the more detailed particular instructions are the more information is transmitted. (Krippendorff)


Information is the meaning of the representation of a fact (or of a message) for the receiver. (Hornung)

The point is, that information itself is not an obvious material artifact, though it is often embedded in such. It is abstract but nonetheless very real, very effective, very functional. So much so that it lies at the heart of cell-based life and of the wave of technology currently further transforming our world.

It is in the light of the above concerns, issues and concepts that, a few days ago, I added the following working rough definition to a current UD post. On doing so, I thought, this is worth headlining for itself. And so, let me now cite:

>>to facilitate discussion [as, a good general definition that does not “bake-in” information being a near-synonym to knowledge is hard to find . . . ] we may roughly identify information as

1: facets of reality [–> a way to speak of an abstract entity]  that may capture and so frame — give meaningful FORM to

2: representations of elements of reality — such representations being items of data — that

3: by accumulation of such structured items . . .

4: [NB: which accumulation is in principle quantifiable, e.g. by defining a description language that chains successive y/n questions to specify a unique/particular description or statement, thence I = – log p in the Shannon case, etc],

5: meaningful complex messages may then be created, modulated, encoded, decoded, conveyed, stored, processed and otherwise made amenable to use by a system, entity or agent interacting with the wider world. E.g. consider the ASCII code:

The ASCII code takes seven y/n q’s per character to specify text in English, yielding text size  7 bits per character of FSCO/I for such text

or, the genetic code (notice, the structural patterns set by the first two letters):

 

The Genetic code uses three-letter codons to specify the sequence of AA’s in proteins and specifying start/stop, and using six bits per AA

or, mRNA . . . notice, U not T:

Genetic code (RNA form), courtesy Wiki

or, a cybernetic entity using informational signals to guide its actions:

The Derek Smith two-tier controller cybernetic model

. . . or, a von Neumann kinematic self-replicator:

(Of course, such representations may be more or less accurate or inaccurate, or even deceitful. Thus, knowledge requires information but has to address warrant as credibly truthful. Wisdom, goes beyond knowledge to imply sound insight into fundamental aspects of reality that guide sound, prudent, ethical action.)>>

Food, for thought. END

PS: It would be remiss of me not to tie in the informational thermodynamics link, with a spot of help from Wikipedia on that topic. Let me clip from my longstanding briefing note:

>>To quantify the above definition of what is perhaps best descriptively termed information-carrying capacity, but has long been simply termed information (in the “Shannon sense” – never mind his disclaimers . . .), let us consider a source that emits symbols from a vocabulary: s1,s2, s3, . . . sn, with probabilities p1, p2, p3, . . . pn. That is, in a “typical” long string of symbols, of size M [say this web page], the average number that are some sj, J, will be such that the ratio J/M –> pj, and in the limit attains equality. We term pj the a priori — before the fact — probability of symbol sj. Then, when a receiver detects sj, the question arises as to whether this was sent. [That is, the mixing in of noise means that received messages are prone to misidentification.] If on average, sj will be detected correctly a fraction, dj of the time, the a posteriori — after the fact — probability of sj is by a similar calculation, dj. So, we now define the information content of symbol sj as, in effect how much it surprises us on average when it shows up in our receiver:

I = log [dj/pj], in bits [if the log is base 2, log2] . . . Eqn 1

This immediately means that the question of receiving information arises AFTER an apparent symbol sj has been detected and decoded. That is, the issue of information inherently implies an inference to having received an intentional signal in the face of the possibility that noise could be present. Second, logs are used in the definition of I, as they give an additive property: for, the amount of information in independent signals, si + sj, using the above definition, is such that:

I total = Ii + Ij . . . Eqn 2

For example, assume that dj for the moment is 1, i.e. we have a noiseless channel so what is transmitted is just what is received. Then, the information in sj is:

I = log [1/pj] = – log pj . . . Eqn 3

This case illustrates the additive property as well, assuming that symbols si and sj are independent. That means that the probability of receiving both messages is the product of the probability of the individual messages (pi *pj); so:

Itot = log1/(pi *pj) = [-log pi] + [-log pj] = Ii + Ij . . . Eqn 4

So if there are two symbols, say 1 and 0, and each has probability 0.5, then for each, I is – log [1/2], on a base of 2, which is 1 bit. (If the symbols were not equiprobable, the less probable binary digit-state would convey more than, and the more probable, less than, one bit of information. Moving over to English text, we can easily see that E is as a rule far more probable than X, and that Q is most often followed by U. So, X conveys more information than E, and U conveys very little, though it is useful as redundancy, which gives us a chance to catch errors and fix them: if we seewueen” it is most likely to have beenqueen.”)

Further to this, we may average the information per symbol in the communication system thusly (giving in termns of -H to make the additive relationships clearer):

– H = p1 log p1 + p2 log p2 + . . . + pn log pn 

or,  H = –  SUM [pi log pi]  . . . Eqn 5

H, the average information per symbol transmitted [usually, measured as: bits/symbol], is often termed the Entropy; first, historically, because it resembles one of the expressions for entropy in statistical thermodynamics. As Connor notes: “it is often referred to as the entropy of the source.” [p.81, emphasis added.] Also, while this is a somewhat controversial view in Physics, as is briefly discussed in Appendix 1below, there is in fact an informational interpretation of thermodynamics that shows that informational and thermodynamic entropy can be linked conceptually as well as in mere mathematical form. Though somewhat controversial even in quite recent years, this is becoming more broadly accepted in physics and information theory, as Wikipedia now discusses [as at April 2011]  in its article on Informational Entropy (aka Shannon Information, cf also here):

At an everyday practical level the links between information entropy and thermodynamic entropy are not close. Physicists and chemists are apt to be more interested in changes in entropy as a system spontaneously evolves away from its initial conditions, in accordance with the second law of thermodynamics, rather than an unchanging probability distribution. And, as the numerical smallness of Boltzmann’s constant kB indicates, the changes in S / kB for even minute amounts of substances in chemical and physical processes represent amounts of entropy which are so large as to be right off the scale compared to anything seen in data compression or signal processing.

But, at a multidisciplinary level, connections can be made between thermodynamic and informational entropy, although it took many years in the development of the theories of statistical mechanics and information theory to make the relationship fully apparent. In fact, in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon’s information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states that it could be in, thus making any complete state description longer. (See article: maximum entropy thermodynamics.[Also,another article remarks >>in the words of G. N. Lewis writing about chemical entropy in 1930, “Gain in entropy always means loss of information, and nothing more” . . .   in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes/no questions that need to be answered in order to fully specify the microstate, given that we know the macrostate.>>]) Maxwell’s demon can (hypothetically) reduce the thermodynamic entropy of a system by using information about the states of individual molecules; but, as Landauer (from 1961) and co-workers have shown, to function the demon himself must increase thermodynamic entropy in the process, by at least the amount of Shannon information he proposes to first acquire and store; and so the total entropy does not decrease (which resolves the paradox).

Summarising Harry Robertson’s Statistical Thermophysics (Prentice-Hall International, 1993) — excerpting desperately and adding emphases and explanatory comments, we can see, perhaps, that this should not be so surprising after all. (In effect, since we do not possess detailed knowledge of the states of the vary large number of microscopic particles of thermal systems [typically ~ 10^20 to 10^26; a mole of substance containing ~ 6.023*10^23 particles; i.e. the Avogadro Number], we can only view them in terms of those gross averages we term thermodynamic variables [pressure, temperature, etc], and so we cannot take advantage of knowledge of such individual particle states that would give us a richer harvest of work, etc.)

For, as he astutely observes on pp. vii – viii:

. . . the standard assertion that molecular chaos exists is nothing more than a poorly disguised admission of ignorance, or lack of detailed information about the dynamic state of a system . . . . If I am able to perceive order, I may be able to use it to extract work from the system, but if  I am unaware of internal correlations, I cannot use them for macroscopic dynamical purposes. On this basis, I shall distinguish heat from work, and thermal energy from other forms . . .

And, in more details, (pp. 3 – 6, 7, 36, cf Appendix 1 below for a more detailed development of thermodynamics issues and their tie-in with the inference to design; also see recent ArXiv papers by Duncan and Samura here and here):

. . . It has long been recognized that the assignment of probabilities to a set represents information, and that some probability sets represent more information than others . . . if one of the probabilities say p2 is unity and therefore the others are zero, then we know that the outcome of the experiment . . . will give [event]  y2. Thus we have complete information . . . if we have no basis . . . for believing that event yi is more or less likely than any other [we] have the least possible information about the outcome of the experiment . . . . A remarkably simple and clear analysis by Shannon [1948] has provided us with a quantitative measure of the uncertainty, or missing pertinent information, inherent in a set of probabilities [NB: i.e. a probability different from 1 or 0 should be seen as, in part, an index of ignorance] . . . . 

[deriving informational entropy, cf. discussions here, here, here, here and here; also Sarfati’s discussion of debates and the issue of open systems here . . . ]

H({pi}) = – C [SUM over i] pi*ln pi, [. . . “my” Eqn 6]

[where [SUM over i] pi = 1, and we can define also parameters alpha and beta such that: (1) pi = e^-[alpha + beta*yi]; (2) exp [alpha] = [SUM over i](exp – beta*yi) = Z [Z being in effect the partition function across microstates, the “Holy Grail” of statistical thermodynamics]. . . .

[H], called the information entropy, . . . correspond[s] to the thermodynamic entropy [i.e. s, where also it was shown by Boltzmann that s = k ln w], with C = k, the Boltzmann constant, and yi an energy level, usually ei, while [BETA] becomes 1/kT, with T the thermodynamic temperature . . . A thermodynamic system is characterized by a microscopic structure that is not observed in detail . . . We attempt to develop a theoretical description of the macroscopic properties in terms of its underlying microscopic properties, which are not precisely known. We attempt to assign probabilities to the various microscopic states . . . based on a few . . . macroscopic observations that can be related to averages of microscopic parameters. Evidently the problem that we attempt to solve in statistical thermophysics is exactly the one just treated in terms of information theory. It should not be surprising, then, that the uncertainty of information theory becomes a thermodynamic variable when used in proper context . . . . 

Jayne’s [summary rebuttal to a typical objection] is “. . . The entropy of a thermodynamic system is a measure of the degree of ignorance of a person whose sole knowledge about its microstate consists of the values of the macroscopic quantities . . . which define its thermodynamic state. This is a perfectly ‘objective’ quantity . . . it is a function of [those variables] and does not depend on anybody’s personality. There is no reason why it cannot be measured in the laboratory.” . . . .   [pp. 3 – 6, 7, 36; replacing Robertson’s use of S for Informational Entropy with the more standard H.] 

As is discussed briefly in Appendix 1, Thaxton, Bradley and Olsen [TBO], following Brillouin et al, in the 1984 foundational work for the modern Design Theory, The Mystery of Life’s Origins [TMLO], exploit this information-entropy link, through the idea of moving from a random to a known microscopic configuration in the creation of the bio-functional polymers of life, and then — again following Brillouin — identify a quantitative information metric for the information of polymer molecules. For, in moving from a random to a functional molecule, we have in effect an objective, observable increment in information about the molecule. This leads to energy constraints, thence to a calculable concentration of such molecules in suggested, generously “plausible” primordial “soups.” In effect, so unfavourable is the resulting thermodynamic balance, that the concentrations of the individual functional molecules in such a prebiotic soup are arguably so small as to be negligibly different from zero on a planet-wide scale.

By many orders of magnitude, we don’t get to even one molecule each of the required polymers per planet, much less bringing them together in the required proximity for them to work together as the molecular machinery of life. The linked chapter gives the details. More modern analyses [e.g. Trevors and Abel, here and here], however, tend to speak directly in terms of information and probabilities rather than the more arcane world of classical and statistical thermodynamics, so let us now return to that focus; in particular addressing information in its functional sense, as the third step in this preliminary analysis.>>

Further food for thought.

Comments
JM, I never denied that there is a conflict between elements of QM and RT, though at other points they are blended. After 100 years, it is not resolved. KFkairosfocus
December 16, 2017
December
12
Dec
16
16
2017
06:30 AM
6
06
30
AM
PDT
KF@ 31 I'm aware of correspondence principle. I'm also hopeful that you are aware of correspondence limit... Problems like that are just indicators of the incompatibility of theories like general relativity and quantum mechanics that can't be consolidated... QM has never been proven wrong and one of these will have to be scrapped...I'm putting my money on QM...J-Mac
December 14, 2017
December
12
Dec
14
14
2017
10:32 AM
10
10
32
AM
PDT
RVB8, Appreciated. I am told that across time the pain of loss fades and the positive memories provide a balm. I find that those last few minutes are a bitter-sweet mix of the two . . . and for nearly 3 months, they ran like a movie in my mind, imprinted like a burned-in image in the retina. That, I did not expect. KFkairosfocus
December 14, 2017
December
12
Dec
14
14
2017
02:03 AM
2
02
03
AM
PDT
Kairos, very sorry for your loss. I remember keenly the loss of my own father. Although I did not have the consolation of Christ to fall back upon, I found family and friends sufficient. All the best, Rob.rvb8
December 13, 2017
December
12
Dec
13
13
2017
08:05 PM
8
08
05
PM
PDT
F/N: Belfast asked for a y/n answer, above. This shows how a 1-bit result exists in a context of a language, as in writing I had to use the word no, two ASCII characters, forming a word in a language -- itself a huge entity that is information rich. In turn the concept, no, is part of a world of experience, which we implicitly bring to the situation and which sets context for the significance of the answer above and beyond the context of a question in a life-situation. So, that bit is actually a switch, accessing much that is not visible in giving 0 vs 1. Of course even that is using ASCII characters that go back to a world of technology and history as well as info-systems praxis. KFkairosfocus
December 13, 2017
December
12
Dec
13
13
2017
07:48 PM
7
07
48
PM
PDT
KF @31: Valid point. Interesting analogies for illustration. Thanks.Dionisio
December 13, 2017
December
12
Dec
13
13
2017
03:54 PM
3
03
54
PM
PDT
KF @27: "Mind you, in parallel I have to help carry a major policy war that embraces local, regional and Commonwealth dimensions and actually has a few links to the Brexit question. So, I cannot commit to any timeline." I pray that you can focus on those difficult tasks ahead and can get them resolved properly and timely without stress.Dionisio
December 13, 2017
December
12
Dec
13
13
2017
03:53 PM
3
03
53
PM
PDT
J-Mac, I am sure you are aware of the correspondence principle. As Quantum effects scale up, they converge to a classical limit, as is necessary to meet the fact that at that scale classical results tend to work quite well by and large -- of course, photo effect, UV catastrophe, freezing out of degrees of freedom tied to heat capacities etc and the like effects such as line and absorption spectra were clues pointing inwards to the quantum scale. As it turns out, the genetic code's functionality is well accounted for on classical information insofar as protein synthesis and regulatory circuits etc are concerned. Yes, the chemical reactions necessarily involve quantum processes, but that is as common as that the combustion reaction at the heart of a fire is a quantum process too. If mutational "hot spots" or whatever do reflect quantum effects, that does not change what has been addressed in the OP. Nothing you have said affects the basic understanding of what information is, indeed, qubits encoding information vectorially in Bloch spheres is another form of information expression due to configuration, and probabilities come up as expressions of uncertainties leading right to how information will reduce uncertainty; in this case linked to where we may have wave function collapse. The notion that you would wish to suggest, apparently to sweep the facts of alphabetic coded thus linguistic information in D/RNA -- what was presented in the OP -- off the table are in fact clearly irrelevant to that well established fact. You have evidently fallen into a red herring distractive fallacy. KFkairosfocus
December 13, 2017
December
12
Dec
13
13
2017
10:00 AM
10
10
00
AM
PDT
KF@ 13, At subatomic level everything is based on quantum mechanics; the quantum arrangement of 3 particles including the arrangement of nucleotides and codons in the genetic code...the classical information without subatomic arrangement of particles is dead... When you consider that processes like mitosis, hot-spots in mutations etc seem to be controlled by quantum processes like quantum coherence and quantum entanglement, classical info seems almost irrelevant or dead...J-Mac
December 13, 2017
December
12
Dec
13
13
2017
06:09 AM
6
06
09
AM
PDT
F/N: Les from the walled up thread: >> 5 LesDecember 13, 2017 at 7:49 am Biological information definition – Francis Crick (1958) http://hudsonvalleyrnaclub.org/course/rna_lecture1_pata090209.pdf “By information I mean the specification of the amino acid sequence of the protein.” “Information means here the precise determination of sequence, either of bases in the nucleic acid or of amino acid residues in the protein.” >> KFkairosfocus
December 13, 2017
December
12
Dec
13
13
2017
06:07 AM
6
06
07
AM
PDT
Folks, Part 3, Plato, too, has some'at to say to us in The Laws Bk X:
Athenian Stranger: [[The avant garde philosophers, teachers and artists c. 400 BC] say that the greatest and fairest things are the work of nature and of chance, the lesser of art [[ i.e. techne], which, receiving from nature the greater and primeval creations, moulds and fashions all those lesser works which are generally termed artificial . . . They say that fire and water, and earth and air [[i.e the classical "material" elements of the cosmos], all exist by nature and chance, and none of them by art, and that as to the bodies which come next in order-earth, and sun, and moon, and stars-they have been created by means of these absolutely inanimate existences. The elements are severally moved by chance and some inherent force according to certain affinities among them-of hot with cold, or of dry with moist, or of soft with hard, and according to all the other accidental admixtures of opposites which have been formed by necessity. After this fashion and in this manner the whole heaven has been created, and all that is in the heaven, as well as animals and all plants, and all the seasons come from these elements, not by the action of mind, as they say, or of any God, or from art, but as I was saying, by nature and chance only . . . . [[T]hese people would say that the Gods exist not by nature, but by art, and by the laws of states, which are different in different places, according to the agreement of those who make them; and that the honourable is one thing by nature and another thing by law, and that the principles of justice have no existence at all in nature, but that mankind are always disputing about them and altering them; and that the alterations which are made by art and by law have no basis in nature, but are of authority for the moment and at the time at which they are made.- [[Relativism, too, is not new; complete with its radical amorality rooted in a worldview that has no foundational IS that can ground OUGHT. (Cf. here for Locke's views and sources on a very different base for grounding liberty as opposed to license and resulting anarchistic "every man does what is right in his own eyes" chaos leading to tyranny.)] These, my friends, are the sayings of wise men, poets and prose writers, which find a way into the minds of youth. They are told by them that the highest right is might [[ Evolutionary materialism leads to the promotion of amorality], and in this way the young fall into impieties, under the idea that the Gods are not such as the law bids them imagine; and hence arise factions [[Evolutionary materialism-motivated amorality "naturally" leads to continual contentions and power struggles; cf. dramatisation here], these philosophers inviting them to lead a true life according to nature, that is, to live in real dominion over others [[such amoral factions, if they gain power, "naturally" tend towards ruthless tyranny; here, too, Plato hints at the career of Alcibiades], and not in legal subjection to them . . . . [[I]f impious discourses were not scattered, as I may say, throughout the world, there would have been no need for any vindication of the existence of the Gods-but seeing that they are spread far and wide, such arguments are needed; and who should come to the rescue of the greatest laws, when they are being undermined by bad men, but the legislator himself? . . . . Ath. Then, by Heaven, we have discovered the source of this vain opinion of all those physical investigators; and I would have you examine their arguments with the utmost care, for their impiety is a very serious matter; they not only make a bad and mistaken use of argument, but they lead away the minds of others: that is my opinion of them. Cle. You are right; but I should like to know how this happens. Ath. I fear that the argument may seem singular. Cle. Do not hesitate, Stranger; I see that you are afraid of such a discussion carrying you beyond the limits of legislation. But if there be no other way of showing our agreement in the belief that there are Gods, of whom the law is said now to approve, let us take this way, my good sir. Ath. Then I suppose that I must repeat the singular argument of those who manufacture the soul according to their own impious notions; they affirm that which is the first cause of the generation and destruction of all things, to be not first, but last, and that which is last to be first, and hence they have fallen into error about the true nature of the Gods. Cle. Still I do not understand you. Ath. Nearly all of them, my friends, seem to be ignorant of the nature and power of the soul [[ = psuche], especially in what relates to her origin: they do not know that she is among the first of things, and before all bodies, and is the chief author of their changes and transpositions. And if this is true, and if the soul is older than the body, must not the things which are of the soul's kindred be of necessity prior to those which appertain to the body? Cle. Certainly. Ath. Then thought and attention and mind and art and law will be prior to that which is hard and soft and heavy and light; and the great and primitive works and actions will be works of art; they will be the first, and after them will come nature and works of nature, which however is a wrong term for men to apply to them; these will follow, and will be under the government of art and mind. Cle. But why is the word "nature" wrong? Ath. Because those who use the term mean to say that nature is the first creative power; but if the soul turn out to be the primeval element, and not fire or air, then in the truest sense and beyond other things the soul may be said to exist by nature; and this would be true if you proved that the soul is older than the body, but not otherwise. [[ . . . .] Ath. . . . when one thing changes another, and that another, of such will there be any primary changing element? How can a thing which is moved by another ever be the beginning of change? Impossible. But when the self-moved changes other, and that again other, and thus thousands upon tens of thousands of bodies are set in motion, must not the beginning of all this motion be the change of the self-moving principle? . . . . self-motion being the origin of all motions, and the first which arises among things at rest as well as among things in motion, is the eldest and mightiest principle of change, and that which is changed by another and yet moves other is second. [[ . . . .] Ath. If we were to see this power existing in any earthy, watery, or fiery substance, simple or compound-how should we describe it? Cle. You mean to ask whether we should call such a self-moving power life? Ath. I do. Cle. Certainly we should. Ath. And when we see soul in anything, must we not do the same-must we not admit that this is life? [[ . . . . ] Cle. You mean to say that the essence which is defined as the self-moved is the same with that which has the name soul? Ath. Yes; and if this is true, do we still maintain that there is anything wanting in the proof that the soul is the first origin and moving power of all that is, or has become, or will be, and their contraries, when she has been clearly shown to be the source of change and motion in all things? Cle. Certainly not; the soul as being the source of motion, has been most satisfactorily shown to be the oldest of all things. Ath. And is not that motion which is produced in another, by reason of another, but never has any self-moving power at all, being in truth the change of an inanimate body, to be reckoned second, or by any lower number which you may prefer? Cle. Exactly. Ath. Then we are right, and speak the most perfect and absolute truth, when we say that the soul is prior to the body, and that the body is second and comes afterwards, and is born to obey the soul, which is the ruler? [[ . . . . ] Ath. If, my friend, we say that the whole path and movement of heaven, and of all that is therein, is by nature akin to the movement and revolution and calculation of mind, and proceeds by kindred laws, then, as is plain, we must say that the best soul takes care of the world and guides it along the good path. [[Plato here explicitly sets up an inference to design (by a good soul) from the intelligible order of the cosmos.]
KFkairosfocus
December 12, 2017
December
12
Dec
12
12
2017
10:40 PM
10
10
40
PM
PDT
Belfast, no -- the one bit form. Amplifying: there has been a series of posts I have made recently (having sufficiently recovered to do so, never mind my last "triggering" with grieving was but a few minutes past due to impact of memories) on fundamental matters that seem necessary at this time. DV, in due course there will be more, including taking a look at quantum versions of information since these seem to trigger a sort of quantum rhetoric that needs to be answered. Not that such is actually a cogent response to the issues posed by say DNA and the complex, coherent functional organisation that pervades the world of life from the living cell up to the brains and CNS we have as computational substrates. Mind you, in parallel I have to help carry a major policy war that embraces local, regional and Commonwealth dimensions and actually has a few links to the Brexit question. So, I cannot commit to any timeline. Of course, at this time, CR may have a juster claim to be the grain of sand in the oyster that is UD. And Constructor Theory is also on the agenda for some time, DV. KFkairosfocus
December 12, 2017
December
12
Dec
12
12
2017
10:36 PM
10
10
36
PM
PDT
Kf: rvb8 wrote"I have nothing to offer here, this post is well beyond me. However, I get the distinct feeling that that was its intention." I feel you should 'fess up. Did you write this post with rvb8 in mind? Binary answer if possible please.Belfast
December 12, 2017
December
12
Dec
12
12
2017
10:05 PM
10
10
05
PM
PDT
RVB8, Part 2. First, a personal report. July last, on hearing of my dad having to go back to hospital the second time within a week, I got a flight booked and went straight from an airport in Kingston Ja to his hospital bedside in Mandeville. He held on by force of will until he could see me and resolve one final duty to my mom (his wife of eleven days short of sixty years as at the time of my visit . . . ) by signing a final legal document. That document was signed the next morning, after I had napped and come to watch across the night then went home to my parents' house with an aunt to freshen up and return to the hospital about 30 miles away. After that, within three hours, he was gone. At 1145 hrs local time, July 18th last, he turned to look at me, he turned to look at his caregiver -- my aunts had done overnight vigil and were then minutes out -- looked up to Someone we could not see, called Him "Lord" and surrendered his spirit to Him, then was gone in seconds thereafter. Fifteen minutes before, he preached his last, one-line sermon (he was a lay preacher and distinguished economist), at a time when every breath was precious and painful. Those who disobey the Word of God (thus designating the most precious heritage our Clan Lord was passing on . . . ) are in danger of Hell's fires. Not just the eschatological judgements, but more importantly in some respects, the tongue of falsity is set alight from Hell, sets the course of one's life afire, and so also sets fires of hellish chaos blazing through the world. This whole scene fits with the scripture, that the death of his saints is precious in the eyes of the Lord. I have moral certainty as to the fate of my dad: he went to be with his Lord, whom he loved and walked with for decades. The same Lord he led me to meet and manifested in the path of his life. I took the Thompson's Chain Reference Bible from his bedside (the one I knew from childhood) and brought it home with me. As I bought my own as a young College student, this one I handed over to my son -- fourth generation in succession. The Bible passed on from my Grandfather (bought on the occasion of the drowning death of his first son during a rescue of people from a spilled boat in the US where they were as war workers) I left in my parents' home. There are millions who will give similar testimony. For instance, my cousins in law here speak of a Father who was seeing the Celestial city at the time of his departure and who then was more than ready to go. But such will doubtless be dismissed as mass delusion. (Never mind the implications of claiming the human mind is that prone to delusion.) To give the deeper answer to your question, I turn to those same Scriptures, passed on to us by prophets, apostles, martyrs and confessors at appalling cost at the hands of those whose lives were ablaze with hellish fire:
55 AD, based on the official summary of the witness of the 500, c. 35 - 38 AD:
1 Cor 15:1 Now brothers and sisters, let me remind you [once again] of the good news [of salvation] which I preached to you, which you welcomed and accepted and on which you stand [by faith]. 2 By this faith you are saved [reborn from above—spiritually transformed, renewed, and set apart for His purpose], if you hold firmly to the word which I preached to you, unless you believed in vain [just superficially and without complete commitment]. 3 For I passed on to you as of first importance what I also received,
that Christ died for our sins according to [that which] the Scriptures [foretold], 4 and that He was buried, and that He was [bodily] raised on the third day according to [that which] the Scriptures [foretold], 5 and that He appeared to Cephas (Peter), then to the [a]Twelve. 6 After that He appeared to more than five hundred brothers and sisters at one time, the majority of whom are still alive, but some have fallen asleep [in death]. 7 Then He was seen by James, then by all the apostles, 8 and last of all, as to one [b]untimely (prematurely, traumatically) born, He appeared to me also . . .
11 So whether it was I or they, this is what we preach, and this is what you believed and trusted in and relied on with confidence. [AMP, pistis being a word that speaks of confident trust or soundly arrived at conviction and/or "faith" rooted in convincing evidence and its sound presentation, indeed it is the word for rhetorical proof.]
There, sir, is your answer, by the One who came for my dad, who is risen from the dead in exact fulfillment of the scriptures given almost a thousand years earlier (see esp. Isa 52 - 53) and who was witnessed by the 500, now backed by millions whose lives he has transformed around the whole world. Including that, apart from a miracle of guidance that answered my mom's prayer of surrender on that very day it was given, I would not be here today to type these words. You may mock or dismiss this answer, clinging to a self-falsifying ideology of atheistical Scientism (or its fellow travellers) but you cannot justly say that you have been given no answer. Here witnesseth, this 13th day of December, in the year of Our Lord 2017: GEM of TKIkairosfocus
December 12, 2017
December
12
Dec
12
12
2017
10:02 PM
10
10
02
PM
PDT
We can all recognise information in everyday life, and distinguish different kinds of information and whether there is an increase or not. e.g. Hollandaise Sauce. We could look it up in a dictionary and get some information. We can read a recipe and get more information. We can taste it and get more information. We now know what it is, how to make it, and what it tastes like. We know that we have acquired and increased information; we can qualitatively say we have more information even if we can't quantitatively say how much. How would we quantify the information in the taste? Parameterise taste so that its structure and quantity can be characterised. Design a description language and then measure and report, doubtless some will be borrowed from Chemistry. The Scoville scale for peppers is a related case. KFaarceng
December 12, 2017
December
12
Dec
12
12
2017
09:47 PM
9
09
47
PM
PDT
RVB8, in fact, the evolutionary materialistic scientism (and fellow travellers) that dominates big-S Science today cannot account for consciousness much less the responsible, rational freedom required to carry out the reasoning used in doing even science. That's part of why it is utterly and irretrievably self-referentially incoherent and thus self-falsifying. Here's J B S Haldane on the point, long since:
"It seems to me immensely unlikely that mind is a mere by-product of matter. For if my mental processes are determined wholly by the motions of atoms in my brain I have no reason to suppose that my beliefs are true. They may be sound chemically, but that does not make them sound logically. And hence I have no reason for supposing my brain to be composed of atoms. In order to escape from this necessity of sawing away the branch on which I am sitting, so to speak, I am compelled to believe that mind is not wholly conditioned by matter.” ["When I am dead," in Possible Worlds: And Other Essays [1927], Chatto and Windus: London, 1932, reprint, p.209. (NB: DI Fellow, Nancy Pearcey brings this right up to date (HT: ENV) in a current book, Finding Truth.)]
So, too, BA is quite right to say that science has no way of testing that you or any other human being has thoughts, as opposed to having a brain and CNS that form a computational substrate that carries out GIGO-driven, inherently mechanical computation through chains of coupled neurons. And BTW, science that rejects design as a credible possibility through imposing Lewontin-style a priori materialism dressed up in a lab coat has no coherent, cogent explanation for the functionally specific, complex organisation and associated -- yes -- information in that computational substrate either. So, there is no benchtop or field-observational science that accounts for mindedness required to do science. Thus also, the self and/or soul which expresses itself through our being self-moved, self-aware responsibly and rationally free agents. Of course, those who lock up knowledge to the sort of big-S Science we are discussing, reflect Scientism, which is a failed thesis that if it's not big-S Science, it's not knowledge or it's not credible, serious knowledge. This reflects the Lewontin declaration that hoi polloi must be brought to imagine that Science is "the only begetter of truth." Which is an epistemological claim and so refutes itself. Yet another manifestation of self-referential incoherence brought to us by evo mat scientism. Wrong department of knowledge, in short. The more reasonable answer starts with: dust to dust, ashes to ashes. Embodied information in the human body decays after death in accord with the laws of entropy. In fact, that is happening already in life, and once the telomeres are used up a clock is definitely running down. What of the soul and its expression in mindedness? Not a subject of scientific investigation, though foundational to there being the possibility of science. This does not mean there's no answer, just that the arbitrary datum line ruled by self-refuting but ideologically dominant evo mat scientism and its fellow travellers has no way to answer it. Locks out the only hopes of an answer in fact, and refuses to listen. As we see just above from you. So, start with answering the prior question, why are so many locking allowed knowledge up to evo mat scientism and/or its fellow travellers? KFkairosfocus
December 12, 2017
December
12
Dec
12
12
2017
09:22 PM
9
09
22
PM
PDT
Rvb8, science has no way of "testing" your thoughts to begin with. I hope you will agree then that asserting "rvb8 has thoughts" is unacceptable.Barry Arrington
December 12, 2017
December
12
Dec
12
12
2017
08:01 PM
8
08
01
PM
PDT
anthropic @20, and immediately the 'untestable' is touted. anthropic, science has no test for that. You may well be right, but we'll never know until death, as there is no way of knowing till then.rvb8
December 12, 2017
December
12
Dec
12
12
2017
08:00 PM
8
08
00
PM
PDT
rvb8 19 Seems to me that information itself is supernatural, if by "natural" we mean fully explicable via unguided matter/energy.anthropic
December 12, 2017
December
12
Dec
12
12
2017
06:45 PM
6
06
45
PM
PDT
Fine; and my second question? How is the information,(memories, genetic code, or other), stored, when the energy source, (metabolism) required to maintain the memory, ('information'), or genetic code, ('information') is stopped? Or, if you want; When you die, what happens to your personal, 'information'? NOTE: Any supernatural answers to this question are unacceptable, as science has no means of testing these answers. Darwin says it is passed on, with unintended natural alterations.rvb8
December 12, 2017
December
12
Dec
12
12
2017
06:06 PM
6
06
06
PM
PDT
KF @ 17. If you write a non-technical post, they will whine that UD has abandoned science. If you write a technical post, they will whine that UD is obfuscating.
“But to what shall I compare this generation? It is like children sitting in the market places, who call out to the other children, and say, ‘We played the flute for you, and you did not dance; we sang a dirge, and you did not mourn.’
Barry Arrington
December 12, 2017
December
12
Dec
12
12
2017
04:21 PM
4
04
21
PM
PDT
RVB8, FYI this is a fairly technical subject -- some relevant science and linked issues that are too often overlooked. It is also at the heart of the design issue. If you cannot handle at least this level, you are not in a position to seriously discuss the issues at stake. KFkairosfocus
December 12, 2017
December
12
Dec
12
12
2017
02:34 PM
2
02
34
PM
PDT
I have nothing to offer here, this post is well beyond me. However, I get the distinct feeling that that was its intention. The 'information', offered here about information, could perhaps be made more readily accessible to the less well educated, such as myself? However, knowing Kairos, and his penchant for the indecipherable, perhaps a better communicator, for a description of 'information', is required? Also, one question; How is this 'information' stored in a human being, once the energy source, (metabolism), is stopped?rvb8
December 12, 2017
December
12
Dec
12
12
2017
02:19 PM
2
02
19
PM
PDT
GBD, yes, terms have a wide variety of uses that must be watched. KF PS: I was thinking about how we can make sense of a familiar language even against noise and drop-outs.kairosfocus
December 12, 2017
December
12
Dec
12
12
2017
01:47 PM
1
01
47
PM
PDT
KF @ 11: An ideal case. But in the real, noisy world redundancy has its use. Even, in English. KF Yes, thanks. In fact, redundancy is essential to reach Shannon capacity. Once message redundancy is removed, forward error correction is added to the message. This redundancy is optimal and essential to determine which message was sent. The newest low density parity check codes require very little of this redundancy because of an algorithm ('turbo' processing) that looks at the analog value(s) of each received signal. This allows us to get very close to the theoretical Shannon limit. I get confused and even marvel at how the words 'information' and 'entropy' have been co-opted to so many different concepts..like the word 'evolution', we must carefully define the sense in which we are using these words or we end up talking past each other.GBDixon
December 12, 2017
December
12
Dec
12
12
2017
12:36 PM
12
12
36
PM
PDT
J-Mac: The genetic code functions in effect classically: AGCT. Similarly, we can generally reduce functional organisation relevant to the design inference in terms of chains of Y/N q's specifying a description . . . try autocad. Playing around with superposition of states and resolution does not remove the fundamental importance of distinct identity in reasoning and communication. KF PS: You may find the weak argument correctives discussion of quantum mech vs quantum rhetoric, useful: https://uncommondescent.com/faq/#LNCkairosfocus
December 12, 2017
December
12
Dec
12
12
2017
09:42 AM
9
09
42
AM
PDT
KF, There is quite distinct difference of properties of classical and quantum information... Classical information is carried by systems with a definite state, and it can be replicated and measured without being altered... Quantum information is encoded as a property of quantum systems i.e. superposition and entanglement with no classical counterpart... As per no-cloning theorem, quantum information cannot be cloned but it can be altered as a result of an observation or measurement... Unlike classical information, quantum information can't be created or destroyed though there is still some doubt about both...J-Mac
December 12, 2017
December
12
Dec
12
12
2017
09:33 AM
9
09
33
AM
PDT
GBD, yes Shannon knew much about information, signals, processing and transmission. His metric is about info carrying capacity. I spoke to the random case because of a specific common objection. Yes, an ideal code will squeeze out redundancy and reach the same maximum. An ideal case. But in the real, noisy world redundancy has its use. Even, in English. KFkairosfocus
December 12, 2017
December
12
Dec
12
12
2017
09:24 AM
9
09
24
AM
PDT
Hi KF, Thank you for the interesting post. Once again, I feel obligated to stop lurking and come to Shannon's defense, who I feel understood information very well, and certainly more than those who often misinterpret him. Shannon's model, forms of which are given in your post, was simply a list of valid messages agreed upon together by the sender and receiver. Because the messages may contain redundancies (Shannon famously used the English language, where you can guess the next letter in a plain English text better than 50% of the time), there is an encoding process that ideally removes all redundancy. Shannon's famous entropy equation was simply a measure of how well the redundancy was removed. When you have maximum entropy you are done encoding, and each bit sent has maximum 'meaning' in the sense that the messages sent have maximum pithiness, or are the shortest length possible. As your article states, other fields see similarity with this measure and adopt the entropy term for their purposes, and this has confused the simple and elegant model Shannon used. As mentioned in your article the receiver's job is to look at the message received, which may be corrupted by noise, and determine what was sent. After this best guess the pithy message is decoded back into the original message, possibly plain English text. Do not confuse a message that has maximum entropy with random noise. It is just the opposite, even though to a receiver who does not have the decoder ring it cannot be distinguished from noise. This also shows clearly that for information to be such, a receiver to interpret it is necessary. DNA would be useless garbage without the cell machines that receive and interpret it.GBDixon
December 12, 2017
December
12
Dec
12
12
2017
08:42 AM
8
08
42
AM
PDT
here's an interesting report on a most sophisticated information system that we still don't comprehend well enough [to put it nicely]: https://consumer.healthday.com/disabilities-information-11/amputation-news-720/boy-s-double-hand-transplant-changed-his-brain-729122.html control systems engineers and computer scientists would be in awe at the sight of such a fascinating system, thinking they are just dreaming. but someone out there would assure them that all that is the product of RV+NS+...+T+...?Dionisio
December 12, 2017
December
12
Dec
12
12
2017
07:44 AM
7
07
44
AM
PDT
1 2

Leave a Reply