Information, of course, is notoriously a concept that has many senses of meaning. As it is central to the design inference, let us look (again) at defining it.
We can dispose of one sense right off, Shannon was not directly interested in information but in information-carrying capacity; that is why his metric will peak for a truly random signal, which has as a result minimal redundancy. And, we can also see that the bit measure commonly seen in ICT circles or in our PC memories etc, is actually this measure, 1 k bit is 1,024 = 2^10 binary digits of storage or transmission capacity. One binary digit or bit being a unit of information storing one choice between a pair of alternatives such as yes/no, or true/false, or on/off, or high/low or N-pole/S-pole, etc. Where, obviously, the meaningful substance that is stored or communicated or may be implicit in a coherent organised functional entity is the sense of information that is most often relevant.
That is, as F. R. Connor put it in his telecommunication series of short textbooks,
“Information is not what is actually in a message but what could constitute a message. The word could implies a statistical definition in that it involves some selection of the various possible messages. The important quantity is not the actual information content of the message but rather its possible information content.” [Signals, Edward Arnold, 1972, p. 79.]
So, we come to a version of the Shannon Communication system model:

Elaborating slightly by expanding the encoder-decoder framework (and following the general framework of the ISO OSI model):
In this model, information-bearing messages flow from a source to a sink, by being: (1) encoded, (2) transmitted through a channel as a signal, (3) received, and (4) decoded. At each corresponding stage: source/sink encoding/decoding, transmitting/receiving, there is in effect a mutually agreed standard, a so-called protocol. [For instance, HTTP — hypertext transfer protocol — is a major protocol for the Internet. This is why many web page addresses begin: “http://www . . .”]
However, as the diagram hints at, at each stage noise affects the process, so that under certain conditions, detecting and distinguishing the signal from the noise becomes a challenge. Indeed, since noise is due to a random fluctuating value of various physical quantities [due in turn to the random behaviour of particles at molecular levels], the detection of a message and accepting it as a legitimate message rather than noise that got lucky, is a question of inference to design. In short, inescapably, the design inference issue is foundational to communication science and information theory.
Going beyond this, we can refer to the context of information technology, communication systems and computers, which provides a vital clarifying side-light from another view on how complex, specified information functions in information processing systems and so also what information is as contrasted with data and knowledge:
[In the context of computers, etc.] information is data — i.e. digital representations of raw events, facts, numbers and letters, values of variables, etc. — that have been put together in ways suitable for storing in special data structures [strings of characters, lists, tables, “trees” etc], and for processing and output in ways that are useful [i.e. functional]. . . . Information is distinguished from [a] data: raw events, signals, states etc represented digitally, and [b] knowledge: information that has been so verified that we can reasonably be warranted, in believing it to be true. [GEM/TKI, UWI FD12A Sci Med and Tech in Society Tutorial Note 7a, Nov 2005.]
Going to Principia Cybernetica Web as archived, we find three related discussions:
INFORMATION
1) that which reduces uncertainty. (Claude Shannon); 2) that which changes us. (Gregory Bateson)
Literally that which forms within, but more adequately: the equivalent of or the capacity of something to perform organizational work, the difference between two forms of organization or between two states of uncertainty before and after a message has been received, but also the degree to which one variable of a system depends on or is constrained by (see constraint) another. E.g., the dna carries genetic information inasmuch as it organizes or controls the orderly growth of a living organism. A message carries information inasmuch as it conveys something not already known. The answer to a question carries information to the extent it reduces the questioner’s uncertainty. A telephone line carries information only when the signals sent correlate with those received. Since information is linked to certain changes, differences or dependencies, it is desirable to refer to theme and distinguish between information stored, information carried, information transmitted, information required, etc. Pure and unqualified information is an unwarranted abstraction. information theory measures the quantities of all of these kinds of information in terms of bits. The larger the uncertainty removed by a message, the stronger the correlation between the input and output of a communication channel, the more detailed particular instructions are the more information is transmitted. (Krippendorff)
Information is the meaning of the representation of a fact (or of a message) for the receiver. (Hornung)
The point is, that information itself is not an obvious material artifact, though it is often embedded in such. It is abstract but nonetheless very real, very effective, very functional. So much so that it lies at the heart of cell-based life and of the wave of technology currently further transforming our world.
It is in the light of the above concerns, issues and concepts that, a few days ago, I added the following working rough definition to a current UD post. On doing so, I thought, this is worth headlining for itself. And so, let me now cite:
>>to facilitate discussion [as, a good general definition that does not “bake-in” information being a near-synonym to knowledge is hard to find . . . ] we may roughly identify information as
1: facets of reality [–> a way to speak of an abstract entity] that may capture and so frame — give meaningful FORM to
2: representations of elements of reality — such representations being items of data — that
3: by accumulation of such structured items . . .
4: [NB: which accumulation is in principle quantifiable, e.g. by defining a description language that chains successive y/n questions to specify a unique/particular description or statement, thence I = – log p in the Shannon case, etc],
5: meaningful complex messages may then be created, modulated, encoded, decoded, conveyed, stored, processed and otherwise made amenable to use by a system, entity or agent interacting with the wider world. E.g. consider the ASCII code:
The ASCII code takes seven y/n q’s per character to specify text in English, yielding text size 7 bits per character of FSCO/I for such text
or, the genetic code (notice, the structural patterns set by the first two letters):
The Genetic code uses three-letter codons to specify the sequence of AA’s in proteins and specifying start/stop, and using six bits per AA
or, mRNA . . . notice, U not T:
or, a cybernetic entity using informational signals to guide its actions:
. . . or, a von Neumann kinematic self-replicator:
(Of course, such representations may be more or less accurate or inaccurate, or even deceitful. Thus, knowledge requires information but has to address warrant as credibly truthful. Wisdom, goes beyond knowledge to imply sound insight into fundamental aspects of reality that guide sound, prudent, ethical action.)>>
Food, for thought. END
PS: It would be remiss of me not to tie in the informational thermodynamics link, with a spot of help from Wikipedia on that topic. Let me clip from my longstanding briefing note:
>>To quantify the above definition of what is perhaps best descriptively termed information-carrying capacity, but has long been simply termed information (in the “Shannon sense” – never mind his disclaimers . . .), let us consider a source that emits symbols from a vocabulary: s1,s2, s3, . . . sn, with probabilities p1, p2, p3, . . . pn. That is, in a “typical” long string of symbols, of size M [say this web page], the average number that are some sj, J, will be such that the ratio J/M –> pj, and in the limit attains equality. We term pj the a priori — before the fact — probability of symbol sj. Then, when a receiver detects sj, the question arises as to whether this was sent. [That is, the mixing in of noise means that received messages are prone to misidentification.] If on average, sj will be detected correctly a fraction, dj of the time, the a posteriori — after the fact — probability of sj is by a similar calculation, dj. So, we now define the information content of symbol sj as, in effect how much it surprises us on average when it shows up in our receiver:
I = log [dj/pj], in bits [if the log is base 2, log2] . . . Eqn 1
This immediately means that the question of receiving information arises AFTER an apparent symbol sj has been detected and decoded. That is, the issue of information inherently implies an inference to having received an intentional signal in the face of the possibility that noise could be present. Second, logs are used in the definition of I, as they give an additive property: for, the amount of information in independent signals, si + sj, using the above definition, is such that:
I total = Ii + Ij . . . Eqn 2
For example, assume that dj for the moment is 1, i.e. we have a noiseless channel so what is transmitted is just what is received. Then, the information in sj is:
I = log [1/pj] = – log pj . . . Eqn 3
This case illustrates the additive property as well, assuming that symbols si and sj are independent. That means that the probability of receiving both messages is the product of the probability of the individual messages (pi *pj); so:
Itot = log1/(pi *pj) = [-log pi] + [-log pj] = Ii + Ij . . . Eqn 4
So if there are two symbols, say 1 and 0, and each has probability 0.5, then for each, I is – log [1/2], on a base of 2, which is 1 bit. (If the symbols were not equiprobable, the less probable binary digit-state would convey more than, and the more probable, less than, one bit of information. Moving over to English text, we can easily see that E is as a rule far more probable than X, and that Q is most often followed by U. So, X conveys more information than E, and U conveys very little, though it is useful as redundancy, which gives us a chance to catch errors and fix them: if we see “wueen” it is most likely to have been “queen.”)
Further to this, we may average the information per symbol in the communication system thusly (giving in termns of -H to make the additive relationships clearer):
– H = p1 log p1 + p2 log p2 + . . . + pn log pn
or, H = – SUM [pi log pi] . . . Eqn 5
H, the average information per symbol transmitted [usually, measured as: bits/symbol], is often termed the Entropy; first, historically, because it resembles one of the expressions for entropy in statistical thermodynamics. As Connor notes: “it is often referred to as the entropy of the source.” [p.81, emphasis added.] Also, while this is a somewhat controversial view in Physics, as is briefly discussed in Appendix 1below, there is in fact an informational interpretation of thermodynamics that shows that informational and thermodynamic entropy can be linked conceptually as well as in mere mathematical form. Though somewhat controversial even in quite recent years, this is becoming more broadly accepted in physics and information theory, as Wikipedia now discusses [as at April 2011] in its article on Informational Entropy (aka Shannon Information, cf also here):
At an everyday practical level the links between information entropy and thermodynamic entropy are not close. Physicists and chemists are apt to be more interested in changes in entropy as a system spontaneously evolves away from its initial conditions, in accordance with the second law of thermodynamics, rather than an unchanging probability distribution. And, as the numerical smallness of Boltzmann’s constant kB indicates, the changes in S / kB for even minute amounts of substances in chemical and physical processes represent amounts of entropy which are so large as to be right off the scale compared to anything seen in data compression or signal processing.
But, at a multidisciplinary level, connections can be made between thermodynamic and informational entropy, although it took many years in the development of the theories of statistical mechanics and information theory to make the relationship fully apparent. In fact, in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon’s information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states that it could be in, thus making any complete state description longer. (See article: maximum entropy thermodynamics.[Also,another article remarks: >>in the words of G. N. Lewis writing about chemical entropy in 1930, “Gain in entropy always means loss of information, and nothing more” . . . in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes/no questions that need to be answered in order to fully specify the microstate, given that we know the macrostate.>>]) Maxwell’s demon can (hypothetically) reduce the thermodynamic entropy of a system by using information about the states of individual molecules; but, as Landauer (from 1961) and co-workers have shown, to function the demon himself must increase thermodynamic entropy in the process, by at least the amount of Shannon information he proposes to first acquire and store; and so the total entropy does not decrease (which resolves the paradox).
Summarising Harry Robertson’s Statistical Thermophysics (Prentice-Hall International, 1993) — excerpting desperately and adding emphases and explanatory comments, we can see, perhaps, that this should not be so surprising after all. (In effect, since we do not possess detailed knowledge of the states of the vary large number of microscopic particles of thermal systems [typically ~ 10^20 to 10^26; a mole of substance containing ~ 6.023*10^23 particles; i.e. the Avogadro Number], we can only view them in terms of those gross averages we term thermodynamic variables [pressure, temperature, etc], and so we cannot take advantage of knowledge of such individual particle states that would give us a richer harvest of work, etc.)
For, as he astutely observes on pp. vii – viii:
. . . the standard assertion that molecular chaos exists is nothing more than a poorly disguised admission of ignorance, or lack of detailed information about the dynamic state of a system . . . . If I am able to perceive order, I may be able to use it to extract work from the system, but if I am unaware of internal correlations, I cannot use them for macroscopic dynamical purposes. On this basis, I shall distinguish heat from work, and thermal energy from other forms . . .
And, in more details, (pp. 3 – 6, 7, 36, cf Appendix 1 below for a more detailed development of thermodynamics issues and their tie-in with the inference to design; also see recent ArXiv papers by Duncan and Samura here and here):
. . . It has long been recognized that the assignment of probabilities to a set represents information, and that some probability sets represent more information than others . . . if one of the probabilities say p2 is unity and therefore the others are zero, then we know that the outcome of the experiment . . . will give [event] y2. Thus we have complete information . . . if we have no basis . . . for believing that event yi is more or less likely than any other [we] have the least possible information about the outcome of the experiment . . . . A remarkably simple and clear analysis by Shannon [1948] has provided us with a quantitative measure of the uncertainty, or missing pertinent information, inherent in a set of probabilities [NB: i.e. a probability different from 1 or 0 should be seen as, in part, an index of ignorance] . . . .
[deriving informational entropy, cf. discussions here, here, here, here and here; also Sarfati’s discussion of debates and the issue of open systems here . . . ]
H({pi}) = – C [SUM over i] pi*ln pi, [. . . “my” Eqn 6]
[where [SUM over i] pi = 1, and we can define also parameters alpha and beta such that: (1) pi = e^-[alpha + beta*yi]; (2) exp [alpha] = [SUM over i](exp – beta*yi) = Z [Z being in effect the partition function across microstates, the “Holy Grail” of statistical thermodynamics]. . . .
[H], called the information entropy, . . . correspond[s] to the thermodynamic entropy [i.e. s, where also it was shown by Boltzmann that s = k ln w], with C = k, the Boltzmann constant, and yi an energy level, usually ei, while [BETA] becomes 1/kT, with T the thermodynamic temperature . . . A thermodynamic system is characterized by a microscopic structure that is not observed in detail . . . We attempt to develop a theoretical description of the macroscopic properties in terms of its underlying microscopic properties, which are not precisely known. We attempt to assign probabilities to the various microscopic states . . . based on a few . . . macroscopic observations that can be related to averages of microscopic parameters. Evidently the problem that we attempt to solve in statistical thermophysics is exactly the one just treated in terms of information theory. It should not be surprising, then, that the uncertainty of information theory becomes a thermodynamic variable when used in proper context . . . .
Jayne’s [summary rebuttal to a typical objection] is “. . . The entropy of a thermodynamic system is a measure of the degree of ignorance of a person whose sole knowledge about its microstate consists of the values of the macroscopic quantities . . . which define its thermodynamic state. This is a perfectly ‘objective’ quantity . . . it is a function of [those variables] and does not depend on anybody’s personality. There is no reason why it cannot be measured in the laboratory.” . . . . [pp. 3 – 6, 7, 36; replacing Robertson’s use of S for Informational Entropy with the more standard H.]
As is discussed briefly in Appendix 1, Thaxton, Bradley and Olsen [TBO], following Brillouin et al, in the 1984 foundational work for the modern Design Theory, The Mystery of Life’s Origins [TMLO], exploit this information-entropy link, through the idea of moving from a random to a known microscopic configuration in the creation of the bio-functional polymers of life, and then — again following Brillouin — identify a quantitative information metric for the information of polymer molecules. For, in moving from a random to a functional molecule, we have in effect an objective, observable increment in information about the molecule. This leads to energy constraints, thence to a calculable concentration of such molecules in suggested, generously “plausible” primordial “soups.” In effect, so unfavourable is the resulting thermodynamic balance, that the concentrations of the individual functional molecules in such a prebiotic soup are arguably so small as to be negligibly different from zero on a planet-wide scale.
By many orders of magnitude, we don’t get to even one molecule each of the required polymers per planet, much less bringing them together in the required proximity for them to work together as the molecular machinery of life. The linked chapter gives the details. More modern analyses [e.g. Trevors and Abel, here and here], however, tend to speak directly in terms of information and probabilities rather than the more arcane world of classical and statistical thermodynamics, so let us now return to that focus; in particular addressing information in its functional sense, as the third step in this preliminary analysis.>>
Further food for thought.
What is information?
We cannot have a fundamental theory of information based on Shannon because…
A – infomation in Shannon’s theory requires a specific physical task that is not possible in quantum systems.
B- Shannon never got around to defining what to means for something to be distinguishable.
So, not only does Shannon’s theory not scale, but it has a problem of circularity.
My take,
information is a unified concept broken down and spread out in time and space. These lines consist of distinct sentences, words and characters, but what unites them all is my unified thought concerning information. Why the bits are information, and not noise, is precisely because my thought unites them.
Somehow, I can express a unified thought by breaking it down into bits sequenced in time and space, so it can exist in the material world.
Next, the reader needs to reunify (translate) the distinct bits back into unified thought.
This cycle of: “unified concept —> spread out in matter —> reunify —> unified concept” is irreducible complex.
CR, did you actually read the OP? Did you not observe that I specifically set Shannon to one side? You are tilting at yet another strawman. Where if you object to Shannon’s system model, I simply note that it is for cause a well-accepted conceptual framework readily abstracted from comms systems . . . which for decades have been designed around it, up to and including our layercake models used with say the Internet. The logic of digital t/comm forces such a frame, and modulated analogue t/comms forces a very similar one. Of course, the process logic represented is also a dead-on fit to functions in the living cell, including in protein synthesis. Logic of structure, functional operations and quantity in action. And besides, a reasonably accurate description of what info is, is not a theory of info, it is a clarification of a concept that ordinary dictionaries don’t handle well. As for distinguishability, that is the principle of distinct identity, A vs ~A. Where, we may see that the world may be observed as W = {A|~A}. From this the triple first principles of right reason — LOI, LNC, LEM — immediately arise as corollaries as was illustrated in my last OP in a diagram. The OP this builds on. Likewise, two-ness and the naturals follow thence all the way to the surreals, note the stepwise process diagrammed there. From this various operations and structures are possible, yielding the study of the logic of structure and quantity, aka Mathematics. But then, you seem to have a quarrel with first, foundational principles of reasoning. Little errors at the beginning have momentous consequences in the end. KF
Origenes, care to elaborate? KF
F/N: Let me cite a 55 AD source, that brings out distinct identity and its role in meaning, communication, intelligibility of messages, reasoning, learning and becoming educated thus edified. This stuff is not new:
Yes, Paul was there 2,000 years ago. We would all benefit from listening to what he said so long ago now. KF
Another great post. Thank you, KF. Long live UD!
KF,
as usual, a very insightful post on a highly important fundamental concept that unfortunately seems so misunderstood these days.
Well done! Thanks.
here’s an interesting report on a most sophisticated information system that we still don’t comprehend well enough [to put it nicely]:
https://consumer.healthday.com/disabilities-information-11/amputation-news-720/boy-s-double-hand-transplant-changed-his-brain-729122.html
control systems engineers and computer scientists would be in awe at the sight of such a fascinating system, thinking they are just dreaming.
but someone out there would assure them that all that is the product of RV+NS+…+T+…?
Hi KF,
Thank you for the interesting post. Once again, I feel obligated to stop lurking and come to Shannon’s defense, who I feel understood information very well, and certainly more than those who often misinterpret him.
Shannon’s model, forms of which are given in your post, was simply a list of valid messages agreed upon together by the sender and receiver. Because the messages may contain redundancies (Shannon famously used the English language, where you can guess the next letter in a plain English text better than 50% of the time), there is an encoding process that ideally removes all redundancy.
Shannon’s famous entropy equation was simply a measure of how well the redundancy was removed. When you have maximum entropy you are done encoding, and each bit sent has maximum ‘meaning’ in the sense that the messages sent have maximum pithiness, or are the shortest length possible. As your article states, other fields see similarity with this measure and adopt the entropy term for their purposes, and this has confused the simple and elegant model Shannon used.
As mentioned in your article the receiver’s job is to look at the message received, which may be corrupted by noise, and determine what was sent. After this best guess the pithy message is decoded back into the original message, possibly plain English text.
Do not confuse a message that has maximum entropy with random noise. It is just the opposite, even though to a receiver who does not have the decoder ring it cannot be distinguished from noise.
This also shows clearly that for information to be such, a receiver to interpret it is necessary. DNA would be useless garbage without the cell machines that receive and interpret it.
GBD, yes Shannon knew much about information, signals, processing and transmission. His metric is about info carrying capacity. I spoke to the random case because of a specific common objection. Yes, an ideal code will squeeze out redundancy and reach the same maximum. An ideal case. But in the real, noisy world redundancy has its use. Even, in English. KF
KF,
There is quite distinct difference of properties of classical and quantum information…
Classical information is carried by systems with a definite state, and it can be replicated and measured without being altered…
Quantum information is encoded as a property of quantum systems i.e. superposition and entanglement with no classical counterpart…
As per no-cloning theorem, quantum information cannot be cloned but it can be altered as a result of an observation or measurement…
Unlike classical information, quantum information can’t be created or destroyed though there is still some doubt about both…
J-Mac: The genetic code functions in effect classically: AGCT. Similarly, we can generally reduce functional organisation relevant to the design inference in terms of chains of Y/N q’s specifying a description . . . try autocad. Playing around with superposition of states and resolution does not remove the fundamental importance of distinct identity in reasoning and communication. KF
PS: You may find the weak argument correctives discussion of quantum mech vs quantum rhetoric, useful: https://uncommondescent.com/faq/#LNC
KF @ 11:
An ideal case. But in the real, noisy world redundancy has its use. Even, in English. KF
Yes, thanks. In fact, redundancy is essential to reach Shannon capacity. Once message redundancy is removed, forward error correction is added to the message. This redundancy is optimal and essential to determine which message was sent.
The newest low density parity check codes require very little of this redundancy because of an algorithm (‘turbo’ processing) that looks at the analog value(s) of each received signal. This allows us to get very close to the theoretical Shannon limit.
I get confused and even marvel at how the words ‘information’ and ‘entropy’ have been co-opted to so many different concepts..like the word ‘evolution’, we must carefully define the sense in which we are using these words or we end up talking past each other.
GBD, yes, terms have a wide variety of uses that must be watched. KF
PS: I was thinking about how we can make sense of a familiar language even against noise and drop-outs.
I have nothing to offer here, this post is well beyond me.
However, I get the distinct feeling that that was its intention.
The ‘information’, offered here about information, could perhaps be made more readily accessible to the less well educated, such as myself?
However, knowing Kairos, and his penchant for the indecipherable, perhaps a better communicator, for a description of ‘information’, is required?
Also, one question; How is this ‘information’ stored in a human being, once the energy source, (metabolism), is stopped?
RVB8, FYI this is a fairly technical subject — some relevant science and linked issues that are too often overlooked. It is also at the heart of the design issue. If you cannot handle at least this level, you are not in a position to seriously discuss the issues at stake. KF
KF @ 17.
If you write a non-technical post, they will whine that UD has abandoned science. If you write a technical post, they will whine that UD is obfuscating.
Fine;
and my second question?
How is the information,(memories, genetic code, or other), stored, when the energy source, (metabolism) required to maintain the memory, (‘information’), or genetic code, (‘information’) is stopped?
Or, if you want; When you die, what happens to your personal, ‘information’?
NOTE: Any supernatural answers to this question are unacceptable, as science has no means of testing these answers.
Darwin says it is passed on, with unintended natural alterations.
rvb8 19
Seems to me that information itself is supernatural, if by “natural” we mean fully explicable via unguided matter/energy.
anthropic @20,
and immediately the ‘untestable’ is touted.
anthropic, science has no test for that.
You may well be right, but we’ll never know until death, as there is no way of knowing till then.
Rvb8, science has no way of “testing” your thoughts to begin with. I hope you will agree then that asserting “rvb8 has thoughts” is unacceptable.
RVB8,
in fact, the evolutionary materialistic scientism (and fellow travellers) that dominates big-S Science today cannot account for consciousness much less the responsible, rational freedom required to carry out the reasoning used in doing even science.
That’s part of why it is utterly and irretrievably self-referentially incoherent and thus self-falsifying.
Here’s J B S Haldane on the point, long since:
So, too, BA is quite right to say that science has no way of testing that you or any other human being has thoughts, as opposed to having a brain and CNS that form a computational substrate that carries out GIGO-driven, inherently mechanical computation through chains of coupled neurons.
And BTW, science that rejects design as a credible possibility through imposing Lewontin-style a priori materialism dressed up in a lab coat has no coherent, cogent explanation for the functionally specific, complex organisation and associated — yes — information in that computational substrate either.
So, there is no benchtop or field-observational science that accounts for mindedness required to do science. Thus also, the self and/or soul which expresses itself through our being self-moved, self-aware responsibly and rationally free agents.
Of course, those who lock up knowledge to the sort of big-S Science we are discussing, reflect Scientism, which is a failed thesis that if it’s not big-S Science, it’s not knowledge or it’s not credible, serious knowledge. This reflects the Lewontin declaration that hoi polloi must be brought to imagine that Science is “the only begetter of truth.” Which is an epistemological claim and so refutes itself. Yet another manifestation of self-referential incoherence brought to us by evo mat scientism.
Wrong department of knowledge, in short.
The more reasonable answer starts with: dust to dust, ashes to ashes. Embodied information in the human body decays after death in accord with the laws of entropy. In fact, that is happening already in life, and once the telomeres are used up a clock is definitely running down.
What of the soul and its expression in mindedness?
Not a subject of scientific investigation, though foundational to there being the possibility of science.
This does not mean there’s no answer, just that the arbitrary datum line ruled by self-refuting but ideologically dominant evo mat scientism and its fellow travellers has no way to answer it. Locks out the only hopes of an answer in fact, and refuses to listen.
As we see just above from you.
So, start with answering the prior question, why are so many locking allowed knowledge up to evo mat scientism and/or its fellow travellers?
KF
We can all recognise information in everyday life, and distinguish different kinds of information and whether there is an increase or not. e.g. Hollandaise Sauce.
We could look it up in a dictionary and get some information. We can read a recipe and get more information. We can taste it and get more information. We now know what it is, how to make it, and what it tastes like. We know that we have acquired and increased information; we can qualitatively say we have more information even if we can’t quantitatively say how much.
How would we quantify the information in the taste?
Parameterise taste so that its structure and quantity can be characterised. Design a description language and then measure and report, doubtless some will be borrowed from Chemistry. The Scoville scale for peppers is a related case. KF
RVB8,
Part 2.
First, a personal report.
July last, on hearing of my dad having to go back to hospital the second time within a week, I got a flight booked and went straight from an airport in Kingston Ja to his hospital bedside in Mandeville.
He held on by force of will until he could see me and resolve one final duty to my mom (his wife of eleven days short of sixty years as at the time of my visit . . . ) by signing a final legal document. That document was signed the next morning, after I had napped and come to watch across the night then went home to my parents’ house with an aunt to freshen up and return to the hospital about 30 miles away.
After that, within three hours, he was gone.
At 1145 hrs local time, July 18th last, he turned to look at me, he turned to look at his caregiver — my aunts had done overnight vigil and were then minutes out — looked up to Someone we could not see, called Him “Lord” and surrendered his spirit to Him, then was gone in seconds thereafter. Fifteen minutes before, he preached his last, one-line sermon (he was a lay preacher and distinguished economist), at a time when every breath was precious and painful.
Those who disobey the Word of God (thus designating the most precious heritage our Clan Lord was passing on . . . ) are in danger of Hell’s fires. Not just the eschatological judgements, but more importantly in some respects, the tongue of falsity is set alight from Hell, sets the course of one’s life afire, and so also sets fires of hellish chaos blazing through the world.
This whole scene fits with the scripture, that the death of his saints is precious in the eyes of the Lord.
I have moral certainty as to the fate of my dad: he went to be with his Lord, whom he loved and walked with for decades. The same Lord he led me to meet and manifested in the path of his life.
I took the Thompson’s Chain Reference Bible from his bedside (the one I knew from childhood) and brought it home with me. As I bought my own as a young College student, this one I handed over to my son — fourth generation in succession. The Bible passed on from my Grandfather (bought on the occasion of the drowning death of his first son during a rescue of people from a spilled boat in the US where they were as war workers) I left in my parents’ home.
There are millions who will give similar testimony.
For instance, my cousins in law here speak of a Father who was seeing the Celestial city at the time of his departure and who then was more than ready to go.
But such will doubtless be dismissed as mass delusion. (Never mind the implications of claiming the human mind is that prone to delusion.)
To give the deeper answer to your question, I turn to those same Scriptures, passed on to us by prophets, apostles, martyrs and confessors at appalling cost at the hands of those whose lives were ablaze with hellish fire:
There, sir, is your answer, by the One who came for my dad, who is risen from the dead in exact fulfillment of the scriptures given almost a thousand years earlier (see esp. Isa 52 – 53) and who was witnessed by the 500, now backed by millions whose lives he has transformed around the whole world.
Including that, apart from a miracle of guidance that answered my mom’s prayer of surrender on that very day it was given, I would not be here today to type these words.
You may mock or dismiss this answer, clinging to a self-falsifying ideology of atheistical Scientism (or its fellow travellers) but you cannot justly say that you have been given no answer.
Here witnesseth, this 13th day of December, in the year of Our Lord 2017:
GEM of TKI
Kf:
rvb8 wrote”I have nothing to offer here, this post is well beyond me.
However, I get the distinct feeling that that was its intention.”
I feel you should ‘fess up. Did you write this post with rvb8 in mind?
Binary answer if possible please.
Belfast, no — the one bit form. Amplifying: there has been a series of posts I have made recently (having sufficiently recovered to do so, never mind my last “triggering” with grieving was but a few minutes past due to impact of memories) on fundamental matters that seem necessary at this time. DV, in due course there will be more, including taking a look at quantum versions of information since these seem to trigger a sort of quantum rhetoric that needs to be answered. Not that such is actually a cogent response to the issues posed by say DNA and the complex, coherent functional organisation that pervades the world of life from the living cell up to the brains and CNS we have as computational substrates. Mind you, in parallel I have to help carry a major policy war that embraces local, regional and Commonwealth dimensions and actually has a few links to the Brexit question. So, I cannot commit to any timeline. Of course, at this time, CR may have a juster claim to be the grain of sand in the oyster that is UD. And Constructor Theory is also on the agenda for some time, DV. KF
Folks,
Part 3, Plato, too, has some’at to say to us in The Laws Bk X:
KF
F/N: Les from the walled up thread:
>>
5
LesDecember 13, 2017 at 7:49 am
Biological information definition – Francis Crick (1958)
http://hudsonvalleyrnaclub.org.....090209.pdf
“By information I mean the specification of the amino acid sequence of the protein.”
“Information means here the precise determination of sequence, either of bases in the nucleic acid or of amino acid residues in the protein.”
>>
KF
KF@ 13,
At subatomic level everything is based on quantum mechanics; the quantum arrangement of 3 particles including the arrangement of nucleotides and codons in the genetic code…the classical information without subatomic arrangement of particles is dead…
When you consider that processes like mitosis, hot-spots in mutations etc seem to be controlled by quantum processes like quantum coherence and quantum entanglement, classical info seems almost irrelevant or dead…
J-Mac, I am sure you are aware of the correspondence principle. As Quantum effects scale up, they converge to a classical limit, as is necessary to meet the fact that at that scale classical results tend to work quite well by and large — of course, photo effect, UV catastrophe, freezing out of degrees of freedom tied to heat capacities etc and the like effects such as line and absorption spectra were clues pointing inwards to the quantum scale. As it turns out, the genetic code’s functionality is well accounted for on classical information insofar as protein synthesis and regulatory circuits etc are concerned. Yes, the chemical reactions necessarily involve quantum processes, but that is as common as that the combustion reaction at the heart of a fire is a quantum process too. If mutational “hot spots” or whatever do reflect quantum effects, that does not change what has been addressed in the OP. Nothing you have said affects the basic understanding of what information is, indeed, qubits encoding information vectorially in Bloch spheres is another form of information expression due to configuration, and probabilities come up as expressions of uncertainties leading right to how information will reduce uncertainty; in this case linked to where we may have wave function collapse. The notion that you would wish to suggest, apparently to sweep the facts of alphabetic coded thus linguistic information in D/RNA — what was presented in the OP — off the table are in fact clearly irrelevant to that well established fact. You have evidently fallen into a red herring distractive fallacy. KF
KF @27:
“Mind you, in parallel I have to help carry a major policy war that embraces local, regional and Commonwealth dimensions and actually has a few links to the Brexit question. So, I cannot commit to any timeline.”
I pray that you can focus on those difficult tasks ahead and can get them resolved properly and timely without stress.
KF @31:
Valid point. Interesting analogies for illustration. Thanks.
F/N: Belfast asked for a y/n answer, above. This shows how a 1-bit result exists in a context of a language, as in writing I had to use the word no, two ASCII characters, forming a word in a language — itself a huge entity that is information rich. In turn the concept, no, is part of a world of experience, which we implicitly bring to the situation and which sets context for the significance of the answer above and beyond the context of a question in a life-situation. So, that bit is actually a switch, accessing much that is not visible in giving 0 vs 1. Of course even that is using ASCII characters that go back to a world of technology and history as well as info-systems praxis. KF
Kairos,
very sorry for your loss.
I remember keenly the loss of my own father. Although I did not have the consolation of Christ to fall back upon, I found family and friends sufficient.
All the best,
Rob.
RVB8, Appreciated. I am told that across time the pain of loss fades and the positive memories provide a balm. I find that those last few minutes are a bitter-sweet mix of the two . . . and for nearly 3 months, they ran like a movie in my mind, imprinted like a burned-in image in the retina. That, I did not expect. KF
KF@ 31
I’m aware of correspondence principle.
I’m also hopeful that you are aware of correspondence limit…
Problems like that are just indicators of the incompatibility of theories like general relativity and quantum mechanics that can’t be consolidated…
QM has never been proven wrong and one of these will have to be scrapped…I’m putting my money on QM…
JM, I never denied that there is a conflict between elements of QM and RT, though at other points they are blended. After 100 years, it is not resolved. KF