Which we covered here: Applying information theory to the origin of life?, where Adami said,
“The information-theoretic musings I have presented here should convince even the skeptics that, within an environment that produces monomers at relative ratios not too far from those found in a self-replicator, the probabilities can move very much in favour of spontaneous emergence of life,” concludes Adami.
Dembski replies,
The probabilities can move very much in favour of spontaneous emergence of life provided you introduce a search that makes the probabilities high (as by Adami’s “simplifying assumptions”).
Yeah.
Sort of like Martha Stewart does everything better than me, except I never get to see her army of frazzled assistants. For all I know, maybe she doesn’t either.
By the way, here’s more on Dembski’s book, Being as Communion. I’ve read it; it’s a game-changer. Order now if you can stand Brit ship hassle. – O’Leary for News
See The Science Fictions series at your fingertips (origin of life) for a rundown on why no naturalist origin of life theory works.
Follow UD News at Twitter!
I wonder if he has adolescent or grown-up children who kid him that they still believe in Father Christmas – because they know he’s happier believing it.
OT: God and Science – Part 1 | Dr. Hugh Ross (Sept. 7, 2014)
https://www.youtube.com/watch?v=LcGX_6AiHKI&list=UUauB3xW5bf0STGotAxLj4vA
the God of the universe is actually the God of the scientific world as well.
Interesting article by Adami. I was hooked shortly into the article, where the author says, “…there is no remnant of the original set of molecules that began their fight against the second law of thermodynamics.” So here is yet another admission by a non-ID, non-YEC that life is a fight against the Second Law. Haven’t we had this conversation before? Wasn’t Rob Sheldon or someone involved in explaining that with ideal gasses, we understand entropy, but with non-gas systems, we barely have the ability to describe the associated entropy mathematically?
I get that the formulas for thermodynamical entropy and information theoretic entropy have the same form. But it makes no sense to me to equate thermo entropy (measured in something like joules/kelvin mole) with bits of information except under extremely limited circumstances.
So is this researcher off in the weeds here in speaking about thermodynamic entropy in the same breath as search probabilities involving linear molecule chains?
Reducing the origin of life to the origin of information does not actually reduce the problem. It does however allow materialists to do a lot of hand waving and smuggling in of information which is not obvious to those of us who are mathematically naive.
Just as Avida “proved” evolution could happen in silico, so these guys will soon “prove” that origin of life can happen in silico. I am thankful that we have Professor Dembski and other maths whizzes who can call them out.
What kind of idiotic babble is that? If the environment produces monomers at a similar ratios as a self replicator? It then moves the odds in favor of the origin of life? So if the environment produces amino acids (monomers) in the right ratios you get a living cell? No. No you don’t. You get a bunch of amino acids is what you get.
EDTA:
I suggest you acquaint yourself with the informational school of thermodynamics, e.g. this admission against ideological interest at Wikipedia:
As Appendix A my always linked briefing note . . . click on my handle on this or any other comment I make here at UD . . . . underscores, mere opening up of a system is not enough to overcome the linked search space challenge. Indeed, it is easy to show that Darwin’s warm little pond was energy importing, and that an energy importing system naturally INCREASES entropy. What generally creates shaft work and its product, organisation or at least order, is a mechanism that couples inflowing energy to convert some of it to that work [= ordered, forced motion], and exhausts waste heat.
Sometimes such heat engines or energy conversion devices appear spontaneously, such as a hurricane.
But in every case where the device in question produces FSCO/I rich organisation etc, where we observe the source directly, the cause is an intelligent one.
This has to do with the implied beyond-astronomical atomic resources search challenge discussed here, for instance.
After all the handwaving and dismissive talk often dressed up in a lab coat) is done, there is in fact no empirically anchored reasonable explanation of how a gated, encapsulated, metabolising automaton with a code using, algorithmic self maintaining and replicating facility could and did arise. Instead we too often see an attempted redefinition of science that is ideologically loaded with evolutionary materialism as a censoring a priori.
If you doubt me, here is the US NSTA (National Science Teachers Assoc’n) Board in 2000, claiming to speak in the name of science and science education:
(And there is a lot more where that came from.)
Philip Johnson’s retort to Lewontin is still apt in reply to all such ideological impositions on scientific discussion of origins:
KF
KF, thanks for the link to your page. I’ll try to get through that, but there’s a lot there! I have read published papers however that claim to point out allegedly irreconcilable differences between Shannon and thermo entropy. See an example at this link.
And feel free to set me straight here, since I’m not a physicist, but one question has always bugged me if indeed thermo entropy is a measure of ignorance about the molecular state of some system. If that is the case, then there should be some absolute number of bits which would suffice to describe the system completely. I.e., some amount of information that could pinpoint the particles to within, say, 100nm regions. How do we find that number of bits? Things like Boltzmann’s constant don’t contain terms that would cancel with volume, moles, etc.
I glanced at your page, KF, and see that you approach this matter. I will try to get through it sometime. Thanks.
EDTA: If you want it full-bore try Ch 1 of Robertson’s Statistical Thermophysics, which I excerpt and summarise in my note, but beware you need to recognise what a Partition Function is etc, this is not a simple subject. The bottomline is, there is a difference between what entropy is measuring (and degrees of molecular freedom lead to the same import), and how one transfers over into energy units linked to conventional scales. As a first look, notice, Boltzmann’s expression S = k_B log W, is looking at W the number of ways mass and energy can be arranged at micro levels compatible with macro conditions, i.e. a metric of degree of uncertainty already. Take a log and up to a multiplication by -1, you have an info metric [the underlying inference is equiprobability]. k_B converts to energy linked terms. The Gibbs expression is already in terms of adjusted probabilities and is directly comparable. Yes, many still dispute but it is not hard to see that there is a serious point. Indeed, probabilities other than 0 and 1 are measures of degree of ignorance, or on the dual, of information. KF
PS: Clip from sect A my note:
>> Summarising Harry Robertson’s Statistical Thermophysics (Prentice-Hall International, 1993) — excerpting desperately and adding emphases and explanatory comments, we can see, perhaps, that this should not be so surprising after all. (In effect, since we do not possess detailed knowledge of the states of the vary large number of microscopic particles of thermal systems [typically ~ 10^20 to 10^26; a mole of substance containing ~ 6.023*10^23 particles; i.e. the Avogadro Number], we can only view them in terms of those gross averages we term thermodynamic variables [pressure, temperature, etc], and so we cannot take advantage of knowledge of such individual particle states that would give us a richer harvest of work, etc.)
For, as he astutely observes on pp. vii – viii:
. . . the standard assertion that molecular chaos exists is nothing more than a poorly disguised admission of ignorance, or lack of detailed information about the dynamic state of a system . . . . If I am able to perceive order, I may be able to use it to extract work from the system, but if I am unaware of internal correlations, I cannot use them for macroscopic dynamical purposes. On this basis, I shall distinguish heat from work, and thermal energy from other forms . . .
And, in more details, (pp. 3 – 6, 7, 36, cf Appendix 1 below for a more detailed development of thermodynamics issues and their tie-in with the inference to design; also see recent ArXiv papers by Duncan and Samura here and here):
. . . It has long been recognized that the assignment of probabilities to a set represents information, and that some probability sets represent more information than others . . . if one of the probabilities say p2 is unity and therefore the others are zero, then we know that the outcome of the experiment . . . will give [event] y2. Thus we have complete information . . . if we have no basis . . . for believing that event yi is more or less likely than any other [we] have the least possible information about the outcome of the experiment . . . . A remarkably simple and clear analysis by Shannon [1948] has provided us with a quantitative measure of the uncertainty, or missing pertinent information, inherent in a set of probabilities [NB: i.e. a probability different from 1 or 0 should be seen as, in part, an index of ignorance] . . . .
[deriving informational entropy, cf. discussions here, here, here, here and here; also Sarfati’s discussion of debates and the issue of open systems here . . . ]
H({pi}) = – C [SUM over i] pi*ln pi, [. . . “my” Eqn 6]
[where [SUM over i] pi = 1, and we can define also parameters alpha and beta such that: (1) pi = e^-[alpha + beta*yi]; (2) exp [alpha] = [SUM over i](exp – beta*yi) = Z [Z being in effect the partition function across microstates, the “Holy Grail” of statistical thermodynamics]. . . .
[H], called the information entropy, . . . correspond[s] to the thermodynamic entropy [i.e. s, where also it was shown by Boltzmann that s = k ln w], with C = k, the Boltzmann constant, and yi an energy level, usually ei, while [BETA] becomes 1/kT, with T the thermodynamic temperature . . . A thermodynamic system is characterized by a microscopic structure that is not observed in detail . . . We attempt to develop a theoretical description of the macroscopic properties in terms of its underlying microscopic properties, which are not precisely known. We attempt to assign probabilities to the various microscopic states . . . based on a few . . . macroscopic observations that can be related to averages of microscopic parameters. Evidently the problem that we attempt to solve in statistical thermophysics is exactly the one just treated in terms of information theory. It should not be surprising, then, that the uncertainty of information theory becomes a thermodynamic variable when used in proper context . . . .
Jayne’s [summary rebuttal to a typical objection] is “. . . The entropy of a thermodynamic system is a measure of the degree of ignorance of a person whose sole knowledge about its microstate consists of the values of the macroscopic quantities . . . which define its thermodynamic state. This is a perfectly ‘objective’ quantity . . . it is a function of [those variables] and does not depend on anybody’s personality. There is no reason why it cannot be measured in the laboratory.” . . . . [pp. 3 – 6, 7, 36; replacing Robertson’s use of S for Informational Entropy with the more standard H.] >>
EDTA,
Statistical Thermodynamics With Applications to the Life Sciences
Introduces statistical thermodynamics (statistical mechanics) and partition functions (kf @8) and then shows how entropy can be given meaning as a special case of Shannon’s Measure of Information (SMI).
I think the answer to this is by finding the function that maximizes the entropy.
From the above book: