Computing, AI, Cybernetics and Mechatronics Functionally Specified Complex Information & Organization ID Foundations Molecular Animations Selective Hyperskepticism thermodynamics and information

Piotr (and KS, DNA_Jock, VS, Z et al) and “compensation” arguments vs the energy audit police . . .

Spread the love

under_arrestIt seems to be time to call in the energy audit police.

Let us explain, in light of an ongoing sharp exchange on “compensating” arguments in the illusion of organising energy thread. This morning Piotr, an objector (BTW — and this is one time where expertise base is relevant —  a Linguist), at 288 dismissed Niwrad:

Stop using the term “2nd law” for something that is your private misconception. You’ve got it all backwards . . .

This demands correction, as Niwrad has done little more than appropriately point out that functionally specific complex organisation and associated information cannot cogently be explained away by making appeals to irrelevant energy flows elsewhere. Organisation is not properly to be explained on spontaneous energy flows and hoped for statistical miracles.

Not, in a world where something like random Brownian motion (HT: Wiki) is a common, classical case of spontaneous effects of energy and particle interactions at micro level:

Brownian_motion_large

In effect, pollen grains in the fluid of a microscope slide are acting as giant molecules jostled through their interaction with the invisible molecules of the fluid they are embedded in. In fact, this analysis was a key empirical evidence for the atomic-molecular theory and contributed to Einstein’s Nobel Prize as one of the famous 1905 Annalen der Physik papers.

In 289, I responded to Piotr, and drew attention of KS, DNA_Jock and others. I think the response should be headlined and of course the facilities of an OP will be taken advantage of to augment:

__________________

>> Perhaps it has not dawned on you what saying “private misconception” dismissively in front of someone who long since studied thermodynamics in light of the microstate underpinnings of macrostates, rooted in the work of Gibbs and Boltzmann comes across as.

Ill informed, ill advised posturing.

FYI, it is the observational facts, reasoning and underlying first plausibles that decide a scientific issue, not opinions and a united ideological front.

In direct terms, FYFI irrelevant energy flows and entropy changes as are commonly trotted out in “compensation” arguments are a fallacy.

FYYFI, 2LOT (which has multiple formulations as it was arrived at from several directions . . . Clausius being most important in my view), is rooted in the statistics of systems based on large numbers of particles ( typically we can start at say 10^12, and run to 19^19 – 10^26 atoms or molecules etc for analyses that spring to mind), and in effect sums up that for isolated systems the spontaneous trend is towards clusters of microstates with statistical dominance of the possibilities under given macro-conditions.

[Clipping from ID Foundations no 2, on Counterflow, open systems, FSCO/I and self-moved agents in action:]

A heat Engine partially converts heat into work

On peeking within conceptually (somewhat oddly, as strict isolation means no energy or mass movement cross-border so we technically cannot look in from outside . . . we are effectively taking a God’s eye view . . . ) Clausius set up two subsystems of differing temp and pondered heat flow d’q, then took ratios and showed that as Ta > Tb, net entropy rises, when we do the sums.

A direct implication is that raw energy importation tends to increase entropy. The micro view indicates this is because the number of ways micro level mass and energy can be arranged consistent with gross macro state has risen.

Thus, the point that I clipped in 169, that importation of raw energy into a system leads to a trend of increased entropy. Where as G N Lewis and others have highlighted, a useful metric for entropy is that it indicates the average missing information to specify particular microstate consistent with a macroscopic lab level gross state. [Let me clip from Wikipedia speaking against known  interest, c. April 2011,  on Information Entropy and the links to Thermodynamics:]

At an everyday practical level the links between information entropy and thermodynamic entropy are not close. Physicists and chemists are apt to be more interested in changes in entropy as a system spontaneously evolves away from its initial conditions, in accordance with the second law of thermodynamics, rather than an unchanging probability distribution. And, as the numerical smallness of Boltzmann’s constant kB indicates, the changes in S / kB for even minute amounts of substances in chemical and physical processes represent amounts of entropy which are so large as to be right off the scale compared to anything seen in data compression or signal processing.

But, at a multidisciplinary level, connections can be made between thermodynamic and informational entropy, although it took many years in the development of the theories of statistical mechanics and information theory to make the relationship fully apparent. In fact, in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon’s information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states that it could be in, thus making any complete state description longer. (See article: maximum entropy thermodynamics.[Also,another article remarks >>in the words of G. N. Lewis writing about chemical entropy in 1930, “Gain in entropy always means loss of information, and nothing more” . . .   in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes/no questions that need to be answered in order to fully specify the microstate, given that we know the macrostate.>>]) Maxwell’s demon can (hypothetically) reduce the thermodynamic entropy of a system by using information about the states of individual molecules; but, as Landauer (from 1961) and co-workers have shown, to function the demon himself must increase thermodynamic entropy in the process, by at least the amount of Shannon information he proposes to first acquire and store; and so the total entropy does not decrease (which resolves the paradox).

Now, too, work can be understood to be forced, ordered motion at macro or micro levels, generally measured on the dot product F*dx.

[Courtesy HyperPhysics:]

hyperphys_wrk_defn

Energy conversion devices such as heat engines couple energy inflows to structures that generate such forced ordered work, commonly shaft work that moves a shaft and loads connected to it. In so doing, to operate they exhaust degraded energy, often waste heat to a heat sink.

[A backhoe is a useful example showing controlled, programmed forced ordered motion deriving energy from fuel through a heat engine and effecting intelligently directed configuration thus constructive work and FSCO/I rich entities, exhausting waste heat etc to the atmosphere at ambient temperature. The backhoe is of course itself in turn a capital example of FSCO/I and its known source:]

backhoe

[An illustration of a von Neumann Kinematic self replicator [vNSR] will help bridge this to the world of self replicating automata, which includes the living cell:]

vNSRThe pivotal issue comes up here: relevant energy conversion devices (especially in cell based life, such as driving ATP synthesis (ATP synthase) or photosynthesis, or onwards synthesising proteins, are FSCO/I rich, composed of many interacting parts in specific arrangements that work together.

[ATP Synthase illustration:]

atpsynthase

[An outline on photosynthesis via Wiki and Somepics, will also help underscore the point:]

photosynthfrwk_wiki _somepics

[Finally, let us observe the protein synthesis process:]

Protein Synthesis (HT: Wiki Media)
Protein Synthesis (HT: Wiki Media)

[ . . . and the wider metabolic framework of the cell:]

cell_metabolism

At OOL, there are suggestions, such machinery is supposed to have spontaneously come about through diffusion and chemical kinetics etc.

But the same statistics underpinning 2LOT and integral to it for over 100 years, highlights that such amounts to expecting randomising forces or phenomena such as diffusion to do complex, specific patterns of constructive work. The relevant statistics and their upshot is massively against such. The non functional clumped at random possibilities vastly outnumber the functionally specific ones, much less the scattered ones.

[Notionally, we can look at the needle in haystack blind search challenge based on:]

csi_defn[Where the search challenge can be represented:]

sol_coin_fliprHence the thought exercise I clipped at 242 above.

[U/D, Mar 18, 2015: I think a clip from 123 below in this thread for record will help draw out the significance of the direct statistical underpinnings of the second law of thermodynamics [2LOT]. For, it will help us to see how — though many orders of magnitude smaller than typical energy flows and entropy numbers from those of typical heat transfer exercises, the micro-level arrangements linked to FSCO/I at something like the level of the living cell are of such deep isolation in the field of molecular contingencies or configurations that the blind search resources of the solar system or observed cosmos will be deeply challenged to find such under pre-biotic conditions such as in a Darwin’s warm salty pond struck by lightning or the like typical scenario:

123: >> . . .  the root [of the second law of thermodynamics] turned out to be [revealed by] an exploration of the statistical behaviour of systems of particles where various distributions of mass and energy at micro level are possible; consistent with given macro-conditions such as pressure, temperature, etc. that define the macrostate of the system. The second law [then] turns out to be a consequence of the strong tendency of systems to drift towards statistically dominant clusters of microstates.

As a simple example, used by Yavorsky and Pinsky in their Physics (MIR, Moscow, USSR, 1974, Vol I, pp. 279 ff.]) of approximately A Level or first College standard and discussed in my always linked note as being particularly clear, which effectively models a diffusion situation, we may consider

. . . a simple model of diffusion, let us think of ten white and ten black balls in two rows in a container. There is of course but one way in which there are ten whites in the top row; the balls of any one colour being for our purposes identical. But on shuffling, there are 63,504 ways to arrange five each of black and white balls in the two rows, and 6-4 distributions may occur in two ways, each with 44,100 alternatives. So, if we for the moment see the set of balls as circulating among the various different possible arrangements at random, and spending about the same time in each possible state on average, the time the system spends in any given state will be proportionate to the relative number of ways that state may be achieved. Immediately, we see that the system will gravitate towards the cluster of more evenly distributed states. In short, we have just seen that there is a natural trend of change at random, towards the more thermodynamically probable macrostates, i.e the ones with higher statistical weights. So “[b]y comparing the [thermodynamic] probabilities of two states of a thermodynamic system, we can establish at once the direction of the process that is [spontaneously] feasible in the given system. It will correspond to a transition from a less probable to a more probable state.” [p. 284.] This is in effect the statistical form of the 2nd law of thermodynamics. Thus, too, the behaviour of the Clausius isolated system . . .  is readily understood: importing d’Q of random molecular energy so far increases the number of ways energy can be distributed at micro-scale in B, that the resulting rise in B’s entropy swamps the fall in A’s entropy. Moreover, given that FSCI-rich micro-arrangements are relatively rare in the set of possible arrangements, we can also see why it is hard to account for the origin of such states by spontaneous processes in the scope of the observable universe. (Of course, since it is as a rule very inconvenient to work in terms of statistical weights of macrostates [i.e W], we instead move to entropy, through

s = k ln W.

Part of how this is done can be seen by imagining a system in which there are W ways accessible, and imagining a partition into parts 1 and 2.

W = W1*W2,

as for each arrangement in 1 all accessible arrangements in 2 are possible and vice versa, but it is far more convenient to have an additive measure, i.e we need to go to logs. The constant of proportionality, k, is the famous Boltzmann constant and is in effect the universal gas constant, R, on a per molecule basis, i.e we divide R by the Avogadro Number, NA, to get: k = R/NA.

The two approaches to entropy, by Clausius, and Boltzmann, of course, correspond. In real-world systems of any significant scale, the relative statistical weights are usually so disproportionate, that the classical observation that entropy naturally tends to increase, is readily apparent.)

[ . . . . ]

A closely parallel first example by L K Nash, ponders the likely outcome of 1,000 coins tossed at random per the binomial distribution. This turns out to be a sharply peaked bell curve centred on 50-50 H-T as has been discussed. The dominant cluster will be just this, with the coins in no particular sequence.

But, if instead we were to see all H or all T or alternating H and T or the first 143 characters of this comment in ASCII code we can be assured that of the 1.07*10^301 possibilities, such highly specific, “simply describable” sets of outcomes are utterly maximally unlikely to come about by blind chance or mechanical necessity, but are readily explained on design. That sort of pattern is a case of complex specified information, and in the case of the ascii code, functionally specific complex organisation and associated information; FSCO/I, particularly digitally coded functionally specific information, dFSCI.

This example draws out the basis of the design inference on FSCO/I; as, the observed cosmos of 10^80 atoms or so, each having a tray of 1,000 coins flipped and observed every 10^-14 s, will in a reasonable lifespan to date of 10^17 s look at 10^111 possibilities. An upper limit to the number of Chem rxn speed atomic scale events in the observed cosmos to date. A large number, but one utterly dwarfed by 10^301 possibilities or so. Reducing the former to the size of a hypothetical straw, the size of the cubical haystack it would be pulled from would reduce our observed cosmos to a small blob by comparison.

That is, any reasonably isolated and special, definable cluster of possible configs, will be maximally unlikely to be found by such a blind search. Far too much haystack, too few needles, no effective scale of search appreciably different from no search.

On the scope of events we can observe, then, we can only reasonably expect to see cases from the overwhelming bulk.

This, with further development, is the core statistical underpinning of 2LOT.

And, as prof Sewell pointed out, the statistical challenge does not go away when you open up a system to generic, non functionally specific mass or energy inflows etc, opened up systems of appreciable size . . . and a system whose state can be specified by 1,000 bits of info is small indeed (yes a coin is a 1-bit info storing register) . . . the statistically miraculous will be still beyond plausibility unless something in particular is happening that makes it much more plausible. Something, like organised forced motion that sets up special configs. >>

Consequently, while something like a molecular nanotech lab perhaps some generations beyond Venter et al could plausibly — or at least conceivably as a thought exercise that extrapolates what routinely goes on in genetic engineering — use atom and molecular manipulation to construct carefully programmed molecules and manipulate them to form a metabolising, encapsulated, gated automaton with an integral code-using von Neumann kinematic self-replicator, that would be very different from hoping for the statistical miracle chain of typical suggested OOL scenarios. These are based on typically hoping for much the same to spontaneously arise from something like a hoped for self replicating molecule that somehow evolves into such an automaton.

For instance, this is how Wikipedia summed up such a few years back:

There is no truly “standard model” of the origin of life. Most currently accepted models draw at least some elements from the framework laid out by the Oparin-Haldane hypothesis. Under that umbrella, however, are a wide array of disparate discoveries and conjectures such as the following, listed in a rough order of postulated emergence:

  1. Some theorists suggest that the atmosphere of the early Earth may have been chemically reducing in nature [[NB: a fairly controversial claim, as others argue that the geological evidence points to an oxidising or neutral composition, which is much less friendly to Miller-Urey-type syntheses] , composed primarily of methane (CH4), ammonia (NH3), water (H2O), hydrogen sulfide (H2S), carbon dioxide (CO2) or carbon monoxide (CO), and phosphate (PO43-), with molecular oxygen (O2) and ozone (O3) either rare or absent.
  2. In such a reducing atmosphere [[notice the critical dependence on a debatable assumption], electrical activity can catalyze the creation of certain basic small molecules (monomers) of life, such as amino acids. [[Mostly, the very simplest ones, which had to be rapidly trapped out lest hey be destroyed by the same process that created them] This was demonstrated in the Miller–Urey experiment by Stanley L. Miller and Harold C. Urey in 1953.
  3. Phospholipids (of an appropriate length) can spontaneously form lipid bilayers, a basic component of the cell membrane.
  4. A fundamental question is about the nature of the first self-replicating molecule. Since replication is accomplished in modern cells through the cooperative action of proteins and nucleic acids, the major schools of thought about how the process originated can be broadly classified as “proteins first” and “nucleic acids first”.
  5. The principal thrust of the “nucleic acids first” argument is as follows:
    1. The polymerization of nucleotides into random RNA molecules might have resulted in self-replicating ribozymes (RNA world hypothesis) [[Does not account for the highly specific nature of observed self-replicating chains, nor the problem of hydrolysis by which ever-present water molecules could relatively easily break chains]
    2. Selection pressures for catalytic efficiency and diversity might have resulted in ribozymes which catalyse peptidyl transfer (hence formation of small proteins), since oligopeptides complex with RNA to form better catalysts. The first ribosome might have been created by such a process, resulting in more prevalent protein synthesis.
    3. Synthesized proteins might then outcompete ribozymes in catalytic ability, and therefore become the dominant biopolymer, relegating nucleic acids to their modern use, predominantly as a carrier of genomic information. [[Does not account for the origin of codes, the information in the codes, the algorithms to put it to use, or the co-ordinated machines to physically execute the algorithms.]

As of 2010, no one has yet synthesized a “protocell” using basic components which would have the necessary properties of life (the so-called “bottom-up-approach”). Without such a proof-of-principle, explanations have tended to be short on specifics. [[Acc.: Aug 5, 2010, coloured emphases and parentheses added.]

Such requires information rich macromolecules to spontaneously arise from molecular noise and blind chemical kinetics of equilibrium (which, as Thaxton et al showed in the 1980’s . . . cf. here and here, are extremely adverse), as well as codes — language — and complex algorithms we are only now beginning to understand.

There is no significant observed evidence of such cumulative constructive work being feasible by chance on the gamut of our solar system or the observed cosmos on the timeline of some 10^17 s from the conventional timeline since the big bang event as is commonly discussed, in light of the very strong statistical tendencies and expected outcomes just outlined.

No wonder, then, that some years ago, leading OOL researchers Robert Shapiro and Leslie Orgel had the following exchange of mutual ruin on metabolism- first vs genes-first/RNA world OOL speculative models:

[Shapiro:] RNA’s building blocks, nucleotides contain a sugar, a phosphate and one of four nitrogen-containing bases as sub-subunits. Thus, each RNA nucleotide contains 9 or 10 carbon atoms, numerous nitrogen and oxygen atoms and the phosphate group, all connected in a precise three-dimensional pattern . . . .  [[S]ome writers have presumed that all of life’s building could be formed with ease in Miller-type experiments and were present in meteorites and other extraterrestrial bodies. This is not the case.A careful examination of the results of the analysis of several meteorites led the scientists who conducted the work to a different conclusion: inanimate nature has a bias toward the formation of molecules made of fewer rather than greater numbers of carbon atoms, and thus shows no partiality in favor of creating the building blocks of our kind of life . . . .To rescue the RNA-first concept from this otherwise lethal defect, its advocates have created a discipline called prebiotic synthesis. They have attempted to show that RNA and its components can be prepared in their laboratories in a sequence of carefully controlled reactions, normally carried out in water at temperatures observed on Earth . . . .Unfortunately, neither chemists nor laboratories were present on the early Earth to produce RNA . . .

[Orgel:] If complex cycles analogous to metabolic cycles could have operated on the primitive Earth, before the appearance of enzymes or other informational polymers, many of the obstacles to the construction of a plausible scenario for the origin of life would disappear . . . .It must be recognized that assessment of the feasibility of any particular proposed prebiotic cycle must depend on arguments about chemical plausibility, rather than on a decision about logical possibility . . . few would believe that any assembly of minerals on the primitive Earth is likely to have promoted these syntheses in significant yield . . . .  Why should one believe that an ensemble of minerals that are capable of catalyzing each of the many steps of [[for instance] the reverse citric acid cycle was present anywhere on the primitive Earth [[8], or that the cycle mysteriously organized itself topographically on a metal sulfide surface [[6]? . . .  Theories of the origin of life based on metabolic cycles cannot be justified by the inadequacy of competing theories: they must stand on their own . . . .  The prebiotic syntheses that have been investigated experimentally almost always lead to the formation of complex mixtures. Proposed polymer replication schemes are unlikely to succeed except with reasonably pure input monomers. No solution of the origin-of-life problem will be possible until the gap between the two kinds of chemistry is closed. Simplification of product mixtures through the self-organization of organic reaction sequences, whether cyclic or not, would help enormously, as would the discovery of very simple replicating polymers. However, solutions offered by supporters of geneticist or metabolist scenarios that are dependent on “if pigs could fly” hypothetical chemistry are unlikely to help.  [Emphases added.] ]

xenon_ibmThe only empirically, observationally warranted adequate cause of such FSCO/I at macro or micro levels (recall that classic pic of atoms arranged to spell IBM?) is intelligently directed configuration. Which of course will use energy converting devices to carry out constructive work in a technology cascade. It takes a lot of background work to carry out the work in hand just now, as a rule.

Such is not a violation of 2LOT, as e.g. Szilard’s analysis of Maxwell’s Demon shows. There is a relevant heat or energy flow and degradation process that compensates the reduction in freedom of possibilities implied in constructing an FSCO/I rich entity.

But, RELEVANT is a key word; the compensating flow needs to credibly be connected to the constructive wiring diagram assembly work in hand to create an FSCO/I rich entity. It cannot just be free floating out there in a cloud cuckooland dream of getting forces of dissipation and disarrangement such as Brownian motion and diffusion to do a large body of constructive work.

That is the red herring-strawman fallacy involved in typical “compensation” arguments. There ain’t no “paper trail” that connects the claimed “compensation” to the energy transactions involved in the detailed construction work required to create FSCO/I.

[Or, let me clip apt but often derided remarks of Mathematics Professor Granville Sewell, an expert on (the highly relevant!) subject of differential equations, from http://www.math.utep.edu/Faculty/sewell/articles/appendixd.pdf:

. . . The second law is all about probability, it uses probability at the microscopic level to predict macroscopic change: the reason carbon distributes itself more and more uniformly in an insulated solid is, that is what the laws of probability predict when diffusion alone is operative. The reason natural forces may turn a spaceship, or a TV set, or a computer into a pile of rubble but not vice-versa is also probability: of all the possible arrangements atoms could take, only a very small percentage could fly to the moon and back, or receive pictures and sound from the other side of the Earth, or add, subtract, multiply and divide real numbers with high accuracy. The second law of thermodynamics is the reason that computers will degenerate into scrap metal over time, and, in the absence of intelligence, the reverse process will not occur; and it is also the reason that animals, when they die, decay into simple organic and inorganic compounds, and, in the absence of intelligence, the reverse process will not occur.

The discovery that life on Earth developed through evolutionary “steps,” coupled with the observation that mutations and natural selection — like other natural forces — can cause (minor) change, is widely accepted in the scientific world as proof that natural selection — alone among all natural forces — can create order out of disorder, and even design human brains, with human consciousness. Only the layman seems to see the problem with this logic. In a recent Mathematical Intelligencer article [“A Mathematician’s View of Evolution,” The Mathematical Intelligencer 22, number 4, 5-7, 2000] I asserted that the idea that the four fundamental forces of physics alone could rearrange the fundamental particles of Nature into spaceships, nuclear power plants, and computers, connected to laser printers, CRTs, keyboards and the Internet, appears to violate the second law of thermodynamics in a spectacular way.1 . . . .

What happens in a[n isolated] system depends on the initial conditions; what happens in an open system depends on the boundary conditions as well. As I wrote in “Can ANYTHING Happen in an Open System?”, “order can increase in an open system, not because the laws of probability are suspended when the door is open, but simply because order may walk in through the door…. If we found evidence that DNA, auto parts, computer chips, and books entered through the Earth’s atmosphere at some time in the past, then perhaps the appearance of humans, cars, computers, and encyclopedias on a previously barren planet could be explained without postulating a violation of the second law here . . . But if all we see entering is radiation and meteorite fragments, it seems clear that what is entering through the boundary cannot explain the increase in order observed here.” Evolution is a movie running backward, that is what makes it special.

THE EVOLUTIONIST, therefore, cannot avoid the question of probability by saying that anything can happen in an open system, he is finally forced to argue that it only seems extremely improbable, but really isn’t, that atoms would rearrange themselves into spaceships and computers and TV sets . . . [NB: Emphases added. I have also substituted in isolated system terminology as GS uses a different terminology. Cf as well his other remarks here and here.]]

Call in the energy auditors!

Arrest that energy embezzler!

In short simple terms, with all due respect, you simply don’t know what you are talking about and yet traipse in to announce that others who do have a clue, misunderstand.

That does not compute, as the fictional Mr Spock was so fond of saying.

Such does not exactly commend evolutionary materialist ideology as the thoughtful man’s view of the world. But then, long since, that view has been known to be self-referentially incoherent. E.g. per Haldane’s subtle retort:

“It seems to me immensely unlikely that mind is a mere by-product of matter. For if my mental processes are determined wholly by the motions of atoms in my brain I have no reason to suppose that my beliefs are true. They may be sound chemically, but that does not make them sound logically. And hence I have no reason for supposing my brain to be composed of atoms. In order to escape from this necessity of sawing away the branch on which I am sitting, so to speak, I am compelled to believe that mind is not wholly conditioned by matter.” [“When I am dead,” in Possible Worlds: And Other Essays [1927], Chatto and Windus: London, 1932, reprint, p.209.]

I suggest to you that you would be well advised “tae think again.”>>

__________________

Time tae think again objectors. END

PS: For benefit of Z at 17 below, here’s an exploded view of a Garden pond fountain pump, exploded view:

pond_fountain pump

. . . and yes, just the drive motor alone would be FSCO/I.

137 Replies to “Piotr (and KS, DNA_Jock, VS, Z et al) and “compensation” arguments vs the energy audit police . . .

  1. 1
    kairosfocus says:

    Call in the energy audit police!

  2. 2
    Zachriel says:

    kairosfocus: Call in the energy audit police!

    Which has lower entropy; a human brain, or a like mass of quartz?

  3. 3
    Hangonasec says:

    Given that the prevailing mood is that ‘materialistic’ OoL states have an insurmountable thermodynamic barrier to overcome, I would invite people coming at it from that angle to consider, in simplistic terms, the issues involved in construction of a functioning replicator. You aren’t just bolting together the components of a jet, wiring it up, filling the tank and then pressing the starter. You are micro-manipulating atoms in 3D to create a complex interacting, self-sustaining and self-replicating system, based mainly upon the exchange and sharing of electrons and coupling to proton gradients. ie, your ‘fuel’ is subatomic, and your components highly reactive and ordered on the individual molecular scale. Thermodynamics is not your friend. You don’t want anything to flow until you are good and ready to start the energy cascade. I’m not sure the engineers and programmers fully grasp the nature of this problem. Thermodynamics is a problem for any ‘naturalistic’ scenario. Of course, one can always imbue one’s Constructor with special powers, but the end result must work naturalistically.

  4. 4
    kairosfocus says:

    PS: I decided to add a clip by GS.

  5. 5
    kairosfocus says:

    Z,

    looks like I forgot to add you to the list!

    Your remark is irrelevant.

    H:

    At last, someone who is beginning to see what I am getting at.

    Yes, this is a gargantuan nanotech engineering task and the notion that forces of diffusion etc can cobble together such a vNSR is highly dubious, as well as the notion that irrelevant energy flows elsewhere can “compensate.”

    That said, I am highly confident that across this century the nanotech breakthroughs to do this sort of thing will be well in hand.

    Venter et al are first straws in the wind.

    Not to mention that IBM pic from 1989 . . . it hit me like a thunderbolt when I first saw it.

    vNSR based nanotech and clanking scale manufacturing looks like Industrial Revo 3.0 coming up.

    KF

  6. 6
    Hangonasec says:

    In short simple terms, with all due respect, […]

    That would be none, I’m guessing!

  7. 7
    kairosfocus says:

    Z, pardon the omission, now duly corrected. KF

  8. 8
    Hangonasec says:

    KF @5 Note then that simply saying ‘design’, or endlessly analogising jets and coins, does not address the issue. Your Designer must implement the plan, against such thermodynamic difficulties.

    The wider point would be, that all scenarios must take thermodynamics into account. I do not propose temporary suspension of the laws of physics, or a quantum-style ‘borrowing’ of probability from one quarter to pay for improbability elsewhere.

  9. 9
    kairosfocus says:

    H, actually, respect is always due to a fellow Son of Adam or Daughter of Eve (and yes, I am a Narnian). Besides, P is a prof of Linguistics; he is no dummy living in a basement and trolling those unwashed heathen IDiots. Just, he has overstepped his limits and has placed unwarranted confidence in a certain school of thought. Hence my allusion to Flower of Scotland, which BTW has a line in that verse that alludes to something literally written into my name. KF

  10. 10
    niwrad says:

    Great work! Thanks kairosfocus.

  11. 11
    kairosfocus says:

    H, I have very senior policy people on the line, gotta get to breakfast and onwards church. But I could not but notice your last. YES — there is a huge engineeering challenge involved. Done waaaaay above the paygrade of any currently living engineer or scientist. First, we gotta crawl until we can begin to dimly grasp what was done. But, within 100 years is my bet. The world changes and solar system colonisation is the first big prize. KF

  12. 12
    kairosfocus says:

    Niw, welcome, and I really gotta go now! KF

  13. 13
    Hangonasec says:

    YES — there is a huge engineeering challenge involved. Done waaaaay above the paygrade of any currently living engineer or scientist. First, we gotta crawl until we can begin to dimly grasp what was done. But, within 100 years is my bet.

    And therefore, at this point, naturalistic and non-naturalistic hypotheses are neck-and-neck (assuming anyone is actually working on the latter). One could just as readily wager that OoL chemists are 100 years away from a breakthrough. We are hardly likely to collect either way.

  14. 14
    kairosfocus says:

    H, got a moment; the pivotal and primary issue is natural factors of blind chance and necessity vs ART, i.e. design. FSCO/I is a reliable index of design and however obviously difficult the challenge, we have good reason to see that blind spontaneity is not a credible causal factor for what is needed. Technologies are routinely quite difficult and cumulative, it is not for nothing that rocket science is proverbial. KF

  15. 15
    Piotr says:

    #9 KF,

    Since you are starting this credentials game and you think it matters: I am a linguist but I’m not mathematically and physically illiterate. I also have a M.Eng. degree in electronic engineering from Warsaw Polytechnic. Among the courses I took there were dynamical systems theory, control theory and information theory (not to mention some fairly advanced maths and physics). It doesn’t make me an expert, but I’m not a total layman either. My opinions are well considered and don’t result from believing the wrong people and betting on the wrong horse.

    Given the expertise of some of the most eminent IDers here, I can add that people who live in glass houses should be careful with stones. I should also point out that I use my real name. I understand and respect other people’s right to Internet anonymity as long as they don’t try to take advantage of being able to know more about me than I can learn about them. I consider myself an informed amateur. I don’t pose as an authority. If I wanted to, I wouldn’t be as open as I am about my real credentials. This is just a blog where everyone can participate in the discussion, whetever (if any) their qualifications are.

    Care to answer Zachriel’s question @#2, KF?

  16. 16
    niwrad says:

    Piotr #15

    I also have a M.Eng. degree in electronic engineering from Warsaw Polytechnic. Among the courses I took there were dynamical systems theory, control theory and information theory (not to mention some fairly advanced maths and physics).

    Very good. So you should have the background to recognize organization where it is, and what can create it. Unfortunately you seem to see organization where there are simple patterns generated by nature. Moreover, being an evolutionist, you believe even in “self-organization”, i.e. engineering by chance and necessity, something impossible in principle. Don’t you think there is a bit of incoherence between your level of education in science and your evolutionist “faith”? Just for curiosity, I don’t intend to be offensive.

  17. 17
    Zachriel says:

    niwrad: Moreover, being an evolutionist, you believe even in “self-organization”, i.e. engineering by chance and necessity, something impossible in principle.

    Claiming it’s impossible in principle is just a way of saying it’s not open to argument.

    Is a water-pump an example of organization?

  18. 18
    kairosfocus says:

    Piotr:

    Good, so you should be able to address statistical thermodynamics as foundation to classical. Here is a good start point for a modern understanding of the informational approach.

    All I will add to what has already been noted is that I find the perspective that entropy is a metric index of avg missing info to specify microstate given macrostate (often phrased in terms of degrees of freedom –> i.e. uncertainty regarding state) is a key insight.

    KF

    PS: I should add as you are a Pole, I have always had a great respect for Poland as a great nation consistently punching above its weight class in a bad neighbourhood; a nation that saved Western civilisation in 1683 under Jan III Sobieski and arguably again by handing the key to the Enigma machine to the Allies. And then there’s John Paul II, the Great, in a class by himself. Blonie fields, June 1979, where the tide of history turned. (And I speak thusly as a convinced Protestant.)

  19. 19
    kairosfocus says:

    Z and Niw,

    degree of FSCO/I involved in the pump is a material consideration. Planet scale coriolis forces and convection can generate hurricanes.

    But something like a four-chamber heart or a force pump or a rotodynamic pump or even the classic bicycle pump or a motorised fan . . . or of course a high bypass fan jet . . . is an utterly different story. For that matter, a rotating alternator . . . an electron pump.

    FSCO/I is an index of what we can hit by blind chance and mechanical necessity leading to blind walks across config spaces, vs what would swamp the search resources of the observed cosmos to find islands of function. Hill climbing within islands with reasonably well behaved fitness indicia connected to variable configuration metrics is again a different story.

    The material challenge is to get to shores of islands of function.

    And, I repeat, irrelevant energy flows have little to do with the hopes being expressed that overwhelmingly dissipative forces such as diffusion etc, would do significant quantities of constructive work.

    KF

  20. 20
    Seversky says:

    I’m also curious to know, because I’m still trying to grasp the concept of entropy, which does have lower, a human brain or a like mass of quartz?

  21. 21
    kairosfocus says:

    H, are you aware that my little thought exercise of making the atoms of the Sol system or the observed cosmos into fast Chem rxn time observers and giving them 500 or 1,000 coins — stand-in for a 1-bit register each — each, flipping and observing at 100 THz is a way of making vivid the action limits of available atomic resources, and the upper limits on observers made out of atomic matter? (Think I don’t know the absolute majority of the atoms around are H and He?) So pardon a thought exercise in the guise of a parlour game; it has a serious purpose behind the playful face that a smart 6 year old could deal with. And do forgive me the indulgence of a coin with a relative on it.) KF

  22. 22
    kairosfocus says:

    Sev, again, the relevant point for our purposes is that just one cell in that brain has in it more than enough FSCO/I to point to design as inductively backed credible cause. As to the astonishing neural network archi of 10^11 neurons (and what of Glial cells) with 10^14 connexions and how it grows in the embryo then further organises in that reflexive process we call education, that takes us through the roof. The FSCO/I challenge to the compensation claim is long before we get to such, oh what 3.5 – 3.8 or is it argued 4.2 BYA in a warm pond or whatever environment you think feasible. KF

    PS: The best succinct explanation of entropy I have seen is from G N Lewis and is clipped in the OP: avg metric of missing info to specify microstate given the lab scale parameters that specify directly observable macrostate.

  23. 23
    niwrad says:

    Zachriel #17

    Is a water-pump an example of organization?

    Indeed in this weekend I have one of the water pumps of my garden disassembled on the table of my lab. They often break down (2nd law…) and I try to repair them, as I do with almost all tools/devices of my house. Its problem is electric dispersion (in other words water gets in the motor container, a classic problem of electric water pumps paid few dollars).

    Yes, those pumps are examples of a first organizational level, hierarchy of functions and of course control/power paradigm. Their controls can be more or less sophisticated depending on the cost, however also the cheapest ones must have controls (on/off depending on the water depth, over-heating…).

  24. 24
    kairosfocus says:

    Z, I have added a Garden pond pump exploded view as a PS. Yup, FSCO/I. KF

  25. 25
    keith s says:

    KF,

    You’re still running away from my question:

    kairosfocus,

    You are avoiding the question:

    I’m still waiting for the resident thermodynamic geniuses to explain how photosynthesis is possible if the compensation argument is invalid.

    We both know the answer: the compensation argument is correct. The local entropy reduction due to photosynthesis is compensated for by an increase of entropy in the surroundings — just as the second law requires.

  26. 26
    niwrad says:

    kairosfocus

    Good. I like exploded views of apparatuses because they give a synthetic idea of their composition and help their repair. Thanks.

  27. 27
    keith s says:

    And:

    KF #187,

    As I already explained, if you deny the compensation argument, you are denying the second law itself.

    Not a smart move.

  28. 28
    niwrad says:

    keith

    The compensation working in photosynthesis has nothing to do with the compensation-argument-supporting-naturalistic-OOL. The former is a thermal entropy issue (similar to the one involved in a refrigerator, or any thermal machine). The latter doesn’t work because — in two words — there is not such thing as import/export of probability (see G. Sewell…)

  29. 29
    kairosfocus says:

    KS, with all due respect, I am calling the energy audit police on you. What is the energy transactions paper trail connexion between the creation of FSCO/I rich nanomachines of life in the cell (including those for photosynthesis) and the constructive work to be done by the process of creating such, to say rise in entropy of the sun as it [Piotr aptly pointed out an error of phrasing, though one that does not change the key point, following: better, >>rise in entropy of the overall cosmos as the sun>>] irradiates the warm little pond or as lightning hits it or whatever scenarios you may prefer. Absent that paper trail that accounts for the required organising work, all you are implying is that by astonishing luck molecular noise, diffusion and other forces overwhelmingly of degradation performed astonishing nanomolecular construction and organising feats. A basic config space challenge vs available blind search atomic and temporal resources — on statistical analysis directly linked to that which gives the foundation for 2LOT, indicates this is maximally implausible to the point where it can be safely rounded to no credible chance. In the teeth of a trillion case inductive evidence base on what causes FSCO/I, multiplied by the blind needle in haystack search challenge, you are insisting on statistical miracles and refusing to recognise the micro underpinnings of the macro view that yields 2LOT as a summary statement. The reason is, a priori commitment to evolutionary materialism from Hydrogen to humans, or to one of its fellow travellers. And the notion that irrelevant energy fluxes “compensate” for the processes, fails. Indeed in Clausius’ own analysis, the relevant increment of heat moves to B and strongly tends to increase its disorder. Think, car dash board burned and cracked by the Sun, or the like. Time to think again, insistently repeating long since cogently corrected errors and blind spots as I have again outlined will not magically transmute them into a sound analysis. KF

  30. 30
    Zachriel says:

    Zachriel: Is a water-pump an example of organization?

    niwrad: Indeed

    The monsoon is a water pump, somewhat more powerful than your garden pump. Very useful too.

  31. 31
    niwrad says:

    Zachriel

    The monsoon is a water pump.

    Provide an exploded view so we can see its functional hierarchy and its controls/power apparatuses. The monsoon is a wind.

  32. 32
    Piotr says:

    …rise in entropy of the sun as it irradiates the warm little pond…

    Do me a personal favour, KF. I have already pointed it out to Niwrad, and I can see that other people have done so before me. It’s really a very basic thing that you need to grasp before you reach for a statistical thermophysics textbook: the Sun lowers its entropy by emitting radiation. Don’t say again that its entropy rises.

    From the Sun’s point of view the surrounding space is one vast energy sink with a mean temperature of about 3K. A tiny portion of the energy emitted by the Sun reaches the Earth (whose mean surface temperature is about 290K). This means that from a terrestrial perspective the Sun is an energy source and the surounding space still an energy sink. The photons from the Sun arrive from one direction, and there are lots of other directions in which waste heat can be emitted. If the Sun warms a pond during the day, the pond can cool during the night. This is what results in energy flows. Some of that energy can be diverted to power little turbines sustaining local non-equilibrium systems whose entropy can be lowered without violating LOT2.

    If every part of the sky radiated like the Sun, the arriving energy would have no sink to go to, and entropy could not be exported. The radiation would go on heating the Earth until thermal equilibrium were reached. If there were no Sun, there would be a cold sink surrounding the Earth, but no low-entropy source of energy (ignoring some chemical sources). Complex life would not be possible, and the Earth would slowly cool until it reached thermal equilibrium with the background radiation.

  33. 33
    Box says:

    Zachriel: The monsoon is a water pump, somewhat more powerful than your garden pump.

    Yes and of course the sun is a “heating system” somewhat more powerful than your central heating and the north pole is a “cooling system” somewhat more powerful than your refrigerator.

    The arguments get better and better.

  34. 34
    Zachriel says:

    niwrad: Provide an exploded view so we can see its functional hierarchy and its controls/power apparatuses. The monsoon is a wind.

    Where do you want to start? With the fusion reactor which powers the pump perhaps? The entire two-cycle system also includes the rotation of the Earth, the tilt of its axis, the configuration of the land and sea, the evaporation of water, the role of the adiabatic lapse rate, dew point, cold sink of space, etc.

    In any case, as you said, a water pump is an example of organization, and the monsoons act as a water pump. As the monsoon rains occur naturally, we can conclude that organization can occur naturally. ETA: Pretty ingenious design, actually. Lots of moving parts, but elegant and self-regulating.

  35. 35
    keith s says:

    niwrad,

    The compensation working in photosynthesis has nothing to do with the compensation-argument-supporting-naturalistic-OOL.

    Sure it does. In both cases, local entropy decreases are permitted by the second law provided that there is a compensatory export of entropy to the surroundings.

    The latter doesn’t work because — in two words — there is not such thing as import/export of probability (see G. Sewell…)

    You seem to be referring to some made-up physical law involving the “import/export of probability”. Could you state that “law” so that we can evaluate it?

  36. 36
    Piotr says:

    Niwrad,

    Water pumps, fishing reels — anything else designed, manufactured, assembled and used by humans — these things are also part of the physical universe. They cannot self-assemble from minerals (and nobody claims they can), but they do not appear by magic either, and no laws of physics (including LOT2) are violated in their production.

    Of course it takes billions of years to produce a human-made water pump, since by definition such pumps are made by humans (using their intelligence), so you need humans first. Humans are products of biological evolution, so before you can have anything like humans, life has to be possible. The origin of life requires some prebiotic chemistry (not yet reconstructed), etc. It’s a stepwise process with a huge numbers of steps, so you can’t even speak meaningfully of the “probability” of a water pump. There are no short cuts in this process, which is why we don’t find fossil remains of pumps similar to yours in, say, Devonian rocks.

    What, by the way, is the entropy of a water pump compared to the entropy of its parts scattered on your table? Is it higher, lower or the same, and how do you know?

  37. 37
    keith s says:

    KF,

    KS, with all due respect, I am calling the energy audit police on you.

    Why? Nothing about the compensation argument violates any of the laws of physics, including the first law of thermodynamics.

  38. 38
    niwrad says:

    Zachriel

    You are enlarging the scenario from the monsoon almost to the entire universe! If you say that the cosmos is organized and designed I agree. Also its laws were designed and act as secunda causa. These laws cause, among many other things, the monsoons. What they cannot cause — taken alone — is life. Life needs additional injection of organization.

  39. 39
    Zachriel says:

    niwrad: You are enlarging the scenario from the monsoon almost to the entire universe! If you say that the cosmos is organized and designed I agree.

    We’re just considering the monsoons at this time, a two-cycle system with multiple parts that pump water.

    It seems you now agree that organization can occur due to simple natural forces.

  40. 40
    Piotr says:

    And the Earth’s magnetic field is generated by a rather complex, highly organised natural dynamo, powered by the flow of heat from the inner core to the core-mantle boundary and regulated by the rotation of the Earth.

  41. 41
    Hangonasec says:

    niwrad @28 –

    keith

    The compensation working in photosynthesis has nothing to do with the compensation-argument-supporting-naturalistic-OOL. The former is a thermal entropy issue (similar to the one involved in a refrigerator, or any thermal machine).

    Dead wrong, I am afraid. Photosynthesis does not work by ‘thermal’ entropy. The light operates due to photons affecting the electronegativity of chemical intermediates. This is different from giving them increased thermal energy.

    People on both sides should in general avoid making too much of the relation with the sun IMO. It is clearly the primary source of energetic electrons in modern life, but there is a much simpler source still widely exploited: ‘inorganic’ electron donors, in chemotrophs. This is much more likely to be a primitive energy source than light.

  42. 42
    Piotr says:

    …there is a much simpler source still widely exploited: ‘inorganic’ electron donors, in chemotrophs. This is much more likely to be a primitive energy source than light.

    I completely agree. Life tapped into solar energy on a grand scale with the appearance of phototrophs, but it surely relied on chemical energy sources in its earliest history. I would claim that the Sun was still importantly involved in driving the chemical cycles of the young Earth, but its role in the OOL was indirect.

  43. 43
    niwrad says:

    Zachriel

    I don’t agree at all. To compare monsoons, or any weather phenomenon, to the organization of life is nonsense. We have already discussed this topic many times. We will never agree. I have no intention to run into an infinite loop.

  44. 44
    Box says:

    Piotr: And the Earth’s magnetic field is generated by a rather complex, highly organised natural dynamo, powered by the flow of heat from the inner core to the core-mantle boundary and regulated by the rotation of the Earth.

    Yes and maybe the Himalayan peaks can serve as DNA-code! Throw in some monsoons and hurricanes and one just might have a primitive replicator.

  45. 45
    Hangonasec says:

    Piotr

    I would claim that the Sun was still importantly involved in driving the chemical cycles of the young Earth […]

    Some of them, sure. But the prime source of the energy exploited by chemotrophs is arguably a previous sun! 🙂

  46. 46
    Piotr says:

    #44 Box,

    Do you mean that the outer core does not function as a dynamo?

  47. 47
    Box says:

    Piotr,

    I mean that the relevance to OOL is somewhat blurry.

  48. 48
    Zachriel says:

    niwrad: To compare monsoons, or any weather phenomenon, to the organization of life is nonsense.

    We didn’t. We noted that, per your own evaluation, spontaneous organization can occur in nature.

  49. 49
    fifthmonarchyman says:

    niwrad says,

    To compare monsoons, or any weather phenomenon, to the organization of life is nonsense.

    I say,

    This is a classic case of logical bait and switch.

    Zac want’s you to concede his entire argument when he hasn’t even demonstrated his premises.

    Zac’s syllogism looks like this.

    premise 1)Monsoons are clearly arise “naturally” (ie with out intelligent design)

    premise 2) monsoons are water pumps

    conclusion:===>water pumps can arise “naturally”

    The problem begins with his first premise.

    We don’t know if monsoons can arise with out intelligent design and the Christian will deny that premise outright (Job 37:3-6).

    All we really know is that monsoons while very complex are simpler than biological pumps.

    Intelligent design does not rule out the design of simple things. It instead infers design from a certain kind of complexity.

    A much better syllogism would read like this

    premise 1)Because of their complexity monsoons arise relatively infrequently in the universe.

    premise 2)Pumps in biology are incalculably more complex than monsoons.

    conclusion:===> In our universe a biological pump is incredibly less probable than a monsoon.

    peace

  50. 50
    Piotr says:

    #47 Box,

    But isn’t it an example of a self-organised “machine” functioning like a man-made generator, only much bigger and working efficienly for billions of years?

  51. 51
    kairosfocus says:

    Piotr 32,

    In 29, I do think I wrote a phrase that was poorly put.

    Rise in entropy of the cosmos as a whole, yes. The sun as a radiator will from the act of radiatioin tend to lose entropy. Deeper in the sun, breakdown processes are of course in train that point to its own end as nuke fuels get used up.

    The main point is though that the pond imports radiation its entropy tends to rise. Export of waste energy elsewhere — typically to heat sinks — lead to the overall balancing.

    The main issue is, again, there is FSCO/I to be accounted for.

    KF

  52. 52
    DNA_Jock says:

    Kf,
    I am posivitely thrilled that you called the energy audit police, and that you cite Lewis and Landauer to support your point that entropy represents a lack of information.

    The energy audit police have a question:

    How much water would you need to melt to account for the information content of the human genome?
    It’s a simple enough calculation, why are you so reticent?
    Are you “taking the fifth”?

  53. 53
    kairosfocus says:

    F/N: Convection currents, in atmosphere, or in earth’s interior, vortices and the like — similar to crystals — are not cases of interactive function of assembled parts creating FSCO/I. This goes back to Orgel, Wicken and co and the distinction between order and organisation. A distinction that many would wish blurred. KF

  54. 54
    kairosfocus says:

    DNA_Jock:

    Again, a red herring tangent.

    Melting of ice is not relevant to the audit trail on energy flow that leads to formation of a brain and CNS in the womb. This is an FSCO/I rich process and result . . . possibly the highest commonly seen case. However as beyond 500 – 1,000 bits the material point is already on the table, there is no need for overkill. Put it this way, the directly observed evidence that leads to the empirically grounded conclusion that neurons or even cells, much less brains can and do form spontaneously through blind chance and mechanical necessity is ________ ?
    (No just so stories loaded with a priori materialism or withut direct observation sufficient to establish vera causa admitted.)

    The side track reflects a studious refusal to address the statistical underpinnings of 2LOT, which for 100+ years have been integral to it. To hope that forces of diffusion, or similar generally disorganising forces to carry out a considerable body of constructive work is to believe in the sort of statistical miracles that that analysis rules utterly implausible once we are at any reasonable size of system.

    That some objectors keep pressing the point over and over as though it made a serious point, shows the depth of the conceptual gap at work.

    I simply point out the note from G N Lewis: the entropy of an entity is the avg missing info to specify microstate given its observable macro state. Where, complex interactively functional combinations are tightly confining.

    KF

  55. 55
    Piotr says:

    #53 KF,

    Wait a moment, convection currents in the liquid outer nucleus are not the dynamo. The whole thing is much more complex. The kinetic energy of the Earth’s rotation organises the currents (via the Coriolis effect) into a system of parallel Taylor columns (rather than random turbulences). There are feedback loops making the system self-sustaining and metastable. The magnetic poles may get reversed every now and then, but the geodynamo quickly re-aligns its columns with the planetary axis, restoring a stable polarity.

  56. 56
    DNA_Jock says:

    No kf,

    My question is absolutely central to the “statistical underpinnings of 2LOT”, as you put it.

    I see that you are “taking the fifth”.

    Wise move.

  57. 57
    kairosfocus says:

    DNA_Jock, I think the astute reader can see well enough what you have been distracting attention from. KF

  58. 58
    kairosfocus says:

    Piotr, order driven by mechanical necessity, not aperiodic, wiring diagram organisation . . . thus, highly probable under similar circumstances. It is the latter pattern (FSCO/I) that has both high contingency and at the same time and place functional specificity of relevant particular aspects. Of course, to set up the dynamics you highlight, there seem to be some fairly special characteristics of our home world, going with many other ways in which it is at least A highly prvileged planet. In a deeply fine tuned observed cosmos. These, arguably, exhibit FSCO/I and point to design of our home world and the cosmos it sits in in ways that ground C-chemistry, aqueous medium, terrestrial planet cell based life in a circumstellar habitable zone in a spiral galaxy habitable zone that also invites exploration and discovery. KF

  59. 59
    niwrad says:

    How sweet.
    Now evolutionists are describing to us a nice scenario a la “privileged planet”. We have an energy engine, the Sun. We have a useful water pump, the monsoon. We have a dynamo, the Earth magnetic field…
    In this tuned Eden at a certain point a wonderful process happened, the spontaneous arise of million highly complex and different organisms: molecules-to-man evolution.

    All this explosion of organization lasting million years obviously without the need of a bit of design and also counter a fundamental law prescribing trend to disorder!

  60. 60
    keith s says:

    kairosfocus,

    It’s amusing to see you dodging all these questions, including mine:

    kairosfocus,

    You are avoiding the question:

    I’m still waiting for the resident thermodynamic geniuses to explain how photosynthesis is possible if the compensation argument is invalid.

    We both know the answer: the compensation argument is correct. The local entropy reduction due to photosynthesis is compensated for by an increase of entropy in the surroundings — just as the second law requires.

  61. 61
    keith s says:

    niwrad,

    What we’ve been telling you throughout these two threads, and in all of the prior threads on this topic, is that:

    1. Evolution and OOL don’t violate the second law.
    2. Evolution and OOL don’t require that the second law be “held at bay”, to use Eric’s inopportune phrase.
    3. Evolution and OOL don’t require the second law to be “overcome”, as Box put it.
    4. The compensation argument is valid for showing that a phenomenon does not violate the second law.

    You obviously think that OOL is improbable under naturalistic scenarios, and you’re free to make that argument if you can. Our point is that you cannot make that argument on the basis of the second law, as we’ve explained over and over.

    You would love it if evolution and/or OOL violated the second law, but they don’t. The scientific community understands this. It’s time for you to catch up.

  62. 62
    kairosfocus says:

    KS, that’s a turnabout projection. I note across several threads that the circle of objectors has yet to acknowledge that the classic 2LOT statements for 100+ years have been inextricably linked to microstate statistical thermodynamics foundations, or that it is a responsible view to understand entropy in terms of missing info to specify microstate given a macrostate. Then, the point that FSCO/I is real and relevant AND is in observation produced, reliably (trillions of cases) by intelligently directed configuration, which obtains at micro not just macro levels. On these, I again point out that the persistent attempts to draw forth a world of cell based life from lucky molecular noise and similar forces of disorder, become quite revealing. I point out — adding — that 2LOT is revealed as a strong statistical generalisation on what is so statistically implausible that it is a practical unobservable, and that the hoped for lucky noise scenarios uniformly boil down to putting hope in statistical miracles. KF

  63. 63
    keith s says:

    kairosfocus,

    You’re still avoiding my question.

    Everyone knows why.

  64. 64
    niwrad says:

    keith

    You would love it if evolution and/or OOL violated the second law, but they don’t. The scientific community understands this. It’s time for you to catch up.

    Can I laugh? The “scientific community” is what has maintained alive for 150 years the biggest unscientific lie in the history of mankind, evolutionism. Do you know how much I trust the “scientific community”…

  65. 65
    niwrad says:

    keith

    P.S. In the next article I will explain why your compensation-argument-that-allows-spontaneous-organization doesn’t work, if I find the time to write it…

  66. 66
    Piotr says:

    KF

    This goes back to Orgel, Wicken and co and the distinction between order and organisation. A distinction that many would wish blurred.

    “Evolution is cleverer than you are.” — Leslie Orgel

  67. 67
    Piotr says:

    #65

    Thanks for the laugh.

    “Gee, Kairosfocus. What are we going to do tonight?”
    “The same thing we do every night, Niwrad. Try to take over the world.”

    May I/OCSF be with you, Niwrad!

    PIOTR, That is an unwarranted personality and nonsense. It should be beneath you. KF

  68. 68
    Hangonasec says:

    KF@62

    I note across several threads that the circle of objectors has yet to acknowledge that the classic 2LOT statements for 100+ years have been inextricably linked to microstate statistical thermodynamics foundations, or that it is a responsible view to understand entropy in terms of missing info to specify microstate given a macrostate.

    Sure, microstate/macrostate treatments are probabilistic. However, no-one on this side is claiming that OoL chemistry must violate that probabilistic issue. Quite the opposite. It is nonetheless an error to conflate statistical thermodynamics with the probabilistic issues involved in achieving a particular (and so far unknown) molecular configuration. No-one is asking for a special-dispensation exemption from actual thermodynamics. When a series of chemical bonds is formed extending a polymer, that is associated with a net increase in entropy regardless which actual residues become incorporated. The connection between chemical and informational thermodynamics is a mathematical one.

    The probability issue in chemistry relates to the probability of energy shed from a system and widely distributed returning to it, NOT the probability of a particular polymeric configuration arising in a particular physicochemical environment.

    Non-enzymatic RNA polymerisation proceeds to a certain short chain length and then stops due to cyclisation. Free RNA strands tend to be subject to chain breakage due to flexion and attack by the 2’ oxygen. Hybridisation of short oligonucleotides increases their persistence due to reduced flexion. Those are thermodynamic issues relevant to the OoL – in all those cases, free energy is shed. In only one, however, is there an increase in ‘disorder’. The thermodynamic constraint is exactly the same regardless what the polymeric product is. You are dressing Hoyle’s argument up in fancy physical garb. The energy audit police have passed this one on to the conflation department.

  69. 69
    kairosfocus says:

    H, you have at least formally acknowledged the link. Now work through its implications. Ponder the creation of a code and communication system out of lucky noise — D/RNA — and linked algorithms, as well as specific instructions for the thousands of proteins (avg for that low being c 250 AA) that are deeply isolated in AA sequence space, solving chirality along the way. That, I put to you, is a belief in statistical miracles from nothing more than lucky noise. KF

  70. 70
    kairosfocus says:

    Piotr, OOL is before “evolution” can begin. You have to appeal to forces of diffusion, chemical kinetics and the like in a warm pond or the like prebiotic choice of the day. In the teeth of the mutual ruin as argued out by Shapiro . . . and ORGEL. KF

  71. 71
    kairosfocus says:

    KS,

    Yes, I refuse to be drawn away on irrelevancies, having answered the material issue: melting ice has little relevance to formation of the body, including the brain, in utero — an astonishing technical performance in itself.

    All I need for relevant purposes is to credibly have more than 125 bytes of information in an object or process that exhibits FSCO/I, and the induction on trillions of cases warrants the conclusion design:

    cells are designed,

    embryological development is designed,

    Neurons are designed.

    The human brain is designed,

    human embryological development is designed.

    To overturn such “all” you need is to produce a credible case of FSCO/I by clear observed blind chance and mechanical necessity — but of course you have not, as you cannot.

    Which, given the blind needle in haystack walk across ultra-astronomical config spaces, is utterly unsurprising. On a pattern of reasoning inextricably tied to the foundations of 2LOT . . . against which you have only posed irrelevancies and an inadvertent demonstration of belief in convenient and abundant statistical miracles without any significant observational evidence that such are observable.

    Essentially, all your arguments show a strong pattern of distraction from evidence.

    You should consider again 62 in the context of 69.

    KF

  72. 72
    Zachriel says:

    fifthmonarchyman: Zac’s syllogism looks like this.

    premise 1)Monsoons are clearly arise “naturally” (ie with out intelligent design)

    premise 2) monsoons are water pumps

    conclusion:===>water pumps can arise “naturally”

    No. This is the syllogism.

    n: Organization can’t arise naturally.
    n: Water pumps are an example of organization.
    z: Monsoons are natural water pumps, an example of organization, therefore #1 is false.

    fifthmonarchyman: We don’t know if monsoons can arise with out intelligent design and the Christian will deny that premise outright (Job 37:3-6).

    If monsoon rains are not natural, then the term has no meaning.

    kairosfocus: melting ice has little relevance to formation of the body, including the brain, in utero

    That’s not correct. The formation of the body, including the brain, utero, has to occur in conformity with the 2nd law of thermodynamics. The dissipation of energy is required, which has a standard formulation.

    DNA_Jock: How much water would you need to melt to account for the information content of the human genome? It’s a simple enough calculation, why are you so reticent?

    We’d be interested in kairosfocus’s calculation.

  73. 73
    Hangonasec says:

    KF @69

    H, you have at least formally acknowledged the link. Now work through its implications.

    The link is mathematical, just as there is a mathematical link between biological fitness and compound interest which does not make them related in any other way.

    Whatever ‘2nd Law’ you are talking about, it is not the Second Law of Thermodynamics. I dream of the day when someone here acknowledges that fact.

  74. 74
    kairosfocus says:

    H, you revert. Again, I simply point out that for over 100 years now, the 2nd law has been inextricably understood to be a direct consequence of the microstate, statistical underpinnings of macro-state conditions. The informational perspective is more recent and is debated but clearly has significant force as shown and outlined. Time to face that history of ideas and where it points. KF

  75. 75
    kairosfocus says:

    Z, Yes, body and brain formation do conform, and plainly manifest the tightly regulated embryonic programs and workings of the FSCO/I rich nanomachines of life in that automaton we call the living cell. The core relevant issue is the origin of those FSCO/I rich programs and executing nanomachinery. Formation from lucky molecular noise by statistical miracle after miracle, is not a credible explanation . . . and 100 years after Gibbs and Boltzmann etc etc, this issue of the statistical behaviour of microparticles and strong trend of motion towards clusters of microstates of greater statistical weightis the inextricable underpinnings of 2LOT to the point that it can properly be regarded as a part of it. Though it is ideologically imposed and enforced. It is time for fresh thinking informed by the well warranted conclusion that FSCO/I is a highly reliable sign of design — intelligently directed configuration — as cause. KF

  76. 76
    Piotr says:

    PIOTR, That is an unwarranted personality and nonsense. It should be beneath you. KF

    It works both ways. Please tell Niwrad that calling the general consensus of scientists “the biggest unscientific lie” going on for 150 years is ridiculous as well as libellous.

  77. 77
    niwrad says:

    Piotr

    Libellous? It was a personal opinion. Can I have personal opinions? There is freedom of speech or not here?

  78. 78
    Hangonasec says:

    H, you revert. Again, I simply point out that for over 100 years now, the 2nd law has been inextricably understood to be a direct consequence of the microstate, statistical underpinnings of macro-state conditions. The informational perspective is more recent and is debated but clearly has significant force as shown and outlined. Time to face that history of ideas and where it points. KF

    Time to stop confusing 2 concepts with different implications for physical systems. Life and evolution DO NOT violate the 2nd Law of Thermodynamics. Simple question: do you agree? A 1 word answer will be perfectly sufficient.

  79. 79
    velikovskys says:

    niwrad:

    There is freedom of speech or not here?

    Not here

  80. 80
    DNA_Jock says:

    Freedom of speech? No, not really, while kf feels justified in defacing others’ posts.

    Kf did note

    Again, I simply point out that for over 100 years now, the 2nd law has been inextricably understood to be a direct consequence of the microstate, statistical underpinnings of macro-state conditions. The informational perspective is more recent and is debated but clearly has significant force as shown and outlined.

    Soooo let’s explore the informational perspective: How much water would you need to melt to account for the information content of the human genome?

    Everytime you bring up the informational perspective, I will point out your refusal to do a simple calculation. As I noted on the previous thread,

    You will need to be explicit about how you measure “configurational work” and how you quantify “FSCO/I rich functional states” for a biological. When you provide examples, please avoid the mistake of assuming that the elements are independent of each other.
    I note with some amusement that this is something that you have never done , despite repeated requests. I also look forward to someone, anyone, demonstrating how to quantify ‘organization’.

    it’s a FIASCO.

  81. 81
    niwrad says:

    DNA_Jock

    I also look forward to someone, anyone, demonstrating how to quantify ‘organization’.

    It is like to quantify quality. To perfectly quantify quality is impossible by definition.
    See:

    http://www.uncommondescent.com.....nt-design/

    Could you quantify ideas? You can have some very defective measures of organization. E.g. how many sub-blocks has a system, how many functions, how many controls, how many codes are involved, have many communication channels and so on…

    But to reduce organization to a single number is utopia.

  82. 82
    keith s says:

    KF:

    Yes, I refuse to be drawn away on irrelevancies,

    Compensation arguments are the topic of this thread, which you started.

    You are afraid to answer a simple question on your own chosen topic.

    Everyone knows why.

  83. 83
    kairosfocus says:

    DNA_Jock, Piotr et al:

    You have a choice.

    By keeping a civil tongue in your heads and keep reasonably on the substantial matter, you can have a serious discussion.

    Choose instead to descend into trollish, schoolyard taunt level behaviour and I will first warn then act to preserve reasonable order. And DNA, this is IIRC, the second time you have done such in the past little while.

    This is the only further warning you will get from me.

    One of the things you are showing is just how evolutionary materialist scientism undermines values, a sense of community and respect. Exactly what you love to protest when it is pointed out at philosophical level.

    Niw,

    If there is a serious complaint for cause, please tone down voltage. Tolerating broken windows opens the door to escalating disorder.

    KF

    PS: An editorial warning on misbehaviour, FYI, is not “defacing.” It is taking my moderation responsibilities seriousy.

  84. 84
    kairosfocus says:

    H, repeating a mantra does not make it so. For over 100 years, 2LOT has been inextricably tied to its statistical underpinnings. That is a fact. It may be inconvenient for your view of things, dependent as it is on statistical miracles, but it is a clear difficulty and obviously one that you want to wish away. KF

  85. 85
    Eric Anderson says:

    kairosfocus:

    Great post on problems with the “compensation” line of argumentation.

    This is also important for this topic:

    But the same statistics underpinning 2LOT and integral to it for over 100 years, highlights that such amounts to expecting randomising forces or phenomena such as diffusion to do complex, specific patterns of constructive work. The relevant statistics and their upshot is massively against such. The non functional clumped at random possibilities vastly outnumber the functionally specific ones, much less the scattered ones.

    This probability issue is key. It was hinted at by keith s in niwrad’s thread, and is one potential area of agreement (not the conclusion, but the issue) if we can get past some of the other red herrings brought up by supporters of materialistic abiogenesis. I’ve got a post almost done that hopefully I can get up later today with a brief summary of some of the myths . . .

  86. 86
    keith s says:

    Eric,

    Perhaps you can answer the question that KF is avoiding.

    How is photosynthesis possible if the compensation argument is invalid?

  87. 87
    DNA_Jock says:

    So kf,

    am I to take your post #83 as a threat to ban me from this thread?

    As a reminder, the title of this thread is:

    Piotr (and KS, DNA_Jock, VS, Z et al) and “compensation” arguments vs the energy audit police . . .
    [emphasis added]

    In the OP you state

    Maxwell’s demon can (hypothetically) reduce the thermodynamic entropy of a system by using information about the states of individual molecules; but, as Landauer (from 1961) and co-workers have shown, to function the demon himself must increase thermodynamic entropy in the process, by at least the amount of Shannon information he proposes to first acquire and store; and so the total entropy does not decrease (which resolves the paradox).

    and go on to state

    Such is not a violation of 2LOT, as e.g. Szilard’s analysis of Maxwell’s Demon shows. There is a relevant heat or energy flow and degradation process that compensates the reduction in freedom of possibilities implied in constructing an FSCO/I rich entity.

    All I am doing is pointing out your continued refusal to actually perform a calculation that illustrates the trade-off between “informational” and “thermal” entropy that would be an essential part of any “compensation argument” that included “information”.

    It’s certainly not “off-topic”, and I fail to see the incivility of repeating a rebuttal if you keep repeating the original, fallacious argument, as in

    Again, I simply point out that for over 100 years now, the 2nd law has been inextricably understood to be a direct consequence of the microstate, statistical underpinnings of macro-state conditions. The informational perspective is more recent and is debated but clearly has significant force as shown and outlined. Time to face that history of ideas and where it points.

    But some people are more sensitive, I guess.

  88. 88
    Piotr says:

    KF,

    Since you respect the founding fathers of thermodynamics so much and love to take their names in vain, here’s a quote (one of many such) from Ludwig Boltzmann, a great fan of Darwin, biological evolution, and the theory of natural selection (amazingly for a physicist, he believed the 19th c. would be remembered as the century of Darwin):

    The general struggle for existence of living beings is therefore not a fight for the elements — the elements of all organisms are available in abundance in air, water, and soil — nor for energy, which is plentiful in the form of heat, unfortunately untransformably, in every body. Rather it is a struggle for entropy that becomes available through the flow of energy from the hot Sun to the cold Earth. To make the fullest use of this energy, the plants spread out the immeasurable areas of their leaves and harness the Sun’s energy by a process that is still unexplored*], before it sinks down to the temperature level of the Earth, to drive chemical syntheses of which one has no inkling as yet in our laboratories**]. The products of this chemical kitchen are the object of the struggle in the animal world.

    Footnotes:

    *] Now no longer unexplored.
    **] We know a thing or two about those processes today. Boltzmann was right — and he wrote this passage with incredible insight long before molecular biology was born, and even before atomic theory became universally accepted.

    If you don’t believe us, perhaps you can believe the guy who invented statistical mechanics.

  89. 89
    keith s says:

    If you deny the compensation argument, you deny the second law.

    You’ve gotta love the irony.

    The IDers here set out to show that OOL and/or evolution run afoul of the second law.

    Instead, they have run afoul of the second law by unknowingly denying it.

    If you deny the compensation argument, you deny the second law.

  90. 90
    Box says:

    Keith, can entropic force be overcome?

  91. 91
    Piotr says:

    Entropy is not a force, Box.

  92. 92
    Hangonasec says:

    KF @84

    H, repeating a mantra does not make it so.

    And so, ironically, in answer you repeat a mantra:

    For over 100 years, 2LOT has been inextricably tied to its statistical underpinnings. That is a fact. It may be inconvenient for your view of things, dependent as it is on statistical miracles, but it is a clear difficulty and obviously one that you want to wish away. KF

    I have nowhere denied the statistical underpinnings of the 2LOT, and they are not remotely inconvenient for energy flows through living systems. What a preposterous notion that they should be! Thermodynamics is thoroughly covered in standard undergrad biochemical texts. This is much more relevant, I would suggest, than naive notions governing ideal gases and diffusion processes – the Maxwell-Boltzmann treatment of particle speeds, for example. Yet interestingly, diffusion equations make their way into population genetics! Suddenly, perhaps, the curious similarity of 2 mathematical treatments becomes an inconvenience.

    The details of the OoL are currently mysterious. It may or may not require probabilistic resources beyond those available. But what is unacceptable, scientifically, is the confusion of the informatic argument with the energetic one. The 2nd Law of thermodynamics cannot be violated by living systems, even primitive ones. And nor by intelligence, per se. That’s why it’s a Law, and an appropriate topic for a mantra. Boltzmann himself, a great admirer of Darwin, recognised this, and no physicist of repute has pursued the counter-argument. That you persist in this line, despite numerous pro-ID counsels against using this argument is, as someone once said, telling.

  93. 93
    keith s says:

    Piotr,

    Entropy is not a force, Box.

    There is a fictional force known as the entropic force, but it’s just a label for the tendency toward increased entropy.

  94. 94
    Box says:

    Piotr: Entropy is not a force

    Piotr and Keith, allow me to google that for you.

  95. 95
    keith s says:

    Box:

    Keith, can entropic force be overcome?

    No. The entropy of the universe is always increasing.

    You can create local decreases of entropy by exporting entropy to the surroundings, but this just displaces the effects of the entropic force. It doesn’t overcome them.

    It’s the compensation argument, which as KF and Eric know but will not admit, is perfectly valid.

    To overcome the entropic force would be to cause the universe’s entropy to decrease, and that is forbidden by the second law.

  96. 96
    keith s says:

    Box,

    Piotr and Keith, allow me to google that for you.

    What’s your point? I confirmed the existence of the entropic force, but explained that it is not an actual force, but merely a label for the tendency toward increased entropy.

  97. 97
    keith s says:

    Which you would know if you had bothered to look at the results that Google returned. Right at the top of the page:

    This fake-force is called an entropic force. In physics, an entropic force acting in a system is a phenomenological force resulting from the entire system’s statistical tendency to increase its entropy, rather than from a particular underlying microscopic force.

    Entropic force – Wikipedia, the free encyclopedia
    en.wikipedia.org/wiki/Entropic_force

  98. 98
    Piotr says:

    Keith S,

    Oh, yes, but it’s easy to overcome. It’s like asking if it’s possible to overcome friction.

  99. 99
    Piotr says:

    Keith S: No. The entropy of the universe is always increasing.

    I may be completely wrong about it, but my impression is that it’s quite customary to refer to “an” entropic force (a local force-like entropy effect, which can be overcome without much difficulty) rather than to “the” entropic force (what would it [seem to] act on?).

  100. 100
    Box says:

    So the tendency toward increased entropy is not a force nor is it caused by something that rightfully can be termed a force?

  101. 101
    keith s says:

    Piotr,

    Oh, yes, but it’s easy to overcome. It’s like asking if it’s possible to overcome friction.

    Friction is a genuine force, so like gravity, it can be overcome by a sufficient force in the opposite direction.

    The tendency toward increased entropy cannot be overcome. When you achieve a local decrease in entropy, you haven’t overcome that tendency — you’ve merely displaced its effects to the surroundings.

  102. 102
    keith s says:

    Box,

    So the tendency toward increased entropy is not a force…

    Correct.

    …nor is it caused by something that rightfully can be termed a force?

    It’s a function of all the forces in nature, operating normally — not a single force.

  103. 103
    keith s says:

    Piotr,

    I may be completely wrong about it, but my impression is that it’s quite customary to refer to “an” entropic force (a local force-like entropy effect, which can be overcome without much difficulty) rather than to “the” entropic force (what would it [seem to] act on?).

    If you define it in those terms, then the local entropic force can be overcome — but only by increasing the entropic force by an equal or larger amount elsewhere.

    To me, it therefore makes sense to say that the entropic force cannot be overcome — which is a direct consequence of the second law.

  104. 104
    Piotr says:

    #101

    Yeah, but we are still speaking of “the” entropic force, which would be merely a synonym for the arrow of entropy. Look, however, at the Wikipedia examples of various “entropic forces” (in the plural): Brownian motion, polymer elasticity, hydrophobic force… Each of them can of course be “overcome”, which in these cases is synonymous with decreasing entropy locally. It goes without saying that there is a cost to pay in the form of a compensatory entropy rise in the surroundings.

  105. 105
    keith s says:

    Piotr,

    But note the asymmetry: a force like gravity can be overcome locally without requiring an increase in gravity elsewhere. The same is not true of the entropic force, local or otherwise.

    To put it another way, the local entropic force really isn’t local at all. There are local manifestations of the global entropic force, but the global entropic force cannot be overcome.

    After all, local systems don’t all tend toward greater entropy — some go in the opposite direction — but the universe as a whole is always subject to the entropic force.

  106. 106
    Piotr says:

    I see what you mean, but I’m not convinced that “the global entropic force” is a very useful concept. What would it appear to act on? the entire Universe? Can anything so abstract really look like an actual physical force? Local “entropic forces” are not really local — no dispute about that — but their observable manifestations look force-like locally.

  107. 107
    DNA_Jock says:

    I think it is hopelessly misleading to think of an “entropic force”, just as it is misleading to equate entropy with “disorder”.
    There is a statistical tendency for macrostates to move towards states that have more numerous microstates associated with them. Now, as soon as you start talking about systems that have more than say 100 atoms and temperatures above a couple of kelvins, then the “tendency” is so strong that you are never going to see a reversal. Humans then impute agency (hey, we are wired to, but that’s a whole other conversation…) and start talking about a “force” that “causes” this behavior.
    Not really. It just is.

  108. 108
    Box says:

    DNA-Jock #107: There is a statistical tendency for macrostates to move towards states that have more numerous microstates associated with them.

    Should this statistical tendency be termed a “force” or a “composite force”, consisting, as Piotr points out, of several specific entropic forces?
    Saying “it just is” doesn’t seem like an option to me

  109. 109
    kairosfocus says:

    H,

    there you go again.

    Let me simply clip to you the final Shapiro-Orgel exchange of mutual ruin on RNA and metab first scenarios, to get a slice of my point across to the onlooker:

    http://iose-gen.blogspot.com/2.....e.html#ool

    [[Shapiro:] RNA’s building blocks, nucleotides contain a sugar, a phosphate and one of four nitrogen-containing bases as sub-subunits. Thus, each RNA nucleotide contains 9 or 10 carbon atoms, numerous nitrogen and oxygen atoms and the phosphate group, all connected in a precise three-dimensional pattern . . . . [[S]ome writers have presumed that all of life’s building could be formed with ease in Miller-type experiments and were present in meteorites and other extraterrestrial bodies. This is not the case.

    A careful examination of the results of the analysis of several meteorites led the scientists who conducted the work to a different conclusion: inanimate nature has a bias toward the formation of molecules made of fewer rather than greater numbers of carbon atoms, and thus shows no partiality in favor of creating the building blocks of our kind of life . . . .

    To rescue the RNA-first concept from this otherwise lethal defect, its advocates have created a discipline called prebiotic synthesis. They have attempted to show that RNA and its components can be prepared in their laboratories in a sequence of carefully controlled reactions, normally carried out in water at temperatures observed on Earth . . . .

    Unfortunately, neither chemists nor laboratories were present on the early Earth to produce RNA . . .

    [[Orgel:] If complex cycles analogous to metabolic cycles could have operated on the primitive Earth, before the appearance of enzymes or other informational polymers, many of the obstacles to the construction of a plausible scenario for the origin of life would disappear . . . .

    It must be recognized that assessment of the feasibility of any particular proposed prebiotic cycle must depend on arguments about chemical plausibility, rather than on a decision about logical possibility . . . few would believe that any assembly of minerals on the primitive Earth is likely to have promoted these syntheses in significant yield . . . . Why should one believe that an ensemble of minerals that are capable of catalyzing each of the many steps of [[for instance] the reverse citric acid cycle was present anywhere on the primitive Earth [[8], or that the cycle mysteriously organized itself topographically on a metal sulfide surface [[6]? . . . Theories of the origin of life based on metabolic cycles cannot be justified by the inadequacy of competing theories: they must stand on their own . . . .

    The prebiotic syntheses that have been investigated experimentally almost always lead to the formation of complex mixtures. Proposed polymer replication schemes are unlikely to succeed except with reasonably pure input monomers. No solution of the origin-of-life problem will be possible until the gap between the two kinds of chemistry is closed. Simplification of product mixtures through the self-organization of organic reaction sequences, whether cyclic or not, would help enormously, as would the discovery of very simple replicating polymers. However, solutions offered by supporters of geneticist or metabolist scenarios that are dependent on “if pigs could fly” hypothetical chemistry are unlikely to help.

    Guess why these trends?

    KF

    KF

  110. 110
    DNA_Jock says:

    Box:

    Should this statistical tendency be termed a “force” or a “composite force”, consisting, as Piotr points out, of several specific entropic forces?

    No, it should not.
    And I think you are mis-characterizing what Piotr said: he was referring the the various fake-entropic forces listed in Wikipedia. Remember that he also wrote to “Entropy is not a force, Box.” But I will let him address that issue.

    Saying “it just is” doesn’t seem like an option to me

    I will try an analogy. Suppose I draw numbers at random from a distribution. As I draw more and more numbers, the mean of my sample will tend to approach the mean of the underlying distribution.

    It really does not make sense to talk about the Force of Large Numbers and ask whether this is a “force” or a “composite force” that is driving the sample mean towards the population mean. There really is no “force”, it just is.

    BTW, the “analogy” between 2LoT and LLN is more than just an analogy…

  111. 111
    Piotr says:

    Just to make clear what I think: the term “entropic force(s)” is confusing and I’d be happier without using it. If smoke from something that has got burnt on a stove fills the kitchen, the distribution of smoke particles becomes approximately uniform, so their collective centre of gravity migrates from the general vicinity of the stove to the middle of the smoked-up space. It’s the statistical effect of countless interactions (of various types) at the microscopic level, not a directed force attracting individual particles towards the middle of the kitchen (such a force would in fact locally reduce the entropy of the system instead of maximising it, and some sort of thermodynamic compensation would be inevitable).

  112. 112
    keith s says:

    Piotr,

    I see what you mean, but I’m not convinced that “the global entropic force” is a very useful concept.

    I’m not either! I’m just arguing that if one insists on speaking of an “entropic force” in the first place, as Box does, then it makes more sense to treat it as a global force than as a collection of local forces, for reasons given above. And as a global force, it can never be overcome, because that would violate the second law.

    What would it appear to act on? the entire Universe?

    Yes, or any isolated system within the universe. Just like the second law.

    Can anything so abstract really look like an actual physical force?

    I don’t think so. That’s why I’m not a fan of the term “entropic force”.

    Local “entropic forces” are not really local — no dispute about that — but their observable manifestations look force-like locally.

    That would be true only if you defined the “entropic force” as the sum of all the other forces acting on each particle in the system. Then each particle would move according to the entropic force.

    I think it makes more sense to refer to it as a tendency, especially in conversation with someone like Box who is new to these concepts.

  113. 113
    keith s says:

    Box,

    Should this statistical tendency be termed a “force” or a “composite force”, consisting, as Piotr points out, of several specific entropic forces?

    I think you’ve misunderstood Piotr.

    In any case, thinking of the entropic force as an actual force or “composite force” is going to lead you down the garden path, so I recommend against it.

    The second law is fundamentally a statistical law, which means that it can be broken, given enough time and opportunity (are you familiar with Boltzmann Brains?). It’s just vanishingly unlikely — so unlikely that we can neglect the possibility of ever observing a violation, directly or indirectly, except on the smallest possible scales.

  114. 114
    Box says:

    Keith,

    It’s all fine by me. I gather that the second law – as a statistical law – cannot be overcome under materialism. However there is a spiritual realm which organizes matter – thereby overcoming the 2nd law. I hold that this is just what we see around us; as Granville Sewell and others pointed out many times.

    Your insistence that the second law cannot be overcome is simply founded in your assumption of materialism.

  115. 115
    keith s says:

    Box,

    I gather that the second law – as a statistical law – cannot be overcome under materialism. However there is a spiritual realm which organizes matter – thereby overcoming the 2nd law.

    The organization of matter doesn’t violate the second law, so there is no need to invoke a magical “spiritual realm” with the power to overcome the second law.

    I hold that this is just what we see around us; as Granville Sewell and others pointed out many times.

    No. The second law is not violated by what we see around us.

  116. 116
    keith s says:

    Off-topic: Christians and other Abrahamic theists, I have a question for you.

  117. 117
    Box says:

    Keith,
    Well that is the topic under debate, isn’t it? One thing is for sure: materialism faces some tremendous hurdles explaining the kind of organization we see in life.
    Your repeated expressions of blind faith are getting tedious.

  118. 118
    keith s says:

    Box,

    The second law is rigorously defined and has been expressed in mathematical form.

    If you think that we are surrounded by violations of the second law, then show us. Plug in the numbers.

    Your hand-waving is getting tedious.

  119. 119
    DNA_Jock says:

    Not just tedious, but a death-knell for ID.

    It is our universal experience that intelligence cannot produce violations of 2LoT. Box is invoking the immaterial/supernatural to account for the ‘violations’ of 2LoT that he erroneously believes life exemplifies.

    That’s not “an intelligence did it”, it’s “Goddidit”.

    ASSF!

    Best own goals evah!

  120. 120
    Box says:

    DNA_Jock and Keith,

    To be clear: to overcome the 2nd law – by means of an organizational power (e.g. intelligence) – is equal to ‘violating’ the second law?

  121. 121
    DNA_Jock says:

    Box,

    Either the 2LoT is violated, or it is not. (Hint: it is not)
    Your idea that there is some need for life to “overcome” the 2LoT is utterly wrong.
    And even if you were right, it is our universal experience that intelligence cannot “overcome” nor violate 2LoT. Show me a way that intelligence can “overcome” the 2LoT, and I’ll show you a perpetual motion machine. There is NOTHING about the intelligent design and manufacture of computers, TV sets, fishing reels etc. etc. that involves ‘overcoming’ 2LoT.
    You don’t understand 2LoT.

  122. 122
    Silver Asiatic says:

    KeithS @ 116

    “Know” in one case means “experience”. In another case, it means intellectually understand.

    They didn’t “know” in one case. But they “knew” in the other case.

  123. 123
    kairosfocus says:

    KS,

    red herring, led away to a strawman distortion.

    The key point is, that 2LOT is based on the statistical behaviour of microparticles which inter alia brings out why heat naturally goes from warmer to cooler bodies. The key mathematical formulation by Clausius was based on the empirical fact of this direction of flow and a quantity that would net increase when heat d’q flowed A –> B because of that; where we saw ds >/= d’q/T. As A is hotter than B, the loss – d’q/Ta will be of smaller magnitude than the gain +d’q/Tb, yielding a net gain in ds, the famous rise in entropy.

    Of course, in an ideal context of an isolated system.

    Where, the analysis implicitly treated the observed cosmos as effectively an isolated system. Sears and Salinger, aptly but subtly mentioned in their classic text, that this was a matter that is open to worldview level considerations.

    As the roots of that and other similar behaviour were explored, the root turned out to be an exploration of the statistical behaviour of systems of particles where various distributions of mass and energy at micro level are possible; consistent with given macro-conditions such as pressure, temperature, etc. that define the macrostate of the system. The second law turns out to be a consequence of the strong tendency of systems to drift towards statistically dominant clusters of microstates.

    As a simple example, used by Yavorsky and Pinsky in their Physics (MIR, Moscow, USSR, 1974, Vol I, pp. 279 ff.]) of approximately A Level or first College standard and discussed in my always linked note as being particularly clear, which effectively models a diffusion situation, we may consider

    . . . a simple model of diffusion, let us think of ten white and ten black balls in two rows in a container. There is of course but one way in which there are ten whites in the top row; the balls of any one colour being for our purposes identical. But on shuffling, there are 63,504 ways to arrange five each of black and white balls in the two rows, and 6-4 distributions may occur in two ways, each with 44,100 alternatives. So, if we for the moment see the set of balls as circulating among the various different possible arrangements at random, and spending about the same time in each possible state on average, the time the system spends in any given state will be proportionate to the relative number of ways that state may be achieved. Immediately, we see that the system will gravitate towards the cluster of more evenly distributed states. In short, we have just seen that there is a natural trend of change at random, towards the more thermodynamically probable macrostates, i.e the ones with higher statistical weights. So “[b]y comparing the [thermodynamic] probabilities of two states of a thermodynamic system, we can establish at once the direction of the process that is [spontaneously] feasible in the given system. It will correspond to a transition from a less probable to a more probable state.” [p. 284.] This is in effect the statistical form of the 2nd law of thermodynamics. Thus, too, the behaviour of the Clausius isolated system above is readily understood: importing d’Q of random molecular energy so far increases the number of ways energy can be distributed at micro-scale in B, that the resulting rise in B’s entropy swamps the fall in A’s entropy. Moreover, given that FSCI-rich micro-arrangements are relatively rare in the set of possible arrangements, we can also see why it is hard to account for the origin of such states by spontaneous processes in the scope of the observable universe. (Of course, since it is as a rule very inconvenient to work in terms of statistical weights of macrostates [i.e W], we instead move to entropy, through s = k ln W. Part of how this is done can be seen by imagining a system in which there are W ways accessible, and imagining a partition into parts 1 and 2. W = W1*W2, as for each arrangement in 1 all accessible arrangements in 2 are possible and vice versa, but it is far more convenient to have an additive measure, i.e we need to go to logs. The constant of proportionality, k, is the famous Boltzmann constant and is in effect the universal gas constant, R, on a per molecule basis, i.e we divide R by the Avogadro Number, NA, to get: k = R/NA. The two approaches to entropy, by Clausius, and Boltzmann, of course, correspond. In real-world systems of any significant scale, the relative statistical weights are usually so disproportionate, that the classical observation that entropy naturally tends to increase, is readily apparent.)

    [BTW, as a RW note; my son just came across on national radio giving part of the family response to the award of National Hero to my late Father-in-Law, his Grandfather. His cousin, gave the other half.]

    A closely parallel first example by L K Nash, ponders the likely outcome of 1,000 coins tossed at random per the binomial distribution. This turns out to be a sharply peaked bell curve centred on 50-50 H-T as has been discussed. The dominant cluster will be just this, with the coins in no particular sequence.

    But, if instead we were to see all H or all T or alternating H and T or the first 143 characters of this comment in ASCII code we can be assured that of the 1.07*10^301 possibilities, such highly specific, “simply describable” sets of outcomes are utterly maximally unlikely to come about by blind chance or mechanical necessity, but are readily explained on design. That sort of pattern is a case of complex specified information, and in the case of the ascii code, functionally specific complex organisation and associated information; FSCO/I, particularly digitally coded functionally specific information, dFSCI.

    This example draws out the basis of the design inference on FSCO/I; as, the observed cosmos of 10^80 atoms or so, each having a tray of 1,000 coins flipped and observed every 10^-14 s, will in a reasonable lifespan to date of 10^17 s look at 10^111 possibilities. An upper limit to the number of Chem rxn speed atomic scale events in the observed cosmos to date. A large number, but one utterly dwarfed by 10^301 possibilities or so. Reducing the former to the size of a hypothetical straw, the size of the cubical haystack it would be pulled from would reduce our observed cosmos to a small blob by comparison.

    That is, any reasonably isolated and special, definable cluster of possible configs, will be maximally unlikely to be found by such a blind search. Far too much haystack, too few needles, no effective scale of search appreciably different from no search.

    On the scope of events we can observe, then, we can only reasonably expect to see cases from the overwhelming bulk.

    This, with further development, is the core statistical underpinning of 2LOT.

    And, as prof Sewell pointed out, the statistical challenge does not go away when you open up a system to generic, non functionally specific mass or energy inflows etc, opened up systems of appreciable size . . . and a system whose state can be specified by 1,000 bits of info is small indeed (yes a coin is a 1-bit info storing register) . . . the statistically miraculous will be still beyond plausibility unless something in particular is happening that makes it much more plausible. Something, like organised forced motion that sets up special configs.

    In short, we cannot properly expect

    a: molecular noise or general statistical and chemical behaviour in a Darwin’s warm salty lightning struck pond or other typical proposed pre-life setting to

    b: spontaneously and cumulatively do the massive quantity of functionally specific and complex configuration work — forced, ordered motion at micro or macro levels — that

    c: is required to get us anywhere serious along the road to self replicating, metabolising, coded info using cell based life.

    d: No more than we can reasonably expect to compose this post by flipping coins.

    In short, the proposed OOL frameworks expect forces overwhelmingly of diffusion, disorganisation etc to spontaneously carry out a cascade of statistical miracles and create FSCO/I.

    Where 2LOT in light of the statistical underpinnings in effect says, such statistical miracles are unobservable due to the relative weights of clusters of microstates consistent with given macrostates. Or as G N Lewis etc would put it, the entropy of a system is effectively the average missing info to specify its microstate (range of micro-level freedom of distribution of micro level mass and energy) given the macro state conditions. The spontaneous direction of change . . . say we start with 500H-500 T in the coins will be away from low uncertainty micro state clusters to high uncertainty ones; the famous time’s arrow description of entropy. In the long run, spontaneous changes will settle the states in dominant clusters, an equilibrium.

    Consistent with this, we can constrain systems to be far from that by imposing a pattern of forced ordered motion.

    But, we will pay a price.

    Work comes from converting energy sources into useful ordered motion, often by way of shaft work as seen in the OP. High quality energy is extracted, flows through a working system, which may execute a programmed series of motions that create desired configs, then degraded waste energy must be rejected, often as heat.

    The net entropy rise for such will exceed the reduction of entropy occasioned by the constructive work as described. That comes from Szilard’s analysis of Maxwell’s Demon a thought intelligence carrying out such organising work.

    The problem with the spontaneous OOL stories we are being told, is they want to get organisation for free, through statistical miracles. That is Darwin’s warm pond — which is NOT coupled to an effective and informed energy-work device — is in effect a perpetual motion device of the second kind. This is covered up by suggesting that somehow energy and materials flux through the pond (or like environment) can be used to compensate. Won’t work for a conventional perpetuum mobile of the second kind and won’t work for this.

    Hence my cry to call in the energy auditors to assess relevant energy-work flows.

    Absent such, we do not have a credible explanation.

    Something that could do it would be an advanced molecular nanotech lab, well beyond Venter et al. Maybe with cryogenic assemblers using atom-force manipulators that go well beyond what IBM did with the famous 35 Xe atoms in the pic shown in the OP. That should help you see why that hit me like a thunderbolt.

    In short, the energy auditor’s view allows us to see how relevant the statistical understanding of 2LOT is, to evaluating the credibility of the spontaneous OOL scenarios proposed and the required FSCO/I.

    When it comes to body plan origin, much the same obtains for the FSCO/I in novel proteins and regulatory networks, noting the deep isolation of complex proteins in AA sequence space. As an index, a first cell may have had a genome 100 – 1,000 kbases, body plans plausibly require 10 – 100+ millions.

    Nor does the emphasised “natural selection” adequately account for the source of organisation and information. This is because the “selection” is actually based on culling out unsuccessful varieties and sub populations. Information SUBTRACTION not addition. The remaining factor has to be the info source: chance variation of various types. Which then faces blind needle in haystack searches that dwarf the 1,000 coin case.

    The only thing that can overwhelm such is actual empirical observation of causal factors adequate to the effects, factors tracing to blind chance and mechanical necessity.

    Such are not forthcoming, and statistical miracles based on lucky molecular noise are not a credible source.

    FSCO/I is readily produced by intelligently directed configuration.

    The world of life — with aid of the inextricable statistical underpinnings of 2LOT — points strongly to design as material cause.

    The ghost of Wallace is laughing.

    KF

  124. 124
    keith s says:

    KF:

    KS,

    red herring, led away to a strawman distortion.

    No, it’s a valid question regarding the topic of this thread that you are afraid to answer.

    For obvious reasons.

    ETA: Here’s the question again:

    How is photosynthesis possible if the compensation argument is invalid?

  125. 125
    keith s says:

    Silver Asiatic #122,

    Could you post your reply over at TSZ? I don’t want to derail this thread.

  126. 126
    kairosfocus says:

    KS, I have taken time I should not have spared, to address a raft of seriouis misconceptions and strawman objections you and other objectors have raised again and again, resisting and dismissing correction. Do you think that I am now in a mood to go after a red herring chase that leads predictably to more strawman caricatures soaked in ad hominems to be set alight to further poison, cloud and polarise the atmosphere? Do you think that I was born yesterday and do not understand that already you have been hinting at personal attacks, in the teeth of reasonable responses already given on the material question, the 500 – 1,000 bit threshold? As of now, you come across as an indoctrinated ideologue who is not serious about substantial matters, only intent on the rhetoric of polarisation and distraction. I challenge you to show to us some semblance of a sign that you have seriously and with fairness and understanding interacted with our concerns and understand why I in particular have pointed to statistical underpinnings of 2LOT and then have raised the issue of statistical miracles; saying that prof Sewell has a serious point. KF

  127. 127
    keith s says:

    kairosfocus,

    The onlookers are aware that

    a) my question is on-topic, and

    b) you are deathly afraid of answering it.

    If you have no confidence in your position, why should they? What a pitiful performance.

  128. 128
    sparc says:

    FSCO/I is readily produced by intelligently directed configuration.

    Give me a break. You are advertising FSCI/O since the days when DaveScot was the blogczar here. Why then has FSCO/I not been adopted or even mentioned by William Dembski? Why does he never show up to help you when you feel attacked (BTW, criticising your ideas is not the same as burning you on stakes)?

  129. 129
    niwrad says:

    Box

    To be clear: to overcome the 2nd law – by means of an organizational power (e.g. intelligence) – is equal to ‘violating’ the second law?

    Obviously NO, otherwise technology/engineering would be impossible.

  130. 130
    Piotr says:

    Intelligence doesn’t overcome/violate the 2nd LOT. The emergence of structure, order, complexity, organisation — call it what you will — can only happen in a way that satisfies the 2nd law. In macroscopic systems, entropy can be generated, it can be exported (by dispersing waste energy), creating a local “deficit”, but the total entropy of the system plus its environment can’t be decreased. Intelligence can’t destroy it either.

  131. 131
    kairosfocus says:

    Sparc:

    FSCO/I is a description rooted in Orgel and Wicken directly. If you have a problem with understanding its reality and relevance i/l/o strawman caricatures, I suggest a glance here:

    http://www.uncommondescent.com.....-relevant/

    Until you are willing to acknowledge that there is a relevant reality to be addressed, you are operating in denial.

    As for the objection on “adopted,” WmAD has focussed on an abstract superset, specified complexity but quite explicitly defines in NFL, pp 144 and 148 as follows:

    p. 148: “The great myth of contemporary evolutionary biology is that the information needed to explain complex biological structures can be purchased without intelligence. My aim throughout this book is to dispel that myth . . . . Eigen and his colleagues must have something else in mind besides information simpliciter when they describe the origin of information as the central problem of biology.

    I submit that what they have in mind is specified complexity [[cf. here below], or what equivalently we have been calling in this Chapter Complex Specified information or CSI . . . .

    Biological specification always refers to function. An organism is a functional system comprising many functional subsystems. . . . In virtue of their function [[a living organism’s subsystems] embody patterns that are objectively given and can be identified independently of the systems that embody them. Hence these systems are specified in the sense required by the complexity-specificity criterion . . . the specification can be cashed out in any number of ways [through observing the requisites of functional organisation within the cell, or in organs and tissues or at the level of the organism as a whole, WmAD citing Wouters, Behe and Dawkins as he hits the end of 148 and goes on to 149. Wouters: “globally in terms of the viability of whole organisms,” Behe: “minimal function of biochemical systems,” Dawkins writes: “Complicated things have some quality, specifiable in advance, that is highly unlikely to have been acquired by ran-| dom chance alone. In the case of living things, the quality that is specified in advance is . . . the ability to propagate genes in reproduction.” On p. 149, he cites Orgel’s famous remark from 1973 and highlights Paul Davis in The Fifth Miracle: “Living organisms are mysterious not for their complexity per se, but for their tightly specified complexity.”] . . .”

    p. 144: [[Specified complexity can be defined:] “. . . since a universal probability bound of 1 [[chance] in 10^150 corresponds to a universal complexity bound of 500 bits of information, [[the cluster] (T, E) constitutes CSI because T [[ effectively the target hot zone in the field of possibilities] subsumes E [[ effectively the observed event from that field], T is detachable from E, and and T measures at least 500 bits of information . . . ” [If you doubt me, do what I just did and crack open your copy and simply read it for yourself. Or else, maybe, Google Books will be kind.]

    Where, in replying to Falk’s critical review of Signature in the Cell, Meyer remarks in relevant part:

    The central argument of my book is that intelligent design—the activity of a conscious and rational deliberative agent—best explains the origin of the information necessary to produce the first living cell. I argue this because of two things that we know from our uniform and repeated experience, which following Charles Darwin I take to be the basis of all scientific reasoning about the past. First, intelligent agents have demonstrated the capacity to produce large amounts of functionally specified information (especially in a digital form). Second, no undirected chemical process has demonstrated this power. Hence, intelligent design provides the best—most causally adequate—explanation for the origin of the information necessary to produce the first life from simpler non-living chemicals. In other words, intelligent design is the only explanation that cites a cause known to have the capacity to produce the key effect in question . . . . In order to [[scientifically refute this inductive conclusion] Falk would need to show that some undirected material cause has [[empirically] demonstrated the power to produce functional biological information apart from the guidance or activity a designing mind. Neither Falk, nor anyone working in origin-of-life biology, has succeeded in doing this . . . .

    In short, FSCO/I is a concept that was long since recognised and acknowledged.

    It is high time that this particular strawman was taken down from the gibbet and given decent burial.

    Piotr:

    With a slight augmentation, I agree 100%:

    The emergence of structure, order, complexity, organisation [beyond a reasonable blind search threshold, typ. 500 – 1,000 bits] — call it what you will — can only happen in a way that satisfies the 2nd law.

    Yes, and as my discussion last evening at 123 outlined, that is inextricably connected to microscopic statistical behaviour of systems of micro particles such as atoms, ions, molecules . . . and even embedded small pollen grains . . . etc. (Analysis of Brownian motion pointing to the reality of the atomic-molecular picture contributing significantly to Einstein’s Nobel Prize.)

    When we look at that statistical picture, contrasting what we can reasonably expect per the atomic-temporal resources of the sol system or the observed cosmos, we can extend L K Nash’s example of a string of 1,000 coins to understand certain implications tied to the strong tendency of cumulative random changes in a string of coins (say 500H in a row then 500T, initially to bring in the idea of moving from order to what we can reasonably call disorder) to move towards the bulk cluster of near 50-50 HT configs, with the coins being in no particular order. Then to strongly tend to stay in that neighbourhood.

    Where, even in the near 50-50 zone, we can independently or detachably identify certain special, specific, simply describable “target zone” patterns in ways that don’t boil down to quoting them. Such as the first 143 ASCII characters of this post, or even just 143 characters in recognisable and coherent English. These zones, islands of function, given the gap between blind search resources and the scope of the space of possibilities, are utterly implausible to be hit on by any process driven by blind chance and/or mechanical necessity. Such a statistical miracle would be strictly logically and physically possible but is so remote that it is observationally utterly implausible. If we were to chance on such a string, we would to moral certainty conclude at once that it was set to that configuration by intelligently directed configuration. That, in a nutshell is why FSCO/I is an empirically and analytically strong sign of design as cause.

    Of course, as a functionally specific nodes arcs network is amenable to description per a structured sequence of y/n q’s (cf AutoCAD etc) the discussion of bit strings of sufficient length/complexity (Kolmogorov fits in here) is WLOG.

    In the molecular world, the issue is that requisites of biofunction point to astonishing FSCO/I such as in the protein synthesis subsystem and associated code bearing strings in D/RNA.

    The notion or implication that molecular noise and uncontrolled chemical kinetics can plausibly carry out the massive quantity of constructive work to create such, is akin to starting with a Nash string of coins in the bulk cluster configs then expecting the coins by cumulative random changes, to chance upon a functionally specific coded description. Where the relevant string will be a lot longer than 1,000 bits.

    This is expecting noise to carry out large quantities of constructive work blindly and for free, just have a compensating energy and/or mass flow elsewhere.

    The implications of the statistics described (recall, underpinning to 2LOT) are that this is unobservable.

    For FSCO/I to emerge in accordance with what we do know and readily observe, we need to generate considerable quantities of shaft work, then direct such through a credible, relevant, information-rich constructive process connected to the sort of structured y/n bit string description or Wicken wiring diagram nodes-and-arcs pattern that I have so often discussed. To be plausible as a cause of such FSCO/I in light of the statistics, this counterflow constructive work requires energy converters and capable connected construction devices guided by a programme of action steps that carry out the configuring work.

    Relevant energy and mass flows, with paper trail connected to the construction work, in short. Hence my speaking of energy audit police.

    Lucky noise is not a plausible explanation of FSCO/I and energy/mass flows that lack relevant paper trail that credibly connects them to required construction work, are not a reasonable candidate.

    As I have repeatedly pointed out, Venter et al are taking first steps down that road. The 35-atom IBM picture from 1989 is also relevant. And within a century I think we will get there to Industrial Revolution 3.0 with nano assemblers and clanking scale assemblers of self replicating constructors with intelligent programming that can support industrial transformation and solar system colonisation.

    But to get there we will need to get out of the bad mental habit of imagining that lucky molecular noise and/or irrelevant energy and mass flows will do complex constructive work for free by happy chance working statistical miracles.

    And in terms of the statistical underpinnings outlined 2LOT boils down to, such statistical miracles are unobservable.

    KF

  132. 132
    Piotr says:

    KF:

    [beyond a reasonable blind search threshold, typ. 500 – 1,000 bits]

    Why? Who says a structure has to be established in one fell swoop, tornado-in-the-junkyard style? Natural processes don’t start from scratch but build on what already exists.

    Let’s imagine a constant-size population of replicators in which every new generation consists of individuals 10 times “more specified” (10 times less probable) than members of the parent generation. Let’s further assume that the generation cycle is one year. The resampling of the population reduces its entropy from generation to generation (by eliminating most of the microstates). Is the actual flux of entropy on Earth sufficient to allow for such a decrease? If it is, it means that super-hyper-über-astronomically improbable states (dwarfing your threshold by any order of magnitude you know how to express) can be reached by stepwise evolution, and the probablity/specificity argument is bogus. So can you offer an estimate?

  133. 133
    kairosfocus says:

    Piotr, there is a reasonable threshold for function of relevant kind, which includes metabolism [to process energy and materials], encapsulation [to protect internal environment], gating of same [for energy and materials to go i/o], and for an integrated vNSR self replicating facility involving codes, algorithms, processing etc. Without such, we are not addressing relevant cell based life. 125 bytes of info for that much required specific function is a very generous threshold. Credible genome sizes alone run 100 – 1,000 kbases. But already at 1,000 bits, the blind search capacity of the observed cosmos is swamped as I showed by discussing the statistical underpinnings of 2LOT. If you object, kindly provide observed — not “imagine[d]” — cases of a different architecture of life that arises spontaneously in realistic abiotic envts and meets the above requisites; where such also shows the onward transformation to the cell architecture we know. There are paper speculations and a whole world of highly designed and yet far short of fit for purpose lab exercises, but so far nothing that fills the bill. FSCO/I comes in islands of function in large config spaces driven by the need for specifically organised functional interaction. KF

  134. 134
    Piotr says:

    #133 KF,

    No OOL scenario known to me assumes that cells and all their components were magically poofed into existence, assembling themselves from simple compounds. That would be creation, not prebiotic (proto)evolution. Low probability is a pseudoproblem resulting from the dubious assumption that there is a unique target (how do you know that there is only one kind of life possible?) and estimating its probability after the fact. It might be an insuperable problem for a “designer” who first draws a plan and then tries to manipulate all those wiggly, slippery and unstable little molecules into the right places, states and patterns. It doesn’t worry me, as I’m not a creationist and don’t have to explain the probability of creation. But let’s assume, for the sake of the argument, that Yahweh, Manitou or some other mystery designer did poof some prokaryotes into existence, with all the cellular stuff in the right places. Can you answer my question now? Is the flux of entropy available to life on Earth sufficient to support the decrease of entropy (and the increase of “specificity”) such as described in #130?

  135. 135
    kairosfocus says:

    Poitr, even just the creation of one functional protein of typical length 300 AAs or the D/RNA with the code for it — never mind needed execution machinery — is beyond the 500 – 1,000 bit threshold. Besides, I particularly spoke to needing empirical — not imagined — demonstration of origin of other suggested architectures forming in realistic environments by blind chance and necessity then onwards migration to the cell architecture we see all around us. Your caricaturing of this careful distinction as “magically poofed” is a clear candidate to be a strawman fallacy. I trust the similar distortions regarding my pointing out that for over 100 years 2LOT has been inextricably linked to statistical foundations, have been accepted as a misrepresentation of my concerns. Merely abandoning a line of attack is not good enough given the sort of ad hominem inferences and suggestions above and elsewhere. KF

  136. 136
    DNA_Jock says:

    Yet every time you do one of these FIASCO calculations [–> schoolyard taunt by twisting names to try to create a demeaning insult. In reality Functionally Specific, Complex Organisation and associated Information is real, is relevant and is understood to be so by the informed. TGhanks for letting us see the fundamental irrationality in your behaviour. KF] , you assume that the identity of each amino acid in the protein is independent of the identity of the other AAs [–> in reality, chaining chemistry would allow any given AA to follow any other, and recall you are here starting in Darwin’s pond or the like . . . which raises further issues of interfering cross reactions, chirality and breakdown by forces in the environment. Also, protein fold domains a re known to be deeply isolated in AA sequence space, so much so that chance based random walks will have a major search resource challenge to bridge the gaps, if you move on to existing life forms.] . You might (on one of your good days) admit that this is inaccurate, but you always assert that the error is “immaterial”. [–> this remark is basically just made up out of whole cloth. If you mean that in protein families per observation across the range of observed forms, we see there will be zones of high variabilty for a given protein and zones of very low variability, giving rise tot he sort of Durston calc, this has nothing to do with Darwin’s pond, the context for this discussion. In that latter case of existing life forms, the bottomline is that if you reduce the specificity of protein AAs to as allow as hydrophobic/hydrophilic, which is linked to folding but which is over generous, then you can use a very generous one bit per AA estimate of info content to then look at hundreds of AAs, for which we can be again overgenerous and allow 100 AA’s average and see that for a life form form we would look at 10,000+bits of information. Still well beyond the threshold that swamps atomic resources of the observed cosmos.]

    When asked to justify this latter assertion, you are reduced to “because the numbers are really big”. Any engineer who makes an approximation needs to demonstrate that the approximation is fit-for-purpose, not merely assert it. [–> again, a dismissal of a major resource to find FSCO/I challenge by attacking the man. If you are so confident that FSCO/I can readily be produced by blind chance and mechnical necessity just show it empirically. That would cause the whole design theory movement to collapse. KF]

    A calculation that you have yet to perform.

    Also in the set of calculations that you have yet to perform: any comparison of the “informational” and “thermal” contributions to entropy. How can we discuss the limits of “compensation” unless you are willing to do this calculation? [–> again, a red herring led away to a strawman. Compensation needs to be energetically, and work-wise relevant to the formation of the structures of life. The blanket assertion of irrelevant compensation flows, is little more than a distraction from the statistical miracles that are implied to be producing FSCO/I in cel basedl life in the OOL situation. Further to this, the underlying problem is that there is a dichotomising of what entropy is that is missing the core point. As pointed out, there is a whole school of thought out there that has brought forth good reason to see that entropy can be understood in terms of the avg missing information to specify microstate given macrostate, e.g. the reduced Gibbs entropy is equal to the minimum number of structured yes/no questions that need to be answered in order to fully specify the microstate, given that we know the macrostate; which can be converted from an info-probability type metric to an energy value with appropriate conversion factor. Jaynes’ point is apt: “. . . The entropy of a thermodynamic system is a measure of the degree of ignorance of a person whose sole knowledge about its microstate consists of the values of the macroscopic quantities . . . which define its thermodynamic state. This is a perfectly ‘objective’ quantity . . . it is a function of [those variables] and does not depend on anybody’s personality. There is no reason why it cannot be measured in the laboratory.” Robertson is also apt: “. . . the standard assertion that molecular chaos exists is nothing more than a poorly disguised admission of ignorance, or lack of detailed information about the dynamic state of a system . . . . If I am able to perceive order, I may be able to use it to extract work from the system, but if I am unaware of internal correlations, I cannot use them for macroscopic dynamical purposes. On this basis, I shall distinguish heat from work, and thermal energy from other forms . . . ” Echoes of the Maxwell demon are there.]

    Would any of the other IDists care to help you out? I think Sal Cordova is numerate enough… [–> The above is sufficient to highlight the conceptual gap involved, that will not recognise that, though entropy values directly tied to creating such configurations would patently be very small relative to those connected to things like the latent heat of fusion of ice, they are nonetheless real and pose significant informational and statistical implausibility barriers to the hoped for creation of FSCO/I relevant to creation of life forms or key steps towards that through molecular noise, diffusion and/or the like. Just 1,000 bits of FSCO/I is plausibly an insurmountable value. To use the Nash coins and infer a space of possibilities, the s = k ln w value for the coins from the space as a whole would be 9.28*10^-21 J/K, very low in comparison with say entropy change on fusion of a gram of water, where we would require 333.5 J at 273.16 K, yielding a “simple” dq/T value of dS = 1.22 J/K. But that does NOT mean the significance of that sort of scale can be dismissed, because there is an informational requisite and a search challenge that apply. But trying to dodge the implications of the micro-level statistical issues can be readily entertained on such numbers. For, melting a gram of ice next to the coins (or equivalent, say in RNA sequence)spelling out the 1st 143 characters of this post will be utterly irrelevant. That is part of why I refuse the red herring chase, the side-issue is beside the material point. KF]

  137. 137
    kairosfocus says:

    DNA_Jock, I warned. You insist on descending to schoolyard taunts, revealing much about your attitude. Commenting is a privilege, not a right, and it is conditional on reasonable, civil behaviour. KF

Comments are closed.