Uncommon Descent Serving The Intelligent Design Community

Should ID supporters argue in terms of thermodynamics or information or [“basic . . . “] probability?

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

In the still active discussion thread on failure of compensation arguments, long term maverick ID (and, I think, still YEC-sympathetic) supporter SalC comments:

SalC, 570:    . . .  I’ve argued against using information theory type arguments in defense of ID, it adds way too much confusion. Basic probability will do the job, and basic probability is clear and unassailable.

The mutliplicities of interest to ID proponents don’t vary with temperature, whereas the multiplicities from a thermodynamic perspective change with temperature. I find that very problematic for invoking 2LOT in defense of ID.

Algorithmically controlled metabolisms (such as realized in life) are low multiplicity constructs as a matter of principle. They are high in information content. But why add more jargon and terminology?

Most people understand “complex-computer-like machines such a living creatures are far from the expected outcome of random processes”. This is a subtle assertion of LLN [ –> The Law of Large Numbers in Statistics]. This is a different way of posing the Humpty Dumpty problem.

There are an infinite number of ways to make lock-and-key or login/password systems, but just because there are infinite number of ways to do this does not make them highly probable from random processes . . . . Why invoke 2LOT? Why invoke fancy terms that add confusion? One deliberately uses unclear arguments when one wishes to obfuscate, but why be unclear when evidence is on one’s side? . . . .

LLN and variations of Jonathan Wells Humpty Dumpty argument I think have proven themselves in the arena of debate. Maybe even more tersely, extrapolating Pasteur, “life does not come from non-life.”

In response, while SalC has a point where a basic and non-controversial exchange is in hand, we usually do not have this luxury and must address the technical issues to some significant extent. When we do so all of the above factors at once come out as differing facets of the same issue.

Facets, that inextricably interact as parts of a whole.

Analogous to say the hearts and arrows effect with suitably, very specifically cut and polished diamonds:

hearts_arrows_viewer

In short, once we begin to probe beneath the surface, design thought shows itself to be . . . irreducibly complex.

Accordingly, I responded as follows at 592 , which I think I should headline and augment:

_______________

KF, 592: >>Pardon, but I have a different take: on years of observing, any serious design argument will be twisted into pretzels, strawmannised, confused, clouded and generally distorted and dismissed by the sort of ruthlessly determined and too often amoral or outright nihilistic, truth and fairness disregarding objectors we frequently face.

This is because, too many such are “any means necessary”/”ends justify means” committed ideologues full of agit-prop talking points and agendas.

That’s exactly how the trained, indoctrinated Marxist agitators of my youth operated. Benumbed in conscience, insensitive to truth, fundamentally rage-blinded [even when charming], secure in their notion that they were the vanguard of the future/progress, and that they were championing pure victims of the oppressor classes who deserved anything they got.

(Just to illustrate the attitude, I remember one who accused me falsely of theft of an item of equipment kept in my lab. I promptly had it signed over to the Student Union once I understood the situation, then went to her office and confronted her with the sign off. How can you be so thin skinned was her only response; taking full advantage of the rule that men must restrain themselves in dealing with women, however outrageous the latter, and of course seeking to further wound. Ironically, this champion of the working classes was from a much higher class-origin than I was . . . actually, unsurprisingly. To see the parallels, notice how often not only objectors who come here but the major materialist agit-prop organisations — without good grounds — insinuate calculated dishonesty and utter incompetence to the point that we should not have been able to complete a degree, on our part.)

I suggest, first, that the pivot of design discussions on the world of life is functionally specific, complex interactive Wicken wiring diagram organisation of parts that achieve a whole performance based on particular arrangement and coupling, and associated information. Information that is sometimes explicit (R/DNA codes) or sometimes may be drawn out by using structured Y/N q’s that describe the wiring pattern to achieve function.

FSCO/I, for short.

{Aug. 1:}  Back to Reels to show the basic “please face and acknowledge facts” reality of FSCO/I , here the Penn International Trolling Reel exploded view:

Penn_intl_50_expl_vw

. . . and a video showing the implications of this “wiring diagram” for how it is put together in the factory:

[youtube TTqzSHZKQ1k]

. . . just, remember, the arm-hand system is a complex, multi-axis cybernetic manipulator-arm:

ArmModelLabel

This concept is not new, it goes back to Orgel 1973:

. . . In brief, living organisms [–> functional context] are distinguished by their specified complexity. Crystals are usually taken as the prototypes of simple well-specified structures, because they consist of a very large number of identical molecules packed together in a uniform way. Lumps of granite or random mixtures of polymers are examples of structures that are complex but not specified. The crystals fail to qualify as living because they lack complexity; the mixtures of polymers fail to qualify because they lack specificity . . . .

[HT, Mung, fr. p. 190 & 196:] These vague idea can be made more precise by introducing the idea of information. Roughly speaking, the information content of a structure is the minimum number of instructions needed to specify the structure. [–> this is of course equivalent to the string of yes/no questions required to specify the relevant “wiring diagram” for the set of functional states, T, in the much larger space of possible clumped or scattered configurations, W, as Dembski would go on to define in NFL in 2002 . . . ] One can see intuitively that many instructions are needed to specify a complex structure. [–> so if the q’s to be answered are Y/N, the chain length is an information measure that indicates complexity in bits . . . ] On the other hand a simple repeating structure can be specified in rather few instructions. [–> do once and repeat over and over in a loop . . . ] Complex but random structures, by definition, need hardly be specified at all . . . . Paley was right to emphasize the need for special explanations of the existence of objects with high information content, for they cannot be formed in nonevolutionary, inorganic processes. [The Origins of Life (John Wiley, 1973), p. 189, p. 190, p. 196.]

. . . as well as Wicken, 1979:

‘Organized’ systems are to be carefully distinguished from ‘ordered’ systems. Neither kind of system is ‘random,’ but whereas ordered systems are generated according to simple algorithms [[i.e. “simple” force laws acting on objects starting from arbitrary and common- place initial conditions] and therefore lack complexity, organized systems must be assembled element by element according to an [[originally . . . ] external ‘wiring diagram’ with a high information content . . . Organization, then, is functional complexity and carries information. It is non-random by design or by selection, rather than by the a priori necessity of crystallographic ‘order.’ [[“The Generation of Complexity in Evolution: A Thermodynamic and Information-Theoretical Discussion,” Journal of Theoretical Biology, 77 (April 1979): p. 353, of pp. 349-65. (Emphases and notes added. Nb: “originally” is added to highlight that for self-replicating systems, the blue print can be built-in.)]

. . . and is pretty directly stated by Dembski in NFL:

p. 148:“The great myth of contemporary evolutionary biology is that the information needed to explain complex biological structures can be purchased without intelligence. My aim throughout this book is to dispel that myth . . . . Eigen and his colleagues must have something else in mind besides information simpliciter when they describe the origin of information as the central problem of biology.

I submit that what they have in mind is specified complexity, or what equivalently we have been calling in this Chapter Complex Specified information or CSI . . . .

Biological specification always refers to function. An organism is a functional system comprising many functional subsystems. . . . In virtue of their function [[a living organism’s subsystems] embody patterns that are objectively given and can be identified independently of the systems that embody them. Hence these systems are specified in the sense required by the complexity-specificity criterion . . . the specification can be cashed out in any number of ways [[through observing the requisites of functional organisation within the cell, or in organs and tissues or at the level of the organism as a whole. Dembski cites:

Wouters, p. 148: “globally in terms of the viability of whole organisms,”

Behe, p. 148: “minimal function of biochemical systems,”

Dawkins, pp. 148 – 9: “Complicated things have some quality, specifiable in advance, that is highly unlikely to have been acquired by ran-| dom chance alone. In the case of living things, the quality that is specified in advance is . . . the ability to propagate genes in reproduction.”

On p. 149, he roughly cites Orgel’s famous remark from 1973, which exactly cited reads:

In brief, living organisms are distinguished by their specified complexity. Crystals are usually taken as the prototypes of simple well-specified structures, because they consist of a very large number of identical molecules packed together in a uniform way. Lumps of granite or random mixtures of polymers are examples of structures that are complex but not specified. The crystals fail to qualify as living because they lack complexity; the mixtures of polymers fail to qualify because they lack specificity . . .

And, p. 149, he highlights Paul Davis in The Fifth Miracle: “Living organisms are mysterious not for their complexity per se, but for their tightly specified complexity.”] . . .”

p. 144: [[Specified complexity can be more formally defined:] “. . . since a universal probability bound of 1 [[chance] in 10^150 corresponds to a universal complexity bound of 500 bits of information, [[the cluster] (T, E) constitutes CSI because T [[ effectively the target hot zone in the field of possibilities] subsumes E [[ effectively the observed event from that field], T is detachable from E, and and T measures at least 500 bits of information . . . ”

What happens at relevant cellular level, is that this comes down to highly endothermic C-Chemistry, aqueous medium context macromolecules in complexes that are organised to achieve highly integrated and specific interlocking functions required for metabolising, self replicating cells to function.

self_replication_mignea

This implicates huge quantities of information manifest in the highly specific functional organisation. Which is observable on a much coarser resolution than the nm range of basic molecular interactions. That is we see tightly constrained clusters of micro-level arrangements — states — consistent with function, as opposed to the much larger numbers of possible but overwhelmingly non-functional ways the same atoms and monomer components could be chemically and/or physically clumped “at random.” In turn, that is a lot fewer ways than the same could be scattered across a Darwin’s pond or the like.

{Aug. 2} For illustration let us consider the protein synthesis process at gross level:

Proteinsynthesis

. . . spotlighting and comparing the ribosome in action as a coded tape Numerically Controlled machine:

fscoi_facts

. . . then at a little more zoomed in level:

Protein Synthesis (HT: Wiki Media)
Protein Synthesis (HT: Wiki Media)

. . . then in the wider context of cellular metabolism [protein synthesis is the little bit with two call-outs in the top left of the infographic]:

cell_metabolism

Thus, starting from the “typical” diffused condition, we readily see how a work to clump at random emerges, and a further work to configure in functionally specific ways.

With implications for this component of entropy change.

As well as for the direction of the clumping and assembly process to get the right parts together, organised in the right cluster of ways that are consistent with function.

Thus, there are implications of prescriptive information that specifies the relevant wiring diagram. (Think, AutoCAD etc as a comparison.)

Pulling back, we can see that to achieve such, the reasonable — and empirically warranted — expectation, is

a: to find energy, mass and information sources and flows associated with

b: energy converters that provide shaft work or controlled flows [I use a heat engine here but energy converters are more general than that], linked to

A heat Engine partially converts heat into workc: constructors that carry out the particular work, under control of

d: relevant prescriptive information that explicitly or implicitly regulates assembly to match the wiring diagram requisites of function,

A von Neumann kinematic self-replicator
A von Neumann kinematic self-replicator

. . . [u/d Apr 13] or, comparing an contrasting a Maxwell Demon model that imposes organisation by choice with use of mechanisms, courtesy Abel:

max_vs_spontFFEq

. . . also with

e: exhaust or dissipation otherwise of degraded energy [typically, but not only, as heat . . . ] and discarding of wastes. (Which last gives relevant compensation where dS cosmos rises. Here, we may note SalC’s own recent cite on that law from Clausius, at 570 in the previous thread that shows what “relevant” implies: Heat can never pass from a colder to a warmer body without some other change, connected therewith, occurring at the same time.)

{Added, April 19, 2015: Clausius’ statement:}

Clausius_1854

By contrast with such, there seems to be a strong belief that irrelevant mass and/or energy flows without coupled converters, constructors and prescriptive organising information, through phenomena such as diffusion and fluctuations can somehow credibly hit on a replicating entity that then can ratchet up into a full encapsulated, gated, metabolising, algorithmic code using self replicating cell.

Such is thermodynamically — yes, thermodynamically, informationally and probabilistically [loose sense] utterly implausible. And, the sort of implied genes first/RNA world, or alternatively metabolism first scenarios that have been suggested are without foundation in empirically observed adequate cause tracing only to blind chance and mechanical necessity.

{U/D, Apr 13:} Abel 2012 makes much the same point, in his book chapter, MOVING ‘FAR FROM EQUILIBRIUM’ IN A PREBIOTIC ENVIRONMENT: The role of Maxwell’s Demon in life origin :

Mere heterogeneity and/or order do not even begin to satisfy the necessary and sufficient conditions for life. Self-ordering tendencies provide no mechanism for self-organization, let alone abiogenesis. All sorts of physical astronomical “clumping,” weak-bonded molecular alignments, phase changes, and outright chemical reactions occur spontaneously in nature that have nothing to do with life. Life is organization-based, not order-based. As we shall see below in Section 6, order is poisonous to organization.

Stochastic ensembles of nucleotides and amino acids can polymerize naturalistically (with great difficulty). But functional sequencing of those monomers cannot be determined by any fixed physicodynamic law. It is well-known that only one 150-mer polyamino acid string out of 10^74 stochastic ensembles folds into a tertiary structure with any hint of protein function (Axe, 2004). This takes into full consideration the much publicized substitutability of amino acids without loss of function within a typical protein family membership. The odds are still only one functional protein out of 10^74 stochastic ensembles. And 150 residues are of minimal length to qualify for protein status. Worse yet, spontaneously condensed Levo-only peptides with peptide-only bonds between only biologically useful amino acids in a prebioitic environment would rarely exceed a dozen mers in length. Without polycodon prescription and sophisticated ribosome machinery, not even polypeptides form that would contribute much to “useful biological work.” . . . .

There are other reasons why merely “moving far from equilibrium” is not the key to life as seems so universally supposed. Disequilibrium stemming from mere physicodynamic constraints and self-ordering phenomena would actually be poisonous to life-origin (Abel, 2009b). The price of such constrained and self-ordering tendencies in nature is the severe reduction of Shannon informational uncertainty in any physical medium (Abel, 2008b, 2010a). Self-ordering processes preclude information generation because they force conformity and reduce freedom of selection. If information needs anything, it is the uncertainty made possible by freedom from determinism at true decisions nodes and logic gates. Configurable switch-settings must be physicodynamically inert (Rocha, 2001; Rocha & Hordijk, 2005) for genetic programming and evolution of the symbol system to take place (Pattee, 1995a, 1995b). This is the main reason that Maxwell’s Demon model must use ideal gas molecules. It is the only way to maintain high uncertainty and freedom from low informational physicochemical determinism. Only then is the control and regulation so desperately needed for organization and life-origin possible. The higher the combinatorial possibilities and epistemological uncertainty of any physical medium, the greater is the information recordation potential of that matrix.

Constraints and law-like behavior only reduce uncertainty (bit content) of any physical matrix. Any self-ordering tendency precludes the freedom from law needed to program logic gates and configurable switch settings. The regulation of life requires not only true decision nodes, but wise choices at each decision node. This is exactly what Maxwell’s Demon does. No yet-to-be discovered physicodynamic law will ever be able to replace the Demon’s wise choices, or explain the exquisite linear digital PI programming and organization of life (Abel, 2009a; Abel & Trevors, 2007). Organization requires choice contingency rather than chance contingency or law (Abel, 2008b, 2009b, 2010a). This conclusion comes via deductive logical necessity and clear-cut category differences, not just from best-thus-far empiricism or induction/abduction.

In short, the three perspectives converge. Thermodynamically, the implausibility of finding information rich FSCO/I in islands of function in vast config spaces . . .

csi_defn . . . — where we can picture the search by using coins as stand-in for one-bit registers —

sol_coin_flipr

. . . links directly to the overwhelmingly likely outcome of spontaneous processes. Such is of course a probabilistically liked outcome. And, information is often quantified on the same probability thinking.

Taking a step back to App A my always linked note, following Thaxton Bradley and Olson in TMLO 1984 and amplifying a bit:

. . . Going forward to the discussion in Ch 8, in light of the definition dG = dH – Tds, we may then split up the TdS term into contributing components, thusly:

First, dG = [dE + PdV] – TdS . . . [Eqn A.9, cf def’ns for G, H above]

But, [1] since pressure-volume work [–> the PdV term] may be seen as negligible in the context we have in mind, and [2] since we may look at dE as shifts in bonding energy [which will be more or less the same in DNA or polypeptide/protein chains of the same length regardless of the sequence of the monomers], we may focus on the TdS term. This brings us back to the clumping then configuring sequence of changes in entropy in the Micro-Jets example above:

dG = dH – T[dS”clump” +dSconfig] . . . [Eqn A.10, cf. TBO 8.5]

Of course, we have already addressed the reduction in entropy on clumping and the further reduction in entropy on configuration, through the thought expt. etc., above. In the DNA or protein formation case, more or less the same thing happens. Using Brillouin’s negentropy formulation of information, we may see that the dSconfig is the negative of the information content of the molecule.

A bit of back-tracking will help:

S = k ln W . . . Eqn A.3

{U/D Apr 19: Boltzmann’s tombstone}

Boltzmann_equation

Now, W may be seen as a composite of the ways energy as well as mass may be arranged at micro-level. That is, we are marking a distinction between the entropy component due to ways energy [here usually, thermal energy] may be arranged, and that due to the ways mass may be configured across the relevant volume. The configurational component arises from in effect the same considerations as lead us to see a rise in entropy on having a body of gas at first confined to part of an apparatus, then allowing it to freely expand into the full volume:

Free expansion:

|| * * * * * * * * | . . . . .  ||

Then:

|| * * * * * * * * ||

Or, as Prof Gary L. Bertrand of university of Missouri-Rollo summarises:

The freedom within a part of the universe may take two major forms: the freedom of the mass and the freedom of the energy. The amount of freedom is related to the number of different ways the mass or the energy in that part of the universe may be arranged while not gaining or losing any mass or energy. We will concentrate on a specific part of the universe, perhaps within a closed container. If the mass within the container is distributed into a lot of tiny little balls (atoms) flying blindly about, running into each other and anything else (like walls) that may be in their way, there is a huge number of different ways the atoms could be arranged at any one time. Each atom could at different times occupy any place within the container that was not already occupied by another atom, but on average the atoms will be uniformly distributed throughout the container. If we can mathematically estimate the number of different ways the atoms may be arranged, we can quantify the freedom of the mass. If somehow we increase the size of the container, each atom can move around in a greater amount of space, and the number of ways the mass may be arranged will increase . . . .

The thermodynamic term for quantifying freedom is entropy, and it is given the symbol S. Like freedom, the entropy of a system increases with the temperature and with volume . . . the entropy of a system increases as the concentrations of the components decrease. The part of entropy which is determined by energetic freedom is called thermal entropy, and the part that is determined by concentration is called configurational entropy.”

In short, degree of confinement in space constrains the degree of disorder/”freedom” that masses may have. And, of course, confinement to particular portions of a linear polymer is no less a case of volumetric confinement (relative to being free to take up any location at random along the chain of monomers) than is confinement of gas molecules to one part of an apparatus. And, degree of such confinement may appropriately be termed, degree of “concentration.”

Diffusion is a similar case: infusing a drop of dye into a glass of water — the particles spread out across the volume and we see an increase of entropy there. (The micro-jets case of course is effectively diffusion in reverse, so we see the reduction in entropy on clumping and then also the further reduction in entropy on configuring to form a flyable microjet.)

So, we are justified in reworking the Boltzmann expression to separate clumping/thermal and configurational components:

S = k ln (Wclump*Wconfig)

= k lnWth*Wc . . . [Eqn A.11, cf. TBO 8.2a]

or, S = k ln Wth + k ln Wc = Sth + Sc . . . [Eqn A.11.1]

We now focus on the configurational component, the clumping/thermal one being in effect the same for at-random or specifically configured DNA or polypeptide macromolecules of the same length and proportions of the relevant monomers, as it is essentially energy of the bonds in the chain, which are the same in number and type for the two cases. Also, introducing Brillouin’s negentropy formulation of Information, with the configured macromolecule [m] and the random molecule [r], we see the increment in information on going from the random to the functionally specified macromolecule:

IB = -[Scm – Scr] . . . [Eqn A.12, cf. TBO 8.3a]

Or, IB = Scr – Scm = k ln Wcr – k ln Wcm

= k ln (Wcr/Wcm) . . . [Eqn A12.1.]

Where also, for N objects in a linear chain, n1 of one kind, n2 of another, and so on to ni, we may see that the number of ways to arrange them (we need not complicate the matter by talking of Fermi-Dirac statistics, as TBO do!) is:

W = N!/[n1!n2! . . . ni!] . . . [Eqn A13, cf TBO 8.7]

So, we may look at a 100-monomer protein, with as an average 5 each of the 20 types of amino acid monomers along the chain , with the aid of log manipulations — take logs to base 10, do the sums in log form, then take back out the logs — to handle numbers over 10^100 on a calculator:

Wcr = 100!/[(5!)^20] = 1.28*10^115

For the sake of initial argument, we consider a unique polymer chain , so that each monomer is confined to a specified location, i.e Wcm = 1, and Scm = 0. This yields — through basic equilibrium of chemical reaction thermodynamics (follow the onward argument in TBO Ch 8) and the Brillouin information measure which contributes to estimating the relevant Gibbs free energies (and with some empirical results on energies of formation etc) — an expected protein concentration of ~10^-338 molar, i.e. far, far less than one molecule per planet. (There may be about 10^80 atoms in the observed universe, with Carbon a rather small fraction thereof; and 1 mole of atoms is ~ 6.02*10^23 atoms. ) Recall, known life forms routinely use dozens to hundreds of such information-rich macromolecules, in close proximity in an integrated self-replicating information system on the scale of about 10^-6 m.

Of course, if one comes at the point from any of these directions, the objections and selectively hyperskeptical demands will be rolled out to fire off salvo after salvo of objections. Selective, as the blind chance needle in haystack models that cannot pass vera causa as a test, simply are not subjected to such scrutiny and scathing dismissiveness by the same objectors. When seriously pressed, the most they are usually prepared to concede, is that perhaps we don’t yet know enough, but rest assured “Science” will triumph so don’t you dare put up “god of the gaps” notions.

To see what I mean, notice [HT: BA 77 et al] the bottomline of a recent article on OOL conundrums:

. . . So the debate rages on. Over the past few decades scientists have edged closer to understanding the origin of life, but there is still some way to go, which is probably why when Robyn Williams asked Lane, ‘What was there in the beginning, do you think?’, the scientist replied wryly: ‘Ah, “think”. Yes, we have no idea, is the bottom line.’

But in fact, adequate cause for FSCO/I is not hard to find: intelligently directed configuration meeting requisites a – e just above. Design.

There are trillions of cases in point.

And that is why I demand that — whatever flaws, elaborations, adjustments etc we may find or want to make — we need to listen carefully and fairly to Granville Sewell’s core point:

The-Emperor-has-no-clothes-illustration-8x61
You are under arrest, for bringing the Emperor into disrepute . . .

. . . The second law is all about probability, it uses probability at the microscopic level to predict macroscopic change: the reason carbon distributes itself more and more uniformly in an insulated solid is, that is what the laws of probability predict when diffusion alone is operative. The reason natural forces may turn a spaceship, or a TV set, or a computer into a pile of rubble but not vice-versa is also probability: of all the possible arrangements atoms could take, only a very small percentage could fly to the moon and back, or receive pictures and sound from the other side of the Earth, or add, subtract, multiply and divide real numbers with high accuracy. The second law of thermodynamics is the reason that computers will degenerate into scrap metal over time, and, in the absence of intelligence, the reverse process will not occur; and it is also the reason that animals, when they die, decay into simple organic and inorganic compounds, and, in the absence of intelligence, the reverse process will not occur.

The discovery that life on Earth developed through evolutionary “steps,” coupled with the observation that mutations and natural selection — like other natural forces — can cause (minor) change, is widely accepted in the scientific world as proof that natural selection — alone among all natural forces — can create order out of disorder, and even design human brains, with human consciousness. Only the layman seems to see the problem with this logic. In a recent Mathematical Intelligencer article [“A Mathematician’s View of Evolution,” The Mathematical Intelligencer 22, number 4, 5-7, 2000] I asserted that the idea that the four fundamental forces of physics alone could rearrange the fundamental particles of Nature into spaceships, nuclear power plants, and computers, connected to laser printers, CRTs, keyboards and the Internet, appears to violate the second law of thermodynamics in a spectacular way.1 . . . .

What happens in a[n isolated] system depends on the initial conditions; what happens in an open system depends on the boundary conditions as well. As I wrote in “Can ANYTHING Happen in an Open System?”, “order can increase in an open system, not because the laws of probability are suspended when the door is open, but simply because order may walk in through the door…. If we found evidence that DNA, auto parts, computer chips, and books entered through the Earth’s atmosphere at some time in the past, then perhaps the appearance of humans, cars, computers, and encyclopedias on a previously barren planet could be explained without postulating a violation of the second law here . . . But if all we see entering is radiation and meteorite fragments, it seems clear that what is entering through the boundary cannot explain the increase in order observed here.” Evolution is a movie running backward, that is what makes it special.

THE EVOLUTIONIST, therefore, cannot avoid the question of probability by saying that anything can happen in an open system, he is finally forced to argue that it only seems extremely improbable, but really isn’t, that atoms would rearrange themselves into spaceships and computers and TV sets . . . [NB: Emphases added. I have also substituted in isolated system terminology as GS uses a different terminology.]

Surely, there is room to listen, and to address concerns on the merits. >>

_______________

I think we need to appreciate that the design inference applies to all three of thermodynamics, information and probability, and that we will find determined objectors who will attack all three in a selectively hyperskeptical manner.  We therefore need to give adequate reasons for what we hold, for the reasonable onlooker. END

PS: As it seems unfortunately necessary, I here excerpt the Wikipedia “simple” summary derivation of 2LOT from statistical mechanics considerations as at April 13, 2015 . . . a case of technical admission against general interest, giving the case where distributions are not necessarily equiprobable. This shows the basis of the point that for over 100 years now, 2LOT has been inextricably rooted in statistical-molecular considerations (where, it is those considerations that lead onwards to the issue that FSCO/I, which naturally comes in deeply isolated islands of function in large config spaces, will be maximally implausible to discover through blind, needle in haystack search on chance and mechanical necessity):

Wiki_2LOT_fr_stat_mWith this in hand, I again cite a favourite basic College level Physics text, as summarised in my online note App I:

Yavorski and Pinski, in the textbook Physics, Vol I [MIR, USSR, 1974, pp. 279 ff.], summarise the key implication of the macro-state and micro-state view well: as we consider a simple model of diffusion, let us think of ten white and ten black balls in two rows in a container. There is of course but one way in which there are ten whites in the top row; the balls of any one colour being for our purposes identical. But on shuffling, there are 63,504 ways to arrange five each of black and white balls in the two rows, and 6-4 distributions may occur in two ways, each with 44,100 alternatives. So, if we for the moment see the set of balls as circulating among the various different possible arrangements at random, and spending about the same time in each possible state on average, the time the system spends in any given state will be proportionate to the relative number of ways that state may be achieved. Immediately, we see that the system will gravitate towards the cluster of more evenly distributed states. In short, we have just seen that there is a natural trend of change at random, towards the more thermodynamically probable macrostates, i.e the ones with higher statistical weights. So “[b]y comparing the [thermodynamic] probabilities of two states of a thermodynamic system, we can establish at once the direction of the process that is [spontaneously] feasible in the given system. It will correspond to a transition from a less probable to a more probable state.” [p. 284.] This is in effect the statistical form of the 2nd law of thermodynamics. Thus, too, the behaviour of the Clausius isolated system above [with interacting sub-systemd A and B that transfer d’Q to B due to temp. difference] is readily understood: importing d’Q of random molecular energy so far increases the number of ways energy can be distributed at micro-scale in B, that the resulting rise in B’s entropy swamps the fall in A’s entropy. Moreover, given that [FSCO/I]-rich micro-arrangements are relatively rare in the set of possible arrangements, we can also see why it is hard to account for the origin of such states by spontaneous processes in the scope of the observable universe. (Of course, since it is as a rule very inconvenient to work in terms of statistical weights of macrostates [i.e W], we instead move to entropy, through s = k ln W. Part of how this is done can be seen by imagining a system in which there are W ways accessible, and imagining a partition into parts 1 and 2. W = W1*W2, as for each arrangement in 1 all accessible arrangements in 2 are possible and vice versa, but it is far more convenient to have an additive measure, i.e we need to go to logs. The constant of proportionality, k, is the famous Boltzmann constant and is in effect the universal gas constant, R, on a per molecule basis, i.e we divide R by the Avogadro Number, NA, to get: k = R/NA. The two approaches to entropy, by Clausius, and Boltzmann, of course, correspond. In real-world systems of any significant scale, the relative statistical weights are usually so disproportionate, that the classical observation that entropy naturally tends to increase, is readily apparent.)

This underlying context is easily understood and leads logically to 2LOT as an overwhelmingly likely consequence. Beyond a reasonable scale, fluctuations beyond a very narrow range are statistical miracles, that we have no right to expect to observe.

And, that then refocusses the issue of connected, concurrent energy flows to provide compensation for local entropy reductions.

 

Comments
Salvador:
If you came across a table on which was set 500 fair coins and 100% displayed the “heads” side of the coin, how would you, using 2LOT, test “chance” as a hypothesis to explain this particular configuration of coins?
Should I just take it on faith that they are 500 "fair" coins? You're so invested in your nonsensical objections to ID that you can't even see the idiocy of your position. How shall we test the proposition that each of these 500 coins is a "fair" coin?Mung
April 14, 2015
April
04
Apr
14
14
2015
07:04 PM
7
07
04
PM
PDT
Mung: Chance is not a cause. Salvador: Mung minces words. Not really. Mung is just better read. Not A ChanceMung
April 14, 2015
April
04
Apr
14
14
2015
06:58 PM
6
06
58
PM
PDT
F/N 2: In case someone thinks the 500 or 1,000 coin example is irrelevant to Stat Mech, or not credible, simply put it in the context of a paramagnetic substance with atoms -- or for argument tiny magnets -- aligned or anti-parallel with an imposed WEAK magnetic field; cf Mandl, Sec 2.2 pp 35 ff. That is, the weak field gives an axis or orientation and the system state is defined on alignment; weak being so relative to typical thermal energy lumps of order kT so thermal agitation can readily flip the atomic alignments . . . think, tiny loops of current here that can be ccw up or ccw down [ = N up or N down]. The same binomial distribution will at once obtain, with extrema showing total alignment [N up one way, S up the other] and the middle zone the typical no apparent net magnetic moment case. The analysis suggests that with some thermal agitation that can cause flipping, the natural thermodynamic equilibrium state -- where it strongly tends to settle once there is reasonable freedom for atoms to flip [see, too, how the idea of equiprobable individual micro states so naturally fits in with a dominant equilibrium cluster . . . ] will be in the bulk cluster near 50-50, with the fluctuations probably within 1 - 2 SD of the 50-50 mean or so, where sd = sqrt n.p.q, with p = q = 0.5, so for 500 fluctuation is ~ 5.6 - 11, and for 1,000 ~ 7.9 - 16, notice how the percentage falls: 2.2 --> 1.6. For a more typical 10^20 atoms, fluctuations are of order 10^10 atoms, a vanishingly small percentage; with N & S up also overwhelmingly in no particular informed order of course. And of course, N/S up is directly applicable to information storage too, though for practical cases we are speaking of domains not single atoms. KFkairosfocus
April 14, 2015
April
04
Apr
14
14
2015
06:14 AM
6
06
14
AM
PDT
F/N: On demand, it has been clearly shown that 2LOT is rooted in statistical thermodynamics, that this connects to the atomic-molecular basis of matter, that this points to information and probability issues, and that the origin of FSCO/I on blind needle in haystack search chance and mechanical necessity are maximally implausible as a direct consequence. Let me cite Mandl in Statistical Physics, from the Manchester series, my actual first text in Stat Mech, p. 32:
It was the superb achievement of Boltzmann, in the 1870's, to relate entropy, which is a macroscopic concept, to the molecular properties of a system. The basic idea is that the macroscopic specification of a system is very imperfect. A system in a given macroscopic state can still be in any one of an enormously large number of microscopic states: the coarse macroscopic description cannot distinguish between these. The microscopic state of a system changes all the time; for example, in a gas it changes due to collisions between molecules. But the number of microscopic states which correspond to macroscopic equilibrium is overwhelmingly large compared with all other microscopic states. Hence the probability of appreciable deviations from equilibrium occurring is utterly negligible. [Contrasting gas molecules filling a container spontaneously rushing to one half, vs free expansion to fill it if a partition were suddenly removed] it is utterly improbable that the gas, starting from a state of uniform density . . . should spontaneously change to a state where all the molecules are in one half (A) of the enclosure . . . Of course one can especially prepare the system to be in this state [by using a partition that forces the molecuses to be in A and B to be empty] . . . On removing the partition (if we imagine that this can be done sufficiently quickly) . . . the gas will very rapidly expand to fill the whole available space . . . Thereafter fluctuations will be very small [for 10^20 molecules, typ. up to ~ 10^10]. One would have to wait for times enormously long compared with the age of the universe [~10^17 s is in view] for a fluctuation to occur which is large on the macroscopic scale and then it would last only a very small fraction of a second. Thus one may safely ignore such large fluctuations altogether [i.e., they are practically unobservable though abstractly possible]. [Mandl, F, Statistical Physics, (Chichester: John Wiley, 1971), p. 32.]
Remember, that's just for simple spontaneous occurrence of order not shaped by imposed constraints. The same, manifestly obtains with even more force for spontaneous functionally specific complex interactive organisation and associated information, FSCO/I for short. Nor will it do to dismiss my point with an epithet like: Hoyle's Fallacy. (Not least, as a Nobel equivalent prize holding astrophysicist, Sir Fred Hoyle had to be expert in thermodynamics and linked subjects.) Long before we come to a jumbo jet formed by a tornado hitting a junkyard, we have forming just an instrument on its instrument panel. Or, forming a Penn International 50 lb class reel. And going back to micro scale, forming a functional 300 AA protein -- a typical size -- or a D/RNA string that codes for it would also tax the atomic-temporal resources of the observed cosmos. Such things can patently be formed without doing violence to 2LOT, but the only empirically and analytically warranted way, is for there to be energy, mass and information flows that are relevant by virtue of being closely coupled to energy converters that drive constructors that use prescriptive information to create the FSCO/I rich entities. In so doing, they will degrade and dissipate energy, and usually will exhaust waste matter also. Jumbo jets and fishing reels have factories and proteins have ribosomes etc. Mother Nature is trying to tell us something -- FSCO/I is a strong sign of design as adequate cause, but are we listening? KF PS: That brings us back to Abel's point highlighted yesterday:
Mere heterogeneity and/or order do not even begin to satisfy the necessary and sufficient conditions for life. Self-ordering tendencies provide no mechanism for self-organization, let alone abiogenesis. All sorts of physical astronomical “clumping,” weak-bonded molecular alignments, phase changes, and outright chemical reactions occur spontaneously in nature that have nothing to do with life. Life is organization-based, not order-based. As we shall see below in Section 6, order is poisonous to organization. Stochastic ensembles of nucleotides and amino acids can polymerize naturalistically (with great difficulty). But functional sequencing of those monomers cannot be determined by any fixed physicodynamic law. It is well-known that only one 150-mer polyamino acid string out of 10^74 stochastic ensembles folds into a tertiary structure with any hint of protein function (Axe, 2004). This takes into full consideration the much publicized substitutability of amino acids without loss of function within a typical protein family membership. The odds are still only one functional protein out of 10^74 stochastic ensembles. And 150 residues are of minimal length to qualify for protein status. Worse yet, spontaneously condensed Levo-only peptides with peptide-only bonds between only biologically useful amino acids in a prebioitic environment would rarely exceed a dozen mers in length. Without polycodon prescription and sophisticated ribosome machinery, not even polypeptides form that would contribute much to “useful biological work.” . . . . There are other reasons why merely “moving far from equilibrium” is not the key to life as seems so universally supposed. Disequilibrium stemming from mere physicodynamic constraints and self-ordering phenomena would actually be poisonous to life-origin (Abel, 2009b). The price of such constrained and self-ordering tendencies in nature is the severe reduction of Shannon informational uncertainty in any physical medium (Abel, 2008b, 2010a). Self-ordering processes preclude information generation because they force conformity and reduce freedom of selection. If information needs anything, it is the uncertainty made possible by freedom from determinism at true decisions nodes and logic gates. Configurable switch-settings must be physicodynamically inert (Rocha, 2001; Rocha & Hordijk, 2005) for genetic programming and evolution of the symbol system to take place (Pattee, 1995a, 1995b). This is the main reason that Maxwell’s Demon model must use ideal gas molecules. It is the only way to maintain high uncertainty and freedom from low informational physicochemical determinism. Only then is the control and regulation so desperately needed for organization and life-origin possible. The higher the combinatorial possibilities and epistemological uncertainty of any physical medium, the greater is the information recordation potential of that matrix. Constraints and law-like behavior only reduce uncertainty (bit content) of any physical matrix. Any self-ordering tendency precludes the freedom from law needed to program logic gates and configurable switch settings. The regulation of life requires not only true decision nodes, but wise choices at each decision node. This is exactly what Maxwell’s Demon does. No yet-to-be discovered physicodynamic law will ever be able to replace the Demon’s wise choices, or explain the exquisite linear digital PI programming and organization of life (Abel, 2009a; Abel & Trevors, 2007). Organization requires choice contingency rather than chance contingency or law (Abel, 2008b, 2009b, 2010a). This conclusion comes via deductive logical necessity and clear-cut category differences, not just from best-thus-far empiricism or induction/abduction. [In the book chapter, MOVING ‘FAR FROM EQUILIBRIUM’ IN A PREBIOTIC ENVIRONMENT: The role of Maxwell’s Demon in life origin, 2012]
. . . and to Sewell's longstanding point:
. . . The second law is all about probability, it uses probability at the microscopic level to predict macroscopic change: the reason carbon distributes itself more and more uniformly in an insulated solid is, that is what the laws of probability predict when diffusion alone is operative. The reason natural forces may turn a spaceship, or a TV set, or a computer into a pile of rubble but not vice-versa is also probability: of all the possible arrangements atoms could take, only a very small percentage could fly to the moon and back, or receive pictures and sound from the other side of the Earth, or add, subtract, multiply and divide real numbers with high accuracy. The second law of thermodynamics is the reason that computers will degenerate into scrap metal over time, and, in the absence of intelligence, the reverse process will not occur; and it is also the reason that animals, when they die, decay into simple organic and inorganic compounds, and, in the absence of intelligence, the reverse process will not occur. The discovery that life on Earth developed through evolutionary “steps,” coupled with the observation that mutations and natural selection — like other natural forces — can cause (minor) change, is widely accepted in the scientific world as proof that natural selection — alone among all natural forces — can create order out of disorder, and even design human brains, with human consciousness. Only the layman seems to see the problem with this logic. In a recent Mathematical Intelligencer article ["A Mathematician's View of Evolution," The Mathematical Intelligencer 22, number 4, 5-7, 2000] I asserted that the idea that the four fundamental forces of physics alone could rearrange the fundamental particles of Nature into spaceships, nuclear power plants, and computers, connected to laser printers, CRTs, keyboards and the Internet, appears to violate the second law of thermodynamics in a spectacular way.1 . . . . What happens in a[n isolated] system depends on the initial conditions; what happens in an open system depends on the boundary conditions as well. As I wrote in “Can ANYTHING Happen in an Open System?”, “order can increase in an open system, not because the laws of probability are suspended when the door is open, but simply because order may walk in through the door…. If we found evidence that DNA, auto parts, computer chips, and books entered through the Earth’s atmosphere at some time in the past, then perhaps the appearance of humans, cars, computers, and encyclopedias on a previously barren planet could be explained without postulating a violation of the second law here . . . But if all we see entering is radiation and meteorite fragments, it seems clear that what is entering through the boundary cannot explain the increase in order observed here.” Evolution is a movie running backward, that is what makes it special. THE EVOLUTIONIST, therefore, cannot avoid the question of probability by saying that anything can happen in an open system, he is finally forced to argue that it only seems extremely improbable, but really isn’t, that atoms would rearrange themselves into spaceships and computers and TV sets . . . [NB: Emphases added. I have also substituted in isolated system terminology as GS uses a different terminology.]
. . . but, are we listening?kairosfocus
April 14, 2015
April
04
Apr
14
14
2015
03:52 AM
3
03
52
AM
PDT
BA77: Thanks again, Abel 2012 is good, in MOVING ‘FAR FROM EQUILIBRIUM’ IN A PREBIOTIC ENVIRONMENT: The role of Maxwell’s Demon in life origin :
Mere heterogeneity and/or order do not even begin to satisfy the necessary and sufficient conditions for life. Self-ordering tendencies provide no mechanism for self-organization, let alone abiogenesis. All sorts of physical astronomical “clumping,” weak-bonded molecular alignments, phase changes, and outright chemical reactions occur spontaneously in nature that have nothing to do with life. Life is organization-based, not order-based. As we shall see below in Section 6, order is poisonous to organization. Stochastic ensembles of nucleotides and amino acids can polymerize naturalistically (with great difficulty). But functional sequencing of those monomers cannot be determined by any fixed physicodynamic law. It is well-known that only one 150-mer polyamino acid string out of 10^74 stochastic ensembles folds into a tertiary structure with any hint of protein function (Axe, 2004). This takes into full consideration the much publicized substitutability of amino acids without loss of function within a typical protein family membership. The odds are still only one functional protein out of 10^74 stochastic ensembles. And 150 residues are of minimal length to qualify for protein status. Worse yet, spontaneously condensed Levo-only peptides with peptide-only bonds between only biologically useful amino acids in a prebioitic environment would rarely exceed a dozen mers in length. Without polycodon prescription and sophisticated ribosome machinery, not even polypeptides form that would contribute much to “useful biological work.” . . . . There are other reasons why merely “moving far from equilibrium” is not the key to life as seems so universally supposed. Disequilibrium stemming from mere physicodynamic constraints and self-ordering phenomena would actually be poisonous to life-origin (Abel, 2009b). The price of such constrained and self-ordering tendencies in nature is the severe reduction of Shannon informational uncertainty in any physical medium (Abel, 2008b, 2010a). Self-ordering processes preclude information generation because they force conformity and reduce freedom of selection. If information needs anything, it is the uncertainty made possible by freedom from determinism at true decisions nodes and logic gates. Configurable switch-settings must be physicodynamically inert (Rocha, 2001; Rocha & Hordijk, 2005) for genetic programming and evolution of the symbol system to take place (Pattee, 1995a, 1995b). This is the main reason that Maxwell’s Demon model must use ideal gas molecules. It is the only way to maintain high uncertainty and freedom from low informational physicochemical determinism. Only then is the control and regulation so desperately needed for organization and life-origin possible. The higher the combinatorial possibilities and epistemological uncertainty of any physical medium, the greater is the information recordation potential of that matrix. Constraints and law-like behavior only reduce uncertainty (bit content) of any physical matrix. Any self-ordering tendency precludes the freedom from law needed to program logic gates and configurable switch settings. The regulation of life requires not only true decision nodes, but wise choices at each decision node. This is exactly what Maxwell’s Demon does. No yet-to-be discovered physicodynamic law will ever be able to replace the Demon’s wise choices, or explain the exquisite linear digital PI programming and organization of life (Abel, 2009a; Abel & Trevors, 2007). Organization requires choice contingency rather than chance contingency or law (Abel, 2008b, 2009b, 2010a). This conclusion comes via deductive logical necessity and clear-cut category differences, not just from best-thus-far empiricism or induction/abduction.
kairosfocus
April 13, 2015
April
04
Apr
13
13
2015
08:15 AM
8
08
15
AM
PDT
Here is a semi-related point of interest on the second law and consciousness:
Quantum Zeno effect “It has been experimentally confirmed,, that unstable particles will not decay, or will decay less rapidly, if they are observed. Somehow, observation changes the quantum system. We’re talking pure observation, not interacting with the system in any way.” Douglas Ell – Counting to God – pg. 189 – 2014 – Douglas Ell graduated early from MIT, where he double majored in math and physics. He then obtained a masters in theoretical mathematics from the University of Maryland. After graduating from law school, magna cum laude, he became a prominent attorney. Quantum Zeno Effect The quantum Zeno effect is,, an unstable particle, if observed continuously, will never decay. http://en.wikipedia.org/wiki/Quantum_Zeno_effect
The reason why I am very impressed with the Quantum Zeno effect as to establishing consciousness’s primacy in quantum mechanics is, for one thing, that Entropy is, by a wide margin, the most finely tuned of initial conditions of the Big Bang:
The Physics of the Small and Large: What is the Bridge Between Them? Roger Penrose Excerpt: “The time-asymmetry is fundamentally connected to with the Second Law of Thermodynamics: indeed, the extraordinarily special nature (to a greater precision than about 1 in 10^10^123, in terms of phase-space volume) can be identified as the “source” of the Second Law (Entropy).” How special was the big bang? – Roger Penrose Excerpt: This now tells us how precise the Creator’s aim must have been: namely to an accuracy of one part in 10^10^123. (from the Emperor’s New Mind, Penrose, pp 339-345 – 1989)
For another thing, it is interesting to note just how foundational entropy is in its explanatory power for actions within the space-time of the universe:
Shining Light on Dark Energy – October 21, 2012 Excerpt: It (Entropy) explains time; it explains every possible action in the universe;,, Even gravity, Vedral argued, can be expressed as a consequence of the law of entropy. ,,, The principles of thermodynamics are at their roots all to do with information theory. Information theory is simply an embodiment of how we interact with the universe —,,, http://crev.info/2012/10/shining-light-on-dark-energy/
In fact, entropy is also the primary reason why our physical, temporal, bodies grow old and die,,,
Aging Process – 85 years in 40 seconds – video http://www.youtube.com/watch?v=A91Fwf_sMhk *3 new mutations every time a cell divides in your body * Average cell of 15 year old has up to 6000 mutations *Average cell of 60 year old has 40,000 mutations Reproductive cells are ‘designed’ so that, early on in development, they are ‘set aside’ and thus they do not accumulate mutations as the rest of the cells of our bodies do. Regardless of this protective barrier against the accumulation of slightly detrimental mutations still we find that,,, *60-175 mutations are passed on to each new generation. Per John Sanford Entropy Explains Aging, Genetic Determinism Explains Longevity, and Undefined Terminology Explains Misunderstanding Both - 2007 Excerpt: There is a huge body of knowledge supporting the belief that age changes are characterized by increasing entropy, which results in the random loss of molecular fidelity, and accumulates to slowly overwhelm maintenance systems [1–4].,,, http://www.plosgenetics.org/article/info%3Adoi/10.1371/journal.pgen.0030220
And yet, to repeat,,,
Quantum Zeno effect “It has been experimentally confirmed,, that unstable particles will not decay, or will decay less rapidly, if they are observed. Somehow, observation changes the quantum system. We’re talking pure observation, not interacting with the system in any way.” Douglas Ell – Counting to God – pg. 189 – 2014 Quantum Zeno effect Excerpt: The quantum Zeno effect is,,, an unstable particle, if observed continuously, will never decay. per wiki
This is just fascinating! Why in blue blazes should conscious observation put a freeze on entropic decay, unless consciousness was/is more foundational to reality than the 1 in 10^10^120 entropy is?bornagain77
April 13, 2015
April
04
Apr
13
13
2015
07:00 AM
7
07
00
AM
PDT
Germanicus, you are (purposely?) missing the bigger point. I am well aware that the 2nd law was not violated in the overall sense and that the energy was payed for elsewhere in the experiment. I did not think for a moment that it was not. Again, I am sorry for any fault I may have had in your confusion. My main point is, and always has been, that physically real information was imparted to a system by intelligence to 'locally' violate the second law of the isolated system being worked on. As I pointed out before, the materialistic presupposition is that information is 'emergent' from a matter-energy basis and is thus not 'physically real'. In fact, also as I pointed out before, I have had more than one debate with atheists on UD who had claimed the information in the cell was merely illusory, i.e merely a 'metaphor'. To have an experimental demonstration that information has a 'thermodynamic content' and is thus physically real is a direct falsification of that materialistic presupposition. Moreover, many Darwinists continually gripe that IDists have no 'mechanism' to appeal to explain the origination of information in cells. Yet here we have a direct demonstration that intelligence can 'purposely' impart physically real information into an isolated system whilst paying for the second law elsewhere outside that isolated system. i.e. Maxwell's demon! Moreover, I also previously highlighted that non-local quantum information falsified neo-Darwinism more directly than even this experiment did. This direct falsification of a foundational neo-Darwinian claim by non-local quantum information might not seem like that big of a deal for you, but for me personally, the finding of 'non-local' quantum entanglement/information in molecular biology is a direct empirical falsification of the foundational neo-Darwinian claim that information can 'emerge' from a material basis, and is thus, of no small importance as far as empirical science itself is concerned.bornagain77
April 13, 2015
April
04
Apr
13
13
2015
06:22 AM
6
06
22
AM
PDT
Germanicus & BA77, you seem to be discussing a case of relevant compensation. KFkairosfocus
April 13, 2015
April
04
Apr
13
13
2015
04:42 AM
4
04
42
AM
PDT
F/N: I have added an excerpted derivation of 2LOT from statistical-molecular considerations to the OP, using the non equiprobable distribution case for S tracing to Gibbs. The original work to make this general connexion is over 100 years old. I trust, this will allow us to focus on the real issue, that "compensation" must be relevant. KF PS: I also added my comment and cite from Yavorski-Pinski, on the black-white balls diffusion model.kairosfocus
April 13, 2015
April
04
Apr
13
13
2015
04:35 AM
4
04
35
AM
PDT
bornagain77 at #43, to be clear: SLOT in that experiment is not violated at all, either "locally" or "in overall sense". Indeed SLOT is about the compensation argument.Germanicus
April 13, 2015
April
04
Apr
13
13
2015
04:16 AM
4
04
16
AM
PDT
Germanicus, I certainly did not mean to imply in any way, shape, or fashion, that the second law was violated in the overall sense. That is precisely why I added 'locally' with scare quotes. Even your quote makes clear that the second law was violated 'locally' by saying energy was consumed elsewhere in the experiment.,,,
"energy was consumed in the experiment by the apparatus used, and by the experimenters themselves, who did work in monitoring the particle and adjusting the voltage, … ”
The main point I was trying to make clear is that information was shown to have a 'thermodynamic content'. Seeing as Darwinists/Materialists have denied the physical reality of information through the years, that IS NOT a minor development!
"In a paper published in Nature Physics they describe how they coaxed a Brownian particle to travel upwards on a “spiral-staircase-like” potential energy created by an electric field solely on the basis of information on its location. As the particle traveled up the staircase it gained energy from moving to an area of higher potential, and the team was able to measure precisely how much energy had been converted from information." and,,, “This is a beautiful experimental demonstration that information has a thermodynamic content,” says Christopher Jarzynski, a statistical chemist at the University of Maryland in College Park. In 1997, Jarzynski formulated an equation to define the amount of energy that could theoretically be converted from a unit of information2; the work by Sano and his team has now confirmed this equation. “This tells us something new about how the laws of thermodynamics work on the microscopic scale,” says Jarzynski."
once again, showing that information has a thermodynamic content IS NOT a minor consideration seeing as Darwinists/Materialists deny the physical reality of information and consider it 'emergent' from a material basis. I'm sorry what I wrote was not more clear to you as to that important 'information is real' point I was making. I repeated that important point several times in my post, and even gave evidence for non-local, beyond space and time, 'quantum information' in the cell! Which is another point that is also certainly far from a minor consideration, since it directly falsifies materialistic claims that information can be reduced to a material basis. None-the-less, I will add a more nuanced caveat than the violated 'locally' with scare quotes that I used so as to avoid any confusion in the future. Supplemental notes: The experiment also concretely verified that they are not just whistling Dixie in the following,,,
"Is there a real connection between entropy in physics and the entropy of information? ....The equations of information theory and the second law are the same, suggesting that the idea of entropy is something fundamental..." Siegfried, Dallas Morning News, 5/14/90, [Quotes Robert W. Lucky, Ex. Director of Research, AT&T, Bell Laboratories & John A. Wheeler, of Princeton & Univ. of TX, Austin] Biophysics – Information theory. Relation between information and entropy: - Setlow-Pollard, Ed. Addison Wesley Excerpt: Linschitz gave the figure 9.3 x 10^12 cal/deg or 9.3 x 10^12 x 4.2 joules/deg for the entropy of a bacterial cell. Using the relation H = S/(k In 2), we find that the information content is 4 x 10^12 bits. Morowitz' deduction from the work of Bayne-Jones and Rhees gives the lower value of 5.6 x 10^11 bits, which is still in the neighborhood of 10^12 bits. Thus two quite different approaches give rather concordant figures. https://docs.google.com/document/d/18hO1bteXTPOqQtd2H12PI5wFFoTjwg8uBAU5N0nEQIE/edit “a one-celled bacterium, e. coli, is estimated to contain the equivalent of 100 million pages of Encyclopedia Britannica. Expressed in information in science jargon, this would be the same as 10^12 bits of information. In comparison, the total writings from classical Greek Civilization is only 10^9 bits, and the largest libraries in the world – The British Museum, Oxford Bodleian Library, New York Public Library, Harvard Widenier Library, and the Moscow Lenin Library – have about 10 million volumes or 10^12 bits.” – R. C. Wysong 'The information content of a simple cell has been estimated as around 10^12 bits, comparable to about a hundred million pages of the Encyclopedia Britannica." Carl Sagan, "Life" in Encyclopedia Britannica: Macropaedia (1974 ed.), pp. 893-894
Also of note:
“It is CSI that enables Maxwell’s demon to outsmart a thermodynamic system tending toward thermal equilibrium” William Dembki Intelligent Design, pg. 159 MOVING ‘FAR FROM EQUILIBRIUM’ IN A PREBIOTIC ENVIRONMENT: The role of Maxwell’s Demon in life origin - DAVID L. ABEL Abstract: Can we falsify the following null hypothesis? “A kinetic energy potential cannot be generated by Maxwell’s Demon from an ideal gas equilibrium without purposeful choices of when to open and close the partition’s trap door.” If we can falsify this null hypothesis with an observable naturalistic mechanism, we have moved a long way towards modeling the spontaneous molecular evolution of life. Falsification is essential to discount teleology. But life requires a particular version of “far from equilibrium” that explains formal organization, not just physicodynamic self-ordering as seen in Prigogine’s dissipative structures. Life is controlled and regulated, not just constrained. Life follows arbitrary rules of behavior, not just invariant physical laws. To explain life’s origin and regulation naturalistically, we must first explain the more fundamental question, “How can hotter, faster moving, ideal gas molecules be dichotomized from cooler, slower moving, ideal gas molecules without the Demon’s choice contingency operating the trap door?” https://www.academia.edu/9963341/MOVING_FAR_FROM_EQUILIBRIUM_IN_A_PREBIOTIC_ENVIRONMENT_The_role_of_Maxwell_s_Demon_in_life_origin
bornagain77
April 13, 2015
April
04
Apr
13
13
2015
02:56 AM
2
02
56
AM
PDT
SalC: I have shown you the general framework that grounds 2LOT in stat mech, a framework that is generally uncontroversial as a first reasonable approach. It is therefore time for you to acknowledge that rather than cover with a Wilson Arte of Rhetorique blanket of silence then pivot to the next objection. The further objection also fails because the first relevant context is Darwin's pond and the like. Indeed, thermodynamically/Chemical Kinetics wise, there is a big problem getting to having the relevant monomers for life and sustaining any significant conc. That goes all the way back to Thaxton et al and likely beyond. For OOBP, we have encapsulation and we have a situation where to cross the sea of non-function, there can be no favourable selection pressure as there is no fitness to have a rising slope on . . . indeed, that is a key part of the OOBP problem. Remember, 10 - 100+ mn base pairs to account for. Dozens of times over. Where, it is known from the chaining chemistry for both proteins and D/RNA that there is no significant chemical sequencing bias in the chains. And if there were, it would reduce info carrying capacity for D/RNA and functional chaining for proteins. To see why consider H2O, which is polar. Once we drain away thermal agitation sufficiently for the polarisation to have an effect, we form definite crystalline patterns in a definite lattice imposed by the bias of the geometry and the polarisation. This of course brings out the point that the structuring and ordering come from a flow coupled to something that gives a pattern Here, locked into the molecular structure and relatively inflexible. Constructors controlled by prescriptive information are able to impart organisation note merely order; but obviously require things to be sufficiently in control thermally -- overheat an organism and it cooks; degrading the complex molecules. Even a high fever is dangerous. KF PS: I also must remind you that when someone like L K Nash uses coin flipping as a useful first model to capture the essential point of stat mech [i.e. a 2-state base element], that should give you pause before rushing off to dismissive language about equivocations. The logic involved is also quite general, it is a logic of states and statistics with onward links into the informational perspective that you patently are unwilling to address. I strongly suggest going to a library and pulling Nash. If it has Harry Robertson's Statistical Thermophysics [Prentice], try that too.kairosfocus
April 13, 2015
April
04
Apr
13
13
2015
01:32 AM
1
01
32
AM
PDT
bornagain77 at #37, usually I have no time to check your list of references (too long, mainly out of topic, etc. ) but your announcement of an experiment that demonstrates that information can violate (also if only "locally", as you added cautiously) the SLOT was so stunning that I looked more in detail the paper. So, I was really disappointed to read in the continuation of the article that you omitted: "The experiment did not violate the second law of thermodynamics because energy was consumed in the experiment by the apparatus used, and by the experimenters themselves, who did work in monitoring the particle and adjusting the voltage, … " So, no breaking news, the old SLOT was not violated.Germanicus
April 13, 2015
April
04
Apr
13
13
2015
12:40 AM
12
12
40
AM
PDT
KF and Niwrad, Old comrades, sorry we disagree. Our difference on the question of 2LOT are obviously irreconcilable. Apologies for any harsh words. Peace be with you, and God bless. Salscordova
April 12, 2015
April
04
Apr
12
12
2015
05:50 PM
5
05
50
PM
PDT
Besides providing direct empirical falsification of neo-Darwinian claims as to information being 'emergent' from a material basis, the implication of finding 'non-local', beyond space and time, and ‘conserved’ quantum information in molecular biology on a massive scale is fairly, and pleasantly, obvious:
Does Quantum Biology Support A Quantum Soul? – Stuart Hameroff - video (notes in description) http://vimeo.com/29895068 Quantum Entangled Consciousness - Life After Death - Stuart Hameroff - video https://vimeo.com/39982578
Verse and Music:
John 1:1-4 In the beginning was the Word, and the Word was with God, and the Word was God. He was in the beginning with God. All things were made through Him, and without Him nothing was made that was made. In Him was life, and the life was the light of men. The Word Is Alive - Casting Crowns https://www.youtube.com/watch?v=8rRF1FQed7c
bornagain77
April 12, 2015
April
04
Apr
12
12
2015
04:45 PM
4
04
45
PM
PDT
In other words, to give a coherent explanation for an effect that is shown to be completely independent of any time and space constraints one is forced to appeal to a cause that is itself not limited to time and space! i.e. Put more simply, you cannot explain a effect by a cause that has been falsified by the very same effect you are seeking to explain! Improbability arguments of various ‘special’ configurations of material particles, which have been a staple of the arguments against neo-Darwinism, simply do not apply since the cause is not within the material particles in the first place! Of related interest, classical information is found to be a subset of ‘non-local' (i.e. beyond space and time) quantum entanglement/information by the following method:
Quantum knowledge cools computers: New understanding of entropy – June 2011 Excerpt: No heat, even a cooling effect; In the case of perfect classical knowledge of a computer memory (zero entropy), deletion of the data requires in theory no energy at all. The researchers prove that “more than complete knowledge” from quantum entanglement with the memory (negative entropy) leads to deletion of the data being accompanied by removal of heat from the computer and its release as usable energy. This is the physical meaning of negative entropy. Renner emphasizes, however, “This doesn’t mean that we can develop a perpetual motion machine.” The data can only be deleted once, so there is no possibility to continue to generate energy. The process also destroys the entanglement, and it would take an input of energy to reset the system to its starting state. The equations are consistent with what’s known as the second law of thermodynamics: the idea that the entropy of the universe can never decrease. Vedral says “We’re working on the edge of the second law. If you go any further, you will break it.” http://www.sciencedaily.com/releases/2011/06/110601134300.htm Vlatko Vedral - Entanglement and its relationship to thermodynamics - QuICC Lecture 1 https://www.youtube.com/watch?v=sBBxIa2CK6o Vlatko Vedral - Entanglement and its relationship to thermodynamics - QuICC Lecture 2 https://www.youtube.com/watch?v=wNpD5tjs0Cs Vlatko Vedral - Entanglement and its relationship to thermodynamics - QuICC Lecture 3 https://www.youtube.com/watch?v=t5PCYhlXLHA
,,,And here is the evidence that quantum information is in fact ‘conserved’;,,,
Quantum no-hiding theorem experimentally confirmed for first time Excerpt: In the classical world, information can be copied and deleted at will. In the quantum world, however, the conservation of quantum information means that information cannot be created nor destroyed. This concept stems from two fundamental theorems of quantum mechanics: the no-cloning theorem and the no-deleting theorem. A third and related theorem, called the no-hiding theorem, addresses information loss in the quantum world. According to the no-hiding theorem, if information is missing from one system (which may happen when the system interacts with the environment), then the information is simply residing somewhere else in the Universe; in other words, the missing information cannot be hidden in the correlations between a system and its environment. http://www.physorg.com/news/2011-03-quantum-no-hiding-theorem-experimentally.html Quantum no-deleting theorem Excerpt: A stronger version of the no-cloning theorem and the no-deleting theorem provide permanence to quantum information. To create a copy one must import the information from some part of the universe and to delete a state one needs to export it to another part of the universe where it will continue to exist. http://en.wikipedia.org/wiki/Quantum_no-deleting_theorem#Consequence Black holes don't erase information, scientists say - April 2, 2015 Excerpt: The "information loss paradox" in black holes—a problem that has plagued physics for nearly 40 years—may not exist.,,, This is an important discovery, Stojkovic says, because even physicists who believed information was not lost in black holes have struggled to show, mathematically, how this happens. His new paper presents explicit calculations demonstrating how information is preserved, he says. The research marks a significant step toward solving the "information loss paradox," a problem that has plagued physics for almost 40 years, since Stephen Hawking first proposed that black holes could radiate energy and evaporate over time. This posed a huge problem for the field of physics because it meant that information inside a black hole could be permanently lost when the black hole disappeared—a violation of quantum mechanics, which states that information must be conserved. http://phys.org/news/2015-04-black-holes-dont-erase-scientists.html+/ Information Conservation and the Unitarity of Quantum Mechanics Excerpt: “In more technical terms, information conservation is related to the unitarity of quantum mechanics. In this article, I will explain what unitarity is and how it’s related to information conservation.” http://youngsubyoon.com/QMunitarity.htm
bornagain77
April 12, 2015
April
04
Apr
12
12
2015
04:45 PM
4
04
45
PM
PDT
As to Sal's contention:
"I’ve argued against using information theory type arguments in defense of ID, it adds way too much confusion. Basic probability will do the job, and basic probability is clear and unassailable."
Whilst I would agree with the overall sentiment of that statement, I must point out that information has now been physically measured and shown to have a 'thermodynamic content': While neo-Darwinian evolution has no evidence that material processes can generate functional, i.e. prescriptive, information, it is now shown that information introduced by the knowledge of an intelligence can 'locally' violate the second law and generate potential energy:
Maxwell's demon demonstration (knowledge of a particle's position) turns information into energy - November 2010 Excerpt: Scientists in Japan are the first to have succeeded in converting information into free energy in an experiment that verifies the "Maxwell demon" thought experiment devised in 1867.,,, In Maxwell’s thought experiment the demon creates a temperature difference simply from information about the gas molecule temperatures and without transferring any energy directly to them.,,, Until now, demonstrating the conversion of information to energy has been elusive, but University of Tokyo physicist Masaki Sano and colleagues have succeeded in demonstrating it in a nano-scale experiment. In a paper published in Nature Physics they describe how they coaxed a Brownian particle to travel upwards on a "spiral-staircase-like" potential energy created by an electric field solely on the basis of information on its location. As the particle traveled up the staircase it gained energy from moving to an area of higher potential, and the team was able to measure precisely how much energy had been converted from information. http://www.physorg.com/news/2010-11-maxwell-demon-energy.html Demonic device converts information to energy - 2010 Excerpt: "This is a beautiful experimental demonstration that information has a thermodynamic content," says Christopher Jarzynski, a statistical chemist at the University of Maryland in College Park. In 1997, Jarzynski formulated an equation to define the amount of energy that could theoretically be converted from a unit of information2; the work by Sano and his team has now confirmed this equation. "This tells us something new about how the laws of thermodynamics work on the microscopic scale," says Jarzynski. http://www.scientificamerican.com/article.cfm?id=demonic-device-converts-inform
The importance of this experiment cannot be understated. Materialists hold that information is not physically real but is merely 'emergent' from a material basis. In fact, in the past I have had, and many others on UD have had, debates with materialists defending the fact that the information in the cell is not simply a metaphor but is in fact real.,, More than once I have used the following reference to refute the 'information is just a metaphor' claim of materialists:
Information Theory, Evolution, and the Origin of Life - Hubert P. Yockey, 2005 Excerpt: “Information, transcription, translation, code, redundancy, synonymous, messenger, editing, and proofreading are all appropriate terms in biology. They take their meaning from information theory (Shannon, 1948) and are not synonyms, metaphors, or analogies.” http://www.cambridge.org/catalogue/catalogue.asp?isbn=9780521802932&ss=exc
Thus the fact that information is now found to have a 'thermodynamic content' and to be physically real is of no small importance. Moreover, the finding that information has a 'thermodynamic content', and is thus physically real, directly supports Andy C. McIntosh's claim about information and the thermodynamics of the cell. Specifically, Andy C. McIntosh, professor of thermodynamics and combustion theory at the University of Leeds, holds that non-material information is what is constraining the cell to be is such a extremely high thermodynamic non-equilibrium state. Moreover, Dr. McIntosh holds that regarding information to be independent of energy and matter, instead of emergent from energy and matter, 'resolves the thermodynamic issues and invokes the correct paradigm for understanding the vital area of thermodynamic/organisational interactions'.
Information and Thermodynamics in Living Systems - Andy C. McIntosh - 2013 Excerpt: ,,, information is in fact non-material and that the coded information systems (such as, but not restricted to the coding of DNA in all living systems) is not defined at all by the biochemistry or physics of the molecules used to store the data. Rather than matter and energy defining the information sitting on the polymers of life, this approach posits that the reverse is in fact the case. Information has its definition outside the matter and energy on which it sits, and furthermore constrains it to operate in a highly non-equilibrium thermodynamic environment. This proposal resolves the thermodynamic issues and invokes the correct paradigm for understanding the vital area of thermodynamic/organisational interactions, which despite the efforts from alternative paradigms has not given a satisfactory explanation of the way information in systems operates.,,, http://www.worldscientific.com/doi/abs/10.1142/9789814508728_0008
Here is a fairly recent video by Dr. Giem, that gets the main points of Dr. McIntosh’s paper over very well for the lay person:
Biological Information – Information and Thermodynamics in Living Systems 11-22-2014 by Paul Giem (A. McIntosh) – video https://www.youtube.com/watch?v=IR_r6mFdwQM
Of supplemental note: On top of classical information, 'quantum information' is now found in the cell. First, it is important to learn that ‘non-local’, beyond space and time, quantum entanglement (A. Aspect, A. Zeilinger, etc..) can be used as a ‘quantum information channel’,,,
Quantum Entanglement and Information Quantum entanglement is a physical resource, like energy, associated with the peculiar nonclassical correlations that are possible between separated quantum systems. Entanglement can be measured, transformed, and purified. A pair of quantum systems in an entangled state can be used as a quantum information channel to perform computational and cryptographic tasks that are impossible for classical systems. The general study of the information-processing capabilities of quantum systems is the subject of quantum information theory. http://plato.stanford.edu/entries/qt-entangle/
And this 'quantum entanglement/information' is now found in the cell on a massive scale in every DNA and protein molecule. Moreover this 'quantum entanglement/information' is implicated in performing quantum computation in the cell so as to provide resolutions to the unsolved problems of protein folding and DNA repair (notes and references given upon request).
Quantum Information/Entanglement In DNA - short video https://vimeo.com/92405752 Classical and Quantum Information Channels in Protein Chain - Dj. Koruga, A. Tomi?, Z. Ratkaj, L. Matija - 2006 Abstract: Investigation of the properties of peptide plane in protein chain from both classical and quantum approach is presented. We calculated interatomic force constants for peptide plane and hydrogen bonds between peptide planes in protein chain. On the basis of force constants, displacements of each atom in peptide plane, and time of action we found that the value of the peptide plane action is close to the Planck constant. This indicates that peptide plane from the energy viewpoint possesses synergetic classical/quantum properties. Consideration of peptide planes in protein chain from information viewpoint also shows that protein chain possesses classical and quantum properties. So, it appears that protein chain behaves as a triple dual system: (1) structural - amino acids and peptide planes, (2) energy - classical and quantum state, and (3) information - classical and quantum coding. Based on experimental facts of protein chain, we proposed from the structure-energy-information viewpoint its synergetic code system. http://www.scientific.net/MSF.518.491
That ‘non-local’ quantum entanglement, which conclusively demonstrates that ‘information’ in its pure ‘quantum form’ is completely transcendent of any time and space constraints (Bell, Aspect, Leggett, Zeilinger, etc..), should be found in molecular biology on such a massive scale, in every DNA and protein molecule, is a direct empirical falsification of Darwinian claims, for how can the ‘non-local’ quantum entanglement ‘effect’ in biology possibly be explained by a material (matter/energy) cause when the quantum entanglement effect falsified material particles as its own causation in the first place? Appealing to the probability of various 'random' configurations of material particles, as Darwinism does, simply will not help since a timeless/spaceless cause must be supplied which is beyond the capacity of the material particles themselves to supply!
Looking beyond space and time to cope with quantum theory – 29 October 2012 Excerpt: “Our result gives weight to the idea that quantum correlations somehow arise from outside spacetime, in the sense that no story in space and time can describe them,” http://www.quantumlah.org/highlight/121029_hidden_influences.php
bornagain77
April 12, 2015
April
04
Apr
12
12
2015
04:44 PM
4
04
44
PM
PDT
groovamos: Maybe you can precisely delineate for us your categorization scheme to be able to define ‘science’ in your view, please.
Science is observation and formulation of falsifiable hypotheses, and hypothesis testing via experiments. I think many of the claims of history are true, like George Washington was the first president. But such truth claims are not the soul of science even though they are true claims. The one part of ID that is scientific and testable is rejection of chance and law as an explanation. That's good enough for me, and that is scientific. The inference to an intelligent designer, the "I" in ID however, is formally only an inference. ID strictly speaking, absent the Designer himself, seems outside of experimental science. But that is my opinion, not those of other IDists. For the record, I believe ID is true, but I'm not eager to call it science. Showing the chance and law cannot make certain designs like algorithmic metabolisms, that is science.scordova
April 12, 2015
April
04
Apr
12
12
2015
03:06 PM
3
03
06
PM
PDT
in statistical mechanics, the Second Law is not a postulate, rather it is a consequence of the fundamental postulate, also known as the equal prior probability postulate
Ah yes, statistical mechanics is founded on the "equal prior probability postulate". Since the 2nd law can proceed from the equal prior probability postulate, does that imply "the 2nd law is then applicable to the equal prior probability of each of the possible 500 fair coins configurations"? No! To say so would be as backward and invalid as saying the 2nd law of thermodynamics proves Rolle's Theorem because the 2nd law utilizes calculus. Equal prior probability applies to a system of 500 fair coins because the coins are fair, not because of the 2nd law of thermodynamics! The statistics of the 2^500 possible heads/tails microstates proceeds from the coins being fair, not because of the 2nd law of thermodynamics.scordova
April 12, 2015
April
04
Apr
12
12
2015
02:43 PM
2
02
43
PM
PDT
SalC: I still do not see a substantial response to some fairly specific information already posted in 26 above, that shows exactly how stat Mech undergirds 2LOT in ways that then apply directly to the issue of origin of FSCO/I. You are continuing to play at tag and dismiss, inappropriately. Is that an admission, by implication? Please, do better. KF
The equal prior probability postulate in formulation of the 2nd law applies to thermodynamic micrsotates, not microstates related to FCSO/I states. So in what textbook is FSCO/I a thermodynamic state? I pointed out you can't equivocate thermodynamic states with states of interest to IDists. 500 fair coins 100% heads is a state of hypothetical interest to IDists. the heads/tails microstates obeys the equal probability postulate, but heads/tails microstates aren't thermodynamic microstates. Just because heads/tails microstates obey the equal probability postulate, doesn't mean you can apply the 2nd law of thermodynamics to it! And again, where is the statement of 2nd_law_SM?
Please, do better. KF
Quit pretending your equivocations are actual arguments. Heads/tails configurations aren't thermodynamic microstates, and neither are more complicated design states generally thermodynamic microstates.scordova
April 12, 2015
April
04
Apr
12
12
2015
02:08 PM
2
02
08
PM
PDT
Kairosfocus, First of all, thank you for a well-researched, well-thought out, and wonderfully articulated article! And thank you for taking the time and effort to put it together. Here are some thoughts inspired by your article and comments. * Measuring information and design quantitatively – This is very hard to do, perhaps impossible. I've never been comfortable with the idea. For example, a large computer program does not necessarily require more information or more design than a small one. In fact, a small one that efficiently meets requirements can be extremely challenging to design. Requirements for human manufacturing are often evident in design for manufacturability, design for low cost, design for common parts, design for low tolerances, design for ruggedness, design for maintainability (or not), and so on. This might not be self-evident. For example, which is a “better” design, a deciduous leaf or an evergreen leaf? Which one is more expensive to for an organism to manufacture? * Complexity and probability – What's the chance of the design-free origin of a Boeing 777 or a city? The complex chemical cycles, codes, and physical operations within a cell are billions and billions and billions of times more complex than anything humans ever created. A mere 13.7 billion years is not remotely enough time to account for life anywhere in the universe. Irreducible complexity challenges the notion that incredibly complex, self- assembling, self-replicating, and self-repairing systems can derive from random collections of parts by tiny steps, each of which results in a more persistent configuration than being separate. * Existence – The instantaneous and simultaneous origin of time, mass-energy, and spatial dimensions cannot be explained by pre-existing physical laws; probability doesn't exist before time; and the instantaneous ex nihilo existence of the universe is a miracle, not science. There's no other option outside of wishful thinking. * On God I – The same people who choke on the question of what created God easily swallow the existence of the multiverse. * On God II – The same people who rage against God for allowing “evil” will then themselves perform “evil” things when they feel like it without remorse and without rage. * On God III – The people who criticize and then deny the existence of God, first project a ridiculously simplistic anthropomorphic caricature of Him as a straw man, while denying the ridiculously obvious possibility that they might not be able to comprehend God. Thus, the denial of God is not an intellectual objection, but rather an objection of the will. Thanks again for your work! -QQuerius
April 12, 2015
April
04
Apr
12
12
2015
01:41 PM
1
01
41
PM
PDT
SalC: I still do not see a substantial response to some fairly specific information already posted in 26 above, that shows exactly how stat Mech undergirds 2LOT in ways that then apply directly to the issue of origin of FSCO/I. You are continuing to play at tag and dismiss, inappropriately. Is that an admission, by implication? Please, do better. KFkairosfocus
April 12, 2015
April
04
Apr
12
12
2015
01:26 PM
1
01
26
PM
PDT
statistical thermodynamics underpinnings of 2LOT
So Statistical mechanics underpins 2LOT, and statistical mechanics encompasses and is under pinned by: 1. fundamental theorem of calculus 2. Newton's 2nd law 3. Hamiltonians principle 4. Lioville Theroem 5. Law of Large Numbers 6. Quantum Mechanics (including Shrodinger's Equation) 7. Classical Mechanics 8. General Relativity (statistical mechanics of say the Bekenstein) 9. Electro magnetic theory etc. etc. Does that justify saying 2nd_law_SM should be used to show trajectory of rocket ships since according to 2nd_law_SM it would "englobe" all sorts of existing bodies of physics and mathematics including Newtow's 2nd law (since Newton's 2nd law of Motion is necessary for the Gibbs type derivation statistical mechanical entropy through the Lioville theorem)? Of course not. Would we use the 2nd_law_sm (which supposedly englobes general relativity since the most general 2nd law involves Bekenstein which involves general relativity) would we use it to prove gravitational lensing? Of course not! So it is just as inappropriate to say, just because statistical mechanics uses the Law of Large Numbers (LLN), that somehow 2nd_law_SM proves situations where LLN would be used to analyze 500 fair coins 100% heads.
it seems clear that you have neither read nor acknowledged the mere existence of the point I just made to extend previous remarks about the statistical thermodynamics underpinnings of 2LOT;
Wrong. It is clear you don't acknowledge that just because 2LOT might be approximately derived in part from Newtonian/Classical Mechanics as Gibbs demonstrated, you can't use 2LOT as the basis of Classical Mechanics. In like manner, you can't use 2LOT to justify statistical mechanics nor the Law of Large numbers, nor 500 fair coins being 100% heads. The only place such invalid backward inferences are allowable is in Niwrad_KF_2nd_law_of_statistical_mechanics. It's not valid physics nor logic.
I think you owe us a fairly serious explanation for your behaviour.
I don't want IDists to think Niwrad_KF_2nd_law_of_statistical_mechanics is a correct representation of the 2nd law of thermodynamics. Btw, you've yet to state what 2nd_law_sm in any clear, concise, understandable and usable way. Here, I'll give you a paraphrase from what I've been reading:
The 2nd_law_SM states that: All the knowledge that statistical mechanics is founded on is valid. That foundational knowledge upon which statistical mechanics is built can be used to demonstrate whatever can be demonstrated by said foundational knowledge. That said knowledge englobes, but is not limited to all the collected knowledge of: 1. Mathematics 2. Physics 3. Chemistry 4. Cybernetics and pretty much anything that can be used by the discipline of statistical mechanics.
That's about all that you've really said, which is rather vacuous as a law. If you don't like that paraphrase, provide for the reader a workable definition of 2nd_law_sm.scordova
April 12, 2015
April
04
Apr
12
12
2015
12:08 PM
12
12
08
PM
PDT
SalC: it seems clear that you have neither read nor acknowledged the mere existence of the point I just made to extend previous remarks about the statistical thermodynamics underpinnings of 2LOT; which includes a relevant citation and onward link to a source not noted for sympathy to ID but which on such technical matters has to acknowledge certain facts. You then proceeded to reiterate an ad hominem abusive name assignment. This is not good enough, and I think you owe us a fairly serious explanation for your behaviour. KFkairosfocus
April 12, 2015
April
04
Apr
12
12
2015
11:30 AM
11
11
30
AM
PDT
scordova: I don’t think ID is science, even though I believe it is true.... So a project to systematically overturn an entrenched view held by scientists, who consider such a view as scientific, cannot be itself scientific if successful? How about the man, (who was not what I would consider) a scientist in the 19th century, creating a 'scientific' viewpoint akin to a substitute religion is going to be refuted by anything other than what one could call a science? A project which is subjected to a would-be scientific take down in the journal Science cannot have any scientific basis? Maybe you can precisely delineate for us your categorization scheme to be able to define 'science' in your view, please. Maybe even look at it this way. I as a holder of an advance degree in a STEM discipline am constantly evaluating my debate strategy with the Darwinian opposition, and my evaluations are constantly informed by my training in order to take these guys and their purported story down. So if you are similarly trained as myself and are not doing the same, please explain. Remember the opposition is not letting up in their 'scientific' stance to debate us, so what good does it do to give up the 'scientific' moniker to people who can't seem to ignore us?groovamos
April 12, 2015
April
04
Apr
12
12
2015
10:48 AM
10
10
48
AM
PDT
Pardon, but this begins to look rather unnecessary
Pardon, you give a 4,458 word OP and yet are unwilling to state the actual version of the 2nd Law you are working from. Please state the 2nd law you are working from and then make a deduction based on the 2nd law that a random process will not result in 500 fair coins 100% heads. The problem is that 2nd law is restricted to only certain kinds of microstates and macrostates, wherease the LLN is not.
So, to try to set up a strawman and attach Niw’s and my name to it is a case of implicit ad hominem abusive by strawman caricature.
Neither you nor Niw have given an accepted definition of the 2nd law, it is inappropriate to call it 2nd_law_SM as if is an actually accepted law by physicists, chemists and engineers. In actuality it is Niwrad_KFs_2nd_law_of_statistical_mechanics. It is not the 2nd law of thermodyanamics. To insist 2nd_law_SM is the second law of thermodynamics is an equivocation. State the 2nd_law_SM, and then cite where it is accepted as a definition in practice. If it isn't a real law in the literature, it would be best to stop pretending it is. Frankly I wouldn't want the next generation of IDists being exposed and taught equivocations rather than the real thing, and the real thing is: KELVIN STATEMENT OF 2nd LAW OF THERMODYNAMICS
It is impossible, by means of inanimate material agency, to derive mechanical effect from any portion of matter by cooling it below the temperature of the coldest of the surrounding objects
scordova
April 12, 2015
April
04
Apr
12
12
2015
10:45 AM
10
10
45
AM
PDT
kairosfocus, well, yes there is not a law of physics with our name :( but since according to scordova all physics textbooks say that systems go toward improbability may be you and I together at least could send a mail to ask an errata corrige no? :) ;)niwrad
April 12, 2015
April
04
Apr
12
12
2015
10:36 AM
10
10
36
AM
PDT
SalC: Pardon, but this begins to look rather unnecessary, given what has been discussed above and in OP as well as elsewhere:
where then will I find 2nd_law_SM in textbook or even journaled physics?
Niw's material point is that the 2nd law has a statistical foundation; one established for over a hundred years. Namely, the spontaneous direction of change in a system free to evolve is towards dominant clusters of microstates, in which distribution of mass and energy at micro levels will move towards higher numbers of possibilities. That point is valid and should be acknowledged first and foremost. Thereafter any infelicities of phrasing (for one for whom English is a distinct 2nd or more language) can be addressed. Nor am I doing anything so pretentious as proposing to put forward a new law of Science, I have been repeatedly careful to point to 100+ years of foundational stat mech that elucidates the roots of 2LOT, and brings forth the statistical underpinnings. Which, I distinctly recall, was taught to me long since when I studied the subject. Such is also, as well you know or should know, a commonplace in thermodynamics education. It goes so far back that I would be surprised to see it in journals of any currency or recency. It will instead be in the classics by Gibbs, Boltzmann and co, and it will be part of the general discussion; indeed it is a consequence of core principles of stat mech, but is of great empirical significance. BTW, likewise the first law of motion, strictly is a trivial consequence of the second as once we set F = 0 a must be 0, but is of great conceptual and real world significance. A similar pattern obtains with stat mech underpinnings and the 2nd law; which, recall, was first effectively empirically identified, by several different workers. For example here is Wiki on the 2nd law, in a subsection with an interesting title:
Derivation from statistical mechanics Further information: H-theorem Due to Loschmidt's paradox, derivations of the Second Law have to make an assumption regarding the past, namely that the system is uncorrelated at some time in the past; this allows for simple probabilistic treatment. This assumption is usually thought as a boundary condition, and thus the second Law is ultimately a consequence of the initial conditions somewhere in the past, probably at the beginning of the universe (the Big Bang), though other scenarios have also been suggested.[54][55][56] Given these assumptions, in statistical mechanics, the Second Law is not a postulate, rather it is a consequence of the fundamental postulate, also known as the equal prior probability postulate, so long as one is clear that simple probability arguments are applied only to the future, while for the past there are auxiliary sources of information which tell us that it was low entropy.[citation needed] The first part of the second law, which states that the entropy of a thermally isolated system can only increase, is a trivial consequence of the equal prior probability postulate, if we restrict the notion of the entropy to systems in thermal equilibrium . . . . Suppose we have an isolated system whose macroscopic state is specified by a number of variables. These macroscopic variables can, e.g., refer to the total volume, the positions of pistons in the system, etc. Then W will depend on the values of these variables. If a variable is not fixed, (e.g. we do not clamp a piston in a certain position), then because all the accessible states are equally likely in equilibrium, the free variable in equilibrium will be such that W is maximized as that is the most probable situation in equilibrium. If the variable was initially fixed to some value then upon release and when the new equilibrium has been reached, the fact the variable will adjust itself so that W is maximized, implies that the entropy will have increased or it will have stayed the same (if the value at which the variable was fixed happened to be the equilibrium value). Suppose we start from an equilibrium situation and we suddenly remove a constraint on a variable. Then right after we do this, there are a number W of accessible microstates, but equilibrium has not yet been reached, so the actual probabilities of the system being in some accessible state are not yet equal to the prior probability of 1/W. We have already seen that in the final equilibrium state, the entropy will have increased or have stayed the same relative to the previous equilibrium state . . . . The second part of the Second Law states that the entropy change of a system undergoing a reversible process is given by: dS = d'Q/T where the temperature is defined as: 1/k*T = BETA = d ln [W(E)]/ dE [ . . . after derivation (follow the link) . . . dS will be as expected]
Other sources will run along much the same lines. 2LOT is underpinned by statistical thermodynamics. So, to try to set up a strawman and attach Niw's and my name to it is a case of implicit ad hominem abusive by strawman caricature. Kindly, stop it. As for textbook summaries, a simple one is right there in my linked note, as is cited at 5 above:
Yavorski and Pinski, in the textbook Physics, Vol I [MIR, USSR, 1974, pp. 279 ff.], summarise the key implication of the macro-state and micro-state view well: as we consider a simple model of diffusion, let us think of ten white and ten black balls in two rows in a container. There is of course but one way in which there are ten whites in the top row; the balls of any one colour being for our purposes identical. But on shuffling, there are 63,504 ways to arrange five each of black and white balls in the two rows, and 6-4 distributions may occur in two ways, each with 44,100 alternatives. So, if we for the moment see the set of balls as circulating among the various different possible arrangements at random, and spending about the same time in each possible state on average, the time the system spends in any given state will be proportionate to the relative number of ways that state may be achieved. Immediately, we see that the system will gravitate towards the cluster of more evenly distributed states. In short, we have just seen that there is a natural trend of change at random, towards the more thermodynamically probable macrostates, i.e the ones with higher statistical weights. So “[b]y comparing the [thermodynamic] probabilities of two states of a thermodynamic system, we can establish at once the direction of the process that is [spontaneously] feasible in the given system. It will correspond to a transition from a less probable to a more probable state.” [p. 284.] This is in effect the statistical form of the 2nd law of thermodynamics. Thus, too, the behaviour of the Clausius isolated system above [with interacting sub-systemd A and B that transfer d'Q to B due to temp. difference] is readily understood: importing d’Q of random molecular energy so far increases the number of ways energy can be distributed at micro-scale in B, that the resulting rise in B’s entropy swamps the fall in A’s entropy. Moreover, given that [FSCO/I]-rich micro-arrangements are relatively rare in the set of possible arrangements, we can also see why it is hard to account for the origin of such states by spontaneous processes in the scope of the observable universe. (Of course, since it is as a rule very inconvenient to work in terms of statistical weights of macrostates [i.e W], we instead move to entropy, through s = k ln W. Part of how this is done can be seen by imagining a system in which there are W ways accessible, and imagining a partition into parts 1 and 2. W = W1*W2, as for each arrangement in 1 all accessible arrangements in 2 are possible and vice versa, but it is far more convenient to have an additive measure, i.e we need to go to logs. The constant of proportionality, k, is the famous Boltzmann constant and is in effect the universal gas constant, R, on a per molecule basis, i.e we divide R by the Avogadro Number, NA, to get: k = R/NA. The two approaches to entropy, by Clausius, and Boltzmann, of course, correspond. In real-world systems of any significant scale, the relative statistical weights are usually so disproportionate, that the classical observation that entropy naturally tends to increase, is readily apparent.)
The basic point should be plain: 2LOT is closely tied to the statistical analysis, and that same analysis is what highlights the problems with the notion of trying to resort to irrelevant energy and mass flows as though they provide compensation rendering 2LOT a non-problem. KFkairosfocus
April 12, 2015
April
04
Apr
12
12
2015
09:31 AM
9
09
31
AM
PDT
Here is where Sewell (and all other 2nd_law_SM IDers) disagree with you: the conception of statistical mechanics (SM). SM, as its name says, englobes as a tool statistics (and of course probability theory). So in a sense SM englobes also your LLN argument, but not viceversa.
So where then will I find 2nd_law_SM in textbook or even journaled physics? I provided at least 5 acceptable versions of the 2nd law that can be easily found in Universities or Professional literature, none of them say what your 2nd_law_SM says. What you have said is vague description of statistical mechanics, it IS NOT the 2nd law of thermodynamics. For the reader's benefit, thermodynamics is only one branch of statistical mechanics. Here is the wiki description of Statistical Mechanics. Let the reader note, there is no "law of statistical mechanics":
Statistical mechanics is a branch of theoretical physics and chemistry (and mathematical physics) that studies, using probability theory, the average behaviour of a mechanical system where the state of the system is uncertain .... common use of statistical mechanics is in explaining the thermodynamic behaviour of large systems. Microscopic mechanical laws do not contain concepts such as temperature, heat, or entropy, however, statistical mechanics shows how these concepts arise from the natural uncertainty that arises about the state of a system when that system is prepared in practice. The benefit of using statistical mechanics is that it provides exact methods to connect thermodynamic quantities (such as heat capacity) to microscopic behaviour, whereas in classical thermodynamics the only available option would be to just measure and tabulate such quantities for various materials. Statistical mechanics also makes it possible to extend the laws of thermodynamics to cases which are not considered in classical thermodynamics, for example microscopic systems and other mechanical systems with few degrees of freedom.[1] This branch of statistical mechanics which treats and extends classical thermodynamics is known as statistical thermodynamics or equilibrium statistical mechanics. Statistical mechanics also finds use outside equilibrium. An important subbranch known as non-equilibrium statistical mechanics deals with the issue of microscopically modelling the speed of irreversible processes that are driven by imbalances. Examples of such processes include chemical reactions, or flows of particles and heat. Unlike with equilibrium, there is no exact formalism that applies to non-equilibrium statistical mechanics in general and so this branch of statistical mechanics remains an active area of theoretical research.
2nd_law_SM is nowhere to be found. I don't think it is appropriate to represent 2nd_law_SM as some sort of accepted version of the 2nd law of thermodynamics. It is not, it's not even theromodynamics. "2nd_law_SM" deals with systems not describable by thermodynamic state variables such as: Temperature, Volume, Number of Particles, Pressure. It is therefore not a thermodynamic law. If you say "random process will lead to most probable state" that is another restatement of LLN, it is not the 2nd law of thermodynamics. The reader can try googling 2nd_law_SM or "laws of statistical mechanics". You won't find too many matches to that because it is an idiosyncratic construction, it is not the 2nd law of thermodynamics which is simply stated by CLAUSIUS:
Heat can never pass from a colder to a warmer body without some other change, connected therewith, occurring at the same time.
or KELVIN:
It is impossible, by means of inanimate material agency, to derive mechanical effect from any portion of matter by cooling it below the temperature of the coldest of the surrounding objects
I don't think it is too much to ask for a statement of 2nd_law_SM and where it's been codified by professionals in the field like say, Dr. Lambert. If there is no such thing, it is only appropriate to say it is a construction unique to UD, and further, it is inappropriate to try to represent it as some sort of accepted version of the 2nd law of thermodynamics. Call it Niwrad_KF_law_of_thermodynamics, that would be more accurate, don't call it the 2nd law of Statistical Mechanics, because there is no such thing. There might be derivations of the 2nd of thermodynamics from statistical mechanics, but that is not the same as Niwrad_KF_law_of_thermodynamics or Niwrad_KF_2nd_law_of_statistical_mechanics.scordova
April 12, 2015
April
04
Apr
12
12
2015
08:07 AM
8
08
07
AM
PDT
Niw, I agree. KFkairosfocus
April 12, 2015
April
04
Apr
12
12
2015
03:23 AM
3
03
23
AM
PDT
1 2 3

Leave a Reply