Uncommon Descent Serving The Intelligent Design Community

But is this fair to Feynman?

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

File:A small cup of coffee.JPG From Simon Oxenham at BigThink:

How to Use the Feynman Technique to Identify Pseudoscience

Last week a new study made headlines worldwide by bluntly demonstrating the human capacity to be misled by “pseudo-profound bullshit” from the likes of Deepak Chopra, infamous for making profound sounding yet entirely meaningless statements by abusing scientific language.

The researchers correlate believing pseudo-profundities will all kinds of things Clever People Aren’t Supposed to Like, and one suspects the paper wouldn’t survive replication. So why is this a job for Feynman?

Richard Feynman (1918-1988)

This is all well and good, but how are we supposed to know that we are being misled when we read a quote about quantum theory from someone like Chopra, if we don’t know the first thing about quantum mechanics?

Actually, one can often detect BS without knowing much about the topic at hand, because it often sounds deep but doesn’t reflect common sense. Anyway, from Feynman,

I finally figured out a way to test whether you have taught an idea or you have only taught a definition. Test it this way: You say, ‘Without using the new word which you have just learned, try to rephrase what you have just learned in your own language. Without using the word “energy,” tell me what you know now about the dog’s motion.’ You cannot. So you learned nothing about science. That may be all right. You may not want to learn something about science right away. You have to learn definitions. But for the very first lesson, is that not possibly destructive? More.

It won’t work because many people who read pop science literature do so for the same reason others listen to Deepak Chopra: They want to be reassured against their better judgement or the evidence.  Whether it’s that there are billions of habitable planets out there or that chimpanzees are entering the Stone Age, or that everything is a cosmic accident, or whatever the current schtick is.

And Feynman won’t help them, nor will a bucket of ice water. And it’s not fair to drag ol’ Feynman into it just because he said some true things like,

The first principle is that you must not fool yourself and you are the easiest person to fool.

Give the guy a break.

That said, Feynman (1918–1988) may have, through no fault of his (long-deceased) own, played a role in getting a science journalist dumped recently on suspicious grounds. See “Scientific American may be owned by Nature, but it is run by Twitter

Follow UD News at Twitter!

Hat tip: Stephanie West Allen at Brains on Purpose

Comments
ES, where also, the functional clusters or meaningful clusters are zones in configuration states, which are overwhelmingly non-functional. This is where the search across states issue comes in. KF kairosfocus
ES, needle in haystack blind search challenge. KF kairosfocus
KF, I agree, of course. I just think that it needs to be clarified that everything you mention is semantic/semiotic considerations, not a syntactic Shannon information/thermodynamic discourse. Shannon information does not capture the function or meaning of a message. EugeneS
At what scope of needle in haystack search challenge on what degree of limited resources does appeal to >>the magic words ‘fluctuation’ or ‘luck’>> become in effect a materialist miracle of the gaps? kairosfocus
Cf: https://uncommondesc.wpengine.com/popular-culture/id-and-the-overton-window-batna-march-of-folly-issue/ kairosfocus
Dr Selensky, Appreciated. I agree, Shannon info is a measure of capacity. And the nearer to a flat random distribution of glyphs the higher the nominal capacity due to squeezing out redundancies. That said, capacity in a context of observed functionality based on specificity of configuration is a measure of that, not just capacity. And, functional specificity dependent on configuration in a nodes arcs mesh yields a way to quantify complexity. Also, wiring diagram or description based configuration constrains working vs non-working configs. Thus, the islands of function in seas of non function effect and the needle in haystack blind search challenge. Where, the first issue is to get from Darwin's pond to first working life with networks, communication systems, control systems, and the key von Neumann code using kinematic self replication integrated with a functioning metabolic entity for mass and energy flows with proper waste disposal. All of which must be embryologically and ecologically feasible. Yes, irreducibly complex, functionally specific organisation and ever growing complexity are all involved. As are issues on configuration spaces (direct -- wiring diagrams) and by way of descriptions to get complexity indices. That does bring to bear statistical issues, most evidently in the case of the origin based on chem and phys in that Darwin pond or the like environment. I find the notion of a vast, interconnected continent of functional forms dubious, though that seems to be implicit in the tree of life icon. The protein fold domain patterns by contrast immediately put that under question with a large number of isolated clusters in AA sequence space. Similarly, the idea of in effect a golden search is problematic. A search can be seen as a subset of a config space, so the set of possible searches would be the power set, cardinality 2^C for a space of size C. Thus, exponentially harder and harder in succession. Search for search is a real challenge. So, I come back to the focal issue, FSCO/I is an empirically reliable strong sign of design as cause. Including the case of comms systems and linked control systems. With codes involved. That is what needs to be faced. KF kairosfocus
KF, "The same that grounds the 2nd law and explains it." Here is my understanding of the issue. I do not claim to be an expert in this area so may be easily wrong. I think that 2nd law is fundamentally statistical. I believe that the argument from the second law is not as strong as the argument from semantics/semiotics (and, consequently, FCSI). 'Needle in the haystack' is just an illustration. Critics may argue that a natural physical process (allegedly yet unknown) that produced such and such structures may not have been ergodic (like evolution is not ergodic). Therefore it did not need to traverse the entire space of possibilities. To any argument based on probabilities they can always respond with the magic words 'fluctuation' or 'luck'. That is an unfortunate loophole for them :) But design detection need not absolutely depend on it. In my estimation, the way to shut them up (well, at least those of them who are intellectually honest) is consider that this fluctuation is indeed intelligently designed because (a) it is functional, (b) absolutely relies on irreducible complexity and (c) uses code (to the degree that the system that translates code is itself coded with the same code). Shannon information does not reflect any of that. It does not reflect utility. There may be more Shannon information in gibberish text than in a meaningful message. With all due respect to statistical considerations, the iron argument in favor of ID [again, in my personal estimation] is the irreducibility of the complex system {code+processor} that yields pragmatic (non-physical) utility/function. EugeneS
JC, on long track record and the thread above, I am not so sure. In any case, the distinction needs to be clearly put. KF kairosfocus
KF: ". Mung was being satirical, as per usual. KF" As, I believe, was Zachriel. Jonas Crump
Seversky, No one of consequence takes such a view. As in compare Clausius' key example used to give the 2nd law in quantifiable form. Then ask, what is happening at micro level to ways that we can arrange mass and energy, then ask how we get to particular arrangements that are FSCO/I rich in light of search space challenges, in highly contingent circumstances, as distinct to freezing and linked crystalisation per mechanical necessity. Mung was being satirical, as per usual. KF kairosfocus
Mung: The obvious conclusion is that to freeze water requires design. If you assume that entropy can never decrease locally absent design, then that would be the conclusion. Zachriel
Mung, at cosmological level, the cluster of properties of water is a strong indicator of cosmological design. But that is an aside. Water forms crystals by mechanical necessity, which is distinct from the high contingency involved in FSCO/I. KF kairosfocus
The obvious conclusion is that to freeze water requires design. Mung
GD This epitomises your problem:
You could accept that heat flow is sufficient to produce relevant types of information. If you go this way, your overall argument is, as far as I can see, dead.
To suggest such as an option at all, means that you have patently overlooked the fact that we have a sol system and/or a wider cosmos of finite and limited resources.10^57 or 10^80 atoms, 10^17s, 10^12 - 14 atomic level interactions per s. As a direct result, for just say the atomic, paramagnetic substance or its coins equivalent, 1000 2-state elements, flipped at the 10^12 - 14 times/s, and with one such set each per atom as observer, in that duration, the atomic resources of the observed cosmos would be able to search something like one straw to a haystack that dwarfs the whole observed cosmos, translating the 1.07*10^301 possibilities into haystack blind needle search terms. Jaynes' informational thermodynamics view is important in the context of insight into the nature of entropy as a phenomenon connected to the macro-micro distinction. When we ask, what is the further info typically required to specify microstate given macrostate, we are pointing to the degrees of freedom available to a system under given PVT etc conditions, and to the degree of uncertainty and search implied in trying to define a particular microstate. Trying to turn this about into a pushing of the notion that random noise of molecular agitation and/or heat flows linked thereto is sufficient to create information shows fundamental failure to understand what is involved in blind needle in haystack search under relevant conditions. Yes, in watching something freeze into a crystal (such as water), you do gain a fair degree of information about its condition, as it is transformed to a crystal; with rather rigid structural restrictions, up to the usual issues on dislocations, inclusions, etc. That has everything to do with a transformation of mechanical necessity, and nothing to do with the constructive work required to produce aperiodic, functionally specific, contingent and complex key-lock fitting functional components. Where, I repeat, just one 300-monomer protein requires 300 * 4.32 bits/AA = 1296 bits as a baseline, though some redundancies etc are neglected in that first estimate. (That's where Durston's work on functional vs ground state etc comes in.) Let me again clip in fuller and augmented form that key couple of paragraphs from Wikipedia c April 2011, which here makes surprising concessions -- knowing its ideological tendency:
At an everyday practical level the links between information entropy and thermodynamic entropy are not close. Physicists and chemists are apt to be more interested in changes in entropy as a system spontaneously evolves away from its initial conditions, in accordance with the second law of thermodynamics, rather than an unchanging probability distribution. And, as the numerical smallness of Boltzmann's constant kB indicates, the changes in S / kB for even minute amounts of substances in chemical and physical processes represent amounts of entropy which are so large as to be right off the scale compared to anything seen in data compression or signal processing. But, at a multidisciplinary level, connections can be made between thermodynamic and informational entropy, although it took many years in the development of the theories of statistical mechanics and information theory to make the relationship fully apparent. In fact, in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon's information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states that it could be in, thus making any complete state description longer. (See article: maximum entropy thermodynamics.[Also,another article remarks: >>in the words of G. N. Lewis writing about chemical entropy in 1930, "Gain in entropy always means loss of information, and nothing more" . . . in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes/no questions that need to be answered in order to fully specify the microstate, given that we know the macrostate.>>]) Maxwell's demon can (hypothetically) reduce the thermodynamic entropy of a system by using information about the states of individual molecules; but, as Landauer (from 1961) and co-workers have shown, to function the demon himself must increase thermodynamic entropy in the process, by at least the amount of Shannon information he proposes to first acquire and store; and so the total entropy does not decrease (which resolves the paradox).
The very fact that you thought to put such an option on the table, that heat transfer could create adequately functional information of sufficient degree simply indicates to me that you have not adequately addressed the issue of blind search, multiplied by the common failure to differentiate between information holding or transmitting capacity -- what the Shannon metric measures -- and actual concrete functionally specific working configuration based information such as is commonly used in text, computation etc, and as is seen in R/DNA and proteins. The blind, needle in haystack search is inevitable on evolutionary materialist, blind chance and necessity based views. And, those who adhere to or go along with it typically overlook the deeply contextual functional specificity of working information and organisation of components per wiring diagram, node and arc networks. Where, functional specificity locked to arrangement of acceptable parts in the right orientation and arrangement, with appropriate coupling deeply constrains acceptable configurations. The only actually (and routinely) observed source of such FSCO/I is not blind search in config spaces. It is intelligently directed, purposeful configuration, as you experienced in composing your post of objections. To achieve that, energy has to first be converted into orderly shaft or flow work through some conversion mechanisms or other, then in accord with a purpose and/or plan, and put to work in an appropriate step by step sequence of constructive operations, until the resultant functionally organised entity comes into being and is fine tuned to specific acceptable form. This is the only empirically and reliably confirmed cause of FSCO/I, and it is backed up by the import of the needle in haystack search challenge as outlined above. That is the context in which complex specified information and particularly functionally specific complex organisation and associated information, is a reliable sign o design as cause. To overturn that inference, simply show that blind chance and mechanical necessity actually produce such, beyond 500 - 1,000 bits. Where per a basic estimate, a unicellular life form ab initio needs 100 - 1,000 kbases or thereabouts, and a multicellular, body plan needs 10 - 100+ mn or thereabouts on similar basic estimates and observations. At, as an initial number, 2 bits capacity per base. Those are orders of magnitude beyond the threshold composed using the toy example of a string of coins or a simple paramagnetic substance. Where, every additional bit doubles the scope of a config space. Your multi-lemma unfortunately set up a strawman target, which it duly skewered. But that is nothing to do with the actual issue on the table, to focus the importance of the micro-state, config space framework for analysis and to understand the search challenge implied in appeals to blind chance and mechanical necessity as performing relevant constructive work on the scope of the implied information required. Where of course, info storing or moving capacity is not quite the same as actual functionally specific complex organisation and/or associated information. KF kairosfocus
Kairosfocus, let me turn one of your questions back at you. Do you understand the statement:
In fact, in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon’s information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states that it could be in, thus making any complete state description longer.
I don't think you do, because if you did you'd realize that to the extent it's relevant, it severely undercuts your argument. You disagree? Then consider a simple example: compare the entropy of liquid water vs that of ice, and compare their information content (using Jaynes' equivalence). The heat of fusion of water is 79.71 Joules/gram, and since it melts at 273.15 Kelvin (= 0 Celsius = 32 Fahrenheit), the entropy difference between a gram of ice and a gram of water (both at 237.15 K) is S_water - S_ice = 1 g * 79.17 J/g / 273.15 K = 0.2898 J/K. Jaynes' equivalence gives a conversion ratio beteween thermodynamic and Shannon entropies of 1 bit = k_B * ln(2) = 9.570e-24 J/K (where k_B is the Boltzmann constant). With that conversion, the difference in Shannon entropy of the ice vs the water is 0.2898 J/K / 9.570e-24 J/K = 3.028e22 bits. (If you didn't follow that, here's a more detailed derivation: applying Boltzmann's equation S = k_B * ln(w) gives us S_water - S_ice = k_B * (ln(w_water) - ln(w_ice)) = k_B * ln(w_water/w_ice), so w_water/w_ice = e ^ ((S_water - S_ice) / k_B) = e ^ (0.2898 J/K / 1.380e?23 J/K) = e ^ 2.100e22 = 2 ^ 3.029e22. Since the water has 2 ^ 3.029e22 as many possible microscopically distinct states, it would take an additional 3.029e22 bits of information to specify exactly which state it actually was in. Note that this differs slightly from my earlier result due to rounding.) What this means is that if you watch a gram of water freeze, you gain 3.028e22 bits (over 3 zettabytes) of information about its precise physical state. I say "gain", but you haven't really "learned" anything in the usual sense. It's a bit like watching someone sort a shuffled deck of cards: afterward, you know a lot more about the order the cards are in, even though you didn't strictly "learn" anything. And that's for a single gram of water. Watching a kilogram (2.2 pounds) freeze would gain you 3.028e25 bits. Watching a lake freeze... would be a lot of information. There is an important difference between freezing water vs sorting cards, though: I assumed there was an intelligent agent (a human) sorting the cards, but water freezes simply due to heat flowing out of it. Actually, even the observer (i.e. "you" in my description above) isn't necessary, since the change is in the amount of information in the system's macroscopic state, and that is a property of the system itself, and doesn't depend on whether anyone's watching it. The point of all of this? If you accept Jaynes' view, then ridiculously huge gains of information can happen due to heat flow, with no intelligent agent, CSI, FSCI/O, or any such thing involved. As I see it, you have several possible avenues for responding to this challenge: - You could show that there's something seriously wrong with my analysis. Note that you'd need to find an actual problem with it, not just reject it because it doesn't match your views. Good luck with this. - You could reject Jaynes' view on the connection between thermodynamic and Shannon entropies; but since the equivalence works mathematically, and that seems to be a critical part of your argument... - You could accept the result, but claim that this type of information doesn't have any connection to the types of information (CSI, FSCI/O, etc) that you're concerned with. This is more-or-less the view that Zachriel has been pushing you toward, and the only one that makes sense to me. But again, that pretty much wrecks the link to thermodynamics that you've been pushing. - You could accept that heat flow is sufficient to produce relevant types of information. If you go this way, your overall argument is, as far as I can see, dead. So what'll it be? Gordon Davisson
Mung: Why do we value diamonds over brains? Do we now? Zachriel
How many possible arrangements of matter are there that form neither diamonds nor brains? Why do we value diamonds over brains? Mung
Z, it is clear that you are not attending to the substantial matter on the table. The record is there. And BTW, the issue of config spaces and linked states is prior to both eq and non eq th-d. The basic difficulty is you ate trying to get to a blind search-infeasible result with out an adequate means. KF kairosfocus
kairosfocus: enough has already been said Followed by more than 2700 words. If you can't answer simple questions, we'll just assume you can't, and leave you to your self-indulgence. Mung: You agree with me, but I am the one who “gets it.” We are in agreement! Entropy has to do with the number of distinguishable microstates. There are empirical methods for estimating standard entropy. Mung: Who freaking cares? Kairosfocus has expressed extended opinions about the relationship of "FSCO/I" and thermodynamic entropy. We wish to explore this question by asking him a few questions. Mung: Are you using equilibrium thermodynamics or non-equilibrium thermodynamics? Kairosfocus is making reference to equilibrium thermodynamics. To avoid the problem of ambiguity in measuring entropy for a system not in equilibrium, we suggested comparing an unused microprocessor sitting on a shelf and a like mass of diamond. Zachriel
PS: Just for reference, here, again, are Orgel and Wicken:
ORGEL: . . . In brief, living organisms are distinguished by their specified complexity. Crystals are usually taken as the prototypes of simple well-specified structures, because they consist of a very large number of identical molecules packed together in a uniform way. Lumps of granite or random mixtures of polymers are examples of structures that are complex but not specified. The crystals fail to qualify as living because they lack complexity; the mixtures of polymers fail to qualify because they lack specificity . . . . [HT, Mung, fr. p. 190 & 196:] These vague idea can be made more precise by introducing the idea of information. Roughly speaking, the information content of a structure is the minimum number of instructions needed to specify the structure. [--> this is of course equivalent to the string of yes/no questions required to specify the relevant "wiring diagram" for the set of functional states, T, in the much larger space of possible clumped or scattered configurations, W, as Dembski would go on to define in NFL in 2002] One can see intuitively that many instructions are needed to specify a complex structure. [--> so if the q's to be answered are Y/N, the chain length is an information measure that indicates complexity in bits . . . ] On the other hand a simple repeating structure can be specified in rather few instructions. [--> do once and repeat over and over in a loop . . . which can be implicit in the molecular spatial architecture] Complex but random structures, by definition, need hardly be specified at all . . . . Paley was right to emphasize the need for special explanations of the existence of objects with high information content, for they cannot be formed in nonevolutionary, inorganic processes. [The Origins of Life (John Wiley, 1973), p. 189, p. 190, p. 196. Of course, that immediately highlights OOL, where the required self-replicating entity is part of what has to be explained, a notorious conundrum for advocates of evolutionary materialism; one, that has led to mutual ruin documented by Shapiro and Orgel between metabolism first and genes first schools of thought. Behe would go on to point out that irreducibly complex structures are not credibly formed by incremental evolutionary processes and Menuge et al would bring up serious issues for the suggested exaptation alternative, cf. his five challenges. Finally, Dembski highlights that CSI comes in deeply isolated islands T in much larger configuration spaces W, for biological systems functional islands. That puts up serious questions for origin of dozens of body plans reasonably requiring some 10 - 100+ mn bases of fresh genetic information to account for cell types, tissues, organs and multiple coherently integrated systems. Wicken's remarks a few years later as already were cited now take on fuller force in light of the further points from Orgel at pp. 190 and 196 . . . ] WICKEN: ‘Organized’ systems are to be carefully distinguished from ‘ordered’ systems. Neither kind of system is ‘random,’ but whereas ordered systems are generated according to simple algorithms [i.e. “simple” force laws acting on objects starting from arbitrary and common- place initial conditions] and therefore lack complexity, organized systems must be assembled element by element according to an [originally . . . ] external ‘wiring diagram’ with a high information content . . . Organization, then, is functional complexity and carries information. It is non-random by design or by selection, rather than by the a priori necessity of crystallographic ‘order.’ [“The Generation of Complexity in Evolution: A Thermodynamic and Information-Theoretical Discussion,” Journal of Theoretical Biology, 77 (April 1979): p. 353, of pp. 349-65.]
kairosfocus
F/N: As a for record, I note on points in summary of things long since highlighted but ignored: >>Kairosfocus’s position (we won’t call it an argument)>> a --> Genetic fallacy, in teeth of evidence that the issues are far more than what some idiosyncratic bloggist says. In effect tantamount to a confession of intent not to engage in rational discussion but an exercise of distraction, distortion, denigration and dismissal >> is that the arrangements of matter that make up a biological structure is minuscule compared to possible arrangements, so are unlikely to occur by chance.>> b --> Neatly left out [and as highlighted since Orgel and Wicken], high contingency, aperiodic functionally specific organisation which by contrast with many other accessible configs, will be deeply isolated. c --> There is a search and there is not a determination per mechanical necessity which prevails in crystalline order, establishing unit cells and the repetition of such to give the macro-structure of the crystal. >> However, the arrangements of microstates that make up a diamond are even more minuscule compared to their possible arrangements.>> d --> Way up in this thread (and repeatedly in citation from Orgel and Wicken), it was pointed out that the crystalline, periodic structure is driven by the mechanical necessity of the circumstances of crystal formation. Diamonds, in fact form under high pressure and temperature conditions and under our more common circumstances are metastable. e --> So, perhaps Z wishes to sggest that similar forces of mechanical necessity force the formation of say protein or DNA chains. The problem is, the chaining chemistry is in fact standard as bonding from the diverse monomer components, forming a backbone where any possible monomer may follow any other. Whether peptide or sugar-phosphate. As Z knows or should know . . . R/DNA chains (sugar-phosphate) and protein chains (peptide bond) are highly contingent. That is how R/DNA can carry a code, and it is how diverse proteins are based on the particular specific sequence, folding and function post folding. f --> Further to this, the key chemistry for function is on the side-chains, in effect at right angles to the chains. That is how on prong height, 3-letter codon sequences in an mRNA chain will accept relevant anticodon tRNAs, which in turn use a standard CCA tip to catch the particular AA from a "loading enzyme" . . . which is what imposes the code. This leads to loading of particular AAs in accord with a code. g --> Which code is not imposed by any mechanical necessity. Indeed, researchers have extended it. >> All this means in terms of thermodynamics is that entropy has to be exported at the expense of usable energy.>> h --> This is little more than anything can happen in an open system. i --> The issue is, that under relevant circumstances, just as with the coin chain or the paramagnetic substance binary models above, there is high contingency for D/RNA chains, and for AA chains. In effect, 4 possibilities per base and 20 possibilities per AA, respectively. This gives rise to a configuration space that expands at the exponential order of n^4 or n^20 respectively, n being chain length. j --> Under relevant circumstances for diamond formation, C-atoms are effectively equivalent, ad there is no effective contingency, it is just the matter of forming the tetrahedral covalent bond structure. k --> It is in that context that there is for these molecules, a very large space of possibilities that has in it islands of function. For proteins in AA sequence space, due to folding, fitting with relevant other components etc to function. l --> For D/mRNA, the encoding of the functionally specific information from start/load methionine to extending to stop, with three different stop codes. I leave off the editing of mRNA before sending it to the ribosome, but that becomes important too. m --> In other cases, both biological and otherwise [e.g. the Abu 6500 3c reel], there are many relevant alternative arrangements of parts, only a very small fraction of which will be functionally relevant due to need to be correctly arranged and coupled per a wiring diagram as Wicken describes. n --> BTW, anyone notice how my repeated reference to Wicken -- functionally specific info rich organisation, wiring diagram arrangement -- and Orgel -- complex specified organisation with distinction from both random arrangements and crystalline order -- keeps on being studiously ignored in the rush to personalise and polarise, then dismiss? >> There’s nothing about biological organisms, about human activity, or about evolution, that is contrary to the laws of thermodynamics.>> o --> The issue is not the "laws" -- which there is no dispute do apply -- but the underlying dynamics of config or phase spaces that give rise to the patterns of consistent behaviour/pheomena that may be summarised. p --> In short, the issue is being strawmannised. q --> The evidence and analysis per Clausius is, energy importation tends to increase entropy by increasing number of ways that energy and mass at micro level may be arranged consistent with the macro conditions. On the informtional view, the entropy metric fan be seen as the further info required to specify microstate given macro conditions consistent with a set of possibilities. r --> The further evidence is, per massive observation [and that without exception], that functionally specific complex organisation is effected by direct intelligent configuration, or else by programs and/or wiring diagrams that are information rich and functionally specific. s --> Further to this, inflowing energy is typically converted into shaft or flow work through an energy converter, and this is then used in a controlled sequence of steps, to configure per the diverse possibilities as limited by the agent involved or the program and wiring diagram etc involved. t --> Waste energy is then ejected in some way, dissipating in a low temperature reservoir. In some cases, e.g. sweating, that is by way of evaporation, taking advantage of the work required to break up the bonds in water as a condensed polar molecular substance with high heat capacity. u --> So, the issue comes back to ordered vs random vs functionally organised sequence complexity, where discussion on strings is WLOG as strings can in principle be used to create the relevant wiring diagrams and programs. That is, there is a preservation of information involved -- which raises the onward question per the config space issues as outlined, of the origin of such information that specifies arrangement per function. v --> Consistently, when we observe (on a trillion member base) it is intelligent direction that accounts for the origin. w --> While, the needle in haystack challenge facing blind mechanical necessity and/or blind chance is that the resources of sol system or the observed cosmos are soon exhausted while being unable to come up with a credible search, given that config spaces exponentiate per S^n for s-state per position chains of length n. Hence, needle in haystack blind search challenge _____________ Now, nothing in the above outline is new, though there is expansion on some points for this thread -- little did I suspect that I would need to state the nature of AA and D/mRNA chains again and why such strings are made up from highly contingent chains capable of 4.32 or 2 bits per member of info carrying capacity on the direct estimation of set of possibilities . . . as Shannon himself used in his original paper long since. It is amazing to see how, after years and years, objectors still seem to be unable to simply accurately describe the issue as we see it. Don't have to agree, just give a fair summary then provide good and cogent argument as to your own view. It is fair comment that there is no observationally confirmed means to create FSCO/I rich systems per blind chance and/or mechanical necessity only, but there is a trillion member observational basis that shows this to be accounted for by intelligently directed configuration. If you doubt, think about how the objecting comment you compose comes about,as a case of a string made up from 128 state elements in ASCII code, in turn tracing to 7-bit core elements, with augmentation. KF kairosfocus
Zachriel: Now you’re getting it! You agree with me, but I am the one who "gets it." Got it. Zachriel: Which has more available microstates, a brain or a like mass of diamonds? Who freaking cares? Are you using equilibrium thermodynamics or non-equilibrium thermodynamics? Does the statistical mechanical basis of thermodynamics change based on your whimsy? Mung
F/N: Sewell has aptly put the issue, never mind some odd points on terminology:
. . . The second law is all about probability, it uses probability at the microscopic level to predict macroscopic change: the reason carbon distributes itself more and more uniformly in an insulated solid is, that is what the laws of probability predict when diffusion alone is operative. The reason natural forces may turn a spaceship, or a TV set, or a computer into a pile of rubble but not vice-versa is also probability: of all the possible arrangements atoms could take, only a very small percentage could fly to the moon and back, or receive pictures and sound from the other side of the Earth, or add, subtract, multiply and divide real numbers with high accuracy. The second law of thermodynamics is the reason that computers will degenerate into scrap metal over time, and, in the absence of intelligence, the reverse process will not occur; and it is also the reason that animals, when they die, decay into simple organic and inorganic compounds, and, in the absence of intelligence, the reverse process will not occur. The discovery that life on Earth developed through evolutionary "steps," coupled with the observation that mutations and natural selection -- like other natural forces -- can cause (minor) change, is widely accepted in the scientific world as proof that natural selection -- alone among all natural forces -- can create order out of disorder, and even design human brains, with human consciousness. Only the layman seems to see the problem with this logic. In a recent Mathematical Intelligencer article ["A Mathematician's View of Evolution," The Mathematical Intelligencer 22, number 4, 5-7, 2000] I asserted that the idea that the four fundamental forces of physics alone could rearrange the fundamental particles of Nature into spaceships, nuclear power plants, and computers, connected to laser printers, CRTs, keyboards and the Internet, appears to violate the second law of thermodynamics in a spectacular way.1 . . . . What happens in a[n isolated] system depends on the initial conditions; what happens in an open system depends on the boundary conditions as well. As I wrote in "Can ANYTHING Happen in an Open System?", "order can increase in an open system, not because the laws of probability are suspended when the door is open, but simply because order may walk in through the door.... If we found evidence that DNA, auto parts, computer chips, and books entered through the Earth's atmosphere at some time in the past, then perhaps the appearance of humans, cars, computers, and encyclopedias on a previously barren planet could be explained without postulating a violation of the second law here . . . But if all we see entering is radiation and meteorite fragments, it seems clear that what is entering through the boundary cannot explain the increase in order observed here." Evolution is a movie running backward, that is what makes it special. THE EVOLUTIONIST, therefore, cannot avoid the question of probability by saying that anything can happen in an open system, he is finally forced to argue that it only seems extremely improbable, but really isn't, that atoms would rearrange themselves into spaceships and computers and TV sets . . .
KF kairosfocus
Z, enough has already been said, and when answers have been pointed out you have consistently ignored, now pretending not given. The last time around was a few posts ago at 256, I am not going back to that loop ad nauseum. It is not to be overlooked that all these side tracks do not affect the force of the core needle in haystack in Darwin's pond issue but sure make ways to not discuss it. There is enough for those who are willing and nothing will ever be enough for the unwilling. KF PS: Memory was wrong, T & A was 2005: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1208958/pdf/1742-4682-2-29.pdf cf fig 4 esp for OSC, RSC, FSC, which directly relates to the relevant component of entropy. As in, with a repetitive unit bloc -esp single atom] there is little room for variability. Random patterns will give highest Shannon Info carrying cap values, in functional patterns, redundancies will reduce cap per symbol. The link to further info to describe microstate should be plain from the resistance of a random string to compression. And discussion on strings is WLOG. PPS: It was long since pointed out but studiously ignored that due to needle in haystack challenge consistently FSCO/I configuring work is directed by direct intelligence or by programme; as is universally observed. Importing raw energy tends to make entropy rise. Open systems do not magically make probabilistic search challenge issues vanish. kairosfocus
Mung: Now you’re getting it! How many possible arrangements of carbon microstates are there compared to how many possible arrangements of carbon microstates that form brains? Now you're getting it! Which has more available microstates, a brain or a like mass of diamonds? Mung: So what? Ask kairosfocus. It's his position. Kairosfocus's position (we won't call it an argument) is that the arrangements of matter that make up a biological structure is minuscule compared to possible arrangements, so are unlikely to occur by chance. However, the arrangements of microstates that make up a diamond are even more minuscule compared to their possible arrangements. All this means in terms of thermodynamics is that entropy has to be exported at the expense of usable energy. There's nothing about biological organisms, about human activity, or about evolution, that is contrary to the laws of thermodynamics. Zachriel
Zachriel: How many possible arrangements of carbon microstates are there compared to how many possible arrangements of carbon microstates that form diamonds? Now you're getting it! How many possible arrangements of carbon microstates are there compared to how many possible arrangements of carbon microstates that form brains? Now, after having done the appropriate calculations, you have an entropy value for each. So what? Mung
kairosfocus: t is now clear that you are not attending to what I have said We have attempted to address your points, but you have made no effort whatsoever. kairosfocus: The same sort of reasoning that explains why per relative statistical weight, the bulk clusters utterly dominate thermodynamic outcomes, grounding the 2nd law. How many possible arrangements of carbon microstates are there compared to how many possible arrangements of carbon microstates that form diamonds? Zachriel
Z, it is now clear that you are not attending to what I have said, much less what Orgel and Wicket pointed out 40 years ago. (I will again pont you to L K Nash's string of coins example and Mandl's paramagnetic substance of say 1,000 atoms with B-field alignment parallel/antiparallel, and the obvious resulting binomial distribution of possibilities. As this is directly parallel to a string of bits, it will readily be seen that it will have clusters of meaningful strings in its config space -- every possible 1,000 bit combination -- but the bulk of possibilities will be near 50-50 in no particular order. This toy example has much to teach.) I pointed to the underlying statistical view and analysis on config spaces (or more broadly phase spaces), which sets up how a commonly observed phenomenon, functionally specific, complex organisation fits in as giving a case of identifiable and predictably small clusters relative to the space as a whole. That sets up the sort of needle in haystack circumstance that leads to there being a maximal unlikeliness of landing in such zones by blind chance and mechanical necessity. The same sort of reasoning that explains why per relative statistical weight, the bulk clusters utterly dominate thermodynamic outcomes, grounding the 2nd law. Recall, for 1,000 bits worth of configs, the number of possible occurrences for 10^80 atoms, 10^17 s and 10^12 - 14 states/s (observations) per atom would only be able to sample as one straw to a haystack that would dwarf the observed cosmos. As to your ad nauseum on demanded answers to questions where you have already ignored cogent answers to essentially the same question already, your behaviour does not lead me to more than say what has already been pointed out: highly ordered states are low information, random states have highest info carrying capacity, organised states will be intermediate. And, the Jaynes point on the entropy being an index of additional information to specify microstate on being given the macro observable state obtains. This has no relevance to the already well established point that a blind needle in haystack search will be maximally unlikely to find FSCO/I clusters by chance and necessity. The point that it seems you cannot bring yourself to face. KF kairosfocus
Mung: A world without thermodynamics, that’s your position? Not at all. The second law of thermodynamics says that overall thermodynamic entropy must increase. It says that if a snowflake forms from water vapor, which is a decrease in entropy, the entropy must be exported to the surroundings. When an organism converts food into protein, which is a decrease in entropy, the entropy must be exported to the surroundings. When a human refines metals to make a machine, which is a decrease in entropy, the entropy must be exported to the surroundings. kairosfocus: for over 100 years since Boltzmann, Maxwell and Gibbs et al, the underlying statistical analysis has been the core of thermodynamics, and particularly of the 2nd law. You didn't answer. If you are claiming that FSCO/I is part of thermodynamics, then please tell us which has lower thermodynamic entropy; an unused microprocessor on a shelf, or a diamond of like mass. If you can't, then your it's clear that you can't substantiate your claims. Zachriel
Dr Selensky, pardon but no one is arguing that thermodynamics or its linked microstates picture and info holding capacity (i.e. Shannon info) directly points to design. Just, that analysis on configurational possibilities -- microstates -- is foundational and valid in thermo-d; with implications once we see the impact of the vast number of possibilities. The same that grounds the 2nd law and explains it. Beyond, it is being stated that there is as a matter of fact a valid approach from Jaynes et al, which identifies entropy as an index of the further info left to specify microstate once state has been defined at macro level; in effect comparable to accessible possibilities for distribution of mass and energy at micro level or to degrees of freedom compatible with a given macro level state. In that context, functionally organised clusters of states at micro level will come in islands of function deeply isolated in the config space of possibilities. So, blind chance and necessity transitions from arbitrary initial conditions will be maximally unlikely to find them due to the overwhelming statistical weight of dominant but non-functionally organised clusters and limited sol system or cosmos scale resources, the needle in haystack challenge. In typical stats terms, if you take a few samples you do not have a reasonable expectation to see the extreme far tail. And function is itself an observable. Under such circumstances the best causal explanation for complex, functionally organised states is design. That is an induction. KF kairosfocus
Zachriel's position re thermodynamics and ID is a rare case where I agree with him. Thermodynamics/Shannon information aspect does not yield the distinction between design and non-design. It is the semantics/semiotics level that does. EugeneS
Z, for over 100 years since Boltzmann, Maxwell and Gibbs et al, the underlying statistical analysis has been the core of thermodynamics, and particularly of the 2nd law. Your response is inadvertently revealing, especially after this has been pointed out to you over and over again. KF kairosfocus
Zachriel: So it’s not actually thermodynamics, but analogous to thermodynamics. You just don't get it. A world without thermodynamics, that's your position? No Zeroth Law. No First Law. No Second Law. Or perhaps isolated areas of the universe where they just don't hold. Is that your position? Mung
Ken M:
She enables and encourages open and fair discussion
No, she doesn't- not even close. Anyone who says that genetic algorithms demonstrate Darwinian evolution is either lost, deluded or dishonest. And Elizabeth and the TSZ ilk do just that. The point is GAs are search heuristics and Darwinian evolution isn't a search. Also GAs are actively searching for a solution to a specific problem and NS is passive and whatever is good enough gets to survive. GAs are the opposite of Darwinian evolution. IOW Elizabeth and the TSZ are about as dishonest and deceptive as it gets. Virgil Cain
kairosfocus: the foundational reasons for that lie in the same analysis of configurational possibilities that underlies statistical analysis at the heart of thermodynamics. So it's not actually thermodynamics, but analogous to thermodynamics. Is that your position? Zachriel
PPS: Nowadays, the cry "censorship" is commonly abused to try to intimidate those who say just as there is no right to cry fire in a crowded theatre with no good reason, there is no right to allow your dog to drag garbage unto your neighbour's front lawn. Censorship properly obtains when there is a power to restrict public communication and suppress civil dissent. In a day when 15 minutes of work to register or set up a blog -- and any number of fever swamp soap box sites are there -- there is no true censorship. But, clearly, there is a proper right to remove the garbage deposited on the front lawn, to build protective fences against further dragging, and to address the irresponsible neighbour. And, abusive commentary, cyberstalking, cyberbullying and the like are widely and correctly understood to be garbage dragged unto the front lawn. Here at UD, there have been significant objectors who, for years have made their case without dragging garbage on the front lawn. K-M, your schoolyard bully level behaviour and record on display above makes it clear that you are not one of such. If you keep on on this line, you will beyond reasonable doubt be shown the exit by UD's moderators, for cause. That should be a word to the wise. kairosfocus
PPS: Locke's alternative:
[2nd Treatise on Civil Gov't, Ch 2 sec. 5:] . . . if I cannot but wish to receive good, even as much at every man's hands, as any man can wish unto his own soul, how should I look to have any part of my desire herein satisfied, unless myself be careful to satisfy the like desire which is undoubtedly in other men . . . my desire, therefore, to be loved of my equals in Nature, as much as possible may be, imposeth upon me a natural duty of bearing to themward fully the like affection. From which relation of equality between ourselves and them that are as ourselves, what several rules and canons natural reason hath drawn for direction of life no man is ignorant . . . [This directly echoes St. Paul in Rom 2: "14 For when Gentiles, who do not have the law, by nature do what the law requires, they are a law to themselves, even though they do not have the law. 15 They show that the work of the law is written on their hearts, while their conscience also bears witness, and their conflicting thoughts accuse or even excuse them . . . " and 13: "9 For the commandments, “You shall not commit adultery, You shall not murder, You shall not steal, You shall not covet,” and any other commandment, are summed up in this word: “You shall love your neighbor as yourself.” 10 Love does no wrong to a neighbor; therefore love is the fulfilling of the law . . . " Hooker then continues, citing Aristotle in The Nicomachean Ethics, Bk 8:] as namely, That because we would take no harm, we must therefore do none; That since we would not be in any thing extremely dealt with, we must ourselves avoid all extremity in our dealings; That from all violence and wrong we are utterly to abstain, with such-like . . . ] [Eccl. Polity ,preface, Bk I, "ch." 8, p.80, cf. here. Emphasis added.] [Augmented citation, Locke, Second Treatise on Civil Government, Ch 2 Sect. 5. ]
kairosfocus
PS: Plato's warning in The Laws, Bk X:
Ath. . . .[The avant garde philosophers and poets, c. 360 BC] say that fire and water, and earth and air [i.e the classical "material" elements of the cosmos], all exist by nature and chance, and none of them by art . . . [such that] all that is in the heaven, as well as animals and all plants, and all the seasons come from these elements, not by the action of mind, as they say, or of any God, or from art, but as I was saying, by nature and chance only [ --> that is, evolutionary materialism is ancient and would trace all things to blind chance and mechanical necessity] . . . . [Thus, they hold] that the principles of justice have no existence at all in nature, but that mankind are always disputing about them and altering them; and that the alterations which are made by art and by law have no basis in nature, but are of authority for the moment and at the time at which they are made.-
[ --> Relativism, too, is not new; complete with its radical amorality rooted in a worldview that has no foundational IS that can ground OUGHT, leading to an effectively arbitrary foundation only for morality, ethics and law: accident of personal preference, the ebbs and flows of power politics, accidents of history and and the shifting sands of manipulated community opinion driven by "winds and waves of doctrine and the cunning craftiness of men in their deceitful scheming . . . " cf a video on Plato's parable of the cave; from the perspective of pondering who set up the manipulative shadow-shows, why.]
These, my friends, are the sayings of wise men, poets and prose writers, which find a way into the minds of youth. They are told by them that the highest right is might,
[ --> Evolutionary materialism -- having no IS that can properly ground OUGHT -- leads to the promotion of amorality on which the only basis for "OUGHT" is seen to be might (and manipulation: might in "spin") . . . ]
and in this way the young fall into impieties, under the idea that the Gods are not such as the law bids them imagine; and hence arise factions [ --> Evolutionary materialism-motivated amorality "naturally" leads to continual contentions and power struggles influenced by that amorality at the hands of ruthless power hungry nihilistic agendas], these philosophers inviting them to lead a true life according to nature, that is,to live in real dominion over others [ --> such amoral and/or nihilistic factions, if they gain power, "naturally" tend towards ruthless abuse and arbitrariness . . . they have not learned the habits nor accepted the principles of mutual respect, justice, fairness and keeping the civil peace of justice, so they will want to deceive, manipulate and crush -- as the consistent history of radical revolutions over the past 250 years so plainly shows again and again], and not in legal subjection to them.
kairosfocus
K-M: I will write just once, for record. You just proved what you are, a trollish agitator unable or unwilling to acknowledge a major and longstanding deep-seated problem of abusive behaviour by objectors to design theory; backed by enabling behaviour of the more genteel . . . some of whom com across as unwilling to confront the abusers because their feral fury will predictably turn on them if they do so. Let him who chooses to ride a tiger beware the consequences. The abuse I point to includes cyberstalking and on the ground stalking, as well as the now increasingly common alinskyite/ cultural marxist agit-prop tactics and linked nihilist, amoral factionalism and bigotry -- warned against 2350 years ago by Plato as consequences of the rise of evolutionary materialist ideologies -- that are wreaking havoc all over our civilisation. You need to be told that you are a part of the problem, not the solution . . . though it will be very hard for those caught up in such marches of folly to see beyond the shadow-shows of the Plato's cave world they inhabit. Our civilisation is sick unto death. (e.g. the warping of law and policy that has led to the mass slaughter of hundreds of millions in the womb under false colours of law is a chief symptom and secondary cause as mass blood guilt is utterly corrupting . . . ) And, those who would bring it down are so historically ignorant, have been so warped on what they imagine is history that they do not understand the consequences of the alternatives they propose -- as the ghosts of the over 100 million victims of the statist tyrannies of the past 100 years try to warn us. The lessons of sound history were bought with blood and tears. Those who dismiss, neglect or distort them doom themselves to pay much the same coin over and over again. March of folly. KF kairosfocus
Mung: "kf, sadly, Elizabeth doesn’t just enable, but encourages." I agree. She enables and encourages open and fair discussion without editing/deleting dissenting comments or banning commenters. I am sure that she appreciates that you would bring this message here. Maybe a little of it will rub off. Ken_M
Z, the foundational reasons for that lie in the same analysis of configurational possibilities that underlies statistical analysis at the heart of thermodynamics. It is also to be noted that relevant functional clusters are observably distinct from those that do not work, thus providing a proper basis for clustering. KF kairosfocus
kf, sadly, Elizabeth doesn't just enable, but encourages. Mung
kairosfocus: There is good reason to distinguish periodic low info order, information rich functionally specific organisation and randomness. Perhaps there is, but it has little or nothing to do with thermodynamics. Zachriel
PS: Configs in a space are microstates from a field of possibilities -- I long since pointed you to L K Nash's discussion of a string of coins and to the physical example in Mandl of a paramagnetic substance with N up/down to show its physical relevance. Of course studiously ignored, this seems to be conclusion of dismissal in hand all along just find a rhetorically handy talking point to trot it out. In that context, functionally specific organisation constrains what works to deeply isolated islands in the space of possible configs. That is evident from say the FSCO/I in the text of your objections, or from what it takes to make a functional reel. Just so, go back to the tree of life at its root, with reproduction to be explained not used as a magic wand, in Darwin's pond or the like. The FSCO/I there needs to be accounted for on physics and chemistry, reflecting exactly the thermodynamics issues repeatedly highlighted. kairosfocus
Z, you need to refresh your thinking on the information and entropy connexion all the way back to the corrective in 76 above, not to mention Orgel and Wicken. There is good reason to distinguish periodic low info order, information rich functionally specific organisation and randomness. That has been repeatedly pointed out with reasons, and it has repeatedly been studiously ignored. And your attempt to personalise to me in the teeth of repeated highlighting of what Orgel and Wicken quite cogently pointed out from the 1970's, is also quite revealing. That's an Alinsky tactic, not discussion in good faith on the merits. Indeed, just now you attribute to me what is a citation. KF kairosfocus
kairosfocus: It is quite clear that it serves no purpose other than to conflate crystalline order and aperiodic, wiring diagram functional organisation, just like the previous distractor on snowflakes. Actually, you are the one who is conflating thermodynamic entropy with functional organization, as was clear when you brought up gases rushing to one side of the room, or when you attempted to discuss "predominant cluster of microstates". The number of possible orderings of, say, of circuits on a microchip, are minuscule when compared to the available microstates of the atoms that make up microchip. kairosfocus: thermodynamics should be seen as an application of Shannon’s information theory To available microstates, not your personal notions of organization. Zachriel
PS: I clip 231: >> i/l/o Boltzmann’s summary eqn that captures an essential point through a particular case of interest, S = k log W:
The Question Z is trying to distract with is in effect the same as that a perfectly ordered repetitive pattern has near zero information. A flat random distribution has Shannon info capacity at max, and meaningful, aperiodic, functional strings that are constrained to be specific by function have a higher Shannon info capacity metric than a periodic crystal like string but a lower one than random noise. And I answer in this dual form to bring out the issue that the three types of sequence are fundamentally different. Discussion on strings is WLOG. Also Shannon info capacity is distinct from functionally specific info strings.
as well as:
in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon’s information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states that it could be in, thus making any complete state description longer.
And:
in the words of G. N. Lewis writing about chemical entropy in 1930, “Gain in entropy always means loss of information, and nothing more” . . . in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes/no questions that need to be answered in order to fully specify the microstate, given that we know the macrostate.
Where, as entropy is a state function, it is in order to draw out key components.>> kairosfocus
Z, FYI you are not the only relevant objector here or elsewhere (as Mung has taken time to clip . . . ). There is a longstanding and widespread problem all the way up to stalking on the ground not just in cyberspace (and yes, that is outright criminal activity and has been brought to police attention), and it needs to be addressed. On the distractor you have kept on pushing, it is strictly irrelevant to the central matter, and has in fact been answered in its dual, info carrying capacity form, cf 200 on above. It is quite clear that it serves no purpose other than to conflate crystalline order and aperiodic, wiring diagram functional organisation, just like the previous distractor on snowflakes. The pivot is, config spaces, islands of function, needle in haystack challenge, as a result of which FSCO/I is properly regarded per abductive inference to best explanation as a reliable empirically grounded sign of design. KF kairosfocus
kairosfocus: there is never an excuse for hate or want of broughtupcy It's not hateful or impolite to ask you a question about thermodynamics, especially when you are making claims about thermodynamics. In our experience, people who are into science, usually like to talk about science. If you ask a physicist why the Earth's surface is warmer than expected of a blackbody, most would be happy to provide an answer. In any case, diamonds have very low thermodynamic entropy. As a rule, substances made up of complex molecules have a higher entropy. Zachriel
Mung, there is never an excuse for hate or want of broughtupcy; but we are likely seeing the recycling of one of the all too familiar obsessed characters who hang around in UD's penumbra of attack sites. That said, let us never underestimate that it is easier to distract, distort, denigrate rather than address the issue, as the above thread shows. Which is the clue we need to attend to: if there were a simple slam dunk answer to the force of the design inference, it would have been triumphantly trotted out long since. KF kairosfocus
Boy kf, they sure do love to hate you. How do you do it? :) Mung
Ken_M:
Hi Mung. I have a simple question. How come anyone who questions ID is labeled a troll? I have made a total of three comments (four including this one) and I am a troll?
Well, let's see now... Ken_M:
Are you too cowardly (or incapable) of answering it? I can answer it for you, if you want. Or are you terrified of the idea of followers finding out that you don’t know all the answers?
That's a funny way to question ID. Looks more like trolling, imo. Ken_M:
If you don’t know the answer, man-up and admit it. There is no shame. To be honest (which is a personal character you have failed to show) I don’t know the answer. I really want to hear the answer. Do you?
Another funny way to question ID. Looks more like trolling. Ken_M:
How come you always bring up this nonsense whenever your opinions are questioned? This is identifying BS when it is observed. Or you can finally admit that you didn’t know what you were talking about.
Another funny way to question ID. Looks more like trolling. Ken_M: I can answer it for you, if you want. Ken_M: I don’t know the answer. I really want to hear the answer. Troll. Mung
K_M: Why is it that so many objectors to design theory routinely -- even, habitually -- resort to red herrings, led away to strawmen caricatures soaked in ad hominems and set alight that cloud, confuse, poison and polarise the atmosphere? Why do such so often resort to "you hit back first" when that fact is pointed out, and identified as a fallacy of distraction, distortion, denigration and polarisation? To, repeated doubling down in the face of due correction? Other than, the extraordinarily intense, emotionally over-wrought hostility and bigotry that those who object to their ideology "must be" ignorant, stupid, insane or wicked? (And, to date, Dawkins has never taken those words back. Indeed, he has multiplied them over and over again.) If your ilk actually had the goods, simply showing the substance would be unanswerable and that would be clear. It is no coincidence, then, to see your failure to state, summarise or link those goods in response to direct challenge to bring them forth. The persistent lack of those goods speaks volumes on the true balance on merits, as does unresponsiveness when relevant facts are pointed out. FYI, there is no right to be abusive as -- per fair comment in the teeth of uncivil behaviour -- you have consistently been since surfacing overnight. Trolls is as trolls does, in short -- and this attaches to the behaviour of some (too many) objectors to ID who seem to assume they have a right to be ill-behaved. And FYFI, the thread has developed on the informational import of statistical thermodynamics in response to an out of context misleading quotation and an attempt or two to conflate crystallographic order -- snowflakes, diamonds -- with functionally specific complex organisation and associated information, spelling out a descriptive phrase that you attempted to twist into a schoolyard bully style taunt. Game over. KF PS: Your false assertion of victory in the face of an answer that uses the informational connexion to draw out configurational components of entropy, speaks volumes. I repeat, i/l/o Boltzmann's summary eqn that captures an essential point through a particular case of interest, S = k log W:
The Question Z is trying to distract with is in effect the same as that a perfectly ordered repetitive pattern has near zero information. A flat random distribution has Shannon info capacity at max, and meaningful, aperiodic, functional strings that are constrained to be specific by function have a higher Shannon info capacity metric than a periodic crystal like string but a lower one than random noise. And I answer in this dual form to bring out the issue that the three types of sequence are fundamentally different. Discussion on strings is WLOG. Also Shannon info capacity is distinct from functionally specific info strings.
as well as:
in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon's information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states that it could be in, thus making any complete state description longer.
And:
in the words of G. N. Lewis writing about chemical entropy in 1930, "Gain in entropy always means loss of information, and nothing more" . . . in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes/no questions that need to be answered in order to fully specify the microstate, given that we know the macrostate.
Where, as entropy is a state function, it is in order to draw out key components. In short, you just proved to one and all that the merits were never the issue. kairosfocus
Hi Mung. I have a simple question. How come anyone who questions ID is labeled a troll? I have made a total of three comments (four including this one) and I am a troll? I don't think that is a large enough sample to draw that conclusion. But you are the scientist.... Ken_M
Troll: Now who is distracting from the issue? LoL!!! Mung
KF: "this is now schoolyard level name twisting, a case of doubling down to carry on with atmosphere poisoning. Backed up by brazen false assertion.... Blah, blah, blah. How come you always bring up this nonsense whenever your opinions are questioned? You were the one who brought up the statistical aspect of thermodynamics as proof of ID and then was not capable of answering the most simply question about it. This is not distraction. This is identifying BS when it is observed. But you can prove us wrong by simply answering Z's question: " which has lower entropy, a brain or a like mass of diamonds?" Or you can finally admit that you didn't know what you were talking about. Have a nice evening. Ken_M
K-M, this is now schoolyard level name twisting, a case of doubling down to carry on with atmosphere poisoning. Backed up by brazen false assertion. Doubling down that inadvertently testifies that you have no answer to the main issue and are desperate to distract attention and poison the atmosphere. In fact, the reality of functionally specific complex organisation and associated information is manifest from the very text in your comments to the ckts that implement the PC you likely used to compose, to the software, to the Internet that put the comments up on this blog. The petroleum refinery that made the plastic, the process-flow systems and reaction set, my favourite ABU 6500 C3 fishing reel, and of course the coded strings in D/RNA, the metabolic reaction set in the living cell, proteins, functional organisation of the cell and its organelles such as ribosomes, organs, tissues and networks in body plans are further patent cases in point. Orgel and Wicken clearly identified it in the 70's and in the 80's Thaxton et al built on that. It is the biologically relevant functionally specific subset of CSI. From the 90's Dembski, Meyer and Durston et al have spoken to it. It is readily quantifiable and complexity is easily shown by comparing a 500 - 1000 bit threshold that specifies a level where blind needle in haystack search on the gamut of sol system or observed cosmos, loses all credibility as a feasible mans by which blind chance and/or mechanical necessity could credibly account for functionally specific organisation. It is readily recognised and quantified much as AutoCAD quantifies the self-same FSCO/I in engineering drawings -- Wicken's wiring diagram functionality based on node-arc networks that are functionally specific, info rich and complex. There are trillions of cases in point where we pretty much know the consistent, reliable cause: intelligently, purposefully directed configuration. The very resort to tactics of obfuscation, sneering, schoolyard bullyboyism, dismissal, distraction, atmosphere poisoning etc tells us loud and clear that the former campaigns to find counter examples (including canals on mars or the like) have so demonstrably come up dry that other, more sinister tactics are being resorted to. Per inference to best, empirically grounded, needle in haystack blind search challenge backed explanation, FSCO/I is a reliable sign of design as cause. That points to design as best explanation for OOL, and of body plans up to our own. Which is just what the evolutionary materialist scientism dominated establishment does not want anyone to consider. And ever so many of the indoctrinated are all too eager to enable that imposition. Game over. KF PS: Islands of function deeply isolated in very large config spaces and the information-stat mech picture. Including that all this started when someone artfully clipped a cite from the Wiki info thermod article that by suppressing the following paragraph twisted what has been publicly admitted against known interest for four years to my certain knowledge. PPS: Onlookers, cf 76 on above, responding to Aleta's misleading truncation: https://uncommondesc.wpengine.com/intelligent-design/but-is-this-fair-to-feynman/#comment-593521 kairosfocus
KF: "K-M: In fact, I was repeatedly pointing out the focal issue being distracted from; even noting that the matters are quite distinct so we can address the pivotal issue of FSCO/I and its cause..." Now who is distracting from the issue? FIASCO was never the central issue. You brought it up to distract from the original issue. Would you like me to remind you about it? Ken_M
K-M: In fact, I was repeatedly pointing out the focal issue being distracted from; even noting that the matters are quite distinct so we can address the pivotal issue of FSCO/I and its cause -- which it is obvious that you too are dodging -- even if we do not know the answer to a question of the entropy of a crystal of mass equal to a functionally organised entity. This was of course studiously ignored. You will further see that I just added a PS above that addresses the question in its dual, info form -- and that has been noted by Abel et al since c 2009. Orgel and Wicken point to it 40 years past. It turns out in short that the answer is of long standing, even in the thread, from 40 years ago. The issues of configuration and functionally specific organisation need to be faced not obfuscated. Nor does now trying to question my honesty add anything positive. In short we see the all too common trifecta fallacy of distract, distort, denigrate in action. KF kairosfocus
KF: "K-M: That is a pile on tactic. Why? Because you can't respond to it? Z has been asking a question (repeatedly, and repeatedly ignored) that you have refused to respond to. And it is relevant to the subject at hand. If you don't know the answer, man-up and admit it. There is no shame. To be honest (which is a personal character you have failed to show) I don't know the answer. I really want to hear the answer. Do you? Ken_M
K-M: That is a pile on tactic. I showed that the question being asked is irrelevant to the pivotal issue and so is a red herring led out to a strawman. Piling on by pouring ad hominems in and igniting to cloud, confuse and poison and polarise the atmosphere for discussion by making unwarranted accusations of cowardice simply tells us you are an enemy of the material truth or an enabler of such. The pivotal issue is config spaces and FSCO/I as definable clusters in such, how can we credibly get to same? Blind chance and necessity are not credible on needle in haystack, and FSCO/I routinely comes about by design -- trillions of cases in point. Indeed, it is a reliable sign of design. Now, look at cases where we do not have direct observation of causal process. It's not an orderly repetitious crystal like diamonds or salt. Its not a random tar or cluster of mineral particles, it is a wiring diagram organised, complex, functionally specific system. Now, what is the most reasonable causal explanation, why? (And BTW that is part of why the question is misdirected.) KF PS: The Question Z is trying to distract with is in effect the same as that a perfectly ordered repetitive pattern has near zero information. A flat random distribution has Shannon info capacity at max, and meaningful, aperiodic, functional strings that are constrained to be specific by function have a higher Shannon info capacity metric than a periodic crystal like string but a lower one than random noise. And I answer in this dual form to bring out the issue that the three types of sequence are fundamentally different. Discussion on strings is WLOG. Also Shannon info capacity is distinct from functionally specific info strings. kairosfocus
KF: What is being studiously avoided and diverted from..." IS Z's QUESTION. Are you too cowardly (or incapable) of answering it? I can answer it for you, if you want. Or are you terrified of the idea of followers finding out that you don't know all the answers? Ken_M
kairosfocus: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. So do that with an unused microprocessor sitting on a shelf and a diamond of equivalent mass. Which has lower entropy? Zachriel
F/N: What is being studiously avoided and diverted from, from 200 above, as was addressed to Z: >>5: What part of the following clip from the Wiki article [on info thermo-d and its bridge to classic thermodynamics considerations] do you not understand? in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon’s information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states that it could be in, thus making any complete state description longer. (See article: maximum entropy thermodynamics.[Also,another article remarks: >>in the words of G. N. Lewis writing about chemical entropy in 1930, “Gain in entropy always means loss of information, and nothing more” . . . in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes/no questions that need to be answered in order to fully specify the microstate, given that we know the macrostate.>>]) 6: What part of this clip from Orgel, 1973, is it that you do not understand? . . . In brief, living organisms are distinguished by their specified complexity. Crystals are usually taken as the prototypes of simple well-specified structures, because they consist of a very large number of identical molecules packed together in a uniform way. Lumps of granite or random mixtures of polymers are examples of structures that are complex but not specified. The crystals fail to qualify as living because they lack complexity; the mixtures of polymers fail to qualify because they lack specificity . . . . [HT, Mung, fr. p. 190 & 196:] These vague idea can be made more precise by introducing the idea of information. Roughly speaking, the information content of a structure is the minimum number of instructions needed to specify the structure. [–> this is of course equivalent to the string of yes/no questions required to specify the relevant “wiring diagram” for the set of functional states, T, in the much larger space of possible clumped or scattered configurations, W, as Dembski would go on to define in NFL in 2002.] One can see intuitively that many instructions are needed to specify a complex structure. [–> so if the q’s to be answered are Y/N, the chain length is an information measure that indicates complexity in bits . . . ] On the other hand a simple repeating structure can be specified in rather few instructions. [–> do once and repeat over and over in a loop . . . ] Complex but random structures, by definition, need hardly be specified at all . . . . Paley was right to emphasize the need for special explanations of the existence of objects with high information content, for they cannot be formed in nonevolutionary, inorganic processes. [The Origins of Life (John Wiley, 1973), p. 189, p. 190, p. 196. Of course, that immediately highlights OOL, where the required self-replicating entity is part of what has to be explained (cf. Paley here), a notorious conundrum for advocates of evolutionary materialism; one, that has led to mutual ruin documented by Shapiro and Orgel between metabolism first and genes first schools of thought, cf here. Behe would go on to point out that irreducibly complex structures are not credibly formed by incremental evolutionary processes and Menuge et al would bring up serious issues for the suggested exaptation alternative, cf. his challenges C1 – 5 in the just linked. Finally, Dembski highlights that CSI comes in deeply isolated islands T in much larger configuration spaces W, for biological systems functional islands. That puts up serious questions for origin of dozens of body plans reasonably requiring some 10 – 100+ mn bases of fresh genetic information to account for cell types, tissues, organs and multiple coherently integrated systems. Wicken’s remarks a few years later as already were cited now take on fuller force in light of the further points from Orgel at pp. 190 and 196 . . . ] 7: Likewise, what part of this from Wicken, do you not understand? ‘Organized’ systems are to be carefully distinguished from ‘ordered’ systems. Neither kind of system is ‘random,’ but whereas ordered systems are generated according to simple algorithms [[i.e. “simple” force laws acting on objects starting from arbitrary and common- place initial conditions] and therefore lack complexity, organized systems must be assembled element by element according to an [[originally . . . ] external ‘wiring diagram’ with a high information content . . . Organization, then, is functional complexity and carries information. It is non-random by design or by selection, rather than by the a priori necessity of crystallographic ‘order.’ [[“The Generation of Complexity in Evolution: A Thermodynamic and Information-Theoretical Discussion,” Journal of Theoretical Biology, 77 (April 1979): p. 353, of pp. 349-65.] 8: What part of, these are not design advocates but are frankly facing the issues at stake from OOL up through OO body Plans to OO the human one, do you not understand? 9: What part of, 10^80 atoms of the observed cosmos, acting as observers every 10^-14 s, for 10^17 s would be unable to blindly sample a fraction of the space for 1,000 coins or bits, larger than one straw to a haystack that dwarfs the said observed cosmos, do you not understand? 10: What part of, the search of a space would be a subset so that the set of possible searches is the power set of cardinality 2^C, C being that of the original set, rendering the search for a golden search much harder than the direct search, do you not understand? 11: What part of, on trillions of observed cases and the linked needle in haystack, blind search challenge, the only reliable explanation of FSCO/I is design, do you not understand? 12: Which part of, FSCO/I is thus an inductively reliable sign of design do you not understand? 13: Which part of the search of config spaces and blind transitions among such is at the heart of stat mech thinking, do you not understand?>> kairosfocus
kairosfocus: what part of being a commonly observed phenomenon, which per a like consequence of the underlying statistics that pose a needle in haystack challenge becomes maximally unlikely to be observed on blind chance and/or mechanical necessity We both agree it takes energy to rearrange matter into brains or diamonds. Mung: Is my brain at or near equilibrium? Your brain in particular? {Just kidding!} The typical brain is not at equilibrium, which brings us a bit closer to the point. There's no unambiguous measure of entropy for non-equilibrium systems, though it can be defined for the whole macroscopic system. Let's try a different example. Which has lower entropy, an unused microprocessor sitting on a shelf or a diamond of like mass? Zachriel
Mung: Is equilibrium thermodynamics the same as non-equilibrium thermodynamics? Zachriel: No. The difference is time. Is my brain at or near equilibrium? Mung
Z, a third time -- what part of being a commonly observed phenomenon, which per a like consequence of the underlying statistics that pose a needle in haystack challenge becomes maximally unlikely to be observed on blind chance and/or mechanical necessity (as are macro-fluctuations that would violate the second law . . . ), is it that is so hard for objectors of your ilk to acknowledge? Or is it, that FSCO/I’s origin is readily observed: not blind chance and mechanical necessity on the gamut of the observed cosmos — maximally implausible — but intelligently and purposefully directed configuration . . . so that we have an epistemic right to infer on best explanation that it is an empirically reliable sign of design as cause. KF PS: I am discussing the statistics of configuration spaces, which is foundational to how we get to the eqns. kairosfocus
Mung: Is equilibrium thermodynamics the same as non-equilibrium thermodynamics? No. The difference is time. Notably, kairosfocus's comments concern equilibrium thermodynamics. Zachriel
...if you are trying to change its PoV as nothing can do that. Are you saying it's brain has hardened into diamond? Mung
Is equilibrium thermodynamics the same as non-equilibrium thermodynamics? Mung
kairosfocus- If you need the practice then continue. I am just saying attempting a discussion with Zachriel is useless if you are trying to change its PoV as nothing can do that. Virgil Cain
kairosfocus: Here I am trying to draw his attention to the material issue that he would duck by pretending that a diamond crystal is comparable to the functionally specific organisation of a brain. Never said or thought such a thing. kairosfocus: the issue is to account for the FSCO/I rich heat engines and/or energy conversion devices that carry out key metabolic processes in the cell. FSCO/I is not found in thermodynamic equations. From the viewpoint of thermodynamics, energy is required to rearrange atoms into brains or diamonds. The question then is which has lower entropy, a brain or a like mass of diamonds. So, do you give up? Zachriel
Z, what part of being a commonly observed phenomenon, which per a like consequence of the underlying statistics that pose a needle in haystack challenge becomes maximally unlikely to be observed on blind chance and/or mechanical necessity (as are macro-fluctuations that would violate the second law . . . ), is it that is so hard for objectors of your ilk to acknowledge? Or is it, that FSCO/I’s origin is readily observed: not blind chance and mechanical necessity on the gamut of the observed cosmos — maximally implausible — but intelligently and purposefully directed configuration . . . so that we have an epistemic right to infer on best explanation that it is an empirically reliable sign of design as cause. KF PS: VC, you have a point. Here I am trying to draw his attention to the material issue that he would duck by pretending that a diamond crystal is comparable to the functionally specific organisation of a brain. As in handy side-track. Pretend that I have no answer to the entropy metric for a brain vs a crystal of similar mass, but Z et al do -- crystals are ordered and relatively low entropy; tissues are functionally organised and vary a fair amount but meet fairly hard limits as to what is workable. Does this change the stat underpinnings and needle in haystack implications coming from stat mech that ground FSCO/I as a reliable sign of design -- no. So, it is irrelevant but trotting out a distraction is all they got. You got a hammer everything must be a nail. That is how desperate the objectors now seem to be. kairosfocus
We recommend that everyone ignore Zachriel. Trying to reason with it is useless and causes distress. Virgil Cain
kairosfocus, Sorry you are unable to answer the question. Zachriel
Z, what part of being a commonly observed phenomenon, which per a like consequence of the underlying statistics that pose a needle in haystack challenge becomes maximally unlikely to be observed on blind chance and/or mechanical necessity (as are macro-fluctuations that would violate the second law . . . ), is it that is so hard for objectors of your ilk to acknowledge? Or is it, that FSCO/I's origin is readily observed: not blind chance and mechanical necessity on the gamut of the observed cosmos -- maximally implausible -- but intelligently and purposefully directed configuration . . . so that we have an epistemic right to infer on best explanation that it is an empirically reliable sign of design as cause. KF kairosfocus
kairosfocus: The upshot of that relevant issue is that on stat considerations, FSCO/I is not plausibly the product of blind chance and necessity FSCO/I is not in the laws of thermodynamics. Which has lower thermodynamic entropy, a brain or a like mass of diamond? Zachriel
Z, when I again and again took time to point out the way stat thermo-d relates . . . cf 200 above most recently, you repeatedly ignored and reverted to distractions, irrelevancies (including those warned against by Orgel and Wicken, of 40 years standing: the mechanically necessary ordering of crystals has nothing to do with wiring diagram, functionally specific info-rich organisation of significant complexity . . . ) -- and dismissal. That tells us all we need to know. The upshot of that relevant issue is that on stat considerations, FSCO/I is not plausibly the product of blind chance and necessity, for reasons connected to needle in haystack search. As in, relative statistical weight of clusters of microstates in a config or phase space, relative to accessible resources to take up configs. Inductive evidence on a trillion member base shows FSCO/I a reliable sign of design as cause. From OOL to OO body plans to OO our own, we have huge FSCO/I to address, strongly pointing to design. That has been pointed out repeatedly for years and has met with rhetorical stunts such as you are attempting, revealing again what is really likely to be going on: dismissive conclusion is in hand, just find a talking point to trot it out and expect widespread indoctrination in evolutionary materialism to carry the rhetorical day. At this stage, I am in effect writing for record. KF kairosfocus
kairosfocus: you have put up more dismissive and distractive rhetoric relative to the substantial matter. Not at all. You said it was a matter of statistical thermodynamics. Which has lower thermodynamic entropy, a brain or a like mass of diamond? Zachriel
Z, you have put up more dismissive and distractive rhetoric relative to the substantial matter. Notice what Orgel and Wicken cautioned. KF kairosfocus
kairosfocus: 1 ... 18 Eighteen bullet points and nearly 1500 words, yet did you answer? Which has lower thermodynamic entropy, a brain or a like mass of diamond? We even gave you a hint @194. Zachriel
Mung, For excellent reason. KF kairosfocus
I understand that I do not have the kind of blind faith required to believe the materialist origins story. Mung
Z, 1: What part of a string of N-elements * -* -*- . . . * in S states defines a config space of scope S^N do you not understand, e.g. 2^500 = 3.27*10^150, 2^1000 = 1.07*10^301? (And if you don't, after so many years of ID objecting, what does that say of your collective and the circle of objector sites?) 2: Likewise, what part of anything reducible to a nodes-arcs wiring diagram that defines specific functional organisation (e.g. an exploded view assembly diagram for an ABU 6500 reel) can readily be reduced to a description language chaining Y/N elements such as AutoCAD etc do, do you not understand? 3: What part of, this means description on specific strings gives an index of complexity for FSCO/I and is WLOG, do you not understand? 4: What part of this gives a representation of a config space do you not understand, and if you don't what part of get thee a copy of LK Nash and his discussion of H/T coins as first example for stat mech do you not understand? (And in case you imagine this is unrepresentative consider a paramagnetic substance of 500 or 1,000 atoms in an array per Mandl, as an example. Start 1,000 such in a random state and allow to evolve at random 10^14 times/s for 10^17 s, what are the odds that we would plausibly come up with a code for say a functional protein of 232 AAs or lines in a program that must function or ascii code for a sentence in English of 143 characters etc? ) 5: What part of the following clip from the Wiki article do you not understand?
in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon's information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states that it could be in, thus making any complete state description longer. (See article: maximum entropy thermodynamics.[Also,another article remarks: >>in the words of G. N. Lewis writing about chemical entropy in 1930, "Gain in entropy always means loss of information, and nothing more" . . . in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes/no questions that need to be answered in order to fully specify the microstate, given that we know the macrostate.>>])
6: What part of this clip from Orgel, 1973, is it that you do not understand?
. . . In brief, living organisms are distinguished by their specified complexity. Crystals are usually taken as the prototypes of simple well-specified structures, because they consist of a very large number of identical molecules packed together in a uniform way. Lumps of granite or random mixtures of polymers are examples of structures that are complex but not specified. The crystals fail to qualify as living because they lack complexity; the mixtures of polymers fail to qualify because they lack specificity . . . . [HT, Mung, fr. p. 190 & 196:] These vague idea can be made more precise by introducing the idea of information. Roughly speaking, the information content of a structure is the minimum number of instructions needed to specify the structure. [--> this is of course equivalent to the string of yes/no questions required to specify the relevant "wiring diagram" for the set of functional states, T, in the much larger space of possible clumped or scattered configurations, W, as Dembski would go on to define in NFL in 2002.] One can see intuitively that many instructions are needed to specify a complex structure. [--> so if the q's to be answered are Y/N, the chain length is an information measure that indicates complexity in bits . . . ] On the other hand a simple repeating structure can be specified in rather few instructions. [--> do once and repeat over and over in a loop . . . ] Complex but random structures, by definition, need hardly be specified at all . . . . Paley was right to emphasize the need for special explanations of the existence of objects with high information content, for they cannot be formed in nonevolutionary, inorganic processes. [The Origins of Life (John Wiley, 1973), p. 189, p. 190, p. 196. Of course, that immediately highlights OOL, where the required self-replicating entity is part of what has to be explained (cf. Paley here), a notorious conundrum for advocates of evolutionary materialism; one, that has led to mutual ruin documented by Shapiro and Orgel between metabolism first and genes first schools of thought, cf here. Behe would go on to point out that irreducibly complex structures are not credibly formed by incremental evolutionary processes and Menuge et al would bring up serious issues for the suggested exaptation alternative, cf. his challenges C1 - 5 in the just linked. Finally, Dembski highlights that CSI comes in deeply isolated islands T in much larger configuration spaces W, for biological systems functional islands. That puts up serious questions for origin of dozens of body plans reasonably requiring some 10 - 100+ mn bases of fresh genetic information to account for cell types, tissues, organs and multiple coherently integrated systems. Wicken's remarks a few years later as already were cited now take on fuller force in light of the further points from Orgel at pp. 190 and 196 . . . ]
7: Likewise, what part of this from Wicken, do you not understand?
‘Organized’ systems are to be carefully distinguished from ‘ordered’ systems. Neither kind of system is ‘random,’ but whereas ordered systems are generated according to simple algorithms [[i.e. “simple” force laws acting on objects starting from arbitrary and common- place initial conditions] and therefore lack complexity, organized systems must be assembled element by element according to an [[originally . . . ] external ‘wiring diagram’ with a high information content . . . Organization, then, is functional complexity and carries information. It is non-random by design or by selection, rather than by the a priori necessity of crystallographic ‘order.’ [[“The Generation of Complexity in Evolution: A Thermodynamic and Information-Theoretical Discussion,” Journal of Theoretical Biology, 77 (April 1979): p. 353, of pp. 349-65.]
8: What part of, these are not design advocates but are frankly facing the issues at stake from OOL up through OO body Plans to OO the human one, do you not understand? 9: What part of, 10^80 atoms of the observed cosmos, acting as observers every 10^-14 s, for 10^17 s would be unable to blindly sample a fraction of the space for 1,000 coins or bits, larger than one straw to a haystack that dwarfs the said observed cosmos, do you not understand? 10: What part of, the search of a space would be a subset so that the set of possible searches is the power set of cardinality 2^C, C being that of the original set, rendering the search for a golden search much harder than the direct search, do you not understand? 11: What part of, on trillions of observed cases and the linked needle in haystack, blind search challenge, the only reliable explanation of FSCO/I is design, do you not understand? 12: Which part of, FSCO/I is thus an inductively reliable sign of design do you not understand? 13: Which part of the search of config spaces and blind transitions among such is at the heart of stat mech thinking, do you not understand? 14: Which part of, it is now a commonplace that there are thousands of protein fold clusters in AA sequence space, where a great many are of but few members and are deeply isolated, raising search space challenges, do you not understand? 15: Which part of, fold and fit of coded, assembled AA sequence strings, are key parts of typical protein function do you not understand? (where 20^300 for a typical protein gives a config space of 2.04*10^390 ) 16: Which part of, on the above no effective blind search of the space required to form proteins to make a first living cell by sudden or gradual means on the gamut of the observed cosmos is credible, do you not understand? 17: Which part of, there is no sound conflation of crystal formation and relevant protein assembly based on coded strings, do you not understand? (Or for that matter, formation in a Darwin's pond etc.) 18: Which part of, the search problem is compounded hopelessly when we see that new body plans require 10 - 100+mn bases and embryologically and ecologically feasible system composition, do you not understand? Including, our own. KF kairosfocus
Zachriel: So you can’t. That’s fine. We recommend readers ignore your comments about statistical thermodynamics then. Mung: That bit of [il]logic hardly follows. Zachriel: Kairosfocus claims that statistical thermodynamics supports his claims concerning biological structures. Even if this is true, and even if statistical thermodynamics do not supports his claims concerning biological structures, it does not follow that readers ought to ignore his comments about statistical thermodynamics. Mung
Regular strings have low complexity and high specificity. Random strings have high complexity and zero specificity. Neither type of string is good at describing functional structures. Snowflakes, dunes, convection patterns and such like are regular non-functional structures. Function is a telic phenomenon that cannot be adequately described merely in terms of thermodynamics. On the axis of order to chaos complex function e.g. biological function is closer to chaos. But that is only a projection. http://www.tbiomed.com/content/4/1/47 EugeneS
kairosfocus: protein fold domains are on the whole deeply isolated in AA sequence space. So if we were to take, say, eighty random amino acids, we would never expect them to form a fold? Zachriel
kairosfocus: And once we are past the 500 – 1,000 bit threshold, the only observed and needle in haystack plausible cause is, design. How many bits in a snowflake? You said it was a matter of statistical thermodynamics, so you should be able to answer. Which has lower thermodynamic entropy, a brain or a like mass of diamond? Zachriel
Z, Let's go back to what Wiki was forced to admit on record, which you and others of your ilk keep ignoring or twisting:
At an everyday practical level the links between information entropy and thermodynamic entropy are not close. Physicists and chemists are apt to be more interested in changes in entropy as a system spontaneously evolves away from its initial conditions, in accordance with the second law of thermodynamics, rather than an unchanging probability distribution. And, as the numerical smallness of Boltzmann's constant kB indicates, the changes in S / kB for even minute amounts of substances in chemical and physical processes represent amounts of entropy which are so large as to be right off the scale compared to anything seen in data compression or signal processing. But, at a multidisciplinary level, connections can be made between thermodynamic and informational entropy, although it took many years in the development of the theories of statistical mechanics and information theory to make the relationship fully apparent. In fact, in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon's information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states that it could be in, thus making any complete state description longer. (See article: maximum entropy thermodynamics.[Also,another article remarks: >>in the words of G. N. Lewis writing about chemical entropy in 1930, "Gain in entropy always means loss of information, and nothing more" . . . in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes/no questions that need to be answered in order to fully specify the microstate, given that we know the macrostate.>>]) Maxwell's demon can (hypothetically) reduce the thermodynamic entropy of a system by using information about the states of individual molecules; but, as Landauer (from 1961) and co-workers have shown, to function the demon himself must increase thermodynamic entropy in the process, by at least the amount of Shannon information he proposes to first acquire and store; and so the total entropy does not decrease (which resolves the paradox).
As has been highlighted, entropy is a state not a path function so its components can be partitioned. In the case of FSCO/I, that is a relevant component, one that is easily identifiable by it works or not, based on specificity of configuration. So, the cluster of possible states to remain in such a state or to get to it, is very small relative to the set of possibilities. And this extends to not only forming and organising original live molecules and cells, but to getting the DNA and proteins etc for new body plan components. A string of n components with 20 states per element has 20^n possibilities, and one with 4 states, 4^n. Those soon enough exceeded 500 - 1,000 bits at 4.32 and 2 bits per element respectively. A small part of the problem, but already we are beyond the threshold for just one typical 300 or so AA protein. We need hundreds and hundreds to get to life forms and to get onwards to body plans. Where, protein fold domains are on the whole deeply isolated in AA sequence space. FSCO/I is readily observed [works/fails], and comes in deeply isolated islands of function in the config spaces of all possible arrangements because of the need to assemble and couple the right bits and pieces in the right order and orientation, and the same for complementary bits and pieces, in a key-lock wiring diagram organised system -- not a simple ordered repetitive crystal, as Orgel and Wicken long ago pointed out -- and are being studiously ignored on. Hence the issue of getting to that from Darwin's pond etc, with only physics and chem available to blindly do the work. And once we are past the 500 - 1,000 bit threshold, the only observed and needle in haystack plausible cause is, design. That is what you are ducking. It would be funny, if in the end it were not so sad. KF kairosfocus
Mung: That bit of [il]logic hardly follows. Kairosfocus claims that statistical thermodynamics supports his claims concerning biological structures. The standard molar entropy of various substances can be measured experimentally. Certainly, he should be able to give at least a qualitative answer, but he can't. Keeping in mind that the brain is about 3/4 water, standard molar entropy: Diamond: 2.4 Liquid Water: 70 More complex molecules usually have a higher entropy. Does that help narrow down the answer? http://tinyurl.com/StandardMolarEntropies Zachriel
Z, In one word: nonsense. You refuse to examine the critical point -- OOL, and the implications of the challenge to build FSCO/I with blind chance and mechanical necessity. That is, you keep on ignoring the point that there is a threshold that is readily shown, once we have 500 - 1,000 bits of FSCO/I we are at a point where the atomic and temporal resources of the sol system then the observed cosmos will be utterly incapable of finding isolated islands of function. Where requisites of specific function in a highly contingent context will confine us to clusters of working configs in the space of possibilities. In short, evolutionary materialist scientism is unable to get us from a Darwin pond etc to viable FSCO/I rich cell based life. But, the FSCO/I is an empirically confirmed, needle in haystack plausible, reliable signature of design. This needle in haystack problem compounds for origin of body plans and similarly complex entities. Where, the stat thermo-d speaks straight to config and phase spaces of v large size and highlights why spontaneous equilibrium comes about because of overwhelming relative statistical weight of clusters of microstates. So, we have a pretty good reason to see why systems, isolated, closed or open tend to move to such clusters unless otherwise constrained. But, commonly we see wiring diagram based assembly of such FSCO/I rich configs. By intelligence acting through design. This, you patently refuse to face: there is good reason to hold FSCO/I (that is a descriptive abbreviation . . . ) a signature of design. Above you had to admit that you cannot get to FSCO/I by blind chance and mechanical necessity in a Darwin pond etc. Again, the only empirically confirmed solution on how to get to FSCO/I is design. But no, you want to dismiss so you go off after a tangent that conflates a crystal with organised, functionally specific complexity. All in the obvious context, conclusion to reject design in hand, let's just find a rhetorically convenient excuse to make a dismissive talking point. Pop goes the weasel, you just did that. Here is what the rhetorical stunt you just pulled looks like: >>Oh, you cannot give an exact number on an irrelevancy -- though it is patent that the FSCO/I threshold has been vastly surpassed over and over again to get to the point of a functional brain, so let's ignore the substantial point and pretend all is well with the evo mat mythology.>> But, all you have really shown is that you are locked into a failed paradigm in the teeth of an adequate response on the substantial point. KF kairosfocus
Zachriel: So you can’t. That’s fine. We recommend readers ignore your comments about statistical thermodynamics then. LoL. That bit of [il]logic hardly follows. Mung
kairosfocus: There is no need to try to estimate values for diamond crystal vs brain tissue, when we already know that the FSCO/I in the cells of the brain is already well beyond a relevant threshold. So you can't. That's fine. We recommend readers ignore your comments about statistical thermodynamics then. Zachriel
Z, still side tracking I see. There is no need to try to estimate values for diamond crystal vs brain tissue, when we already know that the FSCO/I in the cells of the brain is already well beyond a relevant threshold. And, the statistical thermodynamics considerations on config spaces and islands of function are already more than adequate, once we recognise that a diamond crystal structure is mechanical necessity leading to a metastable structure which contrasts with the non periodic, wiring diagram based highly informational assembly involved in molecules, cells, tissues and neural networks. We cannot quantify the whole [an overwhelmingly complex calc] but there is far more than enough in hand to see that we are far, far, far beyond the 500 - 1,000 bits of functionally specific complex organisation and associated information where blind chance and mechanical necessity are hopelessly implausible and the only observationally warranted explanation is design. Remember, cellular functionality is a config constraining macro observable, locking us to islands of function in config spaces that go far beyond astronomical blind needle in haystack search challenge. The threshold calc is good enough for reasonable purposes, to strain at a gnat while swallowing a camel is pointless -- apart from playing at the useless rhetoric of selective hyperskepticism -- once such is in hand. Finally, the proper place to begin is the root of the tree of life, OOL, which is decisive, it puts design at the table as of right as best empirically warranted explanation from the outset. That is what is being studiously ducked, and it is telling. KF kairosfocus
kairosfocus: The diamond or graphite crystal is ordered per the mechanical necessity of crystals and relevant PVT etc circumstances. You said it was a matter of statistical thermodynamics, so you should be able to answer. Which has lower thermodynamic entropy, a brain or a like mass of diamond? Mung: Which is more ordered, a brain or a like mass of diamond? Why don't you venture an answer. Zachriel
kairosfocus: Z, not so. Not so, what? That intelligent action is not an exception to the laws of thermodynamics? kairosfocus: You seem to consistently forget 5 minutes after yet another reminder, the issue that thermodynamics has a statistical underpinning Sure it does. That means rearranging matter into certain configurations takes work. Zachriel
Mung, Happy new year to you and all others here at UD. I suggest the best answer is in Orgel and Wicken. Brains are organised in a clearly goal and function oriented pattern in accord with implied wiring diagrams and associated information. The significance of which becomes evident from the sinister sniper's practice of the head-shot. The diamond or graphite crystal is ordered per the mechanical necessity of crystals and relevant PVT etc circumstances. Those worthies also contrast randomness as with tars and masses of small crystals in certain rocks. They warn that conflation of the first two -- an obvious agenda of Z -- has no credible future. And, given the state-based rather than path based functional nature of entropy, it makes sense to partition its components, indeed in classical thermodynamics, measurements are relative to changes from reference states. Thus, a conceptually oriented analysis that unifies information and organisation on the one hand with entropy, order and the second law on the other is an eminently reasonable scientific exercise. So, it is unreasonable to try to insist on conflating that which is observationally and conceptually distinguishable within the ambit of a unifying analysis. Remember, Shannon's analysis unexpectedly pointed to a strong parallel c 1948. By 1957 on, Jaynes had hit on an analysis that was going to be very fruitful, moreso than simple negentropy . . . though this last is not exactly without merits. Z's ideological lockout tactic is 40 - 50 years behind times, once Jaynes weighed in with the information bridge and once we got to the Orgel-Wicken distinction. Actually, Lewis from 80 or so years ago had powerful points, as Wiki has been forced to concede. KF kairosfocus
Zachriel, you forgot to answer: Which is more ordered, a brain or a like mass of diamond? Mung
Z, not so. You seem to consistently forget 5 minutes after yet another reminder, the issue that thermodynamics has a statistical underpinning; which bridges to info theory considerations -- as even Wiki has had to concede. The degree of FSCO/I involved in cell based life is maximally implausible on a Darwin's pond or the like scenario. KF kairosfocus
Zach and Velikovsky came to conclusion on another blog that rain is bathroom shower. Perhaps then for Zach sun is heat lamp, creek is toilet (sorry fishies), birds in the forest are surround sound music system, raspberry bush a food replicator. Then monsoon must be a lawn sprinkler :-D Eugen
kairosfocus: Where is that FSCO/I coming from before there is a self-replicating, von Neumann kinematic self replicating cell. No one knows at this point, but there is no intrinsic argument from thermodynamics that precludes such a possibility. You forgot to answer: Which has lower thermodynamic entropy, a brain or a like mass of diamond? Mung: please note that intelligent configuration of a heat engine to do mechanical work does not violate the laws of thermodynamics. That's right. Neither the building or use of a heat engine violates the laws of thermodynamics. Intelligent action is not an exception to the laws of thermodynamics. Zachriel
Zachriel: Which has lower thermodynamic entropy, a brain or a like mass of diamond? Which is more ordered, a brain or a like mass of diamond? Mung
Zachriel, please note that intelligent configuration of a heat engine to do mechanical work does not violate the laws of thermodynamics. Mung
Z, more and more misdirections. Start with ATP synthase, which is pivotal to life as ever so much of life chemistry is endothermic. Address the FSCO/I-rich units in metabolism, and the FSCO/I in the process-flow network itself. Do not overlook coded information and how it works in the protein assembling ribosome. Start with the darwin pond or the like. Where is that FSCO/I coming from before there is a self-replicating, von Neumann kinematic self replicating cell. Indeed, that too is to be explained. On what actual empirical observation does it become plausible to infer or hold that this FSCO/I comes about by blind chance and mechanical necessity without any intelligent configuration? KF PS: You are also trying to tag to me a statement that comes from Wikipedia speaking against general interest. Besides it is correct, adding heat to a system increases the number of ways mass and energy may be arranged at microscopic level. In a simple but classic case, this is the Boltzmann expression that is on his gravestone: s = k log W. If you object to it, kindly explain how Boltzmann and Gibbs et al went wrong in founding statistical thermodynamics. kairosfocus
kairosfocus: adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states that it could be in, thus making any complete state description longer. Which has lower thermodynamic entropy, a brain or a like mass of diamond? Zachriel
Mung: In thermodynamics, a heat engine is a system that converts heat or thermal energy to mechanical energy, which can then be used to do mechanical work. That's right! We just wanted to make clear that thermal energy is converted into mechanical energy in nature too. Zachriel
Word games, Zachriel.
In thermodynamics, a heat engine is a system that converts heat or thermal energy to mechanical energy, which can then be used to do mechanical work.
Heat Engine Mung
kairosfocus: a word game Not at all. You brought up heat engines, which don't require design, but do require a source of energy. kairosfocus: the issue is to account for the FSCO/I rich heat engines and/or energy conversion devices that carry out key metabolic processes in the cell. From the viewpoint of thermodynamics, energy is required. And complex structures are often less thermodynamically ordered than natural structures, such as crystals. Other than that, thermodynamics doesn't have much to say. Zachriel
Z, again, a word game. That was not an issue, the issue is to account for the FSCO/I rich heat engines and/or energy conversion devices that carry out key metabolic processes in the cell. Try ATP synthase, the rotary turret enzyme that makes ATP the energy battery molecule for use in so many life activities. KF kairosfocus
kairosfocus: Your objection is misdirected We agree that some heat engines are "carefully organised", while others occur naturally. Zachriel
Z, did you see where I both pointed to tropical cyclones as convective-coriolis systems and in what you clipped, spoke of how "heat engines commonly are carefully organised and FSCO/I rich"? (As in, as opposed to universally?) Your objection is misdirected, but of course it could lead an inattentive reader to think I did not make a careful distinction. But, I did. And that highlights the real -- and unanswered -- issue: there are a LOT of heat engines and energy conversion devices that are FSCO/I rich, ranging from steam engines and auto engines to the molecular nanotech of the cell. That requires serious explanation, and the pattern remains that once we directly know causal origin, FSCO/I is a reliable sign of intelligence, backed up by the needle in haystack challenge. For that, so far it is crickets chirping. KF PS: Much the same problem arises with the diversion from the issue of origin of the cell including its self replicating facility. Obviously, cells replicate in a chain and per conventional timeline have for 3.5 bn y. But that says zip about the origin of those FSCO/I based facilities in Darwin's pond or the like. With, the needle haystack challenge looking right at you. Which is what was on the table and it is what you again sought to divert attention from. Such tactics speak volumes. PPS: Especially coming hard on the heels of having to admit no account for origin of the relevant FSCO/I, as Mung reminds. kairosfocus
kairosfocus: Thus, when we look at such in the living cell, the question arises: can this be accounted for per molecular noise in Darwin’s pond etc? Zachriel: No Amen! Mung
Alicia Cartelli:
I’m making it up as I go along. Just like you!
We should be in for some fun times then! :D Mung
kairosfocus: But notice, heat engines commonly are carefully organised and FSCO/I rich. Such as the monsoons. kairosfocus: Thus, when we look at such in the living cell, the question arises: can this be accounted for per molecular noise in Darwin’s pond etc? No. Extant cells are the product of billions of years of evolution, not "molecular noise". Zachriel
Mung (Attn AC et al): Before we get to a self replicating life form, we have to go through the chemistry and physics of a Darwin pond or the like prebiotic environment. And, the origin of the molecular nanotech of life and associated information has to be cogently explained. With thermodynamics issues including statistical and information ones, up front centre. If you do not get from chem to bio chem, the game is over at the beginning, and thermodynamics is central. Thing is, this will give some key lessons in the difficulty of finding deeply isolated islands of functional organisation in beyond astronomical config spaces, by way of blind chance and necessity needle in haystack searches where for just 1,000 bits worth, the search capability of the observed cosmos of 10^80 or so atoms, for 10^17 s at 10^12 - 14 tried per second each as observers, will be as one straw to a haystack that dwarfs the observed cosmos, as we here deal with 1.07*10^301 possibilities. That is what is being ducked. Then, when we get to the origin of body plans, thus integrated networks of organs, tissues, cell types and requisite proteins and D/RNA, we are looking at the same challenge to be solved in a sol system of some 10^57 atoms. Where 1,000 bits is about as complex as a typical protein. Where also, protein fold domains are going to be deeply isolated in AA sequence space. I am not even going to bother to discuss how we get to codes and to embryologically and ecologically viable body plans. Thermodynamics and energy flows obviously apply and there is a tendency to glibly suggest that open systems with energy flows remove that as an obstacle. Not so fast. First, consider Clausius' isolated system with two subsystems . . . I here clip App ! my always linked:
a] Clausius is the founder of the 2nd law, and the first standard example of an isolated system -- one that allows neither energy nor matter to flow in or out -- is instructive, given the "closed" subsystems [i.e. allowing energy to pass in or out] in it. Pardon the substitute for a real diagram, for now: Isol System: | | (A, at Thot) --> d'Q, heat --> (B, at T cold) | | b] Now, we introduce entropy change dS >/= d'Q/T . . . "Eqn" A.1 c] So, dSa >/= -d'Q/Th, and dSb >/= +d'Q/Tc, where Th > Tc d] That is, for system, dStot >/= dSa + dSb >/= 0, as Th > Tc . . . "Eqn" A.2 e] But, observe: the subsystems A and B are open to energy inflows and outflows, and the entropy of B RISES DUE TO THE IMPORTATION OF RAW ENERGY. f] The key point is that when raw energy enters a body, it tends to make its entropy rise. This can be envisioned on a simple model of a gas-filled box with piston-ends at the left and the right: ================================= ||::::::::::::::::::::::::::::::::::::::::::|| ||::::::::::::::::::::::::::::::::::::::::::||=== ||::::::::::::::::::::::::::::::::::::::::::|| ================================= 1: Consider a box as above, filled with tiny perfectly hard marbles [so collisions will be elastic], scattered similar to a raisin-filled Christmas pudding (pardon how the textual elements give the impression of a regular grid, think of them as scattered more or less hap-hazardly as would happen in a cake). 2: Now, let the marbles all be at rest to begin with. 3: Then, imagine that a layer of them up against the leftmost wall were given a sudden, quite, quite hard push to the right [the left and right ends are pistons]. 4: Simply on Newtonian physics, the moving balls would begin to collide with the marbles to their right, and in this model perfectly elastically. So, as they hit, the other marbles would be set in motion in succession. A wave of motion would begin, rippling from left to right 5:As the glancing angles on collision will vary at random, the marbles hit and the original marbles would soon begin to bounce in all sorts of directions. Then, they would also deflect off the walls, bouncing back into the body of the box and other marbles, causing the motion to continue indefinitely. 6: Soon, the marbles will be continually moving in all sorts of directions, with varying speeds, forming what is called the Maxwell-Boltzmann distribution, a bell-shaped curve. 7: And, this pattern would emerge independent of the specific initial arrantgement or how we impart motion to it, i.e. this is an attractor in the phase space: once the marbles are set in motion somehow, and move around and interact, they will soon enough settle into the M-B pattern. E.g. the same would happen if a small charge of explosive were set off in the middle of the box, pushing our the balls there into the rest, and so on. And once the M-B pattern sets in, it will strongly tend to continue. (That is, the process is ergodic.) . . .
Heat engines partly get around this by being carefully organised to couple particular forms of energy to create shaft work; exhausting waste heat to a sink at lower temperature than the heat source. That way the rise compensates. But notice, heat engines commonly are carefully organised and FSCO/I rich. (We are not here looking at say a convection- coriolis entity such as a tropical cyclone.) Thus, when we look at such in the living cell, the question arises: can this be accounted for per molecular noise in Darwin's pond etc? That takes us to the statistical, needle in haystack challenge, and there is no cogent way to plausibly claim and back up with good observations, that such could reasonably spontaneously form. So, we are looking at questionable ideological imposition of a priori materialistic scientism, meant to lock out the obvious inference: FSCO/I per massive observational base, is a reliable sign of design as cause. Backed up by the needle in haystack challenge. So we see the ways in which thermodynamics principles and considerations are relevant but unwelcome. Thus the promotion of dismissive talking points about open systems. KF PS: And to get from heat engines and shaft or flow work to building FSCO/I rich entities, we are looking at wiring diagrams, assembly steps and availability of the right components in the right order, place and time. All of which is hugely information rich. It's FSCO/I all the way down. kairosfocus
Mung, you had me breaking out in laughter as I contemplated apostle Jaynes and evangelist Robertson eyeing a suspiciously large cauldron as they preach to the unwashed heathen in a Science faculty seminar room. KF PS: Even Wiki may be getting the Info view of thermo-d religion:
At an everyday practical level the links between information entropy and thermodynamic entropy are not close. Physicists and chemists are apt to be more interested in changes in entropy as a system spontaneously evolves away from its initial conditions, in accordance with the second law of thermodynamics, rather than an unchanging probability distribution. And, as the numerical smallness of Boltzmann's constant kB indicates, the changes in S / kB for even minute amounts of substances in chemical and physical processes represent amounts of entropy which are so large as to be right off the scale compared to anything seen in data compression or signal processing. But, at a multidisciplinary level, connections can be made between thermodynamic and informational entropy, although it took many years in the development of the theories of statistical mechanics and information theory to make the relationship fully apparent. In fact, in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon's information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states that it could be in, thus making any complete state description longer. (See article: maximum entropy thermodynamics.[Also,another article remarks: >>in the words of G. N. Lewis writing about chemical entropy in 1930, "Gain in entropy always means loss of information, and nothing more" . . . in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes/no questions that need to be answered in order to fully specify the microstate, given that we know the macrostate.>>]) Maxwell's demon can (hypothetically) reduce the thermodynamic entropy of a system by using information about the states of individual molecules; but, as Landauer (from 1961) and co-workers have shown, to function the demon himself must increase thermodynamic entropy in the process, by at least the amount of Shannon information he proposes to first acquire and store; and so the total entropy does not decrease (which resolves the paradox).
Soon, sinners will be called down the sawdust trail to the old fashioned mourner's bench. Info Thermo-D Revival looks to be breaking out! (And I see HSR got to you quickly.) kairosfocus
Nope. I'm making it up as I go along. Just like you! Alicia Cartelli
Well you sure had me fooled. I figured if you were talking biology that you would know what you were talking about. Silly me. :) Mung
We're talking about biological systems here mungy, but feel free to teach me about entropy and steam engines. Alicia Cartelli
Thank God entropy doesn't apply to steam engines. Nor does Gibbs free energy, for that matter. LoL. Mung
No, yearning, it's not speculation. It is a fact that the various stages in the origin of life involved matter moving from a less ordered state to a more ordered state, so by definition, entropy is involved. Specifically, for life to arise, entropy had to be counteracted. And the basic equation you can think on is known, try googling "Gibb's free energy." And entropy does not "apply across the board back to the molecular level." It really ONLY applies at the molecular level, once you go above that, it is not really useful. You love to use the phrases "speculation" and "a priori assertion" but what I am telling you involves neither. What I am telling you are the facts of the world we live in. Sorry to burst your bubble. Mungy, you just don't know when to quit. Maybe there's a program out there that can help you. Alicia Cartelli
If we have enough molecules, then entropy doesn't apply. :) Mung
Alicia Cartelli @136: Thanks for your reply. [My comments embedded thus] "Entropy is an important part of the equation for the origin of life, yearning. [(1)Speculation. (2)I don't believe there is anything as yet known about this 'equation' thus again, speculation] The first living things had to develop metabolic systems that counteracted entropy and life must constantly continue to do this. [Again speculation and a priori assertion of evolution as fact.] Once you leave the molecular level and start talking about entire cells/tissues/organs, entropy really isn’t part of the conversation." [Agree entirely, but I would suggest this sentence applies across the board back to the molecular level; again speculation and a priori assertion of evolution at the molecular level as fact.] ayearningforpublius
Some of the commentaries contain sermons intended to convert the heathen to the truth of the information-theoretic approach to statistical thermophysics. - Harry S. Robertson
heh. thanks kf. Mung
Z, squid ink clouds. Onlookers cf above to see what Z is so at pains not to see. KF kairosfocus
Saying that computers or snowflakes are not possible under the laws of thermodynamics is just silly,
Computers and snowflakes are only possible in an Intelligently Designed world. Thermodynamics only makes sense in an Intelligently Designed world. Saying that computers or snowflakes are possible under a materialistic framework is just silly. Virgil Cain
Mung: Now I’m going to wait for computers to fall from the sky. We all know that it’s not against the laws for thermodynamics for it to happen. As long as work is done, there's nothing in thermodynamics to prevent it. https://cmchone.files.wordpress.com/2010/03/26181-clipart-illustration-of-four-laptop-computers-falling-in-a-cloudy-blue-sky.jpg kairosfocus: It grounds 2LOT and it grounds on pretty much the same grounds the needle in haystack challenge that FSCO/I poses. We both agree that computers and snowflakes can only occur if thermodynamic work is done. Mung: If we just wait long enough he’ll spontaneously change. The laws of thermodynamics don’t prohibit it. Your statement entails the misunderstanding. Thermodynamics doesn't preclude computers or snowflakes, as long as thermodynamic work is involved. But it doesn't require it either. That depends on the specifics of the processes involved. Zachriel
KF: It is interesting that you have consistently refused to address that statistical approach. If we just wait long enough he'll spontaneously change. The laws of thermodynamics don't prohibit it. :) Mung
Z, the issue is the molecular statistics and macro-identifiable distinct clusters of microstates. This is the more fundamental statistical thermodynamics analysis, on whih e.g. entropy is tied to the number of ways mass and energy at micro levels may be arranged consistent with macro-definable states, indeed [up to relevant constants of proportionality] as the measure of info in bits or the like left to define microstate on being given macrostate. It grounds 2LOT and it grounds on pretty much the same grounds the needle in haystack challenge that FSCO/I poses. It is interesting that you have consistently refused to address that statistical approach. KF PS: Nor is it reasonable to rhetorically blur out that oh work is done in both cases so there. Anytime a force moves its point of application along its line of action work is done. Ice crystals form through the geometry of the polar molecule and require nothing beyond mechanical necessity . . . the analogy shows its flaws again. A typical 300 AA protein or a PC are radically different, highly contingent and per observation require step by step programmed, information rich constructive work. For both of these, the needle in haystack challenge is such that blind chance and mechanical necessity on the gamut of the observed cosmos are maximally implausible explanations relative to intelligently directed configuration. Absent arbitrary imposition of the ideology of evolutionary materialist scientism. kairosfocus
Computers and snowflakes are both implausible with thermodynamic work being done. Now I'm going to wait for computers to fall from the sky. We all know that it's not against the laws for thermodynamics for it to happen. Mung
Mung: There’s nothing about “feeding people” in the laws of thermodynamics. Well, technically humans are chemical engines, but the thermodynamics are comparable. Fuel is converted to create mechanical energy. That's why you put gas in your car. Mung: There’s nothing about “building computers” in the laws of thermodynamics. No. But that brings up the salient point. Saying that thermodynamics explains the formation of computers or snowflakes is not much of an explanation. While both processes must be consistent with thermodynamics, we have to point to other features of the system to provide a complete explanation. Saying that computers or snowflakes are not possible under the laws of thermodynamics is just silly, except to say that computers and snowflakes are both implausible without thermodynamic work being done. Zachriel
Zachriel: From a thermodynamic perspective, people who build computers are just heat machines. You feed them fuel, they move mechanically, and when you stop feeding them, they wind down. There’s nothing about “feeding people” in the laws of thermodynamics. Mung
Zachriel: There’s nothing about “specified complexity” in the laws of thermodynamics. So? Zachriel: From a thermodynamic perspective, people who build computers are just heat machines. You feed them fuel, they move mechanically, and when you stop feeding them, they wind down. There's nothing about "building computers" in the laws of thermodynamics. Mung
kairofocus: For ice, that work is a built in mechanical ordering. The primary work of snowflake formation is evaporating water, lifting it into the atmosphere, then cooling it. The energy, of course, comes from the Sun. kairofocus: For a computer or a protein, the functionally specific organisation per prescriptive information and programmed step by step contingent actions, is not mechanical necessity, nor is it credibly chance. From a thermodynamic perspective, people who build computers are just heat machines. You feed them fuel, they move mechanically, and when you stop feeding them, they wind down. There’s nothing about “specified complexity” in the laws of thermodynamics. Zachriel
Z, you are on a tangent chase on word plays that obfuscate the pivotal differences; rhetoric not insight. I have long since pointed out that F dot dx is work. For ice, that work is a built in mechanical ordering. For a computer or a protein, the functionally specific organisation per prescriptive information and programmed step by step contingent actions, is not mechanical necessity, nor is it credibly chance. I suggest you re-read Orgel and Wicken above from 40 years ago. KF kairosfocus
kairosfocus: Water crystallises in the patterns due to built in ordering forces that make the crystallisation a mechanical necessity once sufficient random kinetic energy (thermal agitation) is removed by cooling. In other words, it requires thermodynamic work. kairosfocus: Wiring diagram functional organisation is a very small cluster in the space of available contingent possibilities under the same conditions, which is where the needle in haystack challenge comes in. So is a snowflake in the absence of thermodynamic work, as per your own example of molecules of gas going to one side of the room. Without work being done, there are no computers or snowflakes. Zachriel
Z, your lame attempt to equate crystallisation of a polar molecule with wiring diagram directed constructive work fails. Water crystallises in the patterns due to built in ordering forces that make the crystallisation a mechanical necessity once sufficient random kinetic energy (thermal agitation) is removed by cooling. Wiring diagram functional organisation is a very small cluster in the space of available contingent possibilities under the same conditions, which is where the needle in haystack challenge comes in. Hence the need for programmed organisation to get it to a functional form. And usually, first time up there has to be a lot of troubleshooting to fix errors in our designs. KF kairosfocus
Mung: Is work even possible without entropy? If you mean without an increase in overall entropy, the answer is no. Zachriel
Zachriel: And it is just as unlikely that water will form into snowflakes — in the absence of work. Is work even possible without entropy? Mung
kairosfocus: For just one simple comparison, the spontaneous emergence of a case where the O2 molecules in the room in which you sit all rush to one end is abstractly statistically possible but so overwhelmed by the bulk clusters of configs — the equilibrium — that such is unobservable on the gamut of the observed cosmos. Mung: Exactly! And it is just as unlikely that water will form into snowflakes — in the absence of work. Zachriel
For just one simple comparison, the spontaneous emergence of a case where the O2 molecules in the room in which you sit all rush to one end is abstractly statistically possible...
Exactly! Mung
kairosfocus: This is a case where heat and energy flows through machinery that generates first shaft work then transforms that into functionally specific, organising constructive work and emission of waste heat are all pivotal to why 2LOT is consistent with the assembly of a computer. If you mean work is involved, then sure. Is that what you mean? kairosfocus: Your snowflake was an analogy that failed, you confuse complexity with specified complexity. It’s not an analogy, and there’s nothing about “specified complexity” in the laws of thermodynamics. Zachriel
Z, if you think my answer with explanatory remarks is irrelevant; with all due respect, it shows that you do not understand the subject. This is a case where heat and energy flows through machinery that generates first shaft work then transforms that into functionally specific, organising constructive work and emission of waste heat are all pivotal to why 2LOT is consistent with the assembly of a computer. This is not a simplistic yes no issue. The key point being, raw energy flows are not reasonably expected to produce the constructive work. Your underlying problem is most likely, that you want molecular noise in Darwin's pond or the like to do the same constructive work, confronting you with the needle in haystack problem with config spaces that dwarf the search resources of the observed cosmos. For over 100 years now, too, the laws of thermodynamics, classical form, are rooted in statistical considerations and config or phase spaces. In that deeper context,the issue of accessing deeply isolated, complex and specific clusters of configs are deeply and directly connected to the laws. For just one simple comparison, the spontaneous emergence of a case where the O2 molecules in the room in which you sit all rush to one end is abstractly statistically possible but so overwhelmed by the bulk clusters of configs -- the equilibrium -- that such is unobservable on the gamut of the observed cosmos. The spontaneous emergence of FSCO/I will predictably be unobservable absent energy converters giving shaft work to drive programmed constructors for much the same reason. Dismissive talking points about "irrelevance" in this context, simply tell me that you are resisting the key -- highly material -- points as they do not fot the simplistic notion that effectively magic happens in open systems. Not when complexity and functional specificity lead to needle in haystack search challenges beyond the reasonable scope of the observed cosmos. KF kairosfocus
kairosfocus: So your assumption that I did not respond is false — and I bolded to make that plain. Yes, you did answer a simple yes and no question with a lot of irrelevant words. We are in agreement, though, on this: Building and using a computer is consistent with the laws of thermodynamics. kairosfocus: Your snowflake was an analogy that failed, you confuse complexity with specified complexity. It’s not an analogy, and there’s nothing about “complex specificity” in the laws of thermodynamics. Zachriel
Alicia Cartelli:
The first living things had to develop metabolic systems that counteracted entropy and life must constantly continue to do this.
More evidence that Alicia doesn't know what life is. What Alicia no doubt meant to say is that non-living systems had to "counteract entropy" in order for living systems to come into existence, but she knows how silly that sounds. Mung
Once you leave the molecular level and start talking about entire cells/tissues/organs, entropy really isn’t part of the conversation.
LoL. And yet we have Zachriel: So building and using computers is consistent with the laws of thermodynamics. Mung
AC, the CONVERSATION, maybe -- one dominated by an admitted evolutionary materialistic scientism. But in reality the cell is a molecular scale functional organised entity and its workhorse components are proteins created by step by step molecular scale assemblers. This implies accounting for molecular components, cell types, spatial organisation to function, tissues in integrated organs, with associated systems and networks in an embryologically and ecologically feasible body plan. 50 cell types is a reasonable estimate, and overall requisites are of order 10 - 100+ mn base pairs to account for the novel proteins (and presumably some regulatory capacity). With 100+ mn actually consistently observed. As just 1,000 bits of FSCO/I already overwhelms the blind chance and necessity needles in a haystack search challenge on observed cosmos resources, there is only one observed causal factor capable of such designs: intelligently, purposefully directed configuration. KF kairosfocus
Entropy is an important part of the equation for the origin of life, yearning. The first living things had to develop metabolic systems that counteracted entropy and life must constantly continue to do this. Once you leave the molecular level and start talking about entire cells/tissues/organs, entropy really isn't part of the conversation. Alicia Cartelli
Z, First, I took time to explain in brief how a PC is made and in so doing, where the accord with thermodynamics principles arises; also explaining briefly what work is: forced, ordered motion . . . dW = F dot dx. I have no desire to try to go into mixed first and second law analysis here; I assume you know the Sankey diagram of a heat engine. So your assumption that I did not respond is false -- and I bolded to make that plain. Second, there seems to be a problem of the misconstrued argument rooted in what seem to be serious conceptual gaps. If that is the case, you may find here on http://www.angelfire.com/pro/kairosfocus/resources/Info_design_and_science.htm#shnn_info and here on http://www.angelfire.com/pro/kairosfocus/resources/Info_design_and_science.htm#thermod helpful. KF kairosfocus
You seem unable to respond directly to arguments as they are raised. LoL. Good one! Mung
You seem unable to respond directly to arguments as they are raised. kairosfocus: Your snowflake was an analogy that failed, you confuse complexity with specified complexity. It’s not an analogy, and there’s nothing about “complex specificity” in the laws of thermodynamics. kairosfocus: When we design, construct and use computers, we carry out organised work sequences that are controlled informationally and use work-producing machines. The exported waste energy secures consistency with Lex2 Th, and illustrates the only empirically warranted source for the relevant FSCO/I, intelligently directed configuration. So building and using computers is consistent with the laws of thermodynamics. Zachriel
Z, the pivotal case is Darwin's pond or the like. There is no reproduction mechanism, that has to be explained or accounted for on physics and chemistry in the pond. And yes, that is OOL. Start there as it is the plainest case, and one where there is no magic of differential reproductive success to cloud the core issue. I add, not that origin of body plans requiring 10 - 100+ mn base pairs will help much, with need for novel proteins and integrated systems. KF PS: There is a lot about work in thermo-d and you propose to extract constructive work assembling functionally specific complex and info rich organisation from the blind chance and mechanical necessity of the pond etc. Where FSCO/I is known to sharply constrain configs relative to the set of possibilities, posing a beyond astronomical needle in haystack search. At 1,000 bits of info for let's just say a string molecule, the set of possibilities to what the observed cosmos could do, would be as a haystack dwarfing the cosmos to a single straw. This is the reason why it is very reasonable that we only observe FSCO/I coming from design and construction per wiring diagram. PPS: Your snowflake was an analogy that failed, you confuse complexity with specified complexity. PPPS: Likewise, it is patent that I answered as to how constructive work to create the FSCO/I in a PC accords with 2LOT, by exporting waste heat and using constructive machines and energy converters that act according to a plan. Note:
When we design, construct and use computers, we carry out organised work sequences that are controlled informationally and use work-producing machines. The exported waste energy secures consistency with Lex2 Th, and illustrates the only empirically warranted source for the relevant FSCO/I, intelligently directed configuration.
You are trying to get constructive work to build huge FSCO/I without a plan or constructive machines, out of lucky noise. Contrast how proteins are constructed per instructions in the ribosome, a clue as to what is reasonable. All the way back to Thaxton et al this was addressed. kairosfocus
Zachriel:
Evolution.
Define "evolution" and then produce the evidence that it can produce CSI from scratch.
It’s not an analogy, and there’s nothing about “complex specificity” in the laws of thermodynamics.
CSI requires magnitudes of order. And that doesn't come from disorder.
When a human builds and uses a computer, is that consistent with the laws of thermodynamics?
No, if nature, operating freely did, tat would violate thermodynamics. And the same goes for a living organism. Virgil Cain
kairosfocus: You are asking for lucky noise to create FSCO/I. No. Evolution. kairosfocus: And indeed, the point is that the high contingency and complex specificity are tied to different aspects of the snowflake. Your analogy collapses. It's not an analogy, and there's nothing about "complex specificity" in the laws of thermodynamics. kairosfocus: PS: When we design, construct and use computers, we carry out organised work sequences that are controlled informationally and use work-producing machines. You didn't answer the question. When a human builds and uses a computer, is that consistent with the laws of thermodynamics? Zachriel
Would work even be possible without entropy? Mung
PPS: Work being best summed up as forced, ordered motion at macro or micro levels. Functionally specific organisation arises from step by step sequences of work that assemble an entity per its wiring diagram, hence construction work arising from shaft work. Heat engines provide a partial conversion of random molecular agitation to shaft work, and that is a main reason why this cannot be 100% efficient per the Carnot theorem. Ordered motion can much more readily be converted into work, as with wind flow and turbines or electrical currents etc. Heat engines reject waste heat to the environment. kairosfocus
Z That is directly connected, are you aware of the statistical basis for entropy and the second law? You are asking for lucky noise to create FSCO/I. Second, there is patently no programmed inclination in nature to form particular D/RNA and AA sequences. Otherwise the flexibility in D/RNA and in proteins would be impossible. That is the flexibility and highly informational character of such molecules precisely undermines any claim or suggestion that we deal with the structuring dynamic that imposes sixfold symmetry on snowflakes. And, where snowflakes are highly variable, that too would be a case where the flexibility is not preprogrammed to give particular outcomes. And indeed, the point is that the high contingency and complex specificity are tied to different aspects of the snowflake. Your analogy collapses. I think you need to attend to what Orgel and Wicken had to say in the 1970's:
Wicken, 1979: ‘Organized’ systems are to be carefully distinguished from ‘ordered’ systems. Neither kind of system is ‘random,’ but whereas ordered systems are generated according to simple algorithms [[i.e. “simple” force laws acting on objects starting from arbitrary and common- place initial conditions] and therefore lack complexity, organized systems must be assembled element by element according to an [[originally . . . ] external ‘wiring diagram’ with a high information content . . . Organization, then, is functional complexity and carries information. It is non-random by design or by selection, rather than by the a priori necessity of crystallographic ‘order.’ [[“The Generation of Complexity in Evolution: A Thermodynamic and Information-Theoretical Discussion,” Journal of Theoretical Biology, 77 (April 1979): p. 353, of pp. 349-65. (Emphases and notes added. Nb: “originally” is added to highlight that for self-replicating systems, the blue print can be built-in.)] Orgel, 1973: . . . In brief, living organisms are distinguished by their specified complexity. Crystals are usually taken as the prototypes of simple well-specified structures, because they consist of a very large number of identical molecules packed together in a uniform way. Lumps of granite or random mixtures of polymers are examples of structures that are complex but not specified. The crystals fail to qualify as living because they lack complexity; the mixtures of polymers fail to qualify because they lack specificity . . . . [HT, Mung, fr. p. 190 & 196:] These vague idea can be made more precise by introducing the idea of information. Roughly speaking, the information content of a structure is the minimum number of instructions needed to specify the structure. [--> this is of course equivalent to the string of yes/no questions required to specify the relevant "wiring diagram" for the set of functional states, T, in the much larger space of possible clumped or scattered configurations, W, as Dembski would go on to define in NFL in 2002, also cf here, here and here (with here on self-moved agents as designing causes).] One can see intuitively that many instructions are needed to specify a complex structure. [--> so if the q's to be answered are Y/N, the chain length is an information measure that indicates complexity in bits . . . ] On the other hand a simple repeating structure can be specified in rather few instructions. [--> do once and repeat over and over in a loop . . . ] Complex but random structures, by definition, need hardly be specified at all . . . . Paley was right to emphasize the need for special explanations of the existence of objects with high information content, for they cannot be formed in nonevolutionary, inorganic processes. [The Origins of Life (John Wiley, 1973), p. 189, p. 190, p. 196. Of course, that immediately highlights OOL, where the required self-replicating entity is part of what has to be explained (cf. Paley here), a notorious conundrum for advocates of evolutionary materialism; one, that has led to mutual ruin documented by Shapiro and Orgel between metabolism first and genes first schools of thought, cf here. Behe would go on to point out that irreducibly complex structures are not credibly formed by incremental evolutionary processes and Menuge et al would bring up serious issues for the suggested exaptation alternative, cf. his challenges C1 - 5 in the just linked. Finally, Dembski highlights that CSI comes in deeply isolated islands T in much larger configuration spaces W, for biological systems functional islands. That puts up serious questions for origin of dozens of body plans reasonably requiring some 10 - 100+ mn bases of fresh genetic information to account for cell types, tissues, organs and multiple coherently integrated systems. Wicken's remarks a few years later as already were cited now take on fuller force in light of the further points from Orgel at pp. 190 and 196 . . . ]
For years, it has been pointed out that this is the source context for discussions of CSI and more particularly functionally specific CSI, i.e. FSCO/I. KF PS: When we design, construct and use computers, we carry out organised work sequences that are controlled informationally and use work-producing machines. The exported waste energy secures consistency with Lex2 Th, and illustrates the only empirically warranted source for the relevant FSCO/I, intelligently directed configuration. Your suggestion to confine intelligence to humans collapses by failing the giggle test. kairosfocus
kairosfocus: the pivotal issue is spontaneous origin of functionally specific complex organisation The question concerned thermodynamic entropy. kairosfocus: Under such circumstances, such spontaneous formation is maximally implausible for the same reason that there is a strong tendency to the predominant cluster of microstates. So are snowflakes, given particles randomly arranged. When a human builds and uses a computer, is that consistent with the laws of thermodynamics? Zachriel
Z, the pivotal issue is spontaneous origin of functionally specific complex organisation and associated information in Darwin's warm pond or the like, without mechanisms to couple energy to step by step procedures that perform constructive work. Under such circumstances, such spontaneous formation is maximally implausible for the same reason that there is a strong tendency to the predominant cluster of microstates. This also extends to the problem of elaborating novel body plans without an intelligent author of relevant instructions etc. KF kairosfocus
ayearningforpublius: So my question is — what is the role of entropy in the Neo-Darwinian explanation of life … how life develops … how life came to be … etc. And yes, please understand that my exposure to Biology textbooks and a class I took way back when are distant memories, so please don’t refer me to those resources. While the Second Law of Thermodynamics states that overall entropy will increase, it is quite possible for areas within a system to experience decreased entropy. Consider a simple example as an analogy. Energy from the Sun flows through the Earth's surface system, then exits into the cold of space. As it does so, water evaporates, whirlpools form in the air and water, water crystallizes and falls as snow, glaciers move towards the sea, and many other examples of low entropy brought about as the energy flows through the Earth's system. From a thermodynamic view, think of life as a complex eddy in the stream of energy. ayearningforpublius: And as a bonus Q/A, how and does entropy relate to my questions @4? They don't relate particularly. Biological processes are all consistent with the laws of thermodynamics. Evolution is consistent with the laws of thermodynamics. Thermodynamics is not an explanation for life, but any explanation for life has to be consistent with thermodynamics. When a human builds and uses a computer, is that consistent with the laws of thermodynamics? Zachriel
Kindergarten Q/A again: This thread moved to quite a debate about entropy, and I'm seeking an answer: In very simple language, my reading of this word translates to "decay." Things over time decay and waste away; abandoned cars rust away, abandoned buildings crumble and fall, my life decays and unless ended catastrophically, some of my critical parts will decay and all the rest of me will die. So my question is -- what is the role of entropy in the Neo-Darwinian explanation of life ... how life develops ... how life came to be ... etc. And yes, please understand that my exposure to Biology textbooks and a class I took way back when are distant memories, so please don't refer me to those resources. And as a bonus Q/A, how and does entropy relate to my questions @4? Thanks ayearningforpublius
No, actually I’m still not convinced you’ve actually opened a biology textbook, Mungy. First of all, if you had ever opened a general biology textbook and read the early chapters, you’d see that entropy is consistently thought of as a measure of disorder. Here are some examples: Molecular Biology of the Cell, chapter 2: “The amount of disorder in a system can be quantified. The quantity that we use to measure this disorder is called the entropy of the system: the greater the disorder, the greater the entropy. Thus, a third way to express the second law of thermodynamics is to say that systems will change spontaneously toward arrangements with greater entropy.” Molecular Cell Biology, chapter 2: “Entropy S is a measure of the degree of randomness or disorder of a system. Entropy increases as a system becomes more disordered and decreases as it becomes more structured.” Protein Structure and Function, chapter 1: “Entropy is a measure of randomness or disorder.” The Cell: A molecular approach, chapter 2: “The change in free energy (?G) of a reaction combines the effects of changes in enthalpy (the heat that is released or absorbed during a chemical reaction) and entropy (the degree of disorder resulting from a reaction) to predict whether or not a reaction is energetically favorable.” Second, let’s see what your favorite book says about entropy: “..entropy is a measure of the number of different ways of rearranging the system.” Your book says entropy is a measure of disorder, without actually saying the word “disorder.” This is because physicists know “entropy=disorder” is a bit of an oversimplification, but it works. In fact, those are the exact words that will come out of every professor’s mouth when they teach about entropy in college-level biology classes. For instance, Ahmet Yildiz from UC Berkeley, who uses your book in his biophysics course, says that “entropy (degree of disorder) gets maximized in spontaneous processes in isolated systems.” Third, I’d put money on the fact that you used google ebooks to search “order” and “disorder” in this book you love to talk about so much. So no, I still don’t think you’ve opened up a biology book, and even if you have, you haven’t learned a single thing about biology. Don’t let the door hit you on the way out Mungy. Alicia Cartelli
Alicia Cartelli:
Mungy, why do you think that chapter is titled “Entropy Rules!”?
I can tell you what i do know. The chapter never once claims that entropy has anything at all to do with order/disorder. The word disorder doesn't even appear in the chapter, and while the word order appears twice it's use on those two occasions has nothing at all to do with order/disorder as you're using it. And that's from a textbook on the Physical Biology of the Cell. Still think I haven't opened a biology textbook? Mung
Atomic Structure Of Graphene Ordered or disordered? High entropy or low entropy? You decide. Mung
Atomic Structure of Diamond Ordered or disordered? High entropy or low entropy? You decide. Mung
Mung, Happy Christmas. As Orgel and Wicken pointed out so long ago now, order [as in a crystal] is not to be confused with either randomness [e.g. a tar] or organisation. Especially functionally specific, complex wiring diagram organisation and associated information. Such as this text and functional DNA both exemplify. And thereby lieth a long tale. KF PS: L K Nash is a double thumbs up! (I wish I had been smart enough as an undergrad to go to the Chem section! At least, I eventually figured out that probably the best physics etc textbook writers were the Russians. Thanks to a second hand bookshop in Barbados, and Mir Publishers, now first privatised then defunct. Jackpot was the Communist Party bookshop in Jamaica.) kairosfocus
Merry Christmas kairosfocus, You've been a good friend for a long time. Thank you. Mung
According to the old view, the second law was viewed as a 'law of disorder'. The major revolution in the last decade is the recognition of the "law of maximum entropy production" or "MEP" and with it an expanded view of thermodynamics showing that the spontaneous production of order from disorder is the expected consequence of basic laws. http://www.entropylaw.com/ YIKES! Mung
The diagrams above have generated a lively discussion, partly because of the use of order vs disorder in the conceptual introduction of entropy. It is typical for physicists to use this kind of introduction because it quickly introduces the concept of multiplicity in a visual, physical way with analogies in our common experience. Chemists, on the other hand, often protest this approach because in chemical applications order vs disorder doesn't communicate the needed ideas on the molecular level and can indeed be misleading. The very fact of differences of opinion on the use of order and disorder can itself be instructive. http://hyperphysics.phy-astr.gsu.edu/hbase/therm/entrop.html Mung
In physics, the terms order and disorder designate the presence or absence of some symmetry or correlation in a many-particle system. https://en.wikipedia.org/wiki/Order_and_disorder_%28physics%29 Mung
Merry Christmas all … I good and lively discussion, and I hope everyone gained something here. Merry Christmas! I did gain something, but not without an increase in entropy! Mung
Merry Christmas all ... I good and lively discussion, and I hope everyone gained something here. ayearningforpublius
Before buying on amazon always check half.com Virgil Cain
Haha, looking at the price on amazon now I remember why I hadn't bought it. Ah well, settled for a used copy. Mung
hmm.. thought I had that one, guess I need to get it. Still don't have the Jaynes text either. The books I do have though aren't too shabby :) Arieh Ben-Naim - A Farewell to Entropy: Statistical Thermodynamics Based on Information Arieh Ben-Naim - Statistical Thermodynamics: With Applicatiosn to the Life Sciences H.B. Callen - Thermodynamics H.B. Callen - Thermodynamics And An Introduction to Thermostatistics Amnon Katz - Principles of Statistical Mechanics: The Information Theory Approach A.I Khinchin - Mathematical Foundations of Statistical Mechanics Lewis and Randall - Thermodynmics Leonard K. Nash - Elements of Statistical Thermodynamics H.C. Van Ness - Understanding Thermodynamics Fortunately for Alicia I haven't given these away. iirc the Callen texts explicitly derive thermodynamics from information theory. As you say the connection between thermodynamics and information theory is well known and well-established and has been for decades. Mung
Mung, have a look in Robertson’s Statistical Thermophysics, Ch 1. Will do. Mung
Mung, Harry Robertson, in using aircraft in a control tower's area reduced to molecular size and numbers as an analogy, pointed out that the loss of access to detailed information means we cannot extract work from detailed knowledge of position and motion of each a/c. So we are forced to treat the whole system [thermodynamic sense] as though it were random. In this context, the increased number of possibilities for distribution of particle level mass and energy consistent with gross scale observable states, can often be associated with qualitative concepts of increased disorder -- randomness here being (understandably) associated with disorder. A classic case being free expansion from a half container to accessing the whole container through suddenly removing a barrier. But that is by no means a definition or a detailed, one size covers all physical insight. KF PS: Mung, have a look in Robertson's Statistical Thermophysics, Ch 1. kairosfocus
Aleta, the correction I had to make above -- that your truncated clip manages to reverse the actual point on information and thermodynamics being conceded by the Wiki article -- suffices to show what is going on. It is a further telling point that you are busily doubling down. We take due note, and move on. The bottomline for record, as noted long since from that and other Wiki articles speaking against interest:
At an everyday practical level the links between information entropy and thermodynamic entropy are not close. Physicists and chemists are apt to be more interested in changes in entropy as a system spontaneously evolves away from its initial conditions, in accordance with the second law of thermodynamics, rather than an unchanging probability distribution. And, as the numerical smallness of Boltzmann’s constant kB indicates, the changes in S / kB for even minute amounts of substances in chemical and physical processes represent amounts of entropy which are so large as to be right off the scale compared to anything seen in data compression or signal processing. But, at a multidisciplinary level, connections can be made between thermodynamic and informational entropy, although it took many years in the development of the theories of statistical mechanics and information theory to make the relationship fully apparent. In fact, in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon’s information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states that it could be in, thus making any complete state description longer. (See article: maximum entropy thermodynamics.[Also,another article remarks: >>in the words of G. N. Lewis writing about chemical entropy in 1930, “Gain in entropy always means loss of information, and nothing more” . . . in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes/no questions that need to be answered in order to fully specify the microstate, given that we know the macrostate.>>]) Maxwell’s demon can (hypothetically) reduce the thermodynamic entropy of a system by using information about the states of individual molecules; but, as Landauer (from 1961) and co-workers have shown, to function the demon himself must increase thermodynamic entropy in the process, by at least the amount of Shannon information he proposes to first acquire and store; and so the total entropy does not decrease (which resolves the paradox).
KF PS: Aleta, FYI as one trained in Physics, I have had relevant exposure to classical and statistical thermodynamics as a necessary core part of that training. I am not a self-proclaimed "expert" or autodidact. The thermodynamics stands on its own merits. kairosfocus
Aleta:
Is it not true that in some areas, notably thermodynamics, one way to think about entropy is as a measure of order vs disorder?
I'm probably not the one to ask, hehe. People can certainly think of things that way, but it leads to unnecessary confusion. Take a look at the picture with the rocks at this link: http://physics.info/temperature/ Does calling the energy ordered or disordered really add anything of value? Does it tell us anything about entropy? Mung
We have seen that the Boltzmann distribution can be derived on the basis of counting arguments as a reflection of the number of ways of partitioning energy between a system and a reservoir. A completely different way of deriving the Boltzmann distribution is on the basis of information theory and effectively amounts to making a best guess about the probability distribution given some limited knowledge about the system such as the average energy. - Physical Biology of the Cell (p. 253)
That's pretty amazing, that the Boltzmann distribution can be derived from information theory. (Not really.) Mung
I know nothing about any discussion with Granville, whoever he is, and I am just a layperson, albeit somewhat well-read. There are many places where it is stated that, for instance, entropy is increasing in the universe as whole, that in a closed system entropy will increase, and that local decreases in entropy are possible in an open system where there is an outside source of energy. In all these uses, the state of increased entropy is said to be more disordered. Obviously, entropy is a much more complicated subject than this, and it appears it is used differently in different disciplines, even though the various uses are related by some underlying concepts. Are the things I am saying here wrong? Is it not true that in some areas, notably thermodynamics, one way to think about entropy is as a measure of order vs disorder? Aleta
The idea of statistical mechanics is to deliver a probability distribution that tells us the probability of all the different microstates. - Physical Biology of the Cell Mung
I know, lol. I said the same thing about Alicia. Mung
This is confusing. When Granville wrote about entropy and disorder evos jumped all over him for not understanding entropy and thermodynamics. And now Aleta is here defending his usage. Virgil Cain
Aleta, even if kf disagreed with me and said so, and even though I have the greatest respect for kf, it would not change my mind. Last I checked ordered and disordered are not terms of physics or thermodynamics. How much order is there in that there system? Less than before sir, we just observed an increase in entropy! Who is to say the equilibrium state is not the most ordered? Mung
Mung:
Entropy is a special case of the Shannon Measure of Information. It has to do with probability distributions, not order or disorder.
Wikipedia:
In statistical mechanics and mathematics, a Boltzmann distribution (also called Gibbs distribution) is a probability distribution, probability measure, or frequency distribution of particles in a system over various possible states.
Mung
It makes a difference in this discussion, Mung, because I think your statement that entropy has nothing to do with moving from order to disorder is wrong, and I would think kf, a self-proclaimed expert on these matters, ought to know that. I know, and I know more than I did by reading a bit, that the concept of entropy is used in several different area - one in which it has to do with order/disorder and one in which it does not, and that those two ways are related. To say is entropy has nothing with changes in order is to ignore the first use and look exclusively at the second. And yes, the universe is, as far as we know, becoming less ordered - its entropy is increasing. (Note that even saying that shows that change in entropy and change in disorder are being used here in way that you say doesn't exist.) I didn't know that question was on the table. Aleta
Aleta: Are you willing and able to answer that simple question? What difference does it make whether he answers or not? Why don't you go look up what Boltzmann had to say? Why don't you answer the question about whether the entropy of the universe is increasing or decreasing and whether the the universe is becoming more or less ordered? Mung
And kf, I ask again: "Here’s a quick Yes/No question: do you think it is correct to say, “Entropy has nothing to do with moving from order or disorder” in respect to thermodynamic entropy?" Are you willing and able to answer that simple question? I will take your willing to answer this question, or not, to be revealing of your sincere interest in discussion rather than in perpetual argumentativeness. Aleta
kf, you choose to not read accurately, but rather to always create divisions. You just wrote, "Aleta, your truncation was false and misleading as the immediately following words went on to show exactly the connexion between information and thermodynamic entropy you seem to imagine does not exist"" But I wrote that "Entropy in physics and entropy in information theory are related but different concepts." "Related" means that there is a connection. In no way did my statement state or imply that I "seemed to imagine that there was no connection." You are the one being intellectually dishonest. Aleta
Aleta, your truncation was false and misleading as the immediately following words went on to show exactly the connexion between information and thermodynamic entropy you seem to imagine does not exist. No, those words are not merely interesting, they are key and highly specific on the issue of statistical weight of clusters of macrostaes and the resulting information gap that ties the two views together, even showing how quantification applies. Maybe you need to know about Jaynes and the Information view of thermodynamics in Physics. But, you were responsible to read context especially in such a situation. And your non-responsiveness when that context is pointed out is revealing, and not to your advantage. KF kairosfocus
Good for me. Mung
You're the only person on the planet who thinks that, Mungy. Alicia Cartelli
Mungy, why do you think that chapter is titled “Entropy Rules!”? Because nothing "counteracts" entropy. Not even a transition from disorder to order. Mung
We counteract entropy for about 80 years, yearning, then we die. Bacteria have been doing it for billions of years. Mungy, why do you think that chapter is titled "Entropy Rules!"? Alicia Cartelli
Alsia: This is not even Biology 101. It's Life 101 (except for the young who are invincible) . No one is able to counteract entropy. We all die .. . or haven't you noticed. ayearningforpublius
"Life is able to counteract entropy" is true >99% of the time. The exceptions are when Mungy comes up with the few ridiculous examples of things that are obviously not alive, but do counteract entropy to some extent. If you'd like me to refine my definition, then: "life counteracts entropy through constant energy transformations." Alicia Cartelli
I think the missing ingredients are duct tape and superglue. Mung
Well Mung, that is because those scientists who might possibly intelligently design life in a lab arose by unintelligent causes. :roll: Virgil Cain
Yes it is interesting but it doesn’t get us any closer to answering the materialistic OoL question.
But Virgil, all we need is the very idea that life might be intelligently designed in a lab to allow us to believe that life might have arisen by unintelligent causes in the real world. Mung
From comment 35:
See Wickstead et al., Patterns of kinesin evolution reveal a complex ancestral eukaryote with a multifunctional cytoskeleton Patterns of kinesin evolution reveal a complex ancestral eukaryote with a multifunctional cytoskeleton, BMC Evolutionary Biology 2010.
It is all speculation based on the assumption kinesin did evolve. AND it requires starting with the very things that require an explanation in the first place. Not to mention it only covers mere evolution which could easily mean evolution by design. Only true believers would be satisfied with such explanations. If only our opponents applied their "skepticism" universally.
No one knows how the first cell arose. However, there is some interesting work being done by the Szostak Lab.
Yes it is interesting but it doesn't get us any closer to answering the materialistic OoL question. Virgil Cain
Zachriel @81: Thanks for your answers, appreciate it. As to enrolling in university ... would love to, but constraints of age and finances probably severely constrain that possibility. If I were to enroll, my target would be micro-biology of some sort aimed towards medical research. Can you offer a scholarship? Perhaps football. ayearningforpublius
ayearningforpublius: What I would like is a credible evolutionary description that tells me how Kinesin and the many other cellular machines came to be Kinesin was answered @35. Many of your other questions were answered @52. ayearningforpublius: and for closure to that explanation, I would like an evolutionary explanation of how the cell itself came into being. No one knows how the first cell arose. However, there is some interesting work being done by the Szostak Lab. http://molbio.mgh.harvard.edu/szostakweb/ ayearningforpublius: These are reasonable questions, and students should be entitled to well substantiated, insult free, answers (or speculations). Have you considered enrolling in university? Zachriel
FYI: 78 was in reply to kf at 76, but that's not clear because 77 intervened. Aleta
Alicia Cartelli:
And I didn’t answer the snowflake question because it is asinine. Yes snowflakes are ordered, as are micelles when you place lipids in a water environment.
Now connect that to your comments about entropy. And then consider that if non-living world can also counteract entropy, then how does "something that is able to counteract entropy" define what life is? That, after all, was your original claim that started this all off.
Certain molecules have emergent properties that allow them to counteract entropy, but they are non-living because life only occurs at the macromolecular level.
Name a "non-living" molecule that has emergent properties that allows it to counteract entropy. Mung
I "truncated" (it's called quoting) just a short quote because it made the point concisely, and not many people are interested in the extensive quotes that you like to provide. Your quotes make some interesting points about how the two fields of entropy are related, but to say that my not including such material is "highly misleading to the point of disregarding truth" is a silly statement. Here's a quick Yes/No question: do you think Mung is correct when he says, “Entropy has nothing to do with moving from order or disorder” in respect to thermodynamic entropy? Aleta
Virgil Cain @73 Thanks! Re your comment " ... must be familiar with the fact that unguided evolution cannot account for any of them." I don't expect such an admission from a hardened evolutionist whos is afraid to "let a divine foot in the door." What I would like is a credible evolutionary description that tells me how Kinesin and the many other cellular machines came to be, and for closure to that explanation, I would like an evolutionary explanation of how the cell itself came into being. "You don't understand evolution/biology ... " is not an acceptable explanation in this class room. I would also like the same explanation for the massive amounts of digital data -- meaningful, useful and usable information -- found in DNA came to be, and an explanation is to how cell differentiation and precise timing of same happened to come about. These are reasonable questions, and students should be entitled to well substantiated, insult free, answers (or speculations). ayearningforpublius
Aleta, Why did you truncate -- or has Wiki done its usual thing? Here is the key context from a few years ago as clipped in Section A, my always linked through my handle:
At an everyday practical level the links between information entropy and thermodynamic entropy are not close. Physicists and chemists are apt to be more interested in changes in entropy as a system spontaneously evolves away from its initial conditions, in accordance with the second law of thermodynamics, rather than an unchanging probability distribution. And, as the numerical smallness of Boltzmann's constant kB indicates, the changes in S / kB for even minute amounts of substances in chemical and physical processes represent amounts of entropy which are so large as to be right off the scale compared to anything seen in data compression or signal processing. But, at a multidisciplinary level, connections can be made between thermodynamic and informational entropy, although it took many years in the development of the theories of statistical mechanics and information theory to make the relationship fully apparent. In fact, in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon's information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states that it could be in, thus making any complete state description longer. (See article: maximum entropy thermodynamics.[Also,another article remarks: >>in the words of G. N. Lewis writing about chemical entropy in 1930, "Gain in entropy always means loss of information, and nothing more" . . . in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes/no questions that need to be answered in order to fully specify the microstate, given that we know the macrostate.>>]) Maxwell's demon can (hypothetically) reduce the thermodynamic entropy of a system by using information about the states of individual molecules; but, as Landauer (from 1961) and co-workers have shown, to function the demon himself must increase thermodynamic entropy in the process, by at least the amount of Shannon information he proposes to first acquire and store; and so the total entropy does not decrease (which resolves the paradox).
Quite a different picture, nuh? Here is how it currently reads:
At a multidisciplinary level, however, connections can be made between thermodynamic and informational entropy, although it took many years in the development of the theories of statistical mechanics and information theory to make the relationship fully apparent. In fact, in the view of Jaynes (1957), thermodynamic entropy, as explained by statistical mechanics, should be seen as an application of Shannon's information theory: the thermodynamic entropy is interpreted as being proportional to the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics, with the constant of proportionality being just the Boltzmann constant. For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states of the system that are consistent with the measurable values of its macroscopic variables, thus making any complete state description longer. (See article: maximum entropy thermodynamics). Maxwell's demon can (hypothetically) reduce the thermodynamic entropy of a system by using information about the states of individual molecules; but, as Landauer (from 1961) and co-workers have shown, to function the demon himself must increase thermodynamic entropy in the process, by at least the amount of Shannon information he proposes to first acquire and store; and so the total thermodynamic entropy does not decrease (which resolves the paradox). Landauer's principle imposes a lower bound on the amount of heat a computer must generate to process a given amount of information, though modern computers are far less efficient.
In short, your clip was highly misleading to the point of disregarding truth. I think you have some 'splaining to do. KF PS: The remark on strings of Y/N qs to specify state should be familiar to those who have seen me discuss FSCO/I in terms of descriptions of node-arc meshes. kairosfocus
MNY, language -- one of the seven words. KF kairosfocus
Entropy in physics and entropy in information theory are related but different concepts, Mung.
The inspiration for adopting the word entropy in information theory came from the close resemblance between Shannon's formula and very similar known formulae from statistical mechanics. ... At an everyday practical level the links between information entropy and thermodynamic entropy are not evident. Physicists and chemists are apt to be more interested in changes in entropy as a system spontaneously evolves away from its initial conditions, in accordance with the second law of thermodynamics, rather than an unchanging probability distribution. And, as the minuteness of Boltzmann's constant kB indicates, the changes in S / kB for even tiny amounts of substances in chemical and physical processes represent amounts of entropy that are extremely large compared to anything in data compression or signal processing. Furthermore, in classical thermodynamics the entropy is defined in terms of macroscopic measurements and makes no reference to any probability distribution, which is central to the definition of information entropy.
from Wikipedia Aleta
Alicia the clueless:
Don’t worry yearning, I am very familiar with the kinesins, dyneins, and myosins.
Then you must be familiar with the fact that unguided evolution cannot account for any of them. Cheers, Virgil Cain Virgil Cain
Alicia Kartelli wrote: Mohammad, did you forget to take your pills today? I don't take pills, because I keep my emotional life in order with prayer, and all manner of subjectivity. Presently about 50 percent of students at Harvard suffer from debilitating depression during their student career. A pattern of an epidemic of college depression repeated throughout the USA, and across the world. http://www.adaa.org/finding-help/helping-others/college-students/facts http://www.healthline.com/health/depression/college-students Harvard is now a shithole of depression, by reasonable judgement, that's how far things have come. That is the result of evolution theory functioning as a catalyst in the commonly human head vs heart struggle, the head destroying the heart. Subjectivity is an inherently creationist concept. Subjectivity works by choosing what the agency of a decsion is, that procedure results in an opinion. But evolution theory kills all room for agency, removes subjectivity, hence students become depressed. Who Alicia is as being the owner of her decisions, choosing to say what she does, is a matter of opinion. That means the answer to this question can only be reached by choosing it, and any chosen answer would be logically valid. And I choose "puke" as the answer of who she is in her soul. The usual mister Spock personality, the heart dead, and so on. I mean, who ever saw any evolutionist who had any kind of impressive emotional life? mohammadnursyamsu
Mung, on a statistical level, a useful way to view entropy is as a metric of the number of ways mass and energy at micro level may be arranged consistent with macro level state defining variables. KF kairosfocus
You tell me why it is, Mungy. Demonstrate your knowledge of biology and chemistry. And I didn’t answer the snowflake question because it is asinine. Yes snowflakes are ordered, as are micelles when you place lipids in a water environment. Certain molecules have emergent properties that allow them to counteract entropy, but they are non-living because life only occurs at the macromolecular level. Alicia Cartelli
p.s. Is the universe become more or less ordered over time? Mung
Aleta:
So I wonder what Mung thinks entropy is about if it’s not about changes between order and disorder?
First, if one defines ordered as the lowest entropy state and disordered as the highest entropy state, then one is either begging the question or arguing in a circle (pretty much the same thing). So what we need is the relationship between order/disorder and entropy that doesn't beg the question. Second, because I can come up with counter-examples. Notice how Alicia doesn't answer the snowflake question? Third, because I am 100% sure that I know more about entropy than most people. :) Entropy is a special case of the Shannon Measure of Information. It has to do with probability distributions, not order or disorder. Mung
Dear Alicia, Chapter 6 of the textbook Physical Biology of The Cell (Second Edition) is titled Entropy Rules! Why do you suppose that is? What do you say that we use that chapter as our guide? I'm up for it if you are. Mung
Mung says, "Entropy has nothing to do with moving from order or disorder." Here are some statements from the internet, which all says that entropy is a measure of disorder, and they all talk about a change in entropy as a key concept. So I wonder what Mung thinks entropy is about if it's not about changes between order and disorder? http://www.merriam-webster.com/dictionary/entropy entropy 1 : a measure of the unavailable energy in a closed thermodynamic system that is also usually considered to be a measure of the system's disorder, that is a property of the system's state, and that varies directly with any reversible change in heat in the system and inversely with the temperature of the system; broadly : the degree of disorder or uncertainty in a system 2 a : the degradation of the matter and energy in the universe to an ultimate state of inert uniformity b : a process of degradation or running down or a trend to disorder http://physics.about.com/od/glossary/g/entropy.htm Definition: Entropy is the quantitative measure of disorder in a system. The concept comes out of thermodynamics, which deals with the transfer of heat energy within a system. Instead of talking about some form of "absolute entropy," physicists generally talk about the change in entropy that takes place in a specific thermodynamic process. Wikipedia In thermodynamics, entropy (usual symbol S) is a measure of the number of specific realizations or microstates which may realize a thermodynamic system in a defined state specified by macroscopic observables. Entropy is commonly understood as a measure of disorder. According to the second law of thermodynamics the entropy of an isolated system never decreases; such a system will spontaneously proceed towards thermodynamic equilibrium, the configuration with maximum entropy. Systems that are not isolated may decrease in entropy, provided they increase the entropy of their environment by at least that same amount. http://www.vocabulary.com/dictionary/entropy The idea of entropy comes from a principle of thermodynamics dealing with energy. It usually refers to the idea that everything in the universe eventually moves from order to disorder, and entropy is the measurement of that change. Aleta
Don’t worry yearning, I am very familiar with the kinesins, dyneins, and myosins. I do not need to watch animations (even if you and you friends here think this counts as “understanding biology”). Also, I never said I was a teaching major. I know plenty of people who do computational work in biology, whether they got their start in biology or in computer science, none of them seem to have a problem with evolution. You are a dying breed. Mungy, you know next to nothing about biology (and apparently chemistry), I guess that is a little more than most people though. Congrats! Alicia Cartelli
Alicia Cartelli:
Entropy is the amount of disorder in a system and it is a fact that things tend to move from order to disorder. Life counteracts this move toward disorder.
Snowflakes are ordered. Did they get that way by going from more ordered to less ordered? Did they get that way by counteracting entropy? Entropy has nothing to do with moving from order to disorder. Didn't they teach you that in your biology classes? No? Shame. Mung
Alicia, I likewise am 100% sure that I know more about biology than most people. So you're not giving us much to go on there. :) Mung
Alicia; I'm going to leave you with a bit of advice that was given to me back in 1970 when I was a graduating Math major. DR. Smith advised us to get into computer programming, as computers were the up-and-coming thing. Dr. Smith was right back then, and he would have been right saying the same thing up until 2009 when I retired from the field, and would be right were he saying the same thing today. This advice has been more than validated in my own life and career, and I would just like to pass it along (not in vain I hope). I offer this to you because another professor, from the Perelman Medical School at the University of Pennsylvania elaborated on Dr. Smith's advice in a conversation I had with him about a year ago. We didn't talk specifically about Dr. Smith or his advise, but the Penn Professor's advice fits quite nicely. We were talking about machines in the cells, and in particular the Kinesin (if you haven't seen these animations, you should seek them out, watch and learn more of them.) The Dr. advised me to check into a couple of hot fields today; Computational Biology and Systems Biology - which I have been doing. As to how all this translates to life advise to you, I would suggest that you change your major from teaching to Computer Science with emphasis on design. Then take on a biology minor or even dual degrees in Biology and Computer Science. You would then be well positioned to jump into fields such as Biomimetics, Systems Biology and Computational Biology. You could get in on the ground floor and wind up with an exciting career if that is what you seek. And you would learn a great deal about design in complex systems. I've had my career, and it was a good one, but if I could start anew, these fields would be it for me. And -- I do wish you well. ayearningforpublius
Mungy, I don’t pretend to know it all. But I am 100% sure that I know more about biology than most people. Entropy is the amount of disorder in a system and it is a fact that things tend to move from order to disorder. Life counteracts this move toward disorder. If you need to be told that snowflakes are non-living, then I’m just going to see myself out. Yearning, you are beyond confused. First we talk about the process of evolution, now you want to talk about embryologic development. You should make up your mind because this is obviously way too much biology for you to handle at once. The “information and manufacturing processes arrived in that tiny space” when a gamete from both the mom and dad fused. Again, Bio 101. I hope you are not trying to take over the role of “teacher” here, because boy, that would be the day. Mohammad, did you forget to take your pills today? Alicia Cartelli
Alicia unaware:
Entropy is basically when things move toward a more disordered state. All life counteracts this, while non-life does not.
Life is intelligently designed to counteract that, while non-life requires our help or it meets that certain fate. Cheers, Virgil Cain Virgil Cain
Alicia Kartelli wrote: "Mungy, the difference between you and I is that I don’t have to lie through my teeth." All people know for a fact that freedom is real and relevant in the universe, that things in the universe are chosen. They come to this knowledge naturally at a very early age, even without religious instruction. Evolution theory is in denial of this fact that they know to be true. So evolution theory requires to promote falsehoods, which the people know themselves are falsehoods. In the future students will be taught that once there was a time that scientists denied that freedom is real and relevant in the universe. The students will be astonished to hear of the majority of the most educated men and women denying the totally obvious. How could the majority have been liars? It must have been total chaos then. And yes it was chaos, the holocaust, the impending threat of thermonuclear war, environmental destruction. All sorts of anti-freedom ideologies like nazism, communism, materialism, atheism, came from the universities where evolution theory had taken hold, resulting in mayhem, world wars. mohammadnursyamsu
Alisia: It seems obvious that you haven't watched the embryonic development of the chicken egg. I could have made it easier for you to find at https://www.youtube.com/watch?v=-Ah-gT0hTto It's only 4 minutes long so why don't you watch it and then get back to me. Yes I can focus on a single species at a certain point in time, and that's what this video so clearly illustrates. Not counting external sunshine, food and water, all that is required to develop that chicken from fertilized egg to the frying pan is contained right there from the beginning in the first fertilized egg. We see blue-print like information being transformed into a working body plan. We see the results of a computer like mechanism assembling chicken parts in a timely sequence resulting in a hatched chick, and eventually the bird in a pan. So what I see from biology is information and a mechanism to translate that information in a manufacturing like sense to a fully working model of a Rhode Island Red. In other words, an Intelligent Design. Similar to the various major brand auto manufacturing factories producing Ford's, Volvo's, John Deere's from plans and via manufacturing processes. Now watch the 4 minute video and explain to me again how all this information and manufacturing processes arrived in that very tiny space. I think you can get it Alicia ... I think you are smart enough to sit back and really think about this, and I really believe you can pass this course. You are worth it and you can do it. ayearningforpublius
Alicia, I prefer to let you know-it-alls continue in your smug superiority. Are snowflakes ordered or disordered? Are snowflakes alive? Is life ordered or disordered, and is life getting more ordered over time or less ordered over time? Entropy has nothing to do with moving from order or disorder. Didn't they teach you that in biology class? Mung
Yearning, apparently you have forgotten everything you learned. This is quite obvious in the way you talk about biology. When you talk about evolution in the way that we are, you can’t really focus in on a single species at a certain point in time. You love to talk about evolution and its occurrence over “deep time” and then you say “well the systems that I have or this other organism has right now are all required.” You’re a walking contradiction. Mungy, the difference between you and I is that I don’t have to lie through my teeth. I don’t have to use the internet when I talk about biology either because I don’t go to Google U like you do. And again, you can open all the books you want, but that doesn’t mean you actually learned anything. Why not just say “I understand basic biology,” if you think that you do? It’s almost like you’re actually trying to admit you don’t know anything about biology. Entropy is basically when things move toward a more disordered state. All life counteracts this, while non-life does not. Alicia Cartelli
That is true. I just open them the normal way. Virgil Cain
Well, Virgil, it's just obvious that you have never cracked open a biology book. Mung
Zachriel:
Lungs evolved, along with air bladders, in fish. Lungfish are intermediate.
That is the propaganda but propaganda is not evidence. The fossil record shows fish-> tetrapods-> fish-a-pods whereas Common Descent predicts fish-> fish-a-pods-> tetrapods. Zachriel's willful ignorance, while entertaining, is still ignorance and should be ignored. Virgil Cain
ayearningforpublius: A strange way of addressing my questions It's an important point, though not a comprehensive answer. The evolution of metazoan complexity is complex and only partly understood. As always, it's best to start with establishing the historical transition. Given the evidence for common descent, some of the pieces start to fall into place. The fossil record shows that the earliest life was unicellular. This was followed by primitive sponges, then more complex metazoa including bilaterians. Based on fossil and molecular evidence, metazoan evolution probably started with colonial cell behavior, followed by increasing cell differentiation. The basic transition is from diploblastic to triploblastic. From there, protostomic to deuterostomic. Whole-genome sequencing is helping to unravel the details. ayearningforpublius: My lungs Lungs evolved, along with air bladders, in fish. Lungfish are intermediate. ayearningforpublius: My heart … My circulatory system … The four-chamber heart evolved from a simpler heart. There are ample extant intermediates. The first internal circulatory system appeared in ancestors of triploblasts, and endothelium appeared in ancestral vertebrates. Again, there are extant intermediates. Flatworms lack a vascular system or coelom, but circulate nutrients in the gap between the ectoderm and endoderm. The next step was a coelom with cilia providing simple circulation. ayearningforpublius: My muscular system … Cell contraction to create mechanical movement is deeply rooted in the tree of life. Sponges don't have muscles, but have some of the proteins associated with muscles. As for your own, striated muscles, see Seipel & Schmid, Evolution of striated muscle: Jellyfish and the origin of triploblasty, Developmental Biology 2005. ayearningforpublius: My digestive system … Some organisms only have a single opening for their digestive system, however, you have two! The story of your digestive system is the story of your anus. Yes, you are an elaborated deuterostome, with appendages to stuff food into one end. Your digestive system develops the same way it does in a fish or a starfish. Zachriel
Life, all life, is miraculous. Even Alicia. :) Mung
Alicia Cartelli:
Mungy, you can give away all the textbooks you’d like. That doesn’t mean you actually opened them.
And you can say anything at all you like here and it doesn't mean you're not lying through your teeth. See how that works? The fact is that I did open them all, each and every one. What does it mean, exactly, to counteract entropy and how does we know whether or not counteracting entropy is actually taking place? Is there a measuring stick? Mung
ayearningforpublius@4: Since I failed miserably in answering my own questions @29, I guess I'll take another crack at answering those questions. This time I will forgo sarcasm and just use some basic biology. The answers lie in basic biology, in particular cellular biology as follows: The body systems I list all arise from a single fertilized cell. The various types of body systems develop as the cells split and differentiate into cells of various parts necessary to produce the "directed" end product; muscle, brain, nerve, bone cells etc. The various cell types required, for example, to build a heart are directed to do so as the overall body grows and develops into the final finished product e.g. a seagull. And what is the Intelligently Designed mechanism at the root of all of this? Information is contained at the cellular level by such mechanisms as DNA and probably other mechanisms as well. This information can be likened to a set of blueprints containing all that is needed to construct and maintain a living animal. Also at the cellular level are mechanisms that act much like computer hardware and software that determine the type of cell to be produced, and when in the sequence of development those cells are expressed to become muscle, bone, nerve, brain etc. The sequence I've described can be seen vividly in the video "Flight the Genius of Birds at: http://flightthegeniusofbirds.com/ produced by Discovery Institute. A sequence in this video shows a time-lapse development of a fertilized chicken egg - I believe this was done with sonogram technology. The fascinating and instructive part of this video is that it shows the complete body plan of the bird apparent in the second day -- front/back, top/bottom, left/right. And then it continues to the hatching of the egg 28 days later. I believe what I have described here is a reasonable case study for what can reasonably be inferred to be a case for design and intelligence in what we actually see in nature. Now will someone present a reasonable and believable explanation as to how purely naturalistic processes (with time) result in what is present at the beginning in that single fertilized chicken egg? ayearningforpublius
Alicia: FYI -- no sarcasm in my previous two comments. ayearningforpublius
Alicia @33: Your words: " ... When I say “These body systems do not all have to be present in a living organism,” that is exactly what I mean. In fact, none of them are required for life. They can all be completely absent, and you can still have a living organism. ... " A strange way of addressing my questions, and I suggest you word smith just a bit lest you leave the impression that my human body can somehow live without the body systems I list. My human body does require most of these systems for life -- some are essential while others would only cause a degradation of some sort. Same with fish, they have there own set of essential body systems. Same with birds. Same with mollusks ... and on and on. You say " ... You think that because humans are complex and can’t live without these systems, therefore no other organism can. These systems must be present “at the same time,” you like to say. Well you are very wrong. ... " No Alicia, I am not wrong here. By now we both know that the list of body systems I presented was specific to humans, and more generally speaking to most mammals. I could use a different list for fish and a different one for mollusks and a different one for bacteria. But in each case there would be a necessary and definable list of body systems required for life to exist in that particular type of life -- and yes, they must be extant at the same time -- my heart dies ... I die. ayearningforpublius
Alicia @33: You should not be so quick to claim knowledge of what another is capable of doing. The fact is I have taken and passed a college level biology class. I don't recall my grade, but it most likely was an A or B which would be consistent with my other courses in my BA Math/Physics studies. I also don't recall if there was an evolution component, but if there were I most likely would have aced it since at the time I was an Atheist and evolution would have been right up my alley. Would I pass such a course today? I don't see any reason why I wouldn't. ayearningforpublius
Alicia Cartelli- YOU can open all of the biology textbooks you like. That doesn't mean you understand what is in them. :razz: Virgil Cain
Mungy, you can give away all the textbooks you'd like. That doesn't mean you actually opened them. And once again you're wrong on all counts. Alicia Cartelli
Alicia Cartelli:
If you want to boil it down, life is something that is able to counteract entropy.
You're starting to sound like a Creationist. Nothing counteracts entropy. You're probably just confused about negentropy (negative entropy). Mung
Oh Alciia, I've probably given away more biology textbooks than you've ever opened. Mung
If Alicia ever opened a biology textbook she would have known that Basic biological reproduction is irreducibly complex Virgil Cain
Really, Daniel? I know of a number of professors of religion, with very good understandings of the theologies of several religions, who are themselves agnostics or atheists. In fact, the more deeply one studies comparative religion, the more likely it is for many to realize how much those religions are human inventions, and thus to not "believe in" any of them. So I can't begin to understand how it is inconceivable to you that one can not understand theology without believing it is true. Aleta
Keep in mind that it isn’t necessary to believe in God to pass a course in theology.
Impossible to keep in mind. Can't even be conceived. Daniel King
Oh Mungy. If you ever opened a biology book, you wouldn't have to ask me that question. The first chapter of every general biology book will explain to you that we consider life anything capable of reproduction, growth and development, metabolism, evolutionary adaptation, and responding to the environment. If you want to boil it down, life is something that is able to counteract entropy. Alicia Cartelli
Like most on here, you should at the very least learn some basic biology, before you try to talk about its complexities. I won’t hold my breath though.
Dear Alicia, what is life? When you can answer that, perhaps you'll be competent to teach biology. Until then, you're just pretending. Mung
See Wickstead et al., Patterns of kinesin evolution reveal a complex ancestral eukaryote with a multifunctional cytoskeleton Patterns of kinesin evolution reveal a complex ancestral eukaryote with a multifunctional cytoskeleton, BMC Evolutionary Biology 2010.
This is all speculation based on the assumption they did evolve and were not designed. But it does demonstrate the low level of evidence evos use to support their claims, not even realizing they only support a vague concept of "evolution" and they do not support natural selection or drift. BTW I would not only pass an introductory course in biology I would show the professor how ignorant we really are when it comes to evolution. Virgil Cain
ayearningforpublius: And I fully agree I probably would not pass an introductory course on biology Keep in mind that it isn't necessary to believe in God to pass a course in theology. ayearningforpublius: if I went into the microbiological research area where apparently designed machines such as Kinesin are abundant, and there is apparently no evolutionary explanation for them See Wickstead et al., Patterns of kinesin evolution reveal a complex ancestral eukaryote with a multifunctional cytoskeletonPatterns of kinesin evolution reveal a complex ancestral eukaryote with a multifunctional cytoskeleton, BMC Evolutionary Biology 2010. ayearningforpublius: It is claimed that evolution is not goal directed — true? Correct. ayearningforpublius: It just happens over the course of ‘deep-time’ — true? Not correct, any more than the Solar System "just happened". Zachriel
Alicia says:
Humans (and every other species on Earth today) are the product of billions of years of evolution..
That is the propaganda, anyway. It has nothing to do with science, though. BTW basic biology doesn't support your claims about evolution. Virgil Cain
You would fail an evolutionary biology course, and likely a general bio course, based on your current understanding. You doing research in any field of biology right now is laughable. And maybe you should avoid using “sarcasm” while trying to demonstrate an understanding of something. You just come off as an idiot when people can’t tell where the sarcasm stops and your actual attempts to sound intelligent actually begin. When I say “These body systems do not all have to be present in a living organism,” that is exactly what I mean. In fact, none of them are required for life. They can all be completely absent, and you can still have a living organism. You, as a human require all these systems, but life in general does not. Humans (and every other species on Earth today) are the product of billions of years of evolution and during these billions of years species have existed with many different combinations and variants of these systems. Your problem is that you have a human-centric view of evolution. You think that because humans are complex and can’t live without these systems, therefore no other organism can. These systems must be present "at the same time," you like to say. Well you are very wrong. Yes each system has a specific function, but the evolution of these systems did not have in mind the structure/function that we see today. They are adaptations that have been formed over millennia, have constantly been reformed, and continue to do so. Like most on here, you should at the very least learn some basic biology, before you try to talk about its complexities. I won’t hold my breath though. Alicia Cartelli
Well it seems I have flunked my introductory re-education courses in evolution. And I fully agree I probably would not pass an introductory course on biology -- that is I would probably fail and indeed be drummed out of school when I took the evolutionary part of the course. But biology has many sub-disciplines where I would probably do quite well, especially if I went into the microbiological research area where apparently designed machines such as Kinesin are abundant, and there is apparently no evolutionary explanation for them nor even a practical need to look for an evolutionary explanation in order to accomplish research and advance knowledge. If there were an evolutionary explanation, I think I would have found it by now ... and I have looked and not found it. And yet research proceeds. Another area in which I, as well as my teachers may have failed, is basic reading comprehension. Even though I prefaced my latest, somewhat hair brained story here, as sarcasm, I failed to get that point across to my teachers that I was using sarcasm, so I will remind them again that it was sarcasm --- maybe we all need remedial courses here. Dictionary Definition here: sar·casm. [?sär?kaz?m] NOUN 1.the use of irony to mock or convey contempt: "his voice, hardened by sarcasm, could not hide his resentment" synonyms: derision · mockery · ridicule · scorn · sneering · scoffing · [more] But teacher, I do have a problem when you point out an error in my screed. You say "These body systems do not all have to be present in a living organism." Should I assume here that you meant "at the same time with, one another" Well then I would ask, if that were true, which of the items, or set of items, I mentioned in my long list above could I get along with and still function as a reasonably normal human being? I realize some of these are redundant, but would not that redundancy itself be a pointer to design, if only in the sense that if one member of that redundancy failed or were not there, would be a degradation of the full-up system? Other functional capabilities/parts such as smell, taste and hearing are of lesser life critical importance, but again, loss of them results in a net degradation (spoken as one having a serious hearing loss.) So please teachers, answer my basic questions at 4 above in terms that can be readily understood without hand waving, paper mache mountains, flexible polaroid film strips, or just-so story telling - and don't answer by telling me that I "just don't understand.". Tell me in simplistic fashion what YOU understand, and then relay that to the rest of us. Otherwise I am afraid I may indeed have to go the real re-education camps, complete with rubber hoses and the loss of Politically Correct food such as at Oberlin College. ayearningforpublius
“I have this gigantic box full of parts. But how do I get them all put together in one complete package (like myself), and put together all at the same time?” I’m going to skip everything before this because it’s pretty much a string of thoughts I would expect a second grader to come up with and it also has nothing to do with evolution. Now, yearning, what you have said here is completely and utterly incorrect. These body systems do not all have to be present in a living organism, and in fact the vast majority of living organisms on this planet do not have any of them. You are so confused about evolution and science in general that it hurts. As Zach said, you would fail a basic evolutionary biology class. Fail it miserably, I might add. Alicia Cartelli
ayearningforpublius: I am a person that does better using illustrations or visuals, so let me see if I can paint some sort of mental picture of evolution as I now understand it. You would not pass an introductory course on biology based on what you wrote. Try https://en.wikipedia.org/wiki/Evolution Zachriel
I hope by now that most have understood the sarcasm in which I have written here on this thread. I will continue here with sarcasm, but wish to offer this bit of seriousness as a prelude to the rest of my remarks. I am uncomfortable by nature in using any style of speech or writing that attacks, mocks or ridicules others personally, or needlessly embarrasses. Name calling is not in my nature and I strive to resist the urge. Thus please realize and accept my attempt here to de-personalize and genericize my remarks to the topic at hand. When it is appropriate to refer to individuals commenting here I will use a generic “Jane” or “John” as appropriate. So let me continue … with sarcasm turned on for effect. ** ** ** ** Thank you Jane and John for your remarks, even though they are contrary to my own views. Jane remarked “Yearning, you demonstrate a complete lack of understanding of the basics of evolution.” On the contrary Jane --thanks to these very brief lessons I now believe I do understand evolution. I am a person that does better using illustrations or visuals, so let me see if I can paint some sort of mental picture of evolution as I now understand it. Ready … begin! So we have this thing called “deep time” in which all of this evolution takes place. As an illustration lets use a very large blanket, tarp or one of those cloaks like I saw a famous illusionist use one time at a Las Vegas show. This covering is quite large, because not only does it have to contain all those things that cause (oops – used a bad word there, perhaps a future lesson will give me a more correct vocabulary) evolution to happen, but it must also have room for all of those little bitty steps along the way to a final product, and also a place to collect the trash of those things that just did not work out – perhaps they can be recovered and used later for some other purpose (oops – again I’ll have to find another word ) So under this very large blanket, we have a very large assemblage of things (atoms I suppose) that are randomly moving around and often bumping into one another and moving one another in some different direction or another. OK so far? Now some of these atoms start lining up in peculiar ways that are not random at all, and in fact turn into what we now know as “elements.” Gold, hydrogen, oxygen, nitrogen are a few examples of this evolution that takes place under this “deep time” tarp. Eventually (I told you this is a very big drape, tent or whatever) these accidental bumps create (oops again – darn it’s real hard to come up with descriptive words that describe … well describe things that are made) Eureka! I’ve just hit on it “made” is what I will use in place of those other incorrect and forbidden words … whew --- I sure feel better now. Sorry, lost my train of thought there. What I was saying was that all of these accidental bumps produce these things called “organic” and inorganic” chemistry from which most things are made (hey that worked didn’t it?) From what I learnt a long time ago, inorganic chemistry is what things like rocks are made of (sure am liking that word), and organic chemistry is what my dog is made of. Maybe I got side tracked a bit there with this chemistry lesson, but from what I learnt in one of my correspondence re-education courses, evolution doesn’t care about how those organic and inorganic molecules originated (were made?) but only what happened later under that “deep time” blanket. In any case, somehow we wind up with this huge universe sized collection of molecules and plenty of time to play around and make something (made, make … wow I’m getting the hang of this.) So a long … long … long … long time ago, we wound up with a universe full of things that were tinkered with by this thing called evolution and wind up with quite a tool box of parts – now what to do with them … hmmm. I still have that long list of finished parts that I began with earlier in this lesson (thread), things like My lungs … My heart … My circulatory system … My muscular system … My digestive system … and so on, and I have this gigantic box full of parts. But how do I get them all put together in one complete package (like myself), and put together all at the same time? In another correspondence re-education course I learnt that there was never a first man or a first woman, so evolution must have failed along those lines and figured out a different, but not better way to build me and my dog. I say different but not better because I also learnt that there is no such thing as better as we now understand the word. So somewhere under this huge veil are many, many grotesque and ugly body parts that were originally unintended to be me – evolution just had to give up on that idea and come up with something better. And as we now see -- it did. It unintentionally invented the very first fertilized egg and then after many many more millions of years of jiggering around somehow a sort of blueprint evolved containing enough information, detailed plans (can I use that word?) and machine looking gadgets to very compactly and efficiently invent me. Thus was born the first thing called DNA and soon thereafter me and my dog. Thus the (just-so) story of life and how it came to be! And thanks to my teachers Jane and John I was actually able to peek under that huge blanket and actually see what was under there making all those wonderful things – like me and my dog. So I pulled up a corner and stepped in. And discovered the secret of life … Faith! ayearningforpublius
Isn't it funny teacher #2 is doing the exact thing he's telling the student not to? Vy
Design in life systems is a legitimate inference from the facts, and is a legitimate path of scientific exploration.
Explore to your heart's content. Take a bold step on that path. Don't waste your time here. Daniel King
Rivers are designed! I always suspected it, but was never really sure. Mung
Seversky @24 and others: I don't know if this is a diversion or something, but questions about rivers and canals seem to loose focus of the central issue here -- that being ... what is the most rational and plausible explanation of the vast development and diversity of life on this planet, how best to positively exploit knowledge of it for the benefit of mankind -- and very importantly, the freedom to explore answers to questions that arise from this central issue. Design in life systems is a legitimate inference from the facts, and is a legitimate path of scientific exploration. People like Dawkins and the NCSE asserting "illusions" and "appearance" of design don't make it any less real -- no matter how many paper mache mountains or poloroid film strips are presented by the smartest man in the world. Canals and rivers -- a lot like the question of how many angels dance on the head of a pin? ayearningforpublius
Vy @ 17
Such amazing teachers. #1 prefers ad hominem, #2 postures, and #3 focuses on semantics and then replies with a question
The question is not just one of semantics but an attempt to illustrate what is at the heart of this debate. My answer to that question is to compare a river and a canal. From my perspective, they both have the same function, which is to channel water from one place to another. But only in the case of the canal do we know that was also the goal, the purpose in the mind of the designers and builders. So, to put the question another way, since we know that the canal was designed are we entitled to infer that a river must also have been designed? Seversky
Ya sure Mungy? You've been wrong about pretty much everything else. Alicia Cartelli
Alicia Cartelli: I hope you don’t teach science. We know you don't. Mung
Alicia Cartelli:
Evolution is not “goal directed” in that there is no plan for the future.
Blind watchmaker evolution is not goal directed but Intelligent Design evolution is. Evolutionary algorithms demonstrate the power of directed evolution. BTW, your "concepts" are all made-up and depend on a failed philosophy. Virgil Cain
“That’s your problem right there.” Well said, Dan. Yearning, you demonstrate a complete lack of understanding of the basics of evolution. Evolution is not “goal directed” in that there is no plan for the future. There is variation in a population and this often translates to variation in reproductive success. The traits that better suit an organism to its environment are more likely to be passed on and over “deep-time” become an adaptation in the population. Virtually every part of your body performs some type of function, but the steps to evolve those organs was not “goal directed” in that evolution was not on a path specifically to those organs. You need to learn the distinction between current function and the evolution of function, the first has a “goal” (function), the second does not (except simply to better adapt a species to its environment). You’ll never graduate that school if you continue to demonstrate a complete misunderstanding of even the most basic concepts. I hope you don’t teach science. Alicia Cartelli
what is the difference – if any – in meaning between “goal” and “function”? Neither one can be accounted for under current physics. So, scientifically speaking, none. Mung
AYFP: "Then why is it that virtually every part of my frail body seems to be goal directed — every part — why is that?" As Seversky said, good question. But let me ask you a different questions. Why do natural features appear to be goal directed? Some examples: Ocean basins - for containing the oceans. River valleys - for directing water to lakes or oceans. Lakes - for receiving water from rivers. Clouds - to provide water for our crops The English Channel - to separate the English from the French. Limestone - to provide material for our buildings Chrystals - to provide a means for husbands to get out of trouble with their wives. George Edwards
Such amazing teachers. #1 prefers ad hominem, #2 postures, and #3 focuses on semantics and then replies with a question. Brilliant! Vy
ayearningforpublius @ 4
I have a few serious kindergarten level questions about this thing called evolution: It is claimed that evolution is not goal directed — true? It just happens over the course of ‘deep-time’ — true? Then why is it that virtually every part of my frail body seems to be goal directed — every part — why is that?
A good question, although the operative word here is “seem”. I also have a question for you: in your mind, what is the difference - if any - in meaning between “goal” and “function”? Seversky
I suggest, start from worldviews analysis. And, beware that one's perceptions may be coloured by the if eyes are good/bad problem highlighted by Jesus of Nazareth in his Sermon on the Mount. If the light in you as you think is in reality darkness, how great is your darkness. kairosfocus
See, there you go ayfp. Daniel King, another fine teacher. Keeping it simple. Simple is as simple does. Mung
All I want is a few simple classes with easy answers.
That's your problem right there. Daniel King
All I want is a few simple classes with easy answers. Hopefully Alicia will be able to teach. ayearningforpublius
Dear Axel, Alicia is hear to teach, not to learn. Don't confuse the poor lass. Mung
Alicia, like the vast majority of atheists, ayfp, is forever doomed to remain a captive in Wonderlandia, where anything can be - 'cep God's foot in the door: living, breathing proof that a high, native intelligence of the worldly persuasion, plus an excellent education, is no substitute for a love of truth for its own sake, and all too often makes the monkeys of them they wish to be. Axel
"Actually, one can often detect BS without knowing much about the topic at hand, ..." I agree. i could point you to several OPs that would prove your point. George Edwards
Been in that school for many decades ... have yet to graduate, but have been paying the bills right along. So will the next class be like the first one? I hope not, I was hoping I could get some real thought out answers to my kindergarten level questions -- and questions and answers I would hope you yourself would be interested in. How about it Alicia ... ayearningforpublius
Welcome to the school of hard knocks. Where can I send the tuition bill? Alicia Cartelli
Thanks Alisia ... I didn't expect the classes to start so soon, but I'm thrilled that as my new teacher you chose "Insult and Demean" as the first topic. This should be fun to learn, and I can hardly wait to try it myself on my 2'nd grade students. I didn't know it was going to be this easy! I think I might be ready for my next lesson ... perhaps it's "How to insult your own family members!" ayearningforpublius
You have to be educated in the first place to be "re-educated." Alicia Cartelli
I have a few serious kindergarten level questions about this thing called evolution: It is claimed that evolution is not goal directed -- true? It just happens over the course of 'deep-time' -- true? Then why is it that virtually every part of my frail body seems to be goal directed -- every part --- why is that? Some examples: My skin ... My skeletal system ... My brain ... My visual system ... My hearing system ... My memory system ... My lungs ... My heart ... My circulatory system ... My muscular system ... My digestive system ... My stomach ... My liver ... My kidneys ... My small intestine ... My large intestine ... My colon ... My sense of touch ... My sense of smell ... My sense of taste ... My balance system ... My sense of place and orientation ... My thinking and creative mind ... My immune system ... My reproductive system ... ... ... And why do I see the same 'goal-direction' things when I am watching the sea gulls at the beach? Did I just read that evolution is not goal directed? Since I don't seem to be meekly going along with the program, should I sign up for one of those 're-education' camps? I hear they have been very effective in other countries. Can I get a PhD and the end of my 're-education?' Will my friends then like me and quit calling me names like IDiot? Will I still be able to think for myself after I am 're-educated?' Again ... just a few kindergarten level questions ... ayearningforpublius
Some may say, "But hey, whatever the photon is doing in the double slit while it is in its infinite dimension/information state, we at least know that it is travelling at the speed of light!" Yet, Special Relativity is just about as mysterious as a photon exiting in a infinite dimension/information state.
"The laws of relativity have changed timeless existence from a theological claim to a physical reality. Light, you see, is outside of time, a fact of nature proven in thousands of experiments at hundreds of universities. I don’t pretend to know how tomorrow can exist simultaneously with today and yesterday. But at the speed of light they actually and rigorously do. Time does not pass." Richard Swenson - More Than Meets The Eye, Chpt. 12 “..the distinction between past, present, and future is only an illusion, however tenacious this illusion may be.” – Albert Einstein – March 1955 (of note: he passed away in April of that year)
And all this is before we even get to the profound mysteries surrounding 'the observer' in the double slit! :) Moreover, Feynman, in his role in developing QED, played an integral part in unifying special relativity with quantum mechanics:
Theories of the Universe: Quantum Mechanics vs. General Relativity Excerpt: The first attempt at unifying relativity and quantum mechanics took place when special relativity was merged with electromagnetism. This created the theory of quantum electrodynamics, or QED. It is an example of what has come to be known as relativistic quantum field theory, or just quantum field theory. QED is considered by most physicists to be the most precise theory of natural phenomena ever developed. http://www.infoplease.com/cig/theories-universe/quantum-mechanics-vs-general-relativity.html
This unification was accomplished by “brushing infinity under the rug.”
THE INFINITY PUZZLE: Quantum Field Theory and the Hunt for an Orderly Universe Excerpt: In quantum electrodynamics, which applies quantum mechanics to the electromagnetic field and its interactions with matter, the equations led to infinite results for the self-energy or mass of the electron. After nearly two decades of effort, this problem was solved after World War II by a procedure called renormalization, in which the infinities are rolled up into the electron’s observed mass and charge, and are thereafter conveniently ignored. Richard Feynman, who shared the 1965 Nobel Prize with Julian Schwinger and Sin-Itiro Tomonaga for this breakthrough, referred to this sleight of hand as “brushing infinity under the rug.” http://www.americanscientist.org/bookshelf/pub/tackling-infinity
Feynman rightly expresses his unease with “brushing infinity under the rug.” here:
“It always bothers me that in spite of all this local business, what goes on in a tiny, no matter how tiny, region of space, and no matter how tiny a region of time, according to laws as we understand them today, it takes a computing machine an infinite number of logical operations to figure out. Now how can all that be going on in that tiny space? Why should it take an infinite amount of logic to figure out what one stinky tiny bit of space-time is going to do?" - Richard Feynman – one of the founding fathers of QED (Quantum Electrodynamics) Quote taken from the 6:45 minute mark of the following video: https://www.youtube.com/watch?v=obCjODeoLVw
I don’t know about Feynman, but as for myself, being a Christian Theist, I find it rather comforting to know that it takes an ‘infinite amount of logic to figure out what one stinky tiny bit of space-time is going to do’:
John1:1 "In the beginning was the Word, and the Word was with God, and the Word was God." of note: ‘the Word’ in John1:1 is translated from ‘Logos’ in Greek. Logos is also the root word from which we derive our modern word logic http://etymonline.com/?term=logic
This is my current favorite quote by Feynman,
The Scientific Method - Richard Feynman - video Quote: 'If it disagrees with experiment, it’s wrong. In that simple statement is the key to science. It doesn’t make any difference how beautiful your guess is, it doesn’t matter how smart you are who made the guess, or what his name is… If it disagrees with experiment, it’s wrong. That’s all there is to it.” https://www.youtube.com/watch?v=OL6-x0modwY
The reason that is my current favorite Feynman quote is because of the following recent experimental evidence verifying Dr. Behe's 1 in 10^20 'edge of evolution':
Michael Behe - Observed Limits of Evolution - video - Lecture delivered in April 2015 at Colorado School of Mines 25:56 minute quote - "This is not an argument anymore that Darwinism cannot make complex functional systems; it is an observation that it does not." 27:50 minute mark: no known, or unknown, evolutionary process helped. https://www.youtube.com/watch?v=9svV8wNUqvA
Verse and Music:
1 Thessalonians 5:21 but test everything; hold fast what is good. Amy Grant - Breath Of Heaven https://www.youtube.com/watch?v=L8_475FKJWQ&list=RDrgGaQWCCjR0&index=2
bornagain
It is interesting to note what Feynman himself says about quantum mechanics. Particularly the double slit:
“The double-slit experiment has in it the heart of quantum mechanics. In reality, it contains the only mystery.” Richard Feynman According to a Physics World poll conducted in 2002,[1] the most beautiful experiment in physics is the two-slit experiment with electrons. According to Richard Feynman,[2] this classic gedanken experiment “has in it the heart of quantum mechanics” and “is impossible, absolutely impossible, to explain in any classical way.” Feynman, R.P., Leighton, R.B., and Sands, M. (1965). The Feynman Lectures in Physics Volume 3, Section 1–1, Addison–Wesley. http://thisquantumworld.com/wp/the-mystique-of-quantum-mechanics/two-slit-experiment/
Anton Zeilinger stated this in regards to the double slit:
"The path taken by the photon is not an element of reality. We are not allowed to talk about the photon passing through this or this slit. Neither are we allowed to say the photon passes through both slits. All this kind of language is not applicable." - Anton Zeilinger - Double Slit Experiment. Is anything real? - video http://www.youtube.com/watch?v=ayvbKafw2g0 "We know what the particle is doing at the source when it is created. We know what it is doing at the detector when it is registered. But we do not know what it is doing in-between." Anton Zeilinger - Double Slit Experiment – video http://www.metacafe.com/watch/6101627/
Actually, contrary to what Zeilinger stated, and according to Feynman himself who had a lead role in developing Quantum Electrodynamics (QED), not only do we not know what the photon is doing 'in between' in the double slit experiment, as it is traveling, we really don't even know how the photons are emitted and absorbed in the first place.
Quantum Electrodynamics The key components of Feynman's presentation of QED are three basic actions.[1]:85 *A photon goes from one place and time to another place and time. *An electron goes from one place and time to another place and time. *An electron emits or absorbs a photon at a certain place and time. These actions are represented in a form of visual shorthand by the three basic elements of Feynman diagrams: a wavy line for the photon, a straight line for the electron and a junction of two straight lines and a wavy one for a vertex representing emission or absorption of a photon by an electron. These can all be seen in the adjacent diagram. It is important not to over-interpret these diagrams. Nothing is implied about how a particle gets from one point to another. The diagrams do not imply that the particles are moving in straight or curved lines. They do not imply that the particles are moving with fixed speeds. The fact that the photon is often represented, by convention, by a wavy line and not a straight one does not imply that it is thought that it is more wavelike than is an electron. The images are just symbols to represent the actions above: photons and electrons do, somehow, move from point to point and electrons, somehow, emit and absorb photons. We do not know how these things happen, but the theory tells us about the probabilities of these things happening. https://en.wikipedia.org/wiki/Quantum_electrodynamics#Introduction
And although, according to Zeilinger, we cannot say exactly what the photon is doing in the double slit between emission and absorption, we do know that while a photon is doing whatever it is doing in the double slit, that the photon is mathematically defined as being in a infinite dimension state. A infinite dimension state that takes an infinite amount of information to describe.
The Unreasonable Effectiveness of Mathematics in the Natural Sciences – Eugene Wigner – 1960 Excerpt: We now have, in physics, two theories of great power and interest: the theory of quantum phenomena and the theory of relativity.,,, The two theories operate with different mathematical concepts: the four dimensional Riemann space and the infinite dimensional Hilbert space, http://www.dartmouth.edu/~matc/MathDrama/reading/Wigner.html Wave function Excerpt "wave functions form an abstract vector space",,, This vector space is infinite-dimensional, because there is no finite set of functions which can be added together in various combinations to create every possible function. http://en.wikipedia.org/wiki/Wave_function#Wave_functions_as_an_abstract_vector_space Explaining Information Transfer in Quantum Teleportation: Armond Duwell †‡ University of Pittsburgh Excerpt: In contrast to a classical bit, the description of a (photon) qubit requires an infinite amount of information. The amount of information is infinite because two real numbers are required in the expansion of the state vector of a two state quantum system (Jozsa 1997, 1) http://www.cas.umt.edu/phil/faculty/duwell/DuwellPSA2K.pdf Quantum Computing – Stanford Encyclopedia Excerpt: Theoretically, a single qubit can store an infinite amount of information, yet when measured (and thus collapsing the Quantum Wave state) it yields only the classical result (0 or 1),,, http://plato.stanford.edu/entries/qt-quantcomp/#2.1
bornagain
The accusation of psedoscience is used by partisans in attacking others ideas they don't like. Until its settled what science is then its open to misuse. Evolution is not science. Its a misuse of science's name. So it follows there are more and finally everyone complains. Darwin was the first big name to misuse science to back up a hunch. Robert Byers

Leave a Reply