Uncommon Descent Serving The Intelligent Design Community

But is this fair to Feynman?

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

File:A small cup of coffee.JPG From Simon Oxenham at BigThink:

How to Use the Feynman Technique to Identify Pseudoscience

Last week a new study made headlines worldwide by bluntly demonstrating the human capacity to be misled by “pseudo-profound bullshit” from the likes of Deepak Chopra, infamous for making profound sounding yet entirely meaningless statements by abusing scientific language.

The researchers correlate believing pseudo-profundities will all kinds of things Clever People Aren’t Supposed to Like, and one suspects the paper wouldn’t survive replication. So why is this a job for Feynman?

Richard Feynman (1918-1988)

This is all well and good, but how are we supposed to know that we are being misled when we read a quote about quantum theory from someone like Chopra, if we don’t know the first thing about quantum mechanics?

Actually, one can often detect BS without knowing much about the topic at hand, because it often sounds deep but doesn’t reflect common sense. Anyway, from Feynman,

I finally figured out a way to test whether you have taught an idea or you have only taught a definition. Test it this way: You say, ‘Without using the new word which you have just learned, try to rephrase what you have just learned in your own language. Without using the word “energy,” tell me what you know now about the dog’s motion.’ You cannot. So you learned nothing about science. That may be all right. You may not want to learn something about science right away. You have to learn definitions. But for the very first lesson, is that not possibly destructive? More.

It won’t work because many people who read pop science literature do so for the same reason others listen to Deepak Chopra: They want to be reassured against their better judgement or the evidence.  Whether it’s that there are billions of habitable planets out there or that chimpanzees are entering the Stone Age, or that everything is a cosmic accident, or whatever the current schtick is.

And Feynman won’t help them, nor will a bucket of ice water. And it’s not fair to drag ol’ Feynman into it just because he said some true things like,

The first principle is that you must not fool yourself and you are the easiest person to fool.

Give the guy a break.

That said, Feynman (1918–1988) may have, through no fault of his (long-deceased) own, played a role in getting a science journalist dumped recently on suspicious grounds. See “Scientific American may be owned by Nature, but it is run by Twitter

Follow UD News at Twitter!

Hat tip: Stephanie West Allen at Brains on Purpose

Comments
ES, where also, the functional clusters or meaningful clusters are zones in configuration states, which are overwhelmingly non-functional. This is where the search across states issue comes in. KFkairosfocus
January 14, 2016
January
01
Jan
14
14
2016
06:25 AM
6
06
25
AM
PST
ES, needle in haystack blind search challenge. KFkairosfocus
January 14, 2016
January
01
Jan
14
14
2016
05:34 AM
5
05
34
AM
PST
KF, I agree, of course. I just think that it needs to be clarified that everything you mention is semantic/semiotic considerations, not a syntactic Shannon information/thermodynamic discourse. Shannon information does not capture the function or meaning of a message.EugeneS
January 14, 2016
January
01
Jan
14
14
2016
04:49 AM
4
04
49
AM
PST
At what scope of needle in haystack search challenge on what degree of limited resources does appeal to >>the magic words ‘fluctuation’ or ‘luck’>> become in effect a materialist miracle of the gaps?kairosfocus
January 11, 2016
January
01
Jan
11
11
2016
09:38 PM
9
09
38
PM
PST
Cf: https://uncommondescent.com/popular-culture/id-and-the-overton-window-batna-march-of-folly-issue/kairosfocus
January 11, 2016
January
01
Jan
11
11
2016
09:34 PM
9
09
34
PM
PST
Dr Selensky, Appreciated. I agree, Shannon info is a measure of capacity. And the nearer to a flat random distribution of glyphs the higher the nominal capacity due to squeezing out redundancies. That said, capacity in a context of observed functionality based on specificity of configuration is a measure of that, not just capacity. And, functional specificity dependent on configuration in a nodes arcs mesh yields a way to quantify complexity. Also, wiring diagram or description based configuration constrains working vs non-working configs. Thus, the islands of function in seas of non function effect and the needle in haystack blind search challenge. Where, the first issue is to get from Darwin's pond to first working life with networks, communication systems, control systems, and the key von Neumann code using kinematic self replication integrated with a functioning metabolic entity for mass and energy flows with proper waste disposal. All of which must be embryologically and ecologically feasible. Yes, irreducibly complex, functionally specific organisation and ever growing complexity are all involved. As are issues on configuration spaces (direct -- wiring diagrams) and by way of descriptions to get complexity indices. That does bring to bear statistical issues, most evidently in the case of the origin based on chem and phys in that Darwin pond or the like environment. I find the notion of a vast, interconnected continent of functional forms dubious, though that seems to be implicit in the tree of life icon. The protein fold domain patterns by contrast immediately put that under question with a large number of isolated clusters in AA sequence space. Similarly, the idea of in effect a golden search is problematic. A search can be seen as a subset of a config space, so the set of possible searches would be the power set, cardinality 2^C for a space of size C. Thus, exponentially harder and harder in succession. Search for search is a real challenge. So, I come back to the focal issue, FSCO/I is an empirically reliable strong sign of design as cause. Including the case of comms systems and linked control systems. With codes involved. That is what needs to be faced. KFkairosfocus
January 11, 2016
January
01
Jan
11
11
2016
06:39 AM
6
06
39
AM
PST
KF, "The same that grounds the 2nd law and explains it." Here is my understanding of the issue. I do not claim to be an expert in this area so may be easily wrong. I think that 2nd law is fundamentally statistical. I believe that the argument from the second law is not as strong as the argument from semantics/semiotics (and, consequently, FCSI). 'Needle in the haystack' is just an illustration. Critics may argue that a natural physical process (allegedly yet unknown) that produced such and such structures may not have been ergodic (like evolution is not ergodic). Therefore it did not need to traverse the entire space of possibilities. To any argument based on probabilities they can always respond with the magic words 'fluctuation' or 'luck'. That is an unfortunate loophole for them :) But design detection need not absolutely depend on it. In my estimation, the way to shut them up (well, at least those of them who are intellectually honest) is consider that this fluctuation is indeed intelligently designed because (a) it is functional, (b) absolutely relies on irreducible complexity and (c) uses code (to the degree that the system that translates code is itself coded with the same code). Shannon information does not reflect any of that. It does not reflect utility. There may be more Shannon information in gibberish text than in a meaningful message. With all due respect to statistical considerations, the iron argument in favor of ID [again, in my personal estimation] is the irreducibility of the complex system {code+processor} that yields pragmatic (non-physical) utility/function.EugeneS
January 11, 2016
January
01
Jan
11
11
2016
02:11 AM
2
02
11
AM
PST
JC, on long track record and the thread above, I am not so sure. In any case, the distinction needs to be clearly put. KFkairosfocus
January 10, 2016
January
01
Jan
10
10
2016
10:19 AM
10
10
19
AM
PST
KF: ". Mung was being satirical, as per usual. KF" As, I believe, was Zachriel.Jonas Crump
January 10, 2016
January
01
Jan
10
10
2016
08:41 AM
8
08
41
AM
PST
Seversky, No one of consequence takes such a view. As in compare Clausius' key example used to give the 2nd law in quantifiable form. Then ask, what is happening at micro level to ways that we can arrange mass and energy, then ask how we get to particular arrangements that are FSCO/I rich in light of search space challenges, in highly contingent circumstances, as distinct to freezing and linked crystalisation per mechanical necessity. Mung was being satirical, as per usual. KFkairosfocus
January 9, 2016
January
01
Jan
9
09
2016
11:06 AM
11
11
06
AM
PST
Mung: The obvious conclusion is that to freeze water requires design. If you assume that entropy can never decrease locally absent design, then that would be the conclusion.Zachriel
January 9, 2016
January
01
Jan
9
09
2016
08:08 AM
8
08
08
AM
PST
Mung, at cosmological level, the cluster of properties of water is a strong indicator of cosmological design. But that is an aside. Water forms crystals by mechanical necessity, which is distinct from the high contingency involved in FSCO/I. KFkairosfocus
January 8, 2016
January
01
Jan
8
08
2016
01:51 PM
1
01
51
PM
PST
The obvious conclusion is that to freeze water requires design.Mung
January 8, 2016
January
01
Jan
8
08
2016
01:30 PM
1
01
30
PM
PST
GD This epitomises your problem:
You could accept that heat flow is sufficient to produce relevant types of information. If you go this way, your overall argument is, as far as I can see, dead.
To suggest such as an option at all, means that you have patently overlooked the fact that we have a sol system and/or a wider cosmos of finite and limited resources.10^57 or 10^80 atoms, 10^17s, 10^12 - 14 atomic level interactions per s. As a direct result, for just say the atomic, paramagnetic substance or its coins equivalent, 1000 2-state elements, flipped at the 10^12 - 14 times/s, and with one such set each per atom as observer, in that duration, the atomic resources of the observed cosmos would be able to search something like one straw to a haystack that dwarfs the whole observed cosmos, translating the 1.07*10^301 possibilities into haystack blind needle search terms. Jaynes' informational thermodynamics view is important in the context of insight into the nature of entropy as a phenomenon connected to the macro-micro distinction. When we ask, what is the further info typically required to specify microstate given macrostate, we are pointing to the degrees of freedom available to a system under given PVT etc conditions, and to the degree of uncertainty and search implied in trying to define a particular microstate. Trying to turn this about into a pushing of the notion that random noise of molecular agitation and/or heat flows linked thereto is sufficient to create information shows fundamental failure to understand what is involved in blind needle in haystack search under relevant conditions. Yes, in watching something freeze into a crystal (such as water), you do gain a fair degree of information about its condition, as it is transformed to a crystal; with rather rigid structural restrictions, up to the usual issues on dislocations, inclusions, etc. That has everything to do with a transformation of mechanical necessity, and nothing to do with the constructive work required to produce aperiodic, functionally specific, contingent and complex key-lock fitting functional components. Where, I repeat, just one 300-monomer protein requires 300 * 4.32 bits/AA = 1296 bits as a baseline, though some redundancies etc are neglected in that first estimate. (That's where Durston's work on functional vs ground state etc comes in.) Let me again clip in fuller and augmented form that key couple of paragraphs from Wikipedia c April 2011, which here makes surprising concessions -- knowing its ideological tendency:
At an everyday practical level the links between information entropy and thermodynamic entropy are not close. Physicists and chemists are apt to be more interested in changes in entropy as a system spontaneously evolves away from its initial conditions, in accordance with the second law of thermodynamics, rather than an unchanging probability distribution. And, as the numerical smallness of Boltzmann's constant kB indicates, the changes in S / kB for even minute amounts of substances in chemical and physical processes represent amounts of entropy which are so large as to be right off the scale compared to anything seen in data compression or signal processing. But, at a multidisciplinary level, connections can be made between thermodynamic and informational entropy, although it took many years in the development of the theories of statistical mechanics and information theory to make the relationship fully apparent. In fact, in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon's information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states that it could be in, thus making any complete state description longer. (See article: maximum entropy thermodynamics.[Also,another article remarks: >>in the words of G. N. Lewis writing about chemical entropy in 1930, "Gain in entropy always means loss of information, and nothing more" . . . in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes/no questions that need to be answered in order to fully specify the microstate, given that we know the macrostate.>>]) Maxwell's demon can (hypothetically) reduce the thermodynamic entropy of a system by using information about the states of individual molecules; but, as Landauer (from 1961) and co-workers have shown, to function the demon himself must increase thermodynamic entropy in the process, by at least the amount of Shannon information he proposes to first acquire and store; and so the total entropy does not decrease (which resolves the paradox).
The very fact that you thought to put such an option on the table, that heat transfer could create adequately functional information of sufficient degree simply indicates to me that you have not adequately addressed the issue of blind search, multiplied by the common failure to differentiate between information holding or transmitting capacity -- what the Shannon metric measures -- and actual concrete functionally specific working configuration based information such as is commonly used in text, computation etc, and as is seen in R/DNA and proteins. The blind, needle in haystack search is inevitable on evolutionary materialist, blind chance and necessity based views. And, those who adhere to or go along with it typically overlook the deeply contextual functional specificity of working information and organisation of components per wiring diagram, node and arc networks. Where, functional specificity locked to arrangement of acceptable parts in the right orientation and arrangement, with appropriate coupling deeply constrains acceptable configurations. The only actually (and routinely) observed source of such FSCO/I is not blind search in config spaces. It is intelligently directed, purposeful configuration, as you experienced in composing your post of objections. To achieve that, energy has to first be converted into orderly shaft or flow work through some conversion mechanisms or other, then in accord with a purpose and/or plan, and put to work in an appropriate step by step sequence of constructive operations, until the resultant functionally organised entity comes into being and is fine tuned to specific acceptable form. This is the only empirically and reliably confirmed cause of FSCO/I, and it is backed up by the import of the needle in haystack search challenge as outlined above. That is the context in which complex specified information and particularly functionally specific complex organisation and associated information, is a reliable sign o design as cause. To overturn that inference, simply show that blind chance and mechanical necessity actually produce such, beyond 500 - 1,000 bits. Where per a basic estimate, a unicellular life form ab initio needs 100 - 1,000 kbases or thereabouts, and a multicellular, body plan needs 10 - 100+ mn or thereabouts on similar basic estimates and observations. At, as an initial number, 2 bits capacity per base. Those are orders of magnitude beyond the threshold composed using the toy example of a string of coins or a simple paramagnetic substance. Where, every additional bit doubles the scope of a config space. Your multi-lemma unfortunately set up a strawman target, which it duly skewered. But that is nothing to do with the actual issue on the table, to focus the importance of the micro-state, config space framework for analysis and to understand the search challenge implied in appeals to blind chance and mechanical necessity as performing relevant constructive work on the scope of the implied information required. Where of course, info storing or moving capacity is not quite the same as actual functionally specific complex organisation and/or associated information. KFkairosfocus
January 8, 2016
January
01
Jan
8
08
2016
01:35 AM
1
01
35
AM
PST
Kairosfocus, let me turn one of your questions back at you. Do you understand the statement:
In fact, in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon’s information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states that it could be in, thus making any complete state description longer.
I don't think you do, because if you did you'd realize that to the extent it's relevant, it severely undercuts your argument. You disagree? Then consider a simple example: compare the entropy of liquid water vs that of ice, and compare their information content (using Jaynes' equivalence). The heat of fusion of water is 79.71 Joules/gram, and since it melts at 273.15 Kelvin (= 0 Celsius = 32 Fahrenheit), the entropy difference between a gram of ice and a gram of water (both at 237.15 K) is S_water - S_ice = 1 g * 79.17 J/g / 273.15 K = 0.2898 J/K. Jaynes' equivalence gives a conversion ratio beteween thermodynamic and Shannon entropies of 1 bit = k_B * ln(2) = 9.570e-24 J/K (where k_B is the Boltzmann constant). With that conversion, the difference in Shannon entropy of the ice vs the water is 0.2898 J/K / 9.570e-24 J/K = 3.028e22 bits. (If you didn't follow that, here's a more detailed derivation: applying Boltzmann's equation S = k_B * ln(w) gives us S_water - S_ice = k_B * (ln(w_water) - ln(w_ice)) = k_B * ln(w_water/w_ice), so w_water/w_ice = e ^ ((S_water - S_ice) / k_B) = e ^ (0.2898 J/K / 1.380e?23 J/K) = e ^ 2.100e22 = 2 ^ 3.029e22. Since the water has 2 ^ 3.029e22 as many possible microscopically distinct states, it would take an additional 3.029e22 bits of information to specify exactly which state it actually was in. Note that this differs slightly from my earlier result due to rounding.) What this means is that if you watch a gram of water freeze, you gain 3.028e22 bits (over 3 zettabytes) of information about its precise physical state. I say "gain", but you haven't really "learned" anything in the usual sense. It's a bit like watching someone sort a shuffled deck of cards: afterward, you know a lot more about the order the cards are in, even though you didn't strictly "learn" anything. And that's for a single gram of water. Watching a kilogram (2.2 pounds) freeze would gain you 3.028e25 bits. Watching a lake freeze... would be a lot of information. There is an important difference between freezing water vs sorting cards, though: I assumed there was an intelligent agent (a human) sorting the cards, but water freezes simply due to heat flowing out of it. Actually, even the observer (i.e. "you" in my description above) isn't necessary, since the change is in the amount of information in the system's macroscopic state, and that is a property of the system itself, and doesn't depend on whether anyone's watching it. The point of all of this? If you accept Jaynes' view, then ridiculously huge gains of information can happen due to heat flow, with no intelligent agent, CSI, FSCI/O, or any such thing involved. As I see it, you have several possible avenues for responding to this challenge: - You could show that there's something seriously wrong with my analysis. Note that you'd need to find an actual problem with it, not just reject it because it doesn't match your views. Good luck with this. - You could reject Jaynes' view on the connection between thermodynamic and Shannon entropies; but since the equivalence works mathematically, and that seems to be a critical part of your argument... - You could accept the result, but claim that this type of information doesn't have any connection to the types of information (CSI, FSCI/O, etc) that you're concerned with. This is more-or-less the view that Zachriel has been pushing you toward, and the only one that makes sense to me. But again, that pretty much wrecks the link to thermodynamics that you've been pushing. - You could accept that heat flow is sufficient to produce relevant types of information. If you go this way, your overall argument is, as far as I can see, dead. So what'll it be?Gordon Davisson
January 7, 2016
January
01
Jan
7
07
2016
11:48 PM
11
11
48
PM
PST
Mung: Why do we value diamonds over brains? Do we now?Zachriel
January 7, 2016
January
01
Jan
7
07
2016
12:49 PM
12
12
49
PM
PST
How many possible arrangements of matter are there that form neither diamonds nor brains? Why do we value diamonds over brains?Mung
January 7, 2016
January
01
Jan
7
07
2016
08:43 AM
8
08
43
AM
PST
Z, it is clear that you are not attending to the substantial matter on the table. The record is there. And BTW, the issue of config spaces and linked states is prior to both eq and non eq th-d. The basic difficulty is you ate trying to get to a blind search-infeasible result with out an adequate means. KFkairosfocus
January 7, 2016
January
01
Jan
7
07
2016
07:05 AM
7
07
05
AM
PST
kairosfocus: enough has already been said Followed by more than 2700 words. If you can't answer simple questions, we'll just assume you can't, and leave you to your self-indulgence. Mung: You agree with me, but I am the one who “gets it.” We are in agreement! Entropy has to do with the number of distinguishable microstates. There are empirical methods for estimating standard entropy. Mung: Who freaking cares? Kairosfocus has expressed extended opinions about the relationship of "FSCO/I" and thermodynamic entropy. We wish to explore this question by asking him a few questions. Mung: Are you using equilibrium thermodynamics or non-equilibrium thermodynamics? Kairosfocus is making reference to equilibrium thermodynamics. To avoid the problem of ambiguity in measuring entropy for a system not in equilibrium, we suggested comparing an unused microprocessor sitting on a shelf and a like mass of diamond.Zachriel
January 7, 2016
January
01
Jan
7
07
2016
05:43 AM
5
05
43
AM
PST
PS: Just for reference, here, again, are Orgel and Wicken:
ORGEL: . . . In brief, living organisms are distinguished by their specified complexity. Crystals are usually taken as the prototypes of simple well-specified structures, because they consist of a very large number of identical molecules packed together in a uniform way. Lumps of granite or random mixtures of polymers are examples of structures that are complex but not specified. The crystals fail to qualify as living because they lack complexity; the mixtures of polymers fail to qualify because they lack specificity . . . . [HT, Mung, fr. p. 190 & 196:] These vague idea can be made more precise by introducing the idea of information. Roughly speaking, the information content of a structure is the minimum number of instructions needed to specify the structure. [--> this is of course equivalent to the string of yes/no questions required to specify the relevant "wiring diagram" for the set of functional states, T, in the much larger space of possible clumped or scattered configurations, W, as Dembski would go on to define in NFL in 2002] One can see intuitively that many instructions are needed to specify a complex structure. [--> so if the q's to be answered are Y/N, the chain length is an information measure that indicates complexity in bits . . . ] On the other hand a simple repeating structure can be specified in rather few instructions. [--> do once and repeat over and over in a loop . . . which can be implicit in the molecular spatial architecture] Complex but random structures, by definition, need hardly be specified at all . . . . Paley was right to emphasize the need for special explanations of the existence of objects with high information content, for they cannot be formed in nonevolutionary, inorganic processes. [The Origins of Life (John Wiley, 1973), p. 189, p. 190, p. 196. Of course, that immediately highlights OOL, where the required self-replicating entity is part of what has to be explained, a notorious conundrum for advocates of evolutionary materialism; one, that has led to mutual ruin documented by Shapiro and Orgel between metabolism first and genes first schools of thought. Behe would go on to point out that irreducibly complex structures are not credibly formed by incremental evolutionary processes and Menuge et al would bring up serious issues for the suggested exaptation alternative, cf. his five challenges. Finally, Dembski highlights that CSI comes in deeply isolated islands T in much larger configuration spaces W, for biological systems functional islands. That puts up serious questions for origin of dozens of body plans reasonably requiring some 10 - 100+ mn bases of fresh genetic information to account for cell types, tissues, organs and multiple coherently integrated systems. Wicken's remarks a few years later as already were cited now take on fuller force in light of the further points from Orgel at pp. 190 and 196 . . . ] WICKEN: ‘Organized’ systems are to be carefully distinguished from ‘ordered’ systems. Neither kind of system is ‘random,’ but whereas ordered systems are generated according to simple algorithms [i.e. “simple” force laws acting on objects starting from arbitrary and common- place initial conditions] and therefore lack complexity, organized systems must be assembled element by element according to an [originally . . . ] external ‘wiring diagram’ with a high information content . . . Organization, then, is functional complexity and carries information. It is non-random by design or by selection, rather than by the a priori necessity of crystallographic ‘order.’ [“The Generation of Complexity in Evolution: A Thermodynamic and Information-Theoretical Discussion,” Journal of Theoretical Biology, 77 (April 1979): p. 353, of pp. 349-65.]
kairosfocus
January 7, 2016
January
01
Jan
7
07
2016
12:15 AM
12
12
15
AM
PST
F/N: As a for record, I note on points in summary of things long since highlighted but ignored: >>Kairosfocus’s position (we won’t call it an argument)>> a --> Genetic fallacy, in teeth of evidence that the issues are far more than what some idiosyncratic bloggist says. In effect tantamount to a confession of intent not to engage in rational discussion but an exercise of distraction, distortion, denigration and dismissal >> is that the arrangements of matter that make up a biological structure is minuscule compared to possible arrangements, so are unlikely to occur by chance.>> b --> Neatly left out [and as highlighted since Orgel and Wicken], high contingency, aperiodic functionally specific organisation which by contrast with many other accessible configs, will be deeply isolated. c --> There is a search and there is not a determination per mechanical necessity which prevails in crystalline order, establishing unit cells and the repetition of such to give the macro-structure of the crystal. >> However, the arrangements of microstates that make up a diamond are even more minuscule compared to their possible arrangements.>> d --> Way up in this thread (and repeatedly in citation from Orgel and Wicken), it was pointed out that the crystalline, periodic structure is driven by the mechanical necessity of the circumstances of crystal formation. Diamonds, in fact form under high pressure and temperature conditions and under our more common circumstances are metastable. e --> So, perhaps Z wishes to sggest that similar forces of mechanical necessity force the formation of say protein or DNA chains. The problem is, the chaining chemistry is in fact standard as bonding from the diverse monomer components, forming a backbone where any possible monomer may follow any other. Whether peptide or sugar-phosphate. As Z knows or should know . . . R/DNA chains (sugar-phosphate) and protein chains (peptide bond) are highly contingent. That is how R/DNA can carry a code, and it is how diverse proteins are based on the particular specific sequence, folding and function post folding. f --> Further to this, the key chemistry for function is on the side-chains, in effect at right angles to the chains. That is how on prong height, 3-letter codon sequences in an mRNA chain will accept relevant anticodon tRNAs, which in turn use a standard CCA tip to catch the particular AA from a "loading enzyme" . . . which is what imposes the code. This leads to loading of particular AAs in accord with a code. g --> Which code is not imposed by any mechanical necessity. Indeed, researchers have extended it. >> All this means in terms of thermodynamics is that entropy has to be exported at the expense of usable energy.>> h --> This is little more than anything can happen in an open system. i --> The issue is, that under relevant circumstances, just as with the coin chain or the paramagnetic substance binary models above, there is high contingency for D/RNA chains, and for AA chains. In effect, 4 possibilities per base and 20 possibilities per AA, respectively. This gives rise to a configuration space that expands at the exponential order of n^4 or n^20 respectively, n being chain length. j --> Under relevant circumstances for diamond formation, C-atoms are effectively equivalent, ad there is no effective contingency, it is just the matter of forming the tetrahedral covalent bond structure. k --> It is in that context that there is for these molecules, a very large space of possibilities that has in it islands of function. For proteins in AA sequence space, due to folding, fitting with relevant other components etc to function. l --> For D/mRNA, the encoding of the functionally specific information from start/load methionine to extending to stop, with three different stop codes. I leave off the editing of mRNA before sending it to the ribosome, but that becomes important too. m --> In other cases, both biological and otherwise [e.g. the Abu 6500 3c reel], there are many relevant alternative arrangements of parts, only a very small fraction of which will be functionally relevant due to need to be correctly arranged and coupled per a wiring diagram as Wicken describes. n --> BTW, anyone notice how my repeated reference to Wicken -- functionally specific info rich organisation, wiring diagram arrangement -- and Orgel -- complex specified organisation with distinction from both random arrangements and crystalline order -- keeps on being studiously ignored in the rush to personalise and polarise, then dismiss? >> There’s nothing about biological organisms, about human activity, or about evolution, that is contrary to the laws of thermodynamics.>> o --> The issue is not the "laws" -- which there is no dispute do apply -- but the underlying dynamics of config or phase spaces that give rise to the patterns of consistent behaviour/pheomena that may be summarised. p --> In short, the issue is being strawmannised. q --> The evidence and analysis per Clausius is, energy importation tends to increase entropy by increasing number of ways that energy and mass at micro level may be arranged consistent with the macro conditions. On the informtional view, the entropy metric fan be seen as the further info required to specify microstate given macro conditions consistent with a set of possibilities. r --> The further evidence is, per massive observation [and that without exception], that functionally specific complex organisation is effected by direct intelligent configuration, or else by programs and/or wiring diagrams that are information rich and functionally specific. s --> Further to this, inflowing energy is typically converted into shaft or flow work through an energy converter, and this is then used in a controlled sequence of steps, to configure per the diverse possibilities as limited by the agent involved or the program and wiring diagram etc involved. t --> Waste energy is then ejected in some way, dissipating in a low temperature reservoir. In some cases, e.g. sweating, that is by way of evaporation, taking advantage of the work required to break up the bonds in water as a condensed polar molecular substance with high heat capacity. u --> So, the issue comes back to ordered vs random vs functionally organised sequence complexity, where discussion on strings is WLOG as strings can in principle be used to create the relevant wiring diagrams and programs. That is, there is a preservation of information involved -- which raises the onward question per the config space issues as outlined, of the origin of such information that specifies arrangement per function. v --> Consistently, when we observe (on a trillion member base) it is intelligent direction that accounts for the origin. w --> While, the needle in haystack challenge facing blind mechanical necessity and/or blind chance is that the resources of sol system or the observed cosmos are soon exhausted while being unable to come up with a credible search, given that config spaces exponentiate per S^n for s-state per position chains of length n. Hence, needle in haystack blind search challenge _____________ Now, nothing in the above outline is new, though there is expansion on some points for this thread -- little did I suspect that I would need to state the nature of AA and D/mRNA chains again and why such strings are made up from highly contingent chains capable of 4.32 or 2 bits per member of info carrying capacity on the direct estimation of set of possibilities . . . as Shannon himself used in his original paper long since. It is amazing to see how, after years and years, objectors still seem to be unable to simply accurately describe the issue as we see it. Don't have to agree, just give a fair summary then provide good and cogent argument as to your own view. It is fair comment that there is no observationally confirmed means to create FSCO/I rich systems per blind chance and/or mechanical necessity only, but there is a trillion member observational basis that shows this to be accounted for by intelligently directed configuration. If you doubt, think about how the objecting comment you compose comes about,as a case of a string made up from 128 state elements in ASCII code, in turn tracing to 7-bit core elements, with augmentation. KFkairosfocus
January 7, 2016
January
01
Jan
7
07
2016
12:05 AM
12
12
05
AM
PST
Zachriel: Now you’re getting it! You agree with me, but I am the one who "gets it." Got it. Zachriel: Which has more available microstates, a brain or a like mass of diamonds? Who freaking cares? Are you using equilibrium thermodynamics or non-equilibrium thermodynamics? Does the statistical mechanical basis of thermodynamics change based on your whimsy?Mung
January 6, 2016
January
01
Jan
6
06
2016
07:20 PM
7
07
20
PM
PST
F/N: Sewell has aptly put the issue, never mind some odd points on terminology:
. . . The second law is all about probability, it uses probability at the microscopic level to predict macroscopic change: the reason carbon distributes itself more and more uniformly in an insulated solid is, that is what the laws of probability predict when diffusion alone is operative. The reason natural forces may turn a spaceship, or a TV set, or a computer into a pile of rubble but not vice-versa is also probability: of all the possible arrangements atoms could take, only a very small percentage could fly to the moon and back, or receive pictures and sound from the other side of the Earth, or add, subtract, multiply and divide real numbers with high accuracy. The second law of thermodynamics is the reason that computers will degenerate into scrap metal over time, and, in the absence of intelligence, the reverse process will not occur; and it is also the reason that animals, when they die, decay into simple organic and inorganic compounds, and, in the absence of intelligence, the reverse process will not occur. The discovery that life on Earth developed through evolutionary "steps," coupled with the observation that mutations and natural selection -- like other natural forces -- can cause (minor) change, is widely accepted in the scientific world as proof that natural selection -- alone among all natural forces -- can create order out of disorder, and even design human brains, with human consciousness. Only the layman seems to see the problem with this logic. In a recent Mathematical Intelligencer article ["A Mathematician's View of Evolution," The Mathematical Intelligencer 22, number 4, 5-7, 2000] I asserted that the idea that the four fundamental forces of physics alone could rearrange the fundamental particles of Nature into spaceships, nuclear power plants, and computers, connected to laser printers, CRTs, keyboards and the Internet, appears to violate the second law of thermodynamics in a spectacular way.1 . . . . What happens in a[n isolated] system depends on the initial conditions; what happens in an open system depends on the boundary conditions as well. As I wrote in "Can ANYTHING Happen in an Open System?", "order can increase in an open system, not because the laws of probability are suspended when the door is open, but simply because order may walk in through the door.... If we found evidence that DNA, auto parts, computer chips, and books entered through the Earth's atmosphere at some time in the past, then perhaps the appearance of humans, cars, computers, and encyclopedias on a previously barren planet could be explained without postulating a violation of the second law here . . . But if all we see entering is radiation and meteorite fragments, it seems clear that what is entering through the boundary cannot explain the increase in order observed here." Evolution is a movie running backward, that is what makes it special. THE EVOLUTIONIST, therefore, cannot avoid the question of probability by saying that anything can happen in an open system, he is finally forced to argue that it only seems extremely improbable, but really isn't, that atoms would rearrange themselves into spaceships and computers and TV sets . . .
KFkairosfocus
January 6, 2016
January
01
Jan
6
06
2016
05:17 PM
5
05
17
PM
PST
Z, enough has already been said, and when answers have been pointed out you have consistently ignored, now pretending not given. The last time around was a few posts ago at 256, I am not going back to that loop ad nauseum. It is not to be overlooked that all these side tracks do not affect the force of the core needle in haystack in Darwin's pond issue but sure make ways to not discuss it. There is enough for those who are willing and nothing will ever be enough for the unwilling. KF PS: Memory was wrong, T & A was 2005: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1208958/pdf/1742-4682-2-29.pdf cf fig 4 esp for OSC, RSC, FSC, which directly relates to the relevant component of entropy. As in, with a repetitive unit bloc -esp single atom] there is little room for variability. Random patterns will give highest Shannon Info carrying cap values, in functional patterns, redundancies will reduce cap per symbol. The link to further info to describe microstate should be plain from the resistance of a random string to compression. And discussion on strings is WLOG. PPS: It was long since pointed out but studiously ignored that due to needle in haystack challenge consistently FSCO/I configuring work is directed by direct intelligence or by programme; as is universally observed. Importing raw energy tends to make entropy rise. Open systems do not magically make probabilistic search challenge issues vanish.kairosfocus
January 6, 2016
January
01
Jan
6
06
2016
04:49 PM
4
04
49
PM
PST
Mung: Now you’re getting it! How many possible arrangements of carbon microstates are there compared to how many possible arrangements of carbon microstates that form brains? Now you're getting it! Which has more available microstates, a brain or a like mass of diamonds? Mung: So what? Ask kairosfocus. It's his position. Kairosfocus's position (we won't call it an argument) is that the arrangements of matter that make up a biological structure is minuscule compared to possible arrangements, so are unlikely to occur by chance. However, the arrangements of microstates that make up a diamond are even more minuscule compared to their possible arrangements. All this means in terms of thermodynamics is that entropy has to be exported at the expense of usable energy. There's nothing about biological organisms, about human activity, or about evolution, that is contrary to the laws of thermodynamics.Zachriel
January 6, 2016
January
01
Jan
6
06
2016
04:00 PM
4
04
00
PM
PST
Zachriel: How many possible arrangements of carbon microstates are there compared to how many possible arrangements of carbon microstates that form diamonds? Now you're getting it! How many possible arrangements of carbon microstates are there compared to how many possible arrangements of carbon microstates that form brains? Now, after having done the appropriate calculations, you have an entropy value for each. So what?Mung
January 6, 2016
January
01
Jan
6
06
2016
03:46 PM
3
03
46
PM
PST
kairosfocus: t is now clear that you are not attending to what I have said We have attempted to address your points, but you have made no effort whatsoever. kairosfocus: The same sort of reasoning that explains why per relative statistical weight, the bulk clusters utterly dominate thermodynamic outcomes, grounding the 2nd law. How many possible arrangements of carbon microstates are there compared to how many possible arrangements of carbon microstates that form diamonds?Zachriel
January 6, 2016
January
01
Jan
6
06
2016
03:22 PM
3
03
22
PM
PST
Z, it is now clear that you are not attending to what I have said, much less what Orgel and Wicket pointed out 40 years ago. (I will again pont you to L K Nash's string of coins example and Mandl's paramagnetic substance of say 1,000 atoms with B-field alignment parallel/antiparallel, and the obvious resulting binomial distribution of possibilities. As this is directly parallel to a string of bits, it will readily be seen that it will have clusters of meaningful strings in its config space -- every possible 1,000 bit combination -- but the bulk of possibilities will be near 50-50 in no particular order. This toy example has much to teach.) I pointed to the underlying statistical view and analysis on config spaces (or more broadly phase spaces), which sets up how a commonly observed phenomenon, functionally specific, complex organisation fits in as giving a case of identifiable and predictably small clusters relative to the space as a whole. That sets up the sort of needle in haystack circumstance that leads to there being a maximal unlikeliness of landing in such zones by blind chance and mechanical necessity. The same sort of reasoning that explains why per relative statistical weight, the bulk clusters utterly dominate thermodynamic outcomes, grounding the 2nd law. Recall, for 1,000 bits worth of configs, the number of possible occurrences for 10^80 atoms, 10^17 s and 10^12 - 14 states/s (observations) per atom would only be able to sample as one straw to a haystack that would dwarf the observed cosmos. As to your ad nauseum on demanded answers to questions where you have already ignored cogent answers to essentially the same question already, your behaviour does not lead me to more than say what has already been pointed out: highly ordered states are low information, random states have highest info carrying capacity, organised states will be intermediate. And, the Jaynes point on the entropy being an index of additional information to specify microstate on being given the macro observable state obtains. This has no relevance to the already well established point that a blind needle in haystack search will be maximally unlikely to find FSCO/I clusters by chance and necessity. The point that it seems you cannot bring yourself to face. KFkairosfocus
January 6, 2016
January
01
Jan
6
06
2016
02:37 PM
2
02
37
PM
PST
Mung: A world without thermodynamics, that’s your position? Not at all. The second law of thermodynamics says that overall thermodynamic entropy must increase. It says that if a snowflake forms from water vapor, which is a decrease in entropy, the entropy must be exported to the surroundings. When an organism converts food into protein, which is a decrease in entropy, the entropy must be exported to the surroundings. When a human refines metals to make a machine, which is a decrease in entropy, the entropy must be exported to the surroundings. kairosfocus: for over 100 years since Boltzmann, Maxwell and Gibbs et al, the underlying statistical analysis has been the core of thermodynamics, and particularly of the 2nd law. You didn't answer. If you are claiming that FSCO/I is part of thermodynamics, then please tell us which has lower thermodynamic entropy; an unused microprocessor on a shelf, or a diamond of like mass. If you can't, then your it's clear that you can't substantiate your claims.Zachriel
January 6, 2016
January
01
Jan
6
06
2016
08:53 AM
8
08
53
AM
PST
Dr Selensky, pardon but no one is arguing that thermodynamics or its linked microstates picture and info holding capacity (i.e. Shannon info) directly points to design. Just, that analysis on configurational possibilities -- microstates -- is foundational and valid in thermo-d; with implications once we see the impact of the vast number of possibilities. The same that grounds the 2nd law and explains it. Beyond, it is being stated that there is as a matter of fact a valid approach from Jaynes et al, which identifies entropy as an index of the further info left to specify microstate once state has been defined at macro level; in effect comparable to accessible possibilities for distribution of mass and energy at micro level or to degrees of freedom compatible with a given macro level state. In that context, functionally organised clusters of states at micro level will come in islands of function deeply isolated in the config space of possibilities. So, blind chance and necessity transitions from arbitrary initial conditions will be maximally unlikely to find them due to the overwhelming statistical weight of dominant but non-functionally organised clusters and limited sol system or cosmos scale resources, the needle in haystack challenge. In typical stats terms, if you take a few samples you do not have a reasonable expectation to see the extreme far tail. And function is itself an observable. Under such circumstances the best causal explanation for complex, functionally organised states is design. That is an induction. KFkairosfocus
January 6, 2016
January
01
Jan
6
06
2016
07:40 AM
7
07
40
AM
PST
1 2 3 10

Leave a Reply