Uncommon Descent Serving The Intelligent Design Community

But is this fair to Feynman?

Categories
Culture
Intelligent Design
News
Science
Share
Facebook
Twitter/X
LinkedIn
Flipboard
Print
Email

File:A small cup of coffee.JPG From Simon Oxenham at BigThink:

How to Use the Feynman Technique to Identify Pseudoscience

Last week a new study made headlines worldwide by bluntly demonstrating the human capacity to be misled by “pseudo-profound bullshit” from the likes of Deepak Chopra, infamous for making profound sounding yet entirely meaningless statements by abusing scientific language.

The researchers correlate believing pseudo-profundities will all kinds of things Clever People Aren’t Supposed to Like, and one suspects the paper wouldn’t survive replication. So why is this a job for Feynman?

Richard Feynman (1918-1988)

This is all well and good, but how are we supposed to know that we are being misled when we read a quote about quantum theory from someone like Chopra, if we don’t know the first thing about quantum mechanics?

Actually, one can often detect BS without knowing much about the topic at hand, because it often sounds deep but doesn’t reflect common sense. Anyway, from Feynman,

I finally figured out a way to test whether you have taught an idea or you have only taught a definition. Test it this way: You say, ‘Without using the new word which you have just learned, try to rephrase what you have just learned in your own language. Without using the word “energy,” tell me what you know now about the dog’s motion.’ You cannot. So you learned nothing about science. That may be all right. You may not want to learn something about science right away. You have to learn definitions. But for the very first lesson, is that not possibly destructive? More.

It won’t work because many people who read pop science literature do so for the same reason others listen to Deepak Chopra: They want to be reassured against their better judgement or the evidence.  Whether it’s that there are billions of habitable planets out there or that chimpanzees are entering the Stone Age, or that everything is a cosmic accident, or whatever the current schtick is.

And Feynman won’t help them, nor will a bucket of ice water. And it’s not fair to drag ol’ Feynman into it just because he said some true things like,

The first principle is that you must not fool yourself and you are the easiest person to fool.

Give the guy a break.

That said, Feynman (1918–1988) may have, through no fault of his (long-deceased) own, played a role in getting a science journalist dumped recently on suspicious grounds. See “Scientific American may be owned by Nature, but it is run by Twitter

Follow UD News at Twitter!

Hat tip: Stephanie West Allen at Brains on Purpose

Comments
You seem unable to respond directly to arguments as they are raised. kairosfocus: Your snowflake was an analogy that failed, you confuse complexity with specified complexity. It’s not an analogy, and there’s nothing about “complex specificity” in the laws of thermodynamics. kairosfocus: When we design, construct and use computers, we carry out organised work sequences that are controlled informationally and use work-producing machines. The exported waste energy secures consistency with Lex2 Th, and illustrates the only empirically warranted source for the relevant FSCO/I, intelligently directed configuration. So building and using computers is consistent with the laws of thermodynamics.Zachriel
December 28, 2015
December
12
Dec
28
28
2015
03:24 PM
3
03
24
PM
PDT
Z, the pivotal case is Darwin's pond or the like. There is no reproduction mechanism, that has to be explained or accounted for on physics and chemistry in the pond. And yes, that is OOL. Start there as it is the plainest case, and one where there is no magic of differential reproductive success to cloud the core issue. I add, not that origin of body plans requiring 10 - 100+ mn base pairs will help much, with need for novel proteins and integrated systems. KF PS: There is a lot about work in thermo-d and you propose to extract constructive work assembling functionally specific complex and info rich organisation from the blind chance and mechanical necessity of the pond etc. Where FSCO/I is known to sharply constrain configs relative to the set of possibilities, posing a beyond astronomical needle in haystack search. At 1,000 bits of info for let's just say a string molecule, the set of possibilities to what the observed cosmos could do, would be as a haystack dwarfing the cosmos to a single straw. This is the reason why it is very reasonable that we only observe FSCO/I coming from design and construction per wiring diagram. PPS: Your snowflake was an analogy that failed, you confuse complexity with specified complexity. PPPS: Likewise, it is patent that I answered as to how constructive work to create the FSCO/I in a PC accords with 2LOT, by exporting waste heat and using constructive machines and energy converters that act according to a plan. Note:
When we design, construct and use computers, we carry out organised work sequences that are controlled informationally and use work-producing machines. The exported waste energy secures consistency with Lex2 Th, and illustrates the only empirically warranted source for the relevant FSCO/I, intelligently directed configuration.
You are trying to get constructive work to build huge FSCO/I without a plan or constructive machines, out of lucky noise. Contrast how proteins are constructed per instructions in the ribosome, a clue as to what is reasonable. All the way back to Thaxton et al this was addressed.kairosfocus
December 28, 2015
December
12
Dec
28
28
2015
02:42 PM
2
02
42
PM
PDT
Zachriel:
Evolution.
Define "evolution" and then produce the evidence that it can produce CSI from scratch.
It’s not an analogy, and there’s nothing about “complex specificity” in the laws of thermodynamics.
CSI requires magnitudes of order. And that doesn't come from disorder.
When a human builds and uses a computer, is that consistent with the laws of thermodynamics?
No, if nature, operating freely did, tat would violate thermodynamics. And the same goes for a living organism.Virgil Cain
December 28, 2015
December
12
Dec
28
28
2015
12:35 PM
12
12
35
PM
PDT
kairosfocus: You are asking for lucky noise to create FSCO/I. No. Evolution. kairosfocus: And indeed, the point is that the high contingency and complex specificity are tied to different aspects of the snowflake. Your analogy collapses. It's not an analogy, and there's nothing about "complex specificity" in the laws of thermodynamics. kairosfocus: PS: When we design, construct and use computers, we carry out organised work sequences that are controlled informationally and use work-producing machines. You didn't answer the question. When a human builds and uses a computer, is that consistent with the laws of thermodynamics?Zachriel
December 28, 2015
December
12
Dec
28
28
2015
12:22 PM
12
12
22
PM
PDT
Would work even be possible without entropy?Mung
December 28, 2015
December
12
Dec
28
28
2015
12:06 PM
12
12
06
PM
PDT
PPS: Work being best summed up as forced, ordered motion at macro or micro levels. Functionally specific organisation arises from step by step sequences of work that assemble an entity per its wiring diagram, hence construction work arising from shaft work. Heat engines provide a partial conversion of random molecular agitation to shaft work, and that is a main reason why this cannot be 100% efficient per the Carnot theorem. Ordered motion can much more readily be converted into work, as with wind flow and turbines or electrical currents etc. Heat engines reject waste heat to the environment.kairosfocus
December 28, 2015
December
12
Dec
28
28
2015
11:25 AM
11
11
25
AM
PDT
Z That is directly connected, are you aware of the statistical basis for entropy and the second law? You are asking for lucky noise to create FSCO/I. Second, there is patently no programmed inclination in nature to form particular D/RNA and AA sequences. Otherwise the flexibility in D/RNA and in proteins would be impossible. That is the flexibility and highly informational character of such molecules precisely undermines any claim or suggestion that we deal with the structuring dynamic that imposes sixfold symmetry on snowflakes. And, where snowflakes are highly variable, that too would be a case where the flexibility is not preprogrammed to give particular outcomes. And indeed, the point is that the high contingency and complex specificity are tied to different aspects of the snowflake. Your analogy collapses. I think you need to attend to what Orgel and Wicken had to say in the 1970's:
Wicken, 1979: ‘Organized’ systems are to be carefully distinguished from ‘ordered’ systems. Neither kind of system is ‘random,’ but whereas ordered systems are generated according to simple algorithms [[i.e. “simple” force laws acting on objects starting from arbitrary and common- place initial conditions] and therefore lack complexity, organized systems must be assembled element by element according to an [[originally . . . ] external ‘wiring diagram’ with a high information content . . . Organization, then, is functional complexity and carries information. It is non-random by design or by selection, rather than by the a priori necessity of crystallographic ‘order.’ [[“The Generation of Complexity in Evolution: A Thermodynamic and Information-Theoretical Discussion,” Journal of Theoretical Biology, 77 (April 1979): p. 353, of pp. 349-65. (Emphases and notes added. Nb: “originally” is added to highlight that for self-replicating systems, the blue print can be built-in.)] Orgel, 1973: . . . In brief, living organisms are distinguished by their specified complexity. Crystals are usually taken as the prototypes of simple well-specified structures, because they consist of a very large number of identical molecules packed together in a uniform way. Lumps of granite or random mixtures of polymers are examples of structures that are complex but not specified. The crystals fail to qualify as living because they lack complexity; the mixtures of polymers fail to qualify because they lack specificity . . . . [HT, Mung, fr. p. 190 & 196:] These vague idea can be made more precise by introducing the idea of information. Roughly speaking, the information content of a structure is the minimum number of instructions needed to specify the structure. [--> this is of course equivalent to the string of yes/no questions required to specify the relevant "wiring diagram" for the set of functional states, T, in the much larger space of possible clumped or scattered configurations, W, as Dembski would go on to define in NFL in 2002, also cf here, here and here (with here on self-moved agents as designing causes).] One can see intuitively that many instructions are needed to specify a complex structure. [--> so if the q's to be answered are Y/N, the chain length is an information measure that indicates complexity in bits . . . ] On the other hand a simple repeating structure can be specified in rather few instructions. [--> do once and repeat over and over in a loop . . . ] Complex but random structures, by definition, need hardly be specified at all . . . . Paley was right to emphasize the need for special explanations of the existence of objects with high information content, for they cannot be formed in nonevolutionary, inorganic processes. [The Origins of Life (John Wiley, 1973), p. 189, p. 190, p. 196. Of course, that immediately highlights OOL, where the required self-replicating entity is part of what has to be explained (cf. Paley here), a notorious conundrum for advocates of evolutionary materialism; one, that has led to mutual ruin documented by Shapiro and Orgel between metabolism first and genes first schools of thought, cf here. Behe would go on to point out that irreducibly complex structures are not credibly formed by incremental evolutionary processes and Menuge et al would bring up serious issues for the suggested exaptation alternative, cf. his challenges C1 - 5 in the just linked. Finally, Dembski highlights that CSI comes in deeply isolated islands T in much larger configuration spaces W, for biological systems functional islands. That puts up serious questions for origin of dozens of body plans reasonably requiring some 10 - 100+ mn bases of fresh genetic information to account for cell types, tissues, organs and multiple coherently integrated systems. Wicken's remarks a few years later as already were cited now take on fuller force in light of the further points from Orgel at pp. 190 and 196 . . . ]
For years, it has been pointed out that this is the source context for discussions of CSI and more particularly functionally specific CSI, i.e. FSCO/I. KF PS: When we design, construct and use computers, we carry out organised work sequences that are controlled informationally and use work-producing machines. The exported waste energy secures consistency with Lex2 Th, and illustrates the only empirically warranted source for the relevant FSCO/I, intelligently directed configuration. Your suggestion to confine intelligence to humans collapses by failing the giggle test.kairosfocus
December 28, 2015
December
12
Dec
28
28
2015
10:37 AM
10
10
37
AM
PDT
kairosfocus: the pivotal issue is spontaneous origin of functionally specific complex organisation The question concerned thermodynamic entropy. kairosfocus: Under such circumstances, such spontaneous formation is maximally implausible for the same reason that there is a strong tendency to the predominant cluster of microstates. So are snowflakes, given particles randomly arranged. When a human builds and uses a computer, is that consistent with the laws of thermodynamics?Zachriel
December 28, 2015
December
12
Dec
28
28
2015
08:38 AM
8
08
38
AM
PDT
Z, the pivotal issue is spontaneous origin of functionally specific complex organisation and associated information in Darwin's warm pond or the like, without mechanisms to couple energy to step by step procedures that perform constructive work. Under such circumstances, such spontaneous formation is maximally implausible for the same reason that there is a strong tendency to the predominant cluster of microstates. This also extends to the problem of elaborating novel body plans without an intelligent author of relevant instructions etc. KFkairosfocus
December 28, 2015
December
12
Dec
28
28
2015
08:29 AM
8
08
29
AM
PDT
ayearningforpublius: So my question is — what is the role of entropy in the Neo-Darwinian explanation of life … how life develops … how life came to be … etc. And yes, please understand that my exposure to Biology textbooks and a class I took way back when are distant memories, so please don’t refer me to those resources. While the Second Law of Thermodynamics states that overall entropy will increase, it is quite possible for areas within a system to experience decreased entropy. Consider a simple example as an analogy. Energy from the Sun flows through the Earth's surface system, then exits into the cold of space. As it does so, water evaporates, whirlpools form in the air and water, water crystallizes and falls as snow, glaciers move towards the sea, and many other examples of low entropy brought about as the energy flows through the Earth's system. From a thermodynamic view, think of life as a complex eddy in the stream of energy. ayearningforpublius: And as a bonus Q/A, how and does entropy relate to my questions @4? They don't relate particularly. Biological processes are all consistent with the laws of thermodynamics. Evolution is consistent with the laws of thermodynamics. Thermodynamics is not an explanation for life, but any explanation for life has to be consistent with thermodynamics. When a human builds and uses a computer, is that consistent with the laws of thermodynamics?Zachriel
December 28, 2015
December
12
Dec
28
28
2015
07:45 AM
7
07
45
AM
PDT
Kindergarten Q/A again: This thread moved to quite a debate about entropy, and I'm seeking an answer: In very simple language, my reading of this word translates to "decay." Things over time decay and waste away; abandoned cars rust away, abandoned buildings crumble and fall, my life decays and unless ended catastrophically, some of my critical parts will decay and all the rest of me will die. So my question is -- what is the role of entropy in the Neo-Darwinian explanation of life ... how life develops ... how life came to be ... etc. And yes, please understand that my exposure to Biology textbooks and a class I took way back when are distant memories, so please don't refer me to those resources. And as a bonus Q/A, how and does entropy relate to my questions @4? Thanksayearningforpublius
December 28, 2015
December
12
Dec
28
28
2015
07:30 AM
7
07
30
AM
PDT
No, actually I’m still not convinced you’ve actually opened a biology textbook, Mungy. First of all, if you had ever opened a general biology textbook and read the early chapters, you’d see that entropy is consistently thought of as a measure of disorder. Here are some examples: Molecular Biology of the Cell, chapter 2: “The amount of disorder in a system can be quantified. The quantity that we use to measure this disorder is called the entropy of the system: the greater the disorder, the greater the entropy. Thus, a third way to express the second law of thermodynamics is to say that systems will change spontaneously toward arrangements with greater entropy.” Molecular Cell Biology, chapter 2: “Entropy S is a measure of the degree of randomness or disorder of a system. Entropy increases as a system becomes more disordered and decreases as it becomes more structured.” Protein Structure and Function, chapter 1: “Entropy is a measure of randomness or disorder.” The Cell: A molecular approach, chapter 2: “The change in free energy (?G) of a reaction combines the effects of changes in enthalpy (the heat that is released or absorbed during a chemical reaction) and entropy (the degree of disorder resulting from a reaction) to predict whether or not a reaction is energetically favorable.” Second, let’s see what your favorite book says about entropy: “..entropy is a measure of the number of different ways of rearranging the system.” Your book says entropy is a measure of disorder, without actually saying the word “disorder.” This is because physicists know “entropy=disorder” is a bit of an oversimplification, but it works. In fact, those are the exact words that will come out of every professor’s mouth when they teach about entropy in college-level biology classes. For instance, Ahmet Yildiz from UC Berkeley, who uses your book in his biophysics course, says that “entropy (degree of disorder) gets maximized in spontaneous processes in isolated systems.” Third, I’d put money on the fact that you used google ebooks to search “order” and “disorder” in this book you love to talk about so much. So no, I still don’t think you’ve opened up a biology book, and even if you have, you haven’t learned a single thing about biology. Don’t let the door hit you on the way out Mungy.Alicia Cartelli
December 27, 2015
December
12
Dec
27
27
2015
09:49 AM
9
09
49
AM
PDT
Alicia Cartelli:
Mungy, why do you think that chapter is titled “Entropy Rules!”?
I can tell you what i do know. The chapter never once claims that entropy has anything at all to do with order/disorder. The word disorder doesn't even appear in the chapter, and while the word order appears twice it's use on those two occasions has nothing at all to do with order/disorder as you're using it. And that's from a textbook on the Physical Biology of the Cell. Still think I haven't opened a biology textbook?Mung
December 26, 2015
December
12
Dec
26
26
2015
09:26 PM
9
09
26
PM
PDT
Atomic Structure Of Graphene Ordered or disordered? High entropy or low entropy? You decide.Mung
December 26, 2015
December
12
Dec
26
26
2015
06:00 PM
6
06
00
PM
PDT
Atomic Structure of Diamond Ordered or disordered? High entropy or low entropy? You decide.Mung
December 26, 2015
December
12
Dec
26
26
2015
05:58 PM
5
05
58
PM
PDT
Mung, Happy Christmas. As Orgel and Wicken pointed out so long ago now, order [as in a crystal] is not to be confused with either randomness [e.g. a tar] or organisation. Especially functionally specific, complex wiring diagram organisation and associated information. Such as this text and functional DNA both exemplify. And thereby lieth a long tale. KF PS: L K Nash is a double thumbs up! (I wish I had been smart enough as an undergrad to go to the Chem section! At least, I eventually figured out that probably the best physics etc textbook writers were the Russians. Thanks to a second hand bookshop in Barbados, and Mir Publishers, now first privatised then defunct. Jackpot was the Communist Party bookshop in Jamaica.)kairosfocus
December 25, 2015
December
12
Dec
25
25
2015
01:04 AM
1
01
04
AM
PDT
Merry Christmas kairosfocus, You've been a good friend for a long time. Thank you.Mung
December 24, 2015
December
12
Dec
24
24
2015
08:03 PM
8
08
03
PM
PDT
According to the old view, the second law was viewed as a 'law of disorder'. The major revolution in the last decade is the recognition of the "law of maximum entropy production" or "MEP" and with it an expanded view of thermodynamics showing that the spontaneous production of order from disorder is the expected consequence of basic laws. http://www.entropylaw.com/ YIKES!Mung
December 24, 2015
December
12
Dec
24
24
2015
08:00 PM
8
08
00
PM
PDT
The diagrams above have generated a lively discussion, partly because of the use of order vs disorder in the conceptual introduction of entropy. It is typical for physicists to use this kind of introduction because it quickly introduces the concept of multiplicity in a visual, physical way with analogies in our common experience. Chemists, on the other hand, often protest this approach because in chemical applications order vs disorder doesn't communicate the needed ideas on the molecular level and can indeed be misleading. The very fact of differences of opinion on the use of order and disorder can itself be instructive. http://hyperphysics.phy-astr.gsu.edu/hbase/therm/entrop.htmlMung
December 24, 2015
December
12
Dec
24
24
2015
07:58 PM
7
07
58
PM
PDT
In physics, the terms order and disorder designate the presence or absence of some symmetry or correlation in a many-particle system. https://en.wikipedia.org/wiki/Order_and_disorder_%28physics%29Mung
December 24, 2015
December
12
Dec
24
24
2015
07:55 PM
7
07
55
PM
PDT
Merry Christmas all … I good and lively discussion, and I hope everyone gained something here. Merry Christmas! I did gain something, but not without an increase in entropy!Mung
December 24, 2015
December
12
Dec
24
24
2015
07:07 PM
7
07
07
PM
PDT
Merry Christmas all ... I good and lively discussion, and I hope everyone gained something here.ayearningforpublius
December 24, 2015
December
12
Dec
24
24
2015
06:56 PM
6
06
56
PM
PDT
Before buying on amazon always check half.comVirgil Cain
December 24, 2015
December
12
Dec
24
24
2015
06:25 PM
6
06
25
PM
PDT
Haha, looking at the price on amazon now I remember why I hadn't bought it. Ah well, settled for a used copy.Mung
December 24, 2015
December
12
Dec
24
24
2015
06:10 PM
6
06
10
PM
PDT
hmm.. thought I had that one, guess I need to get it. Still don't have the Jaynes text either. The books I do have though aren't too shabby :) Arieh Ben-Naim - A Farewell to Entropy: Statistical Thermodynamics Based on Information Arieh Ben-Naim - Statistical Thermodynamics: With Applicatiosn to the Life Sciences H.B. Callen - Thermodynamics H.B. Callen - Thermodynamics And An Introduction to Thermostatistics Amnon Katz - Principles of Statistical Mechanics: The Information Theory Approach A.I Khinchin - Mathematical Foundations of Statistical Mechanics Lewis and Randall - Thermodynmics Leonard K. Nash - Elements of Statistical Thermodynamics H.C. Van Ness - Understanding Thermodynamics Fortunately for Alicia I haven't given these away. iirc the Callen texts explicitly derive thermodynamics from information theory. As you say the connection between thermodynamics and information theory is well known and well-established and has been for decades.Mung
December 24, 2015
December
12
Dec
24
24
2015
06:07 PM
6
06
07
PM
PDT
Mung, have a look in Robertson’s Statistical Thermophysics, Ch 1. Will do.Mung
December 24, 2015
December
12
Dec
24
24
2015
05:44 PM
5
05
44
PM
PDT
Mung, Harry Robertson, in using aircraft in a control tower's area reduced to molecular size and numbers as an analogy, pointed out that the loss of access to detailed information means we cannot extract work from detailed knowledge of position and motion of each a/c. So we are forced to treat the whole system [thermodynamic sense] as though it were random. In this context, the increased number of possibilities for distribution of particle level mass and energy consistent with gross scale observable states, can often be associated with qualitative concepts of increased disorder -- randomness here being (understandably) associated with disorder. A classic case being free expansion from a half container to accessing the whole container through suddenly removing a barrier. But that is by no means a definition or a detailed, one size covers all physical insight. KF PS: Mung, have a look in Robertson's Statistical Thermophysics, Ch 1.kairosfocus
December 24, 2015
December
12
Dec
24
24
2015
05:30 PM
5
05
30
PM
PDT
Aleta, the correction I had to make above -- that your truncated clip manages to reverse the actual point on information and thermodynamics being conceded by the Wiki article -- suffices to show what is going on. It is a further telling point that you are busily doubling down. We take due note, and move on. The bottomline for record, as noted long since from that and other Wiki articles speaking against interest:
At an everyday practical level the links between information entropy and thermodynamic entropy are not close. Physicists and chemists are apt to be more interested in changes in entropy as a system spontaneously evolves away from its initial conditions, in accordance with the second law of thermodynamics, rather than an unchanging probability distribution. And, as the numerical smallness of Boltzmann’s constant kB indicates, the changes in S / kB for even minute amounts of substances in chemical and physical processes represent amounts of entropy which are so large as to be right off the scale compared to anything seen in data compression or signal processing. But, at a multidisciplinary level, connections can be made between thermodynamic and informational entropy, although it took many years in the development of the theories of statistical mechanics and information theory to make the relationship fully apparent. In fact, in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon’s information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states that it could be in, thus making any complete state description longer. (See article: maximum entropy thermodynamics.[Also,another article remarks: >>in the words of G. N. Lewis writing about chemical entropy in 1930, “Gain in entropy always means loss of information, and nothing more” . . . in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes/no questions that need to be answered in order to fully specify the microstate, given that we know the macrostate.>>]) Maxwell’s demon can (hypothetically) reduce the thermodynamic entropy of a system by using information about the states of individual molecules; but, as Landauer (from 1961) and co-workers have shown, to function the demon himself must increase thermodynamic entropy in the process, by at least the amount of Shannon information he proposes to first acquire and store; and so the total entropy does not decrease (which resolves the paradox).
KF PS: Aleta, FYI as one trained in Physics, I have had relevant exposure to classical and statistical thermodynamics as a necessary core part of that training. I am not a self-proclaimed "expert" or autodidact. The thermodynamics stands on its own merits.kairosfocus
December 24, 2015
December
12
Dec
24
24
2015
05:21 PM
5
05
21
PM
PDT
Aleta:
Is it not true that in some areas, notably thermodynamics, one way to think about entropy is as a measure of order vs disorder?
I'm probably not the one to ask, hehe. People can certainly think of things that way, but it leads to unnecessary confusion. Take a look at the picture with the rocks at this link: http://physics.info/temperature/ Does calling the energy ordered or disordered really add anything of value? Does it tell us anything about entropy?Mung
December 24, 2015
December
12
Dec
24
24
2015
05:20 PM
5
05
20
PM
PDT
We have seen that the Boltzmann distribution can be derived on the basis of counting arguments as a reflection of the number of ways of partitioning energy between a system and a reservoir. A completely different way of deriving the Boltzmann distribution is on the basis of information theory and effectively amounts to making a best guess about the probability distribution given some limited knowledge about the system such as the average energy. - Physical Biology of the Cell (p. 253)
That's pretty amazing, that the Boltzmann distribution can be derived from information theory. (Not really.)Mung
December 24, 2015
December
12
Dec
24
24
2015
05:09 PM
5
05
09
PM
PDT
1 4 5 6 7 8 10

Leave a Reply