Uncommon Descent Serving The Intelligent Design Community

Here’s the Rundown on the Latest Evolution Blackball Operation

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

University of Texas El Paso mathematics professor Granville Sewell wrote a paper on how the second law of thermodynamics bears on the theory of evolution. The paper was peer-reviewed and accepted for publication. But after a blogger complained the journal, Applied Mathematics Letters (AML), pulled the article, in violation of its own professional standards. That evolutionary blackball operationended up costing the journal $10,000 in attorney’s fees.  Read more

Comments
H'mm: Seems to me life is based on a functionally specific, complex, organised nanotech automaton that uses algorithms, codes, effects a von Neumann self replicator and more, all using c-Chemistry, aqueous medium informational macromolecules. In turn, those molecules are controlled products of a specific algorithmic assembly process. And it turns out the protein folds are apparently isolated to 1 in 10^70 or thereabouts in proteinome space. Denton's summary was:
To grasp the reality of life as it has been revealed by molecular biology, we must magnify a cell a thousand million times until it is twenty kilometers in diameter [[so each atom in it would be “the size of a tennis ball”] and resembles a giant airship large enough to cover a great city like London or New York. What we would then see would be an object of unparalleled complexity and adaptive design. On the surface of the cell we would see millions of openings, like the port holes of a vast space ship, opening and closing to allow a continual stream of materials to flow in and out. If we were to enter one of these openings we would find ourselves in a world of supreme technology and bewildering complexity. We would see endless highly organized corridors and conduits branching in every direction away from the perimeter of the cell, some leading to the central memory bank in the nucleus and others to assembly plants and processing units. The nucleus itself would be a vast spherical chamber more than a kilometer in diameter, resembling a geodesic dome inside of which we would see, all neatly stacked together in ordered arrays, the miles of coiled chains of the DNA molecules. A huge range of products and raw materials would shuttle along all the manifold conduits in a highly ordered fashion to and from all the various assembly plants in the outer regions of the cell. We would wonder at the level of control implicit in the movement of so many objects down so many seemingly endless conduits, all in perfect unison. We would see all around us, in every direction we looked, all sorts of robot-like machines . . . . We would see that nearly every feature of our own advanced machines had its analogue in the cell: artificial languages and their decoding systems, memory banks for information storage and retrieval, elegant control systems regulating the automated assembly of components, error fail-safe and proof-reading devices used for quality control, assembly processes involving the principle of prefabrication and modular construction . . . . However, it would be a factory which would have one capacity not equaled in any of our own most advanced machines, for it would be capable of replicating its entire structure within a matter of a few hours . . . . Unlike our own pseudo-automated assembly plants, where external controls are being continually applied, the cell's manufacturing capability is entirely self-regulated . . . . [[Denton, Michael, Evolution: A Theory in Crisis, Adler, 1986, pp. 327 – 331.]
That is precisely not a case where we have a pretty loose macro level description and a very large number of microstates of moleculaes or their atomic components, comparatively speaking, are compatible with it. But it sounds to me uncommonly like there is a struggle to accept the reality that is there before us. KFkairosfocus
May 1, 2012
May
05
May
1
01
2012
09:33 AM
9
09
33
AM
PDT
I would say the way to refute Sewell is by demonstrating blind and undirected chemical processes can produce a living organism from non-living matter.Joe
May 1, 2012
May
05
May
1
01
2012
08:28 AM
8
08
28
AM
PDT
@Arthur Hunt at 11 Barry responds: Arthur, you are correct. Life is not like a game of Scrabble. The information systems in each and every cell are vastly more complex and specified than the words developed in Scrabble. But to paraphrase Indigo Montoya, “I do not think your observation means what you think it means.”Barry Arrington
May 1, 2012
May
05
May
1
01
2012
08:24 AM
8
08
24
AM
PDT
Just so my point is not missed: Sewell claims that some mythical second law precludes evolution, seemingly because he sees life as a low-entropy proposition. However, the fact of the matter is that life is a high-entropy state of affairs. This (true fact) renders Sewell's long-running monologue on the subject irrelevant. No amount of blathering about imaginary concepts (which is what FCSO/I and its many synonyms are) can change this.Arthur Hunt
May 1, 2012
May
05
May
1
01
2012
05:37 AM
5
05
37
AM
PDT
ES & BA (and AH): The issue is indeed the difference between functionally specific, complex -- and indeed here, cybernetic -- organisation and associated information, and that sort of ordering that will spontaneously arise in nature. Unfortunately, there has been such a talking point storm in recent years that the significance of this distinction has been lost sight of. Now, AH, FYI, it is indeed true that chance and intelligence are both sources of highly contingent outcomes, and it is true that mechanical necessity left to itself will produce highly predictable outcomes on given initial circumstances. Howbeit, once we see highly specific complex functional configurations, such are going to be so deeply isolated in the space of possobilities that it is utterly implausible that mechanical necessity and blind chance will lead to such on the gamut of our observed cosmos. Indeed, it is strictly logically and physically possible that BA's blender full of Scrabble letters will toss out in perfect sequence the words of this post or a similar message in English. However, such configs are so deeply isolated in the space of possibilities that this is not feasible on the gamut of our observed cosmos. Even if it were magically transformed into blenders, power stations, scrabble letters and mechanisms to repeat the exercise impossibly fast and for the cosmos' lifespan. That is one of the key points that design thinkers have made for decades now, since Hoyle spoke about tornadoes, junkyards and 747s. If objectors had taken time to think carefully about the key point being made, much unnecessary confusion and contention would have been saved over these past thirty years. But, when a lot of smart and educated folks keep failing to see a fundamentally simple and easily shown point, there is an ideological reason backed up by polarisation of discussion. Let us hope that AH will pause and reflect sufficiently to see why he has missed what should be plain on a few moments' reflection. Mr Hunt, please notice: we all know the difference between shaking oil and water up and letting them settle out, and letting an ink spot diffuse in a glass of water. We also know that if you decant the contents of a living cell in a test tube and allow diffusion to occur [notice, how Sewell stresses the underlying issue of diffusion in his writings], a cell will not re-emerge, with all but certainty. Similarly, we know why, in a warm little electrified pond with precursors, it is unreasonable to expect that a nanotech, informational polymer based digital code using information system constituting a metabolising, von Neumann self-replicator automaton will spontaneously emerge. Even with opportunities on the gamut of the observed cosmos. So, we have every epistemic right to demand just what is not on offer, empirical demonstration, if we are to surrender the inductively well founded principle that such FSCO/I is a characteristic sign pointing to its routinely observed source: design. It is the same demand in the end as that before we surrender thermodynamics, we need to SEE a perpetuum mobile. And, a similar case applies to the organisation of novel body plans by chance variation and differential reproductive success. Notice, not to adaptation of same, or loss of functions that give some trench warfare advantage or other in distressed environments. KFkairosfocus
May 1, 2012
May
05
May
1
01
2012
04:01 AM
4
04
01
AM
PDT
F/N: Let me clip from my already second linked, citing Wikipedia as inadvertently testifying against known interest on the link between entropy and information, or more exactly missing info on microstate compatible with macrostate:
Further to this, we may average the information per symbol in the communication system thusly (giving in termns of -H to make the additive relationships clearer): - H = p1 log p1 + p2 log p2 + . . . + pn log pn or, H = - SUM [pi log pi] . . . Eqn 5 H, the average information per symbol transmitted [usually, measured as: bits/symbol], is often termed the Entropy; first, historically, because it resembles one of the expressions for entropy in statistical thermodynamics. As Connor notes: "it is often referred to as the entropy of the source." [p.81, emphasis added.] Also, while this is a somewhat controversial view in Physics, as is briefly discussed in Appendix 1 below, there is in fact an informational interpretation of thermodynamics that shows that informational and thermodynamic entropy can be linked conceptually as well as in mere mathematical form. Though somewhat controversial even in quite recent years, this is becoming more broadly accepted in physics and information theory, as Wikipedia now discusses [as at April 2011] in its article on Informational Entropy (aka Shannon Information, cf also here):
At an everyday practical level the links between information entropy and thermodynamic entropy are not close. Physicists and chemists are apt to be more interested in changes in entropy as a system spontaneously evolves away from its initial conditions, in accordance with the second law of thermodynamics, rather than an unchanging probability distribution. And, as the numerical smallness of Boltzmann's constant kB indicates, the changes in S / kB for even minute amounts of substances in chemical and physical processes represent amounts of entropy which are so large as to be right off the scale compared to anything seen in data compression or signal processing. But, at a multidisciplinary level, connections can be made between thermodynamic and informational entropy, although it took many years in the development of the theories of statistical mechanics and information theory to make the relationship fully apparent. In fact, in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon's information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states that it could be in, thus making any complete state description longer. (See article: maximum entropy thermodynamics.[Also,another article remarks: >>in the words of G. N. Lewis writing about chemical entropy in 1930, "Gain in entropy always means loss of information, and nothing more" . . . in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes/no questions that need to be answered in order to fully specify the microstate, given that we know the macrostate. >>]) Maxwell's demon can (hypothetically) reduce the thermodynamic entropy of a system by using information about the states of individual molecules; but, as Landauer (from 1961) and co-workers have shown, to function the demon himself must increase thermodynamic entropy in the process, by at least the amount of Shannon information he proposes to first acquire and store; and so the total entropy does not decrease (which resolves the paradox).
Summarising Harry Robertson's Statistical Thermophysics (Prentice-Hall International, 1993) -- excerpting desperately and adding emphases and explanatory comments, we can see, perhaps, that this should not be so surprising after all. (In effect, since we do not possess detailed knowledge of the states of the vary large number of microscopic particles of thermal systems [typically ~ 10^20 to 10^26; a mole of substance containing ~ 6.023*10^23 particles; i.e. the Avogadro Number], we can only view them in terms of those gross averages we term thermodynamic variables [pressure, temperature, etc], and so we cannot take advantage of knowledge of such individual particle states that would give us a richer harvest of work, etc.) For, as he astutely observes on pp. vii - viii:
. . . the standard assertion that molecular chaos exists is nothing more than a poorly disguised admission of ignorance, or lack of detailed information about the dynamic state of a system . . . . If I am able to perceive order, I may be able to use it to extract work from the system, but if I am unaware of internal correlations, I cannot use them for macroscopic dynamical purposes. On this basis, I shall distinguish heat from work, and thermal energy from other forms . . .
And, in more details, (pp. 3 - 6, 7, 36, cf Appendix 1 below for a more detailed development of thermodynamics issues and their tie-in with the inference to design; also see recent ArXiv papers by Duncan and Samura here and here):
. . . It has long been recognized that the assignment of probabilities to a set represents information, and that some probability sets represent more information than others . . . if one of the probabilities say p2 is unity and therefore the others are zero, then we know that the outcome of the experiment . . . will give [event] y2. Thus we have complete information . . . if we have no basis . . . for believing that event yi is more or less likely than any other [we] have the least possible information about the outcome of the experiment . . . . A remarkably simple and clear analysis by Shannon [1948] has provided us with a quantitative measure of the uncertainty, or missing pertinent information, inherent in a set of probabilities [NB: i.e. a probability different from 1 or 0 should be seen as, in part, an index of ignorance] . . . .
In short, CH seems to be unaware of or improperly dismissive towards developments on this view of thermodynamics. KF
kairosfocus
May 1, 2012
May
05
May
1
01
2012
02:19 AM
2
02
19
AM
PDT
Onlookers (and CH): Sadly, sockpuppet status for CH is apparently confirmed. Having issued a challenge on thermodynamics, then having been answered by a link or two that directly answer, CH reverts to bluster and more of the same. If you cannot do the courtesy of reading and responding to that which anticipates your concerns CH, then that is a sign of your not being here for serious discussion but to push tendentious talking points and to spew uncivil personalities (as you did regarding a professor of mathematics commenting on the significance of partial differential equations as PDE's). Until you change that approach, CH, there is no point in attempting dialogue with one who is evidently only here to shout. G'day GEM of TKIkairosfocus
May 1, 2012
May
05
May
1
01
2012
02:07 AM
2
02
07
AM
PDT
I agree with Eric Andersen, at #11. Order should not be confused with organisation. As a matter of fact, there is a theorem due to Klimontovich which states that the further away an open system is from the dynamic equilibrium state, the more probable the loss of entropy in it becomes. This is an analogue to the famous Boltzmann's theorem for an isolated system (evolution of an isolated system towards an equilibrium state). I am skeptical of GS's results. However I acknowledge that I may have missed out something in his argument or did not fully understand. In honesty, he was talking about the rate of energy exchanges which is not the same as energy deltas. However, having said all this, I would emphasise again that order is not the same as organisation/functionality/semantics/formal control. Nobody has provably demonstrated yet that self-organisation (i.e. spontaneous formation) of multiple parts into a hierarchical and controlled whole is a plausible phenomenon. It just does not happen in nature. Self-ordering routinely happens. Formal control, i.e. the intention of a system towards a goal cannot credibly spontaneously arise in a goal-inert environment, without intelligent agency. "The First Gene" by David Abel deals with all these issues nicely.Eugene S
May 1, 2012
May
05
May
1
01
2012
01:48 AM
1
01
48
AM
PDT
Arthur Hunt: You are confusing order with specified complexity. Those are two very different things.Eric Anderson
April 30, 2012
April
04
Apr
30
30
2012
10:01 PM
10
10
01
PM
PDT
Hi Barry, Life is not like a game of scrabble. Which makes your metaphor rather pointless.Arthur Hunt
April 30, 2012
April
04
Apr
30
30
2012
09:32 PM
9
09
32
PM
PDT
@Arthur Hunt re [9], And then put a bunch of scrabble letters in that same blender, hit the puree button, dump it out, and voila! you get the sonnets of Shakespeare. It’s magic.Barry Arrington
April 30, 2012
April
04
Apr
30
30
2012
09:14 PM
9
09
14
PM
PDT
Um, I don't suppose it would help to point out that increases in entropy actually promote the assembly of macromolecular structures in a cell. Biochemically speaking, the statement (that I believe is implied in Sewell's piece here, as well as in his many other essays on the subject) that macroscopic ordering in the cell is disallowed by the second law of thermodynamics is uninformed gibberish. Look at things this way - shake up a mixture of salad oil and water (run a tornado through the cruet) and then let it set a spell. The mixture - bajillions of molecules - will always spontaneously assemble into a highly-orderd state. This is akin to a tornado-stricken town spontaneously rebuilding, with the important difference that the liquids actually do spontaneously order. Better still, this macroscopic ordering is entropy-driven. It happens because of the second law, not in spite of it. This is what happens in cells. And this is why Sewell's illustrations are irrelevant when it comes to life and evolution.Arthur Hunt
April 30, 2012
April
04
Apr
30
30
2012
09:00 PM
9
09
00
PM
PDT
'chris haynes' yes the relation of TSL to information is fascinating. Sewell did not make it up, and you look ignorant for not knowing anything about it.butifnot
April 30, 2012
April
04
Apr
30
30
2012
08:17 PM
8
08
17
PM
PDT
Here is my reply to the Lloyd piece in the Mathematical Intelligencer.Granville Sewell
April 30, 2012
April
04
Apr
30
30
2012
05:32 PM
5
05
32
PM
PDT
No answers? I say youre bluffing. That's why you cant defend Dr Sewell properly. And I say nobody can. Instead you revert to the tactics of science establishment groupies. You insult those who dare question Sewell's competence. You distract with a link to a long string of generalities. An you invoke the name of big shots like Lewis, to prove God knows what point. Lets' deal with Sewell and his claims. Dr Sewell uses entropy to describe information Entropy has units of joules per DEGREE KELVIN. That's temperature! Temperature and information? I say Dr Sewell is either loony, or a charlatan. You disagree. Fine, but defend him properly. Tell us how information entropy is related to temperature. 1) Take DNA. What temperaure applies to its information? 2) When information is lost, what temperature is used to find the entropy increase?chris haynes
April 30, 2012
April
04
Apr
30
30
2012
05:19 PM
5
05
19
PM
PDT
Yeah I think it's entirely possible he's a sock puppet. Note the ridiculousness of his #1:
"As a creationist, I deplore those who censure* him."
Just to note, as a space cowboy, I deplore censorship* myself; and as a stamp collector, I find it simply unbearable. ;-)
"…his paper is unintelligible gobbledegook." … "I suspect he is a charlatan, and that he has neither a coherent statement nor a definition."
Ironically, chris haynes states that he deplores "those who censure*" and then proceeds to censure Sewell in a manner quite lacking any hint of class or grace. Not failing to miss the opportunity, he responds to an article discussing the unfair treatment of Granville Sewell by dishing out more unfair treatment. Double irony bonus points have been awarded here. He then goes on to fault PaV for ad hominem and sweeping generalities. *snicker* Yeah, sock puppet or oblivious bumbler, or both. Either way it would be amusing in another context -- as a creationist, that is.material.infantacy
April 30, 2012
April
04
Apr
30
30
2012
04:25 PM
4
04
25
PM
PDT
CH: If you are serious -- and right now you unfortunately reek of sock puppet (of which we have had waves here at UD) -- start here and onward links. Notice the use of the Clausius expression and the bridge to the micro-statistical view. Here on the bridge to info, noting Gilbert Newton Lewis especially, will also help. And BTW, PAV, one of UD's most serious commenters and an occasional contributor, is dead right. KFkairosfocus
April 30, 2012
April
04
Apr
30
30
2012
03:47 PM
3
03
47
PM
PDT
Respectfully, you didnt answer the question. As you said, "the second law and entropy have been defined". True. But not by Dr Sewell. Take entropy. It has been defined as: "Entropy is a property of system equal to the lost work, divided by the temperture of the reservoir used to determine the lost work." Its units are joules per degree Kelvin Dr Sewell uses entropy to describe information. 1) Take DNA. What temperaure applies to its information? 2) When information is lost, why does the entropy increase less when the "information is high temperaure? I hold that such questions reveal the nonsense of Dr Sewell's work. As a Creationist, I resent that his nonsense damages our outstanding reputation for clear thought. You disagree. Okay, but instead of ad-hominems and sweeping generalities, please just answer the questions.chris haynes
April 30, 2012
April
04
Apr
30
30
2012
01:58 PM
1
01
58
PM
PDT
Chris haynes: You say you're a "Creationist", and think and write like an evolutionist. So which is it? The "Second Law" and "entropy" have already been defined. Consult standard thermodynamic texts. He invented neither term; he's simply pointing out the more precise understanding of entropy that is lost sight of when people claim that an "decrease of entropy is allowed" as long as it's an "open system." But maybe you can't follow his argument.PaV
April 30, 2012
April
04
Apr
30
30
2012
01:12 PM
1
01
12
PM
PDT
Dr Sewell wrote a paper on "the Second Law of Thermodynamics" As a Creationist, I deplore those who censure him. However, it might be a blessing to Creationism, as his paper is unintelligble gobbledegook. For openers, he didnt bother to state the Second Law, or define entropy, although he applies them, somehow, to information and to evolution. I suspect he is a charlatan, and that he has neither a coherent statement nor a defintion. Take entropy. It has units of Joules per degree Kelvin. How does temperature apply to information? How is it that the "entropy of information" is less, or greater at higher temperatures? Perhaps one of Dr Sewell's defenders could let us know.chris haynes
April 30, 2012
April
04
Apr
30
30
2012
12:44 PM
12
12
44
PM
PDT

Leave a Reply