Uncommon Descent Serving The Intelligent Design Community

Granville Sewell’s important contribution to physics: Entropy-X

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Abstract:   In Sewell’s discussion of entropy flow, he defines “Entropy-X”, which is a very useful concept that should clarify misconceptions of entropy promulgated by Wikipedia and scientists unfamiliar with thermodynamics. Sewell’s important contribution is to argue that one can and should reduce the “atom-entropy” into a subsets of mutually exclusive “entropy-X”. Mathematically, this is like factoring an N x M matrix into block diagonal form, by showing that the cross terms between blocks do not contribute to the total. Each of the blocks in the diagonal then correspond to a separately computed entropy, or “Entropy-X”.  This contribution not only clarifies many of the misunderstandings of laypersons, it also provides a way for physicists to overcome their confusion regarding biology.

– 0 –

Introduction:     Entropy was initially discussed in terms of thermodynamics, as a quantity that came out of the energy, temperature, work formulas.  Ludwig Boltzmann found a way to relate billiard-ball counting statistics to this thermodynamic quantity, with a formula he had inscribed on his tombstone:  S = k ln(Ω). The right-hand-side of this equation, contains a logarithm of the possible ways to arrange the atoms. The left-hand-side is the usual thermodynamic quantity. Relating the two different worlds of counting and heat, is this constant “k”, now called the “Boltzmann constant”.

Now for the shocking part. There is no theory that predicts its value. It is a conversion constant that is experimentally determined. It works best when the real physical system approximates billiard balls–such as noble gasses. The constant gets progressively worse, or needs more adjustments, if the right-hand-side becomes N2 (diatomic gas) or CO2 (triatomic). Going from gas to liquid introduces even more corrections, and by the time we get to solids we use a completely different formula.

For example, a 1991 Nobel prize was awarded for studying how a long oily molecule moves around in a liquid, because not every state of rearrangement is accessible for tangled strings.  So 100 years after Boltzmann, we are just now tackling liquids and gels and trying to understand their entropy.

Does that mean we don’t know what entropy is?

No, it means that we don’t have a neat little Boltzmann factor for relating thermodynamic-S to computer statistics. We still believe that it is conserved, we just can’t compute the number very easily.  This is why Granville Sewell uses “X-entropy” to describe all the various forms of order in the system. We know they must be individually conserved, barring some conversion between the various types of entropy in the sytem, but we can’t compute it very well.

Nor is it simply that the computation gets too large. For example, in a 100-atom molecule, the entropy is NOT computed by looking at all the 100! permutations of atoms, simply because many of those arrangements are energetically impossible.  Remember, when Boltzmann described “ln(Ω)” he was calling it the possible states of the system. If the state is too energetic, it isn’t accessible without a huge amount of energy.  In particle physics, this limitation becomes known as “spontaneous symmetry breaking”, and is responsible for all the variation we see in our universe today.

So rather than counting “atom states”, we assemble atoms into molecules and form new entities that act as complete units, as “molecules”, and then the entropy consists of counting “states of the molecule”–a much smaller number than “states of the atoms of the molecules”.  Molecules form complexes, and then we compute “states of the molecular complexes”. And complexes form structures, such as membranes. Then we compute “states of the structures”. This process continues to build as we approach the size of a cell, and then we have to talk about organs and complete organisms, and then ecologies and Gaia. The point is that our “unit” of calculation is a getting larger and larger as the systems display larger and larger coherence.

Therefore it is completely wrong to talk about single-atom entropy and the entropy of sunlight when we are discussing structural entropy, for as Granville and previous commentators have said, the flow of energy in sunlight with a coherence length of one angstrom cannot explain the decameter coherence of a building.

So from consideration of the physics, it is possible to construct a hierarchical treatment of entropy which enables entropy to address the cell, but in 1991 we had barely made it to the oily liquid stage of development. So on the one hand, unlike many commentators imagine, physicists don’t know how to compute an equivalent “Boltzmann equation” for the entropy of life, but on the other hand Granville is also right, we don’t need to compute the Boltzmann entropy to show that it must be conserved.

– o –

Mathematical Discussion:   Sewell’s contribution is to recognize that there must be a hierarchical arrangement of atoms that permit the intractible problem of calculating the entropy to be treated as a sum of mutually exclusive sub-entropies.

This contribution can be seen in the difference between “Entropy-chromium” and “Entropy-heat” that he introduces in his paper, where Entropy-chromium is the displacement of chromium atoms in a matrix of iron holding the velocity of the atoms constant, and Entropy-heat considers a variation in velocities while holding the position constant.  These two type of entropy have a large energy barrier separating them on the order of several eV per atom that prevent them from interconverting. At sufficiently high temperature–say, the temperature at which the iron-chrome allow was poured–the chromium atoms have sufficient mobility to overcome the energy barrier and move around. But at the present room temperature, they are immobile.  So in the creation event of the chromium bar, the entropy was calculated for both position and velocity, but as it cooled, “spontaneous symmetry breaking” produced two smaller independent entropies from the single larger one.

Now the beginning of this calculation is the full-up, atom entropy where everything is accessible. This “big-bang” entropy gives at least 7 degrees of freedom for each atom. That is, the number of bins available for each atom are at least 7–one for the species, 3 that give the position in x,y,z and 3 that give the velocity in Vx,Vy,Vz.  We could subdivide species into all the different quantum numbers that define atoms, but for the moment we’ll ignore that nuance.  In addition, the quantization of space and velocity into “Planck” sizes of 10^-34 meters, means that our bins do not always have a real number, but have a quantized length or velocity specified by an integer number of Planck sizes.  But again, the real quantization is that atoms don’t overlap, so we can use a much coarser quantization of 10^-10 meters or angstrom atomic lengths. The reason this is important, is that we are reducing ln(Ω) by restricting the number of states of the system that we need to consider.

But if S = ln(Ω), then this means we are mathematically throwing away entropy! How is that fair?

There is a curious result in quantum mechanics, that says if we can’t distinguish two particles, then there is absolutely no difference if they are swapped. This is another way of saying that their position entropy is zero. So if we have two states of a system, separated by a Planck length, but can’t tell the difference, it doesn’t contribute to the entropy.  Now this isn’t to say that we can’t invent a system that can tell the difference, but since a Planck length corresponds to light with gamma-ray intensity, we really have to go back to the Big Bang to find a time when this entropy mattered.   This reconfirms our assumption that as a system cools, it loses entropy and has fewer and fewer states available.

But even this angstrom coarse graining in position represented by “Entropy-Chromium”, is still too fine for the real world because biology is not made out of noble gasses, bouncing in a chamber. Instead, life exists in a matrix of water, an H2O molecule of nanometer size. Just as we cannot tell the difference if we swap the two hydrogen atoms in the water molecule around, we can’t tell the difference if we swap two water molecules around. So the quantization entropy gets even coarser, and the number of states of the system shrink, simply because the atoms form molecules.

A very similar argument holds for the velocities. A hydrogen atom can’t have every velocity possible because it is attached to an oxygen atom. That chemical bond means that the entire system has to move as a unit. But QM tells us that as the mass of a system goes up, the wavelength goes down, which is to say, the number of velocities we have to consider in our binning is reduced as we have a more massive system. Therefore the velocity entropy drops as the system becomes more chemically bound.

And of course, life is mostly made out of polymers of 100’s or 1000’s of nanometers in extent, which have even more constraints as they get tangled around each other and attach or detach from water molecules. That was what the 1991 Nobel prize was about.

Mathematically, we can write the “Big Bang” entropy as a N x M matrix, where N is the number of particles and M the number of bins. As the system cools and becomes more organized, the entropy is reduced, and the system becomes “block-diagonal”, where blocks can correspond to molecules, polymer chains, cell structures, organelles, cells, etc.

Now here is the key point.

If we only considered the atom positions, and not these molecular and macro-molecular structures, the matrix would not be in block diagonal form. Then when we computed the Boltzmann entropy, ln(Ω), we would have a huge number of states available. But in fact, biology forms so much structure, that ln(Ω) is greatly reduced.  All those cross-terms in the matrix are empty, because they are energetically inaccessible, or topologically inaccessible (see the 1991 Nobel prize). Sewell is correct, there is a tremendous amount of order (or reduced entropy) that is improperly calculated if life is considered as a ball of noble gas.

Let me say this one more time. Sewell is not only correct about the proper way to calculate entropy, he has made a huge contribution in arguing for the presence of (nearly) non-convertible forms of sub-entropy.

– o –

Thermodynamic Comment:   Some have argued that this “sub-entropy” of Sewell’s can be explained by some sort of spontaneous symmetry breaking due to the influx of energy. We have talked about “cooling” causing spontaneous symmetry breaking, which is consistent with the idea that higher temperatures have higher entropy, but the idea that “heating” can also drive symmetry breaking and lowered entropy is incoherent. This is simply because thermodynamically d(TS) = dE, or dS = d(E/T), which is to say, energy brings entropy and temperature simultaneously.

Let’s look at this a bit closer and apply it to the Earth.  The Sun is at 5500K, and therefore its energy comes principally in the form of yellow photons.  The Earth global temperature averages out to about 300K, so it emits infrared photons.   In thermodynamic equilibrium, the energy entering has to balance the energy leaving (or the Earth would heat up!).  Since the entropy of the photons hitting the Earth have almost twenty times less than the entropy of the photons leaving the Earth, the Earth must be a source for entropy. Ergo, sunlight should be randomizing every molecule in the Earth and preventing life from happening.

Does this make sense? I mean everybody and their brother say that entropy can decrease if you have a heat engine in the system. Energy comes into your refrigerator as low entropy energy. Energy billows out of the coils in the back as high entropy heat. But inside the fridge is a low-entropy freezer.  Couldn’t this apply to Earth? (E.g., compensation argument.)

Well, if the Earth had a solar-powered refrigerator, and some insulation, and some fancy piping, yes. But of course, all that is an even more low entropy system than the electricity, so we are invoking bigger and bigger systems to get the entropy in the small freezer to go down. The more realistic comparison is without all that special pleading. Imagine you leave your freezer door open accidently and leave for the weekend. What will you find? A melted freezer and a very hot house, because your fridge would run continuously trying to cool something down that warmed even faster. In fact, this is how dehumidifiers function. So your house would be hot and dry, with a large puddle of coolish water in the kitchen. Some fridges dump that water into a pan under the heating coils, which would evaporate the water and we would then be back to our original state but at higher temperature. Would the entropy of the kitchen be greater or smaller than if you had unplugged the fridge for the weekend? Greater, because of all that heat.

And in fact, this is the basic argument for why objects that sit in space for long periods of time have surfaces that are highly randomized.  This is why Mars can’t keep methane in its atmosphere. This is why Titan has aerosols in its atmosphere. This is why the Moon has a sterile surface.

If the Earth is unique, there is more than simply energy flow from the Sun that is responsible.

Comments
KF, Since you mentioned my freezing water example, let me remind you that you still haven't rebutted my simple 4-step argument:
CS3, I’ve mentioned this a couple of times already but people (including you) haven’t picked up on it, so let me try again. When Granville argues against the compensation idea, he is unwittingly arguing against the second law itself. It’s easy to see why. Imagine two open systems A and B that interact only with each other. Now draw a boundary around just A and B and label the contents as system C. Because A and B interact only with each other, and not with anything outside of C, we know that C is an isolated system (by definition). The second law tells us that the entropy cannot decrease in any isolated system (including C). We also know from thermodynamics that the entropy of C is equal to the entropy of A plus the entropy of B. All of us (including Granville) know that it’s possible for entropy to decrease locally, as when a puddle of water freezes. So imagine that system A is a container of water that becomes ice. Note: 1. The entropy of A decreases when the water freezes. 2. The second law tells us that the entropy of C cannot decrease. 3. Thermodynamics tells us that the entropy of C is equal to the entropy of A plus the entropy of B. 4. Therefore, if the entropy of A decreases, the entropy of B must increase by at least an equal amount to avoid violating the second law. The second law demands that compensation must happen. If you deny compensation, you deny the second law. Thus Granville’s paper is not only chock full of errors, it actually shoots itself in the foot by contradicting the second law! It’s a monumental mess that belongs nowhere near the pages of any respectable scientific publication. The BI organizers really screwed up when they accepted Granville’s paper.
keiths
July 9, 2013
July
07
Jul
9
09
2013
12:44 AM
12
12
44
AM
PDT
CS3: Well said, and well cited. The second law of thermodynamics has the peculiarity of being rooted in a broad principle and pattern of observation, leading to a plurality of statements of varying breadth. In the context of origin, heat engines, they will naturally be very close, but some are more narrow, others less so. For instance one of the first formulations was that one could not build a heat engine whose only function would be to convert heat into work (i.e. random molecular motion cannot wholly be converted into orderly forced motion). That is, it banned a certain type r perpetual motion machine as impossible; in effect daring you to provide a successful counter-example. The familiar formulation in terms of isolated systems and entropy net being at least preserved, was itself already a widening from that and the like. The formulation in statistical terms, based on more or less accessible clusters of states that are discernible at micro level and consistent with a macro-level set of conditions, that systems tend to migrate to clusters of higher statistical weight, sometimes phrased higher thermodynamic probability, is an analysis of the former in light of the understanding that emerged across C19 and into early C20, that matter was empirically established as atomic and molecular. (Boltzmann's sad end was in part due to how the fierce objections he encountered to his atomism excited his condition. That gravestone marker, has a sad point to it, it is not merely celebratory of an achievement.) Next, there is a tendency to suggest that the law has applications only to isolated systems and that open ones are unconstrained by it. That early formulation in terms of forbidding a certain class of perpetual motion machine, gives the lie to that. And in fact the formulation in terms of isolated systems, is meant to give an ideal context that then grounds what happens when we move to the cases of wider interest, first the heat engine -- obviously an open system. And in studying thermodynamics, so soon as the law is put, it is combined with the first law to be used in analysis of such. Going further, the isolated system context examines subsystems open to heat flows [in the relevant terminology, "closed" systems, "open" ones being those open to mass and energy flows, etc.], and the changes that occur once a quantum of heat is passed due to temperature difference. It shows that once a subsystem B received such a quantum, its entropy defined in terms of a quantity increasing as dS >/= (- d'Q/T_A) + (d'Q/T_B), tends to rise in absence of a compensating change elsewhere (which requires an energy conversion mechanism such as a heat engine, coupled tot he intake of energy, and will normally also require exhausting energy as waste heat to a heat sink, C). The loss of entropy of the donating subsystem A, is then algebraically compensated by the higher value of the rise in B. Moving to analytical level on the micro picture, the rise in numbers of ways energy and mass can be arranged at micro level so far increases that it exceeds the number of ways lost by A. GS's key contribution, is that he has aptly highlighted that there s another empirically confirmed analytical factor at work, diffusion-like processes including heat spreading. Such processes tend to undo concentrations across time, spreading out the concentrated item. Such is driven by random walks through accessible interactions by atoms, molecules, etc. Random walks leading to spreading out, are then tied to the point that that clusters of states that are rare in the overall state/phase space -- we here reckon with not only positions and masses but momentum etc (which are tied to energy measures) -- continue to be rare relative to the overwhelming bulk clusters, whether or not a system is isolated or open to energy and/or mass inflows or outflows. That may seem trivial, but it is pivotal, and it is obviously easily missed. It means that there will be a spontaneous tendency of systems to gravitate in the state space to accessible configuration clusters that are dominant, rather than to those that are rare and isolated: we deal with cases where the numbers of possibilities are beyond astronomical, even for something as simple as 500 - 1,000 coins; that means that random walk based processes are not likely to be able to access such, as the atomic and temporal resources of our solar system or the observed cosmos are not adequate to search a significant part of the phase space. That is, we are inherently unable to sample enough of the space of possibilities to make the observation of vanishingly rare special and specific clusters to be reasonably observable. (E.g. we have no reason to expect to see 500H on tossing coins, even if we were to expend major efforts on such on the gamut of the atomic and temporal resources of our solar system. [Significantly, the ideologically motivated objectors we are dealing with here at UD, refuse to acknowledge this point. Not on grounds that it is poorly warranted, but evidently that they think since any one state is as improbable as any other, we should show no surprise to see any state appear. To do this, they refuse to acknowledge the significance of clusters of possibilities that dominate a space of states.]) Consequently, the observation of "counterflow" leading to such isolated clusters needs to be explained on mechanisms that make them not vanishingly improbable. Let me reproduce the analysis that I presented at 6 above (and in a previous thread) to draw this out a bit more:
For instance (following a useful simple model of diffusion in Yavorsky and Pinski’s nice elementary Physics [MIR, 1974]), if we have ten each of white and black marbles in two rows in a container: ||**********|| ||0000000000|| There is but one way to be as shown, but over 63,000 to be 5B/W in each row, and 6:4 and 4:6 cases are close to such numbers:44,000 +. So, if the system is sufficiently agitated for balls to swap positions at random on a regular basis, it soon moves towards a dominant cluster near the 5:5 peak and “forgets” the initial state. This basic pattern works for heat, diffusion of ink drops in beakers of water, C atoms in a bar of steel, and more. The basic reason for such does not go away if the system is opened up to energy and mass flows, and the point that rare, specifically and simply describable states will seldom be revisited or found, for enough complexity — 500 – 1,000 bits, soon becomes that such states are beyond the reach of the solar system’s or the observed cosmos’ search capacity. RS’ point that there are states that can be locked away from interaction so that it is reasonable to partition entropy accounting, is also quite useful. My own emphasis is that we need to see the difference between what diffusion like factors/ forces will strongly tend to do and what produces shaft work thence constructive work ending in FSCO/I. Let me try a second diagram using textual features: Heat transfer in Isolated system: || A (at T_a) –> d’Q –> B (at T_b) || dS >/= (-d’Q/T_a) + (+d’Q/T_b), Lex 2, Th in a common form Heat engine, leaving off the isolation of the whole: A –> d’Q_a –> B’ =====> D (shaft work) Where also, B’ –> d’Q_b –> C, heat disposal to a heat sink Where of course the sum of heat inflows d’Q_a will equal the sum of rise in internal energy of B dU_b, work done on D, dW, and heat transferred onwards to C, d’Q_b. The pivotal questions are, the performance of constructive work by dW, the possibility that this results in FSCO/I, the possibility that B itself shows FSCO/I enabling it to couple energy and do shaft work, and the suggested ideas that d’Q’s can spontaneously lead to shaft work constructing FSCO/I as an informational free lunch. By overwhelming statistical weights of accessible microstates consistent with the gross flows outlined, this is not credibly observable on the gamut of our solar system or the observed cosmos. There is but one empirically, reliably observed source for FSCO/I, design. The analysis on statistical weights of microstates and random forces such as diffusion and the like, shows why. Which brings us to the point Hoyle was making: letting his tornado stand in for an agitating energy source triggering configs at random, tornadoes passing through junkyards don’t build jumbo jets or even instruments on their panels. But similar intelligently directed expenditures, through precise patterns of work, routinely do build jumbo jets.
Now, yes, the 10 W + 10B balls example is a toy example, just like the 500 coins. They are accessible, amenable to calculation, and help build a broader, deeper intuition by grounding a concept and providing a reference point. In this case, we see what diffusion is about, at a simple level. One that makes sense of the observation that say once we drop an ink drop in a beaker of water, over time it spreads out, and eventually becomes dispersed, but never do we see it spontaneously reforming. Once similar forces and factors are at work, the same sort of dispersive pattern will obtain, for the same reason of dominant clusters of possibilities that are accessible. GS is right to highlight that diffusion-like spreading out processes are central to our understanding of the second law. RS, properly highlights that when energy access barriers obtain, we can have partitioning of dispersive effects leading to a proper subscripting of our diffusion and entropy accounting. X-entropy or a similar terminology -- though not particularly common -- is reasonable. And the next point is now quite plain. We cannot reasonably expect diffusion-like disorganising forces to give us access to rare clusters, given the limits of our relevant search resources on earth, in the solar system, in the observed cosmos. Functionally specific, complex organisation and associated information are not credibly accessible through diffusion-like disorganising, disordering forces. And we can properly see why, even without having to precisely calculate exact probabilities (yet another hyperskeptical objection), once we recognise the force of sampling theory for blind samples made with resources being inadequate to capture more than a small snapshot of the bulk of a distribution of accessible possibilities. Where also, the very definition of FSCO/I itself underscores how rare it will be: multiple, well matched parts need to be correctly arranged and coupled together for function, thus sharply constraining the set of acceptable configurations. As an example, the letters in this post are sharply constrained by the rules of English text, in order to function effectively as a message. So, if the locations or selection of character states from the ASCII set were by contrast to be selected blindly, we would very soon have gibberish due to the equivalent of a diffusion like process: fieghqvkehju . . . Likewise, constraining forces similar to those of crystallisation (e.g., KS's attempt to use the freezing of water) will produce repetitive order, not information-rich functional organisation: FGFGFGFGFGF . . . With these on the table, we can see how the design inference on signs such as FSCO/I, is rooted in underlying analysis of thermodynamics considerations. Now, of course, I do not expect ideologues to accept or even recognise the above, but that is besides the point. Our interest is in reasonable discussion in light of evidence and analysis, leading to a better insight into empirical reality, not the sort of agenda pushing that has led to the rhetorical mess and worse we see all about. KFkairosfocus
July 9, 2013
July
07
Jul
9
09
2013
12:28 AM
12
12
28
AM
PDT
Granville:
I am well aware that the idea that “entropy” is a single quantity which measures, in units of thermal entropy, all types of disorder is widespread, and promoted for the same reason that you promote it, that is one of the main points my BioComplexity article refutes. But it is patently false. There are applications of the second law which are quantifiable, such as the diffusion of chromium given in my scenario (B), which obviously have nothing to do with heat or energy,
Of course your scenario B has "to do with heat or energy"! How do you think the chromium diffuses through the steel bar if not by work being done on the chromium atoms, and thus the thermal entropy decreasing!
just with probability
"Probability" is a meaningless concept without reference to the generative process under which we compute the probability distribution. In the case of your chromium, it is the thermal energy in the bar - the jiggling in a uniform distribution of directions that makes uniformly diffused chromium the most probable final outcome.
When Asimov talked about the entropy decrease associated with the development of the human brain, he was obviously NOT talking about any change in thermal entropy,
You think a human brain can develop without any change in thermal entropy? It can't even function without changes in thermal entropy! That's why people like me can measure which parts of the brain are active at any time - by measuring proxies for brain metabolism!
and many other examples of “entropy” increases are given in many texts which obviously have nothing to do with thermal entropy, such as those I mentioned earlier.
It's not obvious to me that your examples "have nothing to do with thermal entropy". In fact it seems extremely clear that they have!Elizabeth B Liddle
July 9, 2013
July
07
Jul
9
09
2013
12:23 AM
12
12
23
AM
PDT
There may be all kinds of applications of the general principle that improbable things are more improbable than probable things; indeed it's what underlies the principle of null hypothesis testing. But that doesn't make the 2nd Law of thermodynamics not about thermodynamics. Or is it possible that Granville and CS3 are confusing heat, which is measured in joules with temperature, which is measured in degrees If Granville's point is not about energy, i.e. something measured in joules then it is not about the 2nd Law of thermodynamics, and obviously in that case, any counter argument based on the assumption that it is, e.g.about compensation and external energy sources, will miss their mark. But if this is the case, then Granville should stop invoking the 2nd Law, or, at the very least, make it clear that he is only using it metaphorically. CS3: Can you give me an example of a broader practical application of the 2nd Law that is not about thermodynamic energy? Where thermodynamic energy is defined as the wiki entry has it:
The internal energy of a system, also often called the thermodynamic energy, includes other forms of energy in a thermodynamic system in addition to thermal energy, namely forms of potential energy that do not influence temperature and do not absorb heat, such as the chemical energy stored in its molecular structure and electronic configuration, and the nuclear binding energy that binds the sub-atomic particles of matter.
Elizabeth B Liddle
July 9, 2013
July
07
Jul
9
09
2013
12:00 AM
12
12
00
AM
PDT
And if you don’t believe me that all statements of the 2nd Law of thermodynamics must be equivalent to be statements of the 2nd law, try this.
They are equivalent in the context of thermal entropy. Some can also be applied more broadly.CS3
July 8, 2013
July
07
Jul
8
08
2013
10:26 PM
10
10
26
PM
PDT
More details from University Physics by Young and Freedman, in a section entitled "Microscopic Interpretation of Entropy" in the chapter "The Second Law of Thermodynamics":
Entropy is a measure of the disorder of the system as a whole. To see how to calculate entropy microscopically, we first have to introduce the idea of macroscopic and microscopic states. Suppose you toss N identical coins on the floor, and half of them show heads and half show tails. This is a description of the large-scale or macroscopic state of the system of N coins. A description of the microscopic state of the system includes information about each individual coin: Coin 1 was heads, coin 2 was tails, coin 3 was tails, and so on. There can be many microscopic states that correspond to the same macroscopic description. For instance, with N=4 coins there are six possible states in which half are heads and half are tails. The number of microscopic states grows rapidly with increasing N; for N=100 there are 2^100 = 1.27x10^30 microscopic states, of which 1.01x10^29 are half heads and half tails. The least probable outcomes of the coin toss are the states that are either all heads or all tails. It is certainly possible that you could throw 100 heads in a row, but don't bet on it: the possibility of doing this is only 1 in 1.27x10^30. The most probable outcome of tossing N coins is that half are heads and half are tails. The reason is that this macroscopic state has the greatest number of corresponding microscopic states. To make the connection to the concept of entropy, note that N coins that are all heads constitutes a completely ordered macroscopic state: the description "all heads" completely specifies the state of each one of the N coins. The same is true if the coins are all tails. But the macroscopic description "half heads, half tails" by itself tells you very little about the state (heads or tails) of each individual coin. We say that the system is disordered because we know so little about its microscopic state. Compared to the state "all heads" or "all tails", the state "half heads, half tails" has a much greater number of possible microstates, much greater disorder, and hence much greater entropy (which is a quantitative measure of disorder). Now instead of N coins, consider a mole of an ideal gas containing Avogadro's number of molecules. The macroscopic state of this gas is given by its pressure p, volume V, and temperature T; a description of the microscopic state involves stating the position and velocity for each molecule in the gas. At a given pressure, volume, and temperature the gas may be in any one of an astronomically large number of microscopic states, depending on the positions and velocities of its 6.02x10^23 molecules. If the gas undergoes a free expansion into a greater volume, the range of possible positions increases, as does the number of possible microscopic states. The system becomes more disordered, and the entropy increases. We can draw the following general conclusion: For any system the most probable macroscopic state is the one with the greatest number of corresponding microscopic states, which is also the macroscopic state with the greatest disorder and the greatest entropy.
Sewell's statement follows directly from this: in an isolated system, the reason natural forces (such as tornados) "may turn a spaceship, or a TV set, or a computer into a pile of rubble but not vice-versa is also probability: of all the possible arrangements atoms could take, only a very small percentage could fly to the moon and back, or receive pictures and sound from the other side of the Earth, or add, subtract, multiply and divide real numbers with high accuracy."CS3
July 8, 2013
July
07
Jul
8
08
2013
09:40 PM
9
09
40
PM
PDT
One more, from Chemistry by Zumdahl and Zumdahl.
The natural progression of things is from order to disorder, from lower entropy to higher entropy. To illustrate the natural tendency toward disorder, you only have to think about the condition of your room. Your room naturally tends to get messy (disordered), because an ordered room requires everything to be in its place. There are simply many more ways for things to be out of place than for them to be in their places. As another example, suppose you have a deck of playing cards ordered in some particular way. You throw these cards into the air and pick them all up at random. Looking at the new sequence of the cards, you would be very surprised to find that it matched the original order. Such an event would be possible, but very improbable. There are billions of ways for the deck to be disordered, but only one way to be ordered according to your definition. Thus the chances of picking the cards up out of order are much greater than the chance of picking them up in order. It is natural for disorder to increase. Entropy is a thermodynamic function that describes the number of arrangements (positions and/or energy levels) that are available to a system existing in a given state. Entropy is closely associated with probability. The key concept is that the more ways a particular state can be achieved, the greater is the likelihood (probability) of finding that state. In other words, nature spontaneously proceeds toward the states that have the highest probabilities of existing. This conclusion is not surprising at all. The difficulty comes in connecting this concept to real-life processes. For example, what does the spontaneous rusting of steel have to do with probability? Understanding the connection between entropy and spontaneity will allow us to answer such questions. We will begin to explore this connection by considering a very simple process, the expansion of an ideal gas into a vacuum. Why is this process spontaneous? The driving force is probability. Because there are more ways of having the gas evenly spread throughout the container than there are ways for it to be in any other possible state, the gas spontaneously attains the uniform distribution. ... Nature always moves toward the most probable state available to it.
CS3
July 8, 2013
July
07
Jul
8
08
2013
09:34 PM
9
09
34
PM
PDT
Of note: Two papers investigate the thermodynamics of quantum systems - July 8, 2013 Excerpt: As one of the pillars of the natural sciences, thermodynamics plays an important role in all processes that involve heat, energy, and work. While the principles of thermodynamics can predict the amount of work done in classical systems, for quantum systems there is instead a distribution of many possible values of work. Two new papers published in Physical Review Letters have proposed theoretical schemes that would significantly ease the measurement of the statistics of work done by quantum systems.,,, "Fundamentally, we could start exploring quantum thermodynamics, which puts together a genuine quantum approach and the rock-solid foundations of thermodynamics," he said. "We (and a few other researchers) are trying to do it from an information theoretic viewpoint, hoping to get new insight into this fascinating area.,,, http://phys.org/news/2013-07-papers-thermodynamics-quantum.html related interest: "Is there a real connection between entropy in physics and the entropy of information? ....The equations of information theory and the second law are the same, suggesting that the idea of entropy is something fundamental..." Siegfried, Dallas Morning News, 5/14/90, [Quotes Robert W. Lucky, Ex. Director of Research, AT&T, Bell Laboratories & John A. Wheeler, of Princeton & Univ. of TX, Austin] Maxwell's demon demonstration (knowledge of a particle's position) turns information into energy - November 2010 Excerpt: Until now, demonstrating the conversion of information to energy has been elusive, but University of Tokyo physicist Masaki Sano and colleagues have succeeded in demonstrating it in a nano-scale experiment. In a paper published in Nature Physics they describe how they coaxed a Brownian particle to travel upwards on a "spiral-staircase-like" potential energy created by an electric field solely on the basis of information on its location. As the particle traveled up the staircase it gained energy from moving to an area of higher potential, and the team was able to measure precisely how much energy had been converted from information. http://www.physorg.com/news/2010-11-maxwell-demon-energy.html Demonic device converts information to energy - 2010 Excerpt: "This is a beautiful experimental demonstration that information has a thermodynamic content," says Christopher Jarzynski, a statistical chemist at the University of Maryland in College Park. In 1997, Jarzynski formulated an equation to define the amount of energy that could theoretically be converted from a unit of information2; the work by Sano and his team has now confirmed this equation. "This tells us something new about how the laws of thermodynamics work on the microscopic scale," says Jarzynski. http://www.scientificamerican.com/article.cfm?id=demonic-device-converts-inform Quantum knowledge cools computers: New understanding of entropy – June 2011 Excerpt: No heat, even a cooling effect; In the case of perfect classical knowledge of a computer memory (zero entropy), deletion of the data requires in theory no energy at all. The researchers prove that “more than complete knowledge” from quantum entanglement with the memory (negative entropy) leads to deletion of the data being accompanied by removal of heat from the computer and its release as usable energy. This is the physical meaning of negative entropy. Renner emphasizes, however, “This doesn’t mean that we can develop a perpetual motion machine.” The data can only be deleted once, so there is no possibility to continue to generate energy. The process also destroys the entanglement, and it would take an input of energy to reset the system to its starting state. The equations are consistent with what’s known as the second law of thermodynamics: the idea that the entropy of the universe can never decrease. Vedral says “We’re working on the edge of the second law. If you go any further, you will break it.” http://www.sciencedaily.com/releases/2011/06/110601134300.htm Information and entropy – top-down or bottom-up development in living systems? A.C. McINTOSH - Dr Andy C. McIntosh is the Professor of Thermodynamics (the highest teaching/research rank in U.K. university hierarchy) Combustion Theory at the University of Leeds. Excerpt: This paper highlights the distinctive and non-material nature of information and its relationship with matter, energy and natural forces. It is proposed in conclusion that it is the non-material information (transcendent to the matter and energy) that is actually itself constraining the local thermodynamics to be in ordered disequilibrium and with specified raised free energy levels necessary for the molecular and cellular machinery to operate. http://journals.witpress.com/paperinfo.asp?pid=420bornagain77
July 8, 2013
July
07
Jul
8
08
2013
07:07 PM
7
07
07
PM
PDT
Since some continue to misrepresent the literature to fit their own views:
But nobody is claiming that solely because there is an entropy increase somewhere, any old thing can happen somewhere nearby. What we claim is that if a system is not isolated, work can be done on that system that can result in greater order, although system doing the work will necessarily experience increased entropy.
Yet again, you are imposing your view on Styer, Bunn, and others who have made the compensation argument. If they were trying to argue, as you do, that energy makes the development of complex organisms not extremely improbable, they would not (to quote the Styer paper) estimate that “due to evolution, each individual organism is 1000 times “more improbable” than the corresponding individual was 100 years ago.” They would say something like, “Organisms may seem to be getting more improbable each century, but, in fact, they are actually becoming more probable, due to the actions of the four fundamental forces and the solar influx.” They may well see energy as related to the processes forming organisms, in that if there were no energy, nothing would happen, but they are clearly not arguing that the energy makes these processes not improbable. If they did not think anything improbable was happening, then there would be no need for them to convert the probabilities of improbable events into an entropy and compare that to a different type of entropy to satisfy an inequality. And, even if the energy were causing these events, it makes no sense for them to try to convert from the original improbability of what happened to how much energy is needed. It takes energy to flip coins, but it takes no more energy to flip all heads than to flip half heads and half tails. Even if the processes increasing the improbability of the organisms are the exact same processes as those increasing the thermal entropy, this accounting is completely invalid. If I think energy is simply making something, for example, a plant forming a flower, not improbable (and I would agree in this case), I say, as you do, that energy is making that something not improbable. Perhaps I provide some details of a mechanism by which that might be the case. If I want to know how much energy is required, I analyze the mechanism, or perhaps perform an experiment if possible. I do not compute the ratio of the number of microstates of “flower” to the number of microstates of “dirt” and plug it into the Boltzmann formula to see how much energy I need, not even as an upper or lower bound. I only do that if I am trying to compensate improbable events with events that, if reversed would, be more improbable, according to some global accounting scheme. As I challenged keiths in another thread,
Can you explain how the methodology used by Styer and Bunn cannot be used to show that “anything, no matter how improbable, can happen in a system as long as the above criterion is met?” Just substitute the probability ratio of, say, a set of a thousand coins going from half heads and half tails to all heads in place of their estimate for the increase in improbability of organisms due to evolution. Plug that into the Boltzmann formula, and compare to the thermal entropy increase. If its magnitude is less, the Second Law is satisfied.
Furthermore, I challenge you to explain how their methodology helps make the claim you are trying to make. Again, to summarize their methodology, which no one has been able to defend:
They estimate how much more “improbable” some organism is than an ancestral organism, plug that into the Boltzmann formula, and then multiply by the number of organisms and divide by the time taken to evolve, to get a value, in Joules per degree Kelvin per second, for the rate of entropy decrease due to the evolution. Then, they compare this value to the value for the rate of increase in entropy in the cosmic microwave background. So long as the magnitude of the evolution entropy decrease is less than the magnitude of the cosmic microwave background increase, they conclude that “the second law of thermodynamics is safe.”
Hopefully you can forgive Sewell for writing a paper that responds to the arguments in the literature rather than to the personal views of UD posters. ------------------------------------------------------ If Sewell’s arguments have nothing to do with the Second Law, then why do all of these have to do with the Second Law? 1) Isaac Asimov publishes an article in the Smithsonian Institute Journal, entitled “In the game of energy and thermodynamics, you can’t even break even”, arguing that, when applying the Second Law of Thermodynamics to the improbability of organisms due to evolution, there is no conflict because the increase in the complexity of organisms is compensated by the increase in entropy of the Sun. To quote the article itself:
You can argue, of course, that the phenomenon of life may be an exception [to the second law]. Life on earth has steadily grown more complex, more versatile, more elaborate, more orderly, over the billions of years of the planet’s existence. From no life at all, living molecules were developed, then living cells, then living conglomerates of cells, worms, vertebrates, mammals, finally Man. And in Man is a three-pound brain which, as far as we know, is the most complex and orderly arrangement of matter in the universe. How could the human brain develop out of the primeval slime? How could that vast increase in order (and therefore that vast decrease in entropy) have taken place? Remove the sun, and the human brain would not have developed…. And in the billions of years that it took for the human brain to develop, the increase in entropy that took place in the sun was far greater; far, far greater than the decrease that is represented by the evolution required to develop the human brain.
2) Daniel Styer publishes an article in the American Journal of Physics, entitled “Entropy and Evolution”, arguing that, when applying the Second Law of Thermodynamics to the improbability of organisms due to evolution, there is no conflict because the increase in the complexity of organism is compensated by the increase in entropy of the cosmic microwave background. His paper is a quantitative version of the compensation argument frequently made in textbooks and by prominent Darwinists such as Isaac Asimov and Richard Dawkins. To quote the article itself:
Does the second law of thermodynamics prohibit biological evolution?…Suppose that, due to evolution, each individual organism is 1000 times “more improbable” than the corresponding individual was 100 years ago. In other words, if Ui is the number of microstates consistent with the speci?cation of an organism 100 years ago, and Uf is the number of microstates consistent with the speci?cation of today’s “improved and less probable” organism, then Uf = 10^-3Ui.
Presumably the entropy of the Earth’s biosphere is indeed decreasing by a tiny amount due to evolution, and the entropy of the cosmic microwave background is increasing by an even greater amount to compensate for that decrease. But the decrease in entropy required for evolution is so small compared to the entropy throughput that would occur even if the Earth were a dead planet, or if life on Earth were not evolving, that no measurement would ever detect it.
3) Emory Bunn publishes an article in the American Journal of Physics, entitled “Evolution and the Second Law of Thermodynamics”, arguing that, when applying the Second Law of Thermodynamics to the improbability of organisms due to evolution, there is no conflict because the increase in the complexity of organisms is compensated by the increase in entropy of the cosmic microwave background. His estimate of the improbability of life due to evolution is “more generous” than Styer’s. To quote the article itself:
We now consider (dS/dt)life. .. far from being generous, a probability ratio of Ui/Uf = 10^-3 is probably much too low. One of the central ideas of statistical mechanics is that even tiny changes in a macroscopic object (say, one as large as a cell) result in exponentially large changes in the multiplicity (that is, the number of accessible microstates). I will illustrate this idea by some order of magnitude estimates. First, let us address the precise meaning of the phrase “due to evolution.” If a child grows up to be slightly larger than her mother due to improved nutrition, we do not describe this change as due to evolution, and thus we might not count the associated multiplicity reduction in the factor Ui/Uf. Instead we might count only changes such as the turning on of a new gene as being due to evolution. However, this narrow view would be incorrect. For this argument we should do our accounting in such a way that all biological changes are included. Even if a change like the increased size of an organism is not the direct result of evolution for this organism in this particular generation, it is still ultimately due to evolution in the broad sense that all life is due to evolution. All of the extra proteins, DNA molecules, and other complex structures that are present in the child are there because of evolution at some point in the past if not in the present, and they should be accounted for in our calculation… We conclude that the entropy reduction required for life on Earth is far less than |dS life| ? 10^44k… the second law of thermodynamics is safe.
4) Bob Lloyd publishes a viewpoint in the Mathematical Intelligencer backing Styer and Bunn, arguing that, when applying the Second Law of Thermodynamics to the improbability of organisms due to evolution, there is no conflict because the increase in the organism complexity is compensated by the increase in entropy of the cosmic microwave background. To quote the article itself:
The qualitative point associated with the solar input to Earth, which was dismissed so casually in the abstract of the AML paper, and the quantitative formulations of this by Styer and Bunn, stand, and are unchallenged by Sewell’s work.
------------------------------------------------------ More quotations from textbooks and articles that apply the general form of the Second Law: From University Physics by Young and Freedman, in the Chapter “The Second Law of Thermodynamics”:
There is a relationship between the direction of a process and the disorder or randomness of the resulting state. For example, imagine a tedious sorting job, such as alphabetizing a thousand book titles written on file cards. Throw the alphabetized stack of cards into the air. Do they come down in alphabetical order? No, their tendency is to come down in a random or disordered state. In the free expansion of a gas, the air is more disordered after it has expanded into the entire box than when it was confined in one side, just as your clothes are more disordered when scattered all over your floor than when confined to your closet.
From a different edition of University Physics, in a section about "building physical intuition" about the Second Law:
A new deck of playing cards is sorted out by suit (hearts, diamonds, clubs, spades) and by number. Shuffling a deck of cards increases its disorder into a random arrangement. Shuffling a deck of cards back into its original order is highly unlikely.
From Basic Physics by Kenneth Ford:
Imagine a motion picture of any scene of ordinary life run backward. You might watch...a pair of mangled automobiles undergoing instantaneous repair as they back apart. Or a dead rabbit rising to scamper backward into the woods as a crushed bullet re-forms and flies backward into a rifle while some gunpowder is miraculously manufactured out of hot gas. Or something as simple as a cup of coffee on a table gradually becoming warmer as it draws heat from its cooler surroundings. All of these backward-in-time views and a myriad more that you can quickly think of are ludicrous and impossible for one reason only - they violate the second law of thermodynamics. In the actual scene of events, entropy is increasing. In the time reversed view, entropy is decreasing.
From General Chemistry, 5th Edition, by Whitten, Davis, and Peck:
The Second Law of Thermodynamics is based on our experiences. Some examples illustrate this law in the macroscopic world. When a mirror is dropped, it can shatter...The reverse of any spontaneous change is nonspontaneous, because if it did occur, the universe would tend toward a state of greater order. This is contrary to our experience. We would be very surprised if we dropped some pieces of silvered glass on the floor and a mirror spontaneously assembled… The ideas of entropy, order, and disorder are related to probability.
From Isaac Asimov in "In the game of energy and thermodynamics, you can't even break even":
We have to work hard to straighten a room, but left to itself, it becomes a mess again very quickly and very easily.... How difficult to maintain houses, and machinery, and our own bodies in perfect working order; how easy to let them deteriorate. In fact, all we have to do is nothing, and everything deteriorates, collapses, breaks down, wears out — all by itself — and that is what the second law is all about.
CS3
July 8, 2013
July
07
Jul
8
08
2013
07:02 PM
7
07
02
PM
PDT
Obviously this Ford guy and his text are wrong. :roll:Joe
July 8, 2013
July
07
Jul
8
08
2013
05:48 PM
5
05
48
PM
PDT
And despite such a overwhelming rate of detrimental mutations, I have yet to find even a single unambiguously beneficial mutations in humans that did not come at a cost of compromising some other molecular function.
Human Genome in Meltdown - January 11, 2013 Excerpt: According to a study published Jan. 10 in Nature by geneticists from 4 universities including Harvard, “Analysis of 6,515 exomes reveals the recent origin of most human protein-coding variants.”,,,: "We estimate that approximately 73% of all protein-coding SNVs [single-nucleotide variants] and approximately 86% of SNVs predicted to be deleterious arose in the past 5,000 -10,000 years. The average age of deleterious SNVs varied significantly across molecular pathways, and disease genes contained a significantly higher proportion of recently arisen deleterious SNVs than other genes.",,, As for advantageous mutations, they provided NO examples,,, http://crev.info/2013/01/human-genome-in-meltdown/
In fact, the loss of morphological traits over time, for all organisms found in the fossil record, was/is so consistent that it was made into a 'scientific law':
Dollo's law and the death and resurrection of genes: Excerpt: "As the history of animal life was traced in the fossil record during the 19th century, it was observed that once an anatomical feature was lost in the course of evolution it never staged a return. This observation became canonized as Dollo's law, after its propounder, and is taken as a general statement that evolution is irreversible." http://www.pnas.org/content/91/25/12283.full.pdf+html And this extends to the molecular level as well,, Dollo’s law, the symmetry of time, and the edge of evolution - Michael Behe Excerpt: We predict that future investigations, like ours, will support a molecular version of Dollo's law:,,, Dr. Behe comments on the finding of the study, "The old, organismal, time-asymmetric Dollo’s law supposedly blocked off just the past to Darwinian processes, for arbitrary reasons. A Dollo’s law in the molecular sense of Bridgham et al (2009), however, is time-symmetric. A time-symmetric law will substantially block both the past and the future. http://www.evolutionnews.org/2009/10/dollos_law_the_symmetry_of_tim.html
As well Dr. Sanford has noted that the process of decay is to be expected to the molecular level in his book 'Genetic Entropy':
Dr. John Sanford "Genetic Entropy and the Mystery of the Genome" 1/2 - video http://www.youtube.com/watch?v=pJ-4umGkgos
Thus Darwinists can play word games all they want, hoping to decieve others, but the plain fact, 'common sense' fact, is that what we see happening all around us at the macro-level, of things growing old and decaying (and dying), holds true for things at the macro-level. There is no empirical evidence that Darwinists can appeal to to show that the process of decay does not hold for life as well:
further notes on extensive repair mechanisms in DNA https://uncommondescent.com/evolution/the-hole-of-the-slot/#comment-462076
Verse and music:
Psalm 102:25-27 Of old You laid the foundation of the earth, And the heavens are the work of Your hands. They will perish, but You will endure; Yes, they will all grow old like a garment; Like a cloak You will change them, And they will be changed. But You are the same, And Your years will have no end. Johnny Cash - Ain't No Grave [Official HD] - The Johnny Cash Project - song starts around 2:50 minute mark http://www.youtube.com/watch?v=WwNVlNt9iDk
bornagain77
July 8, 2013
July
07
Jul
8
08
2013
04:50 PM
4
04
50
PM
PDT
Despite what our Darwinian detractors would prefer for people to believe, There is good reason that Dr. Sewell calls SLOT 'the common sense law of physics'
The common sense law of physics - Granville Sewell - July 2010 Excerpt: Yesterday I spoke with my wife about these questions. She immediately grasped that chaos results on the long term when she would stop caring for her home. https://uncommondescent.com/intelligent-design/the-common-sense-law-of-physics/
Everything that is around us tends towards disorder and decay. Everyone can see this happening. It is everywhere. Something is new it will get old. Something is born, it will grow old and die. PERIOD. That is what makes it an extraordinary claim on the Darwinists part to claim that material processes acting without intelligent input can organize themselves into microbes that contain the equivalent to 10^12 bits completely in contradiction to what we know from this 'common sense law of physics':
“a one-celled bacterium, e. coli, is estimated to contain the equivalent of 100 million pages of Encyclopedia Britannica. Expressed in information in science jargon, this would be the same as 10^12 bits of information. In comparison, the total writings from classical Greek Civilization is only 10^9 bits, and the largest libraries in the world – The British Museum, Oxford Bodleian Library, New York Public Library, Harvard Widenier Library, and the Moscow Lenin Library – have about 10 million volumes or 10^12 bits.” – R. C. Wysong http://books.google.com/books?id=yNev8Y-xN8YC&pg=PA112&lpg=PA112 Moleular Biophysics – Information theory. Relation between information and entropy: - Setlow-Pollard, Ed. Addison Wesley Excerpt: Linschitz gave the figure 9.3 x 10^12 cal/deg or 9.3 x 10^12 x 4.2 joules/deg for the entropy of a bacterial cell. Using the relation H = S/(k In 2), we find that the information content is 4 x 10^12 bits. Morowitz' deduction from the work of Bayne-Jones and Rhees gives the lower value of 5.6 x 10^11 bits, which is still in the neighborhood of 10^12 bits. Thus two quite different approaches give rather concordant figures. http://www.astroscu.unam.mx/~angel/tsb/molecular.htm
Well perhaps common sense is not enough and the atheists have something to back their claim up? I mean weird things are discovered in science all the time right? Well this is not the case here. Sidestepping the ludicrous compensation (open system) argument to look at the empirical evidence itself we find deep concordance with what common sense tells us should be the case. OOL research is a completely blocked in on every side by the SLOT's relentless grip, And if we look at life itself we find that the tendency of things to decay (SLOT) is in full force all the way down toi the molecular level:
“The First Rule of Adaptive Evolution”: Break or blunt any functional coded element whose loss would yield a net fitness gain – Michael Behe – December 2010 Excerpt: In its most recent issue The Quarterly Review of Biology has published a review by myself of laboratory evolution experiments of microbes going back four decades.,,, The gist of the paper is that so far the overwhelming number of adaptive (that is, helpful) mutations seen in laboratory evolution experiments are either loss or modification of function. Of course we had already known that the great majority of mutations that have a visible effect on an organism are deleterious. Now, surprisingly, it seems that even the great majority of helpful mutations degrade the genome to a greater or lesser extent.,,, I dub it “The First Rule of Adaptive Evolution”: Break or blunt any functional coded element whose loss would yield a net fitness gain. http://behe.uncommondescent.com/2010/12/the-first-rule-of-adaptive-evolution/ List Of Degraded Molecular Abilities Of Antibiotic Resistant Bacteria: Excerpt: Resistance to antibiotics and other antimicrobials is often claimed to be a clear demonstration of “evolution in a Petri dish.” ,,, all known examples of antibiotic resistance via mutation are inconsistent with the genetic requirements of evolution. These mutations result in the loss of pre-existing cellular systems/activities, such as porins and other transport systems, regulatory systems, enzyme activity, and protein binding. http://www.trueorigin.org/bacteria01.asp Mutations : when benefits level off – June 2011 – (Lenski’s e-coli after 50,000 generations) Excerpt: After having identified the first five beneficial mutations combined successively and spontaneously in the bacterial population, the scientists generated, from the ancestral bacterial strain, 32 mutant strains exhibiting all of the possible combinations of each of these five mutations. They then noted that the benefit linked to the simultaneous presence of five mutations was less than the sum of the individual benefits conferred by each mutation individually. http://www2.cnrs.fr/en/1867.htm?theme1=7
In fact the detrimental nature of mutations in humans is overwhelming for scientists have already cited over 100,000 mutational disorders.
Inside the Human Genome: A Case for Non-Intelligent Design - Pg. 57 By John C. Avise Excerpt: "Another compilation of gene lesions responsible for inherited diseases is the web-based Human Gene Mutation Database (HGMD). Recent versions of HGMD describe more than 75,000 different disease causing mutations identified to date in Homo-sapiens."
I went to the mutation database website cited by John Avise and found:
HGMD®: Now celebrating our 100,000 mutation milestone! http://www.hgmd.org/
I really question their use of the word 'celebrating'. (Of note, apparently someone with a sense of decency has now removed the word 'celebrating')bornagain77
July 8, 2013
July
07
Jul
8
08
2013
04:50 PM
4
04
50
PM
PDT
Elizabeth, Another quote from Ford's text:
Heat flow is so central to most applications of thermodynamics that the second law is sometimes stated in this "restricted" form: (4) heat never flows spontaneously from a cooler to a hotter body. Notice that this is a statement about macroscopic behavior, whereas the more general and fundamental statements of the second law, which make use of the ideas of probability and disorder, refer to the submicroscopic structure of matter.
This statement (4) is basically equivalent to my (1) (thermal entropy cannot decrease in an isolated system) and to several other statements, but not to the two more general statements. I am well aware that the idea that "entropy" is a single quantity which measures, in units of thermal entropy, all types of disorder is widespread, and promoted for the same reason that you promote it, that is one of the main points my BioComplexity article refutes. But it is patently false. There are applications of the second law which are quantifiable, such as the diffusion of chromium given in my scenario (B), which obviously have nothing to do with heat or energy, just with probability, and many others which are not easily quantifiable, which also have nothing to do with heat or energy. When Asimov talked about the entropy decrease associated with the development of the human brain, he was obviously NOT talking about any change in thermal entropy, and many other examples of "entropy" increases are given in many texts which obviously have nothing to do with thermal entropy, such as those I mentioned earlier.Granville Sewell
July 8, 2013
July
07
Jul
8
08
2013
04:26 PM
4
04
26
PM
PDT
I don't think that this comment will get me any friends, but I think that the entropy that Dr. Sewell is talking about is an example of something true and self evident but maybe not yet quantifiable by science. It's like the difference between red and green. There was a time that we could not measure the wavelength of light. We only knew that red and green were different because we observed it to be so. We generally observe things to go from more ordered to less ordered so evolution is a surprising hypothesis. There should be something special explaining why life can go from less ordered to more ordered and not violate this principle. Some argue that the high energy of the sun can explain the increase in order. But then, why doesn't the sun create life on the moon or venus? Or create computers or some other complex thing? It seems that those who profess evolutionism must show, with convincing evidence, that the sun can accomplish this feat. The burden shouldn't be on Granville to disprove it.Collin
July 8, 2013
July
07
Jul
8
08
2013
04:03 PM
4
04
03
PM
PDT
Granville:
Statement (1) is “in an isolated system, thermal entropy cannot decrease.” (3) is “in an isolated system, the direction of spontaneous change is from an arrangement of lesser probability to an arrangement of greater probability” How could these two possibly be equivalent??? (3) is obviously more general than (1), there are other statements that are equivalent to (1), but (2) and (3) are not. And the author of this textbook obviously understands that (2) and (3) have more general applications than to energy, as he applies them to things like autos colliding and dead rabbits decaying. And Asimov even applies them to order in a house, for example, and he was hardly a creationist. Nice try.
Granville: there is ONE 2nd Law of thermodynamics. If several statements of the law appear to differ, then at least one of those statements is inadequate and does not belong in a textbook, or you are using an unintended definition of one of the words. You are a mathematician, Granville: you know that equations can be stated in many different ways. But if they are the same equation, then those statements are equivalent You cannot say: oh, but here's another version of the same equation, and by this version of it, X is possible but by that version of it, X is not. Here are your three versions: 1. In an isolated system, thermal entropy cannot decrease. 2. In an isolated system, the direction of spontaneous change is from order to disorder. 3. In an isolated system, the direction of spontaneous change is from an arrangement of lesser probability to an arrangement of greater probability Let's do this semi-formally. Each statement begins with: [In an isolated system] Statements 2 and 3 then have the phrase: [the direction of spontaneous change is from] 2 has: [order] to [disorder] 3 has: [an arrangement of lesser probability] to [an arrangement of greater probability] Therefore, if the statements are equivalent: order = an arrangement of lesser probability, and disorder = an arrangement of greater probability. Turning to statement 1, then [thermal entropy cannot decrease]=[the direction of spontaneous change is from order to disorder] = [the direction of spontaneous change is from an arrangement of lesser probability to an arrangement of greater probability] Therefore [thermal entropy] = [probability of an arrangement] = [orderedness] We cannot therefore interpret "probability of an arrangement" or "orderedness" to mean anything other than "thermal entropy", or the statements would not be equivalent statements of the 2nd Law of thermodynamics. There could well be (and is) a another kind of entropy, e.g. Shannon entropy, which has a similar definition, but the 2nd Law of thermodynamics is not about Shannon entropy, and there is no law that says that Shannon entropy cannot increase in an isolated system, because that wouldn't have any meaning. You might say that Shannon entropy cannot increase without intelligence, for example, and Dembski's Law of Conservation of Information might boil down to that, I don't know, but it isn't the 2nd Law of thermodynamics. Now, a tidy room could well have less thermal entropy than an untidy room, with things clustered on shelves, with the capacity to smash on the floor, and thus do work But not necessarily. A nice neatly made bed, for instance, with all the sheets smoothed to their lowest potential energy level probably has greater entropy than a messed up bed with the capacity for the pillow to slide onto the floor. Similarly, a sugar molecule has more thermal entropy than the carbon dioxide and water molecules that existed before the plant converted them to sugar through photosynthesis. Indeed, when we do work (whether housework or even having a wild party), we might find we tend to reduce entropy in our surroundings by leaving them in a configuration less likely than that which would "spontaneously" occur in the absence of us doing work. So your examples mostly are of changes in the arrangement of the energy states of the elements of an isolated (or non-isolated) system. And nobody is suggesting (not even you, I don't think) that the entropy changes implied by evolution are not changes in the energetic configuration of a system. Metabolism is a fundamental to living things - without a metabolism living things are not alive - their metabolism is what makes them need energy resources (food), in order to survive and breed. So of course evolution is all about thermal entropy - biology is all about how living creatures reduce their thermal entropy at the cost of increased entropy in their "exhaust" (to put it politely). But they are, of course, not isolated systems so nothing they do requires any violation of the 2nd Law, whether it is feed, breed-with-variance, or die, and if they feed, breed-with-variance and die, natural selection is going to occur. And if you don't believe me that all statements of the 2nd Law of thermodynamics must be equivalent to be statements of the 2nd law, try this. :DElizabeth B Liddle
July 8, 2013
July
07
Jul
8
08
2013
03:35 PM
3
03
35
PM
PDT
Elizabeth,
If evolution doesn’t violate the first statement, then it doesn’t violate any, because all the statements are equivalent
Statement (1) is "in an isolated system, thermal entropy cannot decrease." (3) is "in an isolated system, the direction of spontaneous change is from an arrangement of lesser probability to an arrangement of greater probability" How could these two possibly be equivalent??? (3) is obviously more general than (1), there are other statements that are equivalent to (1), but (2) and (3) are not. And the author of this textbook obviously understands that (2) and (3) have more general applications than to energy, as he applies them to things like autos colliding and dead rabbits decaying. And Asimov even applies them to order in a house, for example, and he was hardly a creationist. Nice try.Granville Sewell
July 8, 2013
July
07
Jul
8
08
2013
02:07 PM
2
02
07
PM
PDT
Fixed the misspelled name in the title and abstract. It was correct elsewhere. The problem several people have with "entropy" is that they confuse it with a substance. For statistical physics, it is a shorthand for discussing the number of states of the system, while for thermodynamicists, it is related to both energy and temperature. The only place, and I stress it again, that statistics and thermal physics overlap, is when we are discussing atomic gasses. Then we can use Boltzmann's equation. Everywhere else, we can't convert them into thermal properties, and probably not even from one statistic to another. For example, one could convert a computer code into 1's and 0's and measure its statistical entropy, but if I use that computer code to, say, sort all the books in the library into alphabetical order, does that produce a constant that relates the entropy of "computer code" into the entropy of "library catalogues"? I don't think so. One needs a "converter", and the efficiency of a converter depends on design, not physics. Why is this important? Well, the theorems for conservation of entropy all come out of thermodynamics. What Boltzmann did, was to show how this could be converted to statistical mechanics. Thus the peculiar "ordering" of atoms has all the same constant properties as the thermal physics of collections of atoms. This was the power of the equation. We may not have a conversion constant for other forms of ordering, but the existence of these constants for noble gas atom ordering, strongly suggest that other forms of ordering are also conserved. This provided a beachhead into the sorts of ordering that Granville refers to, and we can fruitfully discuss the conservation of "Entropy-X", even if we don't know how to calculate it. I tried to describe how far physicists had gone in calculating the entropy of complex systems, but in another sense, this is a red herring. That is, we almost never use the entropy in calculation, only the change in entropy. And in complicated systems, the change in entropy is a path-dependent function. Or to say it another way, it is dS/dx that is important, not S itself, or perhaps integral[(dS/dx) dx]. So for example, when a nitroglycerine molecule dissociates, the reordering of the chemical bonds releases energy, and the process is driven by the increase in entropy of the gas products over the molecular precursor. So it is the local dS/dx that drives the reaction so very quickly. By analogy then, Granville doesn't have to calculate the entropy of his jet airliner, simply the gradients in the entropy from junk-yard to airliner. Likewise, life has enormous gradients, both spatially and temporally, which should determine the direction of the reaction, but don't because the exact opposite is observed. And yes, this violation of entropy gradients is a direct violation of the 2nd Law of Thermodynamics. There may be ways to get around this with long-range forces and correlations. But then, all of statistical mechanics presupposes that there are no long-range correlations, so more than thermodynamics is lost if we invoke long-range forces.Robert Sheldon
July 8, 2013
July
07
Jul
8
08
2013
01:49 PM
1
01
49
PM
PDT
At the 21:00 minute mark of the following video, Dr Suarez explains why photosynthesis needs a ‘non-local’, beyond space and time, cause to explain its effect:
Nonlocality of Photosynthesis – Antoine Suarez – video – 2012 http://www.youtube.com/watch?v=dhMrrmlTXl4&feature=player_detailpage#t=1268s
Now as a Theist, I, of course, have a ‘non-local’ beyond space and time cause to appeal to to explain how 'non-local' coherence of photosynthesis is possible,,,, Verse and Music:
1 John 1:5 This is the message we have heard from him and proclaim to you, that God is light, and in him is no darkness at all. Toby Mac (In The Light) – music video http://www.youtube.com/watch?v=5_MpGRQRrP0
,,,Whereas the atheists have crickets chirping as to any coherent explanation for ever explaining how 'non-local' photosynthesis is even possible in the first place,,
Cricket Chirping http://www.youtube.com/watch?v=CQFEY9RIRJA
bornagain77
July 8, 2013
July
07
Jul
8
08
2013
01:46 PM
1
01
46
PM
PDT
Some of the 'coincidences' of photosynthesis reach all the way down to foundational physics and chemistry (i.e. the universe was 'set up' for photosynthesis) and are just plain 'spooky' to behold as Dr. Michael Denton briefly elaborates here:
Michael Denton: Remarkable Coincidences in Photosynthesis – podcast http://www.idthefuture.com/2012/09/michael_denton_remarkable_coin.html
In fact there is a irreducibly complex molecular machine at the heart of photosynthesis:
The ATP Synthase Enzyme – exquisite motor necessary for first life – video http://www.youtube.com/watch?v=XI8m6o0gXDY ATP Synthase, an Energy-Generating Rotary Motor Engine – Jonathan M. May 15, 2013 Excerpt: ATP synthase has been described as “a splendid molecular machine,” and “one of the most beautiful” of “all enzymes” .,, “bona fide rotary dynamo machine”,,, If such a unique and brilliantly engineered nanomachine bears such a strong resemblance to the engineering of manmade hydroelectric generators, and yet so impressively outperforms the best human technology in terms of speed and efficiency, one is led unsurprisingly to the conclusion that such a machine itself is best explained by intelligent design. http://www.evolutionnews.org/2013/05/atp_synthase_an_1072101.html Thermodynamic efficiency and mechanochemical coupling of F1-ATPase – 2011 Excerpt:F1-ATPase is a nanosized biological energy transducer working as part of FoF1-ATP synthase. Its rotary machinery transduces energy between chemical free energy and mechanical work and plays a central role in the cellular energy transduction by synthesizing most ATP in virtually all organisms.,, Our results suggested a 100% free-energy transduction efficiency and a tight mechanochemical coupling of F1-ATPase. http://www.pnas.org/content/early/2011/10/12/1106787108.short?rss=1 The 10 Step Glycolysis Pathway In ATP Production: An Overview - video http://www.youtube.com/watch?v=8Kn6BVGqKd8 At the 6:00 minute mark of the following video, Chris Ashcraft, PhD – molecular biology, gives us an overview of the Citric Acid Cycle, which is, after the 10 step Glycolysis Pathway, also involved in ATP production: Evolution vs ATP Synthase - Molecular Machine - video http://www.metacafe.com/watch/4012706 Glycolysis and the Citric Acid Cycle: The Control of Proteins and Pathways - Cornelius Hunter - July 2011 http://darwins-god.blogspot.com/2011/07/glycolysis-and-citric-acid-cycle.html
Yet, photosynthesis presents a far, far, more difficult challenge to Darwinists than just explaining how all these extremely complex mechanisms for converting raw energy into useful ATP energy ‘just so happened’ to ‘randomly’ come about,, so as to enable higher life to be possible in the first place.,,,In what I find to be a very fascinating discovery, it is found that photosynthetic life, which is an absolutely vital link that all higher life on earth is dependent on for food, uses ‘non-local’, beyond space and time, quantum mechanical principles to accomplish photosynthesis,,
Quantum Mechanics at Work in Photosynthesis: Algae Familiar With These Processes for Nearly Two Billion Years – Feb. 2010 Excerpt: “We were astonished to find clear evidence of long-lived quantum mechanical states involved in moving the energy. Our result suggests that the energy of absorbed light resides in two places at once — a quantum superposition state, or coherence — and such a state lies at the heart of quantum mechanical theory.”,,, “It suggests that algae knew about quantum mechanics nearly two billion years before humans,” says Scholes. http://www.sciencedaily.com/releases/2010/02/100203131356.htm
bornagain77
July 8, 2013
July
07
Jul
8
08
2013
01:45 PM
1
01
45
PM
PDT
As pointed out before, I liked this comment from Dr. Sheldon in the OP:
'Let’s look at this a bit closer and apply it to the Earth. The Sun is at 5500K, and therefore its energy comes principally in the form of yellow photons. The Earth global temperature averages out to about 300K, so it emits infrared photons. In thermodynamic equilibrium, the energy entering has to balance the energy leaving (or the Earth would heat up!). Since the entropy of the photons hitting the Earth have almost twenty times less (entropy) than the entropy of the photons leaving the Earth, the Earth must be a source for entropy. Ergo, sunlight should be randomizing every molecule in the Earth and preventing life from happening.'
The reason I liked Dr. Sheldon's statement on light coming from the sun is that agrees with some of the empirical evidence I've been recently gathering from another angle,,, I’ve always found the compensation (open system) argument from atheists to be a very disingenuous argument on their part since the second law was formulated right here on earth, an open system, in the first place! ,,, Moreover, most of the electromagnetic emissions coming from the sun is either harmful or useless for life. Yet the harmful and useless energy coming from the sun is the portion that is most prevented portion that is constrained from reaching the earth,,,
Fine Tuning Of Light to the Atmosphere, to Biological Life, and to Water – graphs http://docs.google.com/Doc?docid=0AYmaSrBPNEmGZGM4ejY3d3pfMTljaGh4MmdnOQ Fine Tuning Of Universal Constants, Particularly Light – Walter Bradley – video http://www.metacafe.com/watch/4491552
In fact, in the following video, at the 5:00 minute mark, reveals that these specific frequencies of light (that enable plants to manufacture food and astronomers to observe the cosmos) represent less than 1 trillionth of a trillionth (10^-24) of the universe’s entire range of electromagnetic emissions.
Privileged Planet - The Extreme Fine Tuning of Light for Life and Scientific Discovery – video http://www.metacafe.com/w/7715887
a very interesting sidenote to all this is that DNA is optimized to prevent damage from light:
DNA Optimized for Photostability Excerpt: These nucleobases maximally absorb UV-radiation at the same wavelengths that are most effectively shielded by ozone. Moreover, the chemical structures of the nucleobases of DNA allow the UV-radiation to be efficiently radiated away after it has been absorbed, restricting the opportunity for damage. http://www.reasons.org/dna-soaks-suns-rays
i.e. if radiation from the sun were really driving the decrease in entropy of life on earth then why in blue blazes is there optimized photostability present for DNA to prevent the incoming energy from the sun from having any effect on DNA??? ,,, It is yet another sheer disconnect between what the empirical evidence actually say and what Darwinists claim for reality. A disconnect that they will never really honestly address.,,, But to move on past my disgust for Darwinists,, ,,, even though the energy coming from the sun is very constrained by the atmosphere (and by the magnetic field, etc..?) in such a way as to 'just so happen' to only allow the useful part of light through to the surface of the earth, in the following video Dr. Thomas Kindell points out that even that tiny sliver of 1 in 10^24 (trillionth of a trillionth) of raw energy coming from the sun, that is allowed to reach the earth, is still destructive in its effect on the earth and must be channeled into useful energy (ATP) by photosynthesis,,,
Evolution Vs. Thermodynamics – Open System Refutation – Thomas Kindell – video http://www.metacafe.com/watch/4143014
And indeed, very much contrary to evolutionary expectations, we now have evidence for complex photosynthetic life suddenly appearing on earth, as soon as water appeared on the earth, in the oldest sedimentary rocks ever found on earth.
The Sudden Appearance Of Photosynthetic Life On Earth – video http://www.metacafe.com/watch/4262918 U-rich Archaean sea-floor sediments from Greenland – indications of +3700 Ma oxygenic photosynthesis (2003) http://adsabs.harvard.edu/abs/2004E&PSL.217..237R
,,,yet photosynthesis is a very, very, complex process which is certainly not conducive to any easy materialistic explanation,,,
“There is no question about photosynthesis being Irreducibly Complex. But it’s worse than that from an evolutionary perspective. There are 17 enzymes alone involved in the synthesis of chlorophyll. Are we to believe that all intermediates had selective value? Not when some of them form triplet states that have the same effect as free radicals like O2. In addition if chlorophyll evolved before antenna proteins, whose function is to bind chlorophyll, then chlorophyll would be toxic to cells. Yet the binding function explains the selective value of antenna proteins. Why would such proteins evolve prior to chlorophyll? and if they did not, how would cells survive chlorophyll until they did?” Uncommon Descent Blogger Evolutionary biology: Out of thin air John F. Allen & William Martin: The measure of the problem is here: “Oxygenetic photosynthesis involves about 100 proteins that are highly ordered within the photosynthetic membranes of the cell.” http://www.nature.com/nature/journal/v445/n7128/full/445610a.html The Miracle Of Photosynthesis – electron transport – video http://www.youtube.com/watch?v=hj_WKgnL6MI
bornagain77
July 8, 2013
July
07
Jul
8
08
2013
01:44 PM
1
01
44
PM
PDT
Granville: thank you for your response.
The approach many scientists take toward ID is to “define” science so that it excludes ID, and then declare “ID is not science” so they don’t have to deal with the issue of whether or not ID is true.
That is not my position. I do think that some (many) paper on ID are not valid science (do not draw justified conclusions from their data, or make flawed arguments), but I see no intrinsic reason why the theory that life was created by a designer is not a perfectly good topic for scientific investigation, even if the designer is postulated to have properties not possessed by other known entities.
As CS3 pointed out on another thread, what you are trying to do here is very similar. In my Bio-Complexity article, I quoted three common statements of the second law, taken from a typical general physics text: the first is apparently the only one you accept as valid, and I acknowledged that evolution has little to do with this statement, it is the more general statements (2) and (3) that are relevant. You are determined to limit the second law so that, by definition, there is no conflict with evolution
No, this is not a right reading of my position. Firstly, those three statements of the 2nd Law of thermodynamics are deemed in text books to be equivalent, not statements of slightly different laws. If one statement is not consistent with the others, it is either too loosely worded, or being interpreted too loosely. There is a single 2nd Law of thermodynamics, and although verbal formulations may vary, and even allow for ambiguities, they all mean the same thing. Moreover, I do not interpret things so that they agree some position I wish to preserve. I am a working scientists, and would far rather prove myself wrong than kid myself that I was correct. I simply do not understand what would motivate anyone to do such a thing.
, and you want us to believe that only “creationists” apply it more generally (to things like tornados running backward), which is patently false.
I want you to believe no such thing.
So there is not much left to discuss, IF you insist that (1) is the only legitimate statement of the second law, then you can declare victory, because I admit evolution doesn’t violate this statement.
If evolution doesn't violate the first statement, then it doesn't violate any, because all the statements are equivalent. The third one you quote is rather loose because it refers to "probability" as though it is a property of an arrangement, rather than the property of an arrangement, given a generative process. Obviously very improbable things won't happen (that isn't a conclusion of the 2nd Law, but the truism that makes it self-evident). But things that are very improbable under some conditions (still air suddenly forming a vortex) are highly probable under others (a convection current generated by a sun-warmed patch of earth) that are specifically not disallowed under the 2nd Law of thermodynamics. Taking the word "probability" out of any context that gives the circumstances under which a thing is probable, is to misinterpret that 3rd statement. Which is, in fact, perfectly correct, as it says that
In an isolated system, the direction of spontaneous change is from an arrangement of lesser probability to an arrangement of greater probability
This is absolutely true, because it talks about an isolated system. That system can contain subsystems in which arrangements that would be improbable if they were isolated, but of course they are not. They are in contact with the rest of the super-system, which can do work on it, at the cost of increased entropy in the rest of the supersystem. Your counter-argument to the "compensation" story is not correct. It is perfectly true that the fact that entropy is increasing in some remote unconnected system won't make spaceships appear here, and it is also true that the fact that entropy is increasing in my cup of coffee won't mean that my dishes will wash themselve and put themselves away. But nobody is claiming that solely because there is an entropy increase somewhere, any old thing can happen somewhere nearby. What we claim is that if a system is not isolated, work can be done on that system that can result in greater order, although system doing the work will necessarily experience increased entropy. The sun does not explain life on earth. It merely provides the potential for work to be done (via temperature gradients for instance). The complicated part is explaining what that work was and why. But it does counter the claim that work could not have been done because of the 2nd Law.
But I believe laws of Nature are defined by Nature, not by man. If Isaac Newton had stated the law of gravity as “the Earth attracts apples”, you could say that technically, oranges falling to the ground has nothing to do with the law of gravity. But I would be interested in knowing if the law of gravity could really be generalized beyond apples.
I agree that laws are discovered, not invented, so at least we agree on something! I'd be very interested in your answer to my question at 9. Cheers LizzieElizabeth B Liddle
July 8, 2013
July
07
Jul
8
08
2013
01:20 PM
1
01
20
PM
PDT
Elizabeth, The approach many scientists take toward ID is to "define" science so that it excludes ID, and then declare "ID is not science" so they don't have to deal with the issue of whether or not ID is true. As CS3 pointed out on another thread, what you are trying to do here is very similar. In my Bio-Complexity article, I quoted three common statements of the second law, taken from a typical general physics text: the first is apparently the only one you accept as valid, and I acknowledged that evolution has little to do with this statement, it is the more general statements (2) and (3) that are relevant. You are determined to limit the second law so that, by definition, there is no conflict with evolution, and you want us to believe that only "creationists" apply it more generally (to things like tornados running backward), which is patently false. So there is not much left to discuss, IF you insist that (1) is the only legitimate statement of the second law, then you can declare victory, because I admit evolution doesn't violate this statement. But I believe laws of Nature are defined by Nature, not by man. If Isaac Newton had stated the law of gravity as "the Earth attracts apples", you could say that technically, oranges falling to the ground has nothing to do with the law of gravity. But I would be interested in knowing if the law of gravity could really be generalized beyond apples.Granville Sewell
July 8, 2013
July
07
Jul
8
08
2013
12:48 PM
12
12
48
PM
PDT
bornagan77: You of course disagree with me profoundly, and I with you (mostly), but I need to make it clear to you: I am not dishonest. Fallible, sure, but we all are. Grumpy, occasionally acid, sure. But I never intend to deceive anyone, and I am as prepared as anyone to follow the evidence and argument where it leads. I don't expect to change your opinion of me, but "for record" as Kairofocus would say, I state here plainly: I never deliberately say things that I do not believe to be true. Apart from anything else - what would be the point?Elizabeth B Liddle
July 8, 2013
July
07
Jul
8
08
2013
12:10 PM
12
12
10
PM
PDT
Bornagain, I liked "irrevelant" better. Fits with the Alice in Wonderland idea. :) But I do kind of think that Dr. Sewell should address her concerns if they are reasonable, because I find them interesting.Collin
July 8, 2013
July
07
Jul
8
08
2013
12:06 PM
12
12
06
PM
PDT
Keiths, I also find it a little cringe-inducing when the accolades start being handed out. I don't want hero-worship to be encouraged in ID. Only critical thinking and mutual respect.Collin
July 8, 2013
July
07
Jul
8
08
2013
12:00 PM
12
12
00
PM
PDT
Oh but Mr. Fox, don't be so jealous, you are very close second in my book for people who could care less about truth, and do their damnedest to obfuscate it! But Elizabeth just has you beat on style. Maybe if you put more smiley faces on your deceptions!bornagain77
July 8, 2013
July
07
Jul
8
08
2013
11:55 AM
11
11
55
AM
PDT
Phil "Mr-pots-and-kettles" Cunningham has the bare-faced cheek to complain of someone else of "irrelevant rabbit holes she dreams up to obfuscate the matter." Irony meters must be smoking everywhere!Alan Fox
July 8, 2013
July
07
Jul
8
08
2013
11:51 AM
11
11
51
AM
PDT
pardon: "irrelevant rabbit holes",,bornagain77
July 8, 2013
July
07
Jul
8
08
2013
11:49 AM
11
11
49
AM
PDT
Dr. Sewell, I suggest you ignore Elizabeth posts since as far as I can tell she has absolutely no intention of ever being honest with the evidence. Hundreds of thousands of words have been wasted on the 'move the goalpost' tactics she employs, and In my honest opinion your time would be well spent elsewhere than chasing her down whatever irrevelant rabbit holes she dreams up to obfuscate the matter.bornagain77
July 8, 2013
July
07
Jul
8
08
2013
11:46 AM
11
11
46
AM
PDT
I have one more question?
Don't tell me you are a Kylie fan, too!Alan Fox
July 8, 2013
July
07
Jul
8
08
2013
11:45 AM
11
11
45
AM
PDT
1 2 3 4 5

Leave a Reply