Uncommon Descent Serving The Intelligent Design Community

The illusion of organizing energy

Categories
Biophysics
Intelligent Design
Share
Facebook
Twitter/X
LinkedIn
Flipboard
Print
Email

The 2nd law of statistical thermodynamics states that in a closed system any natural transformation goes towards the more probable states. The states of organization are those more improbable, then transformations spontaneously go towards non-organization, so to speak. Since evolution would be spontaneous organization, evolution disagrees with the 2nd law.

The tendency expressed in the 2nd law rules all physical phenomena and is clearly evident in our everyday life, where e.g. systems that were ok yesterday, today are ko, while systems that are ko, do not self repair and remain ko until an intelligent intervention. In short, things break down and do not self-repair, to greater reason they do not self-organize. All that can be related to the trend of the 2nd law.

Before this evidence an usual objection is that Earth is not a closed system because it receives radiant energy from the Sun, so the 2nd law doesn’t apply. Such energy — evolutionists say — would provide the organizing power for evolution. Here we will see in very simple terms as this is nothing but a naive illusion.

In my previous post I noted how, according to general systems theory, organization shows always two different aspects: power and control. Energy is related to the power that the system needs to work and control is related to all what pertains to the “intelligence” of the system, what governs both energy/matter and information in the system. Notice that control has even to organize the energy itself powering the system. If energy really had the organizing capability evolutionists believe, one would ask why systems theory does such distinction in the first place. (In philosophical terms, in a sense, the above distinction is related to the distinction between action and knowledge. Action without knowledge is only agitation and disorder. We will see below how power/energy without control is even destructive.)

All know what energy is. The capability to do a work. Mechanical work/energy is defined as a force producing a shift. A moving object has kinetic energy, due to its speed. Thermal energy is due to the disordered motions of the molecules making up matter. Electric energy is a flow of electrons. Chemical energy is sort of potential energy able to power chemical reactions. Radiant energy is carried by light and other electromagnetic radiation.

Energy can power the systems, but never can create the organized system in the first place. In short, energy is the fuel, not the engine. Example, in photosynthesis, used by plants to convert light energy into chemical energy, the light energy presupposes a photosynthesis system just in place. The light energy doesn’t create the photosynthesis system, like the photons don’t create the photovoltaic cell that outputs electric current.

In all definitions of “energy” there is nothing that could lead us to think that energy is able to transform improbable states into probable states. Consequently, energy cannot change the situation of the 2nd law: energy cannot create organization, which always implies highly improbable states. Indeed the opposite: per se uncontrolled energy is destructive. Example: an abandoned building is slowly but inexorably destroyed by the natural forces of the environment during some centuries. If we increases the energy by considering a flood, it can be destroyed in some days. With more energy, a tornado can destroy it in minutes. Finally with the energy of a bomb we can destroy the building in few seconds. More the energy, more the speed of destruction.

If we consider the physical principle of mass–energy equivalence we reach the same conclusion as above. Mass per se has nothing to do with real organization. Mass and matter are simply the initial support/substance on which an higher principle — intelligence/essence — must operate to obtain a final organized system.

In general we can say that what energy can do is to speed the processes/transformations. But since the transformations go towards the more probable states, uncontrolled energy, far from helping evolution, it could even worsen its problems, because accelerates the trend towards non-organization. The moral is that to invoke uncontrolled energy to revert the trend of the 2nd law is counterproductive for evolutionists.

An objection that evolutionists could rise is: energy can power and greatly speed the chemical reactions, so they can produce life. In these objection there are two problems.
(1) Usually chemical reactions go towards equilibrium, the more probable state, so they don’t overturn at all the 2nd law.
(2) In this context the alleged naturalistic origin of life stated by evolutionism is a non-sequitur. In the hierarchy of biological organization chemical reactions are at the lowest level. Between this level and the final organization of organisms there are countless layers of complexity, related to increasingly higher kinds of abstractness and formalism, which are unattainable by mere chemistry.

Another similar evolutionist objection is that in 1953 Miller and Urey conducted an experiment where some organic compounds such as amino acids were formed by providing thermal and electric energy to a mixture of methane, ammonia, hydrogen, and water. Again no new organization here. The compounds obtained are exactly the probable transformations that the system was able to produce, under the same circumstances. In fact if one repeats the Miller/Urey experiment he gets again the same results. This shows that nothing improbable happens, rather something of very probable, almost certain. No violation of the 2nd law. Obviously also here there is an abyss between the Miller/Urey amino acids and the organization of life, also if we consider a single unicellular organism.

To sum up, the 2nd law in the context of statistical thermodynamics, provides a fundamental reason why naturalistic origin of life is impossible. To resort to energy doesn’t solve the problem, because energy is not a source of organization, rather the inverse: uncontrolled energy can cause destruction (= non-organization). Only intelligence is source of organization, and as such can explain the arise of life, the more organized thing in the cosmos.

Comments
Zachriel
Overall entropy increases, but parts of a system can decrease in entropy.
Irrelevant. Your "decreases in entropy" have nothing to do with organization. In my example of "car-in-the-wild" (#164) car's organization always decreases. Tell me where are your "decreases in entropy" making car's organization increase.niwrad
March 13, 2015
March
03
Mar
13
13
2015
09:58 AM
9
09
58
AM
PDT
Eric #167, OOL hypotheses are constrained by the second law. Any OOL hypothesis that violates the second law is a non-starter. They are also constrained by the first law. Any OOL hypothesis that violates the conservation of energy is a non-starter. They are also constrained by the law of conservation of charge. Any proposed OOL hypothesis that doesn't conserve charge is a non-starter. And so on, for the other laws of physics. Is the law of conservation of charge a problem for OOL? No, because no one is proposing OOL hypotheses that violate the law of conservation of charge. Does the law of conservation of charge have to be "held at bay" in order for OOL to be viable? No. It remains in full force, like all the other laws of physics. Is the first law a problem for OOL? No, because no one is proposing OOL hypotheses that violate the conservation of energy. Does the first law have to be "held at bay" in order for OOL to be viable? No. It remains in full force, all the time. Is the second law a problem for OOL? No. No one is proposing OOL hypotheses that violate the second law. Must the second law be "held at bay" for OOL to be viable? Of course not. As much as you'd like to believe otherwise, the second law is a problem neither for OOL nor for evolution. It's time to accept that and move on.keith s
March 13, 2015
March
03
Mar
13
13
2015
09:43 AM
9
09
43
AM
PDT
Box: It seems that all three have conceded the obvious fact that – like gravity poses an obstacle to things that want to go up – the 2nd law poses an obstacle to organization. There is law concerning the gravitational force. There is no law that says things can't go up. There is a law that says the overall entropy of a closed, isolated system will increase over time. There is no law that says the entropy of part of the system can't decrease over time. Box: Materialism does not accommodate such a force. Work and energy, not force. And work is available in non-organic systems. Box: systems go towards probability, while unguided evolution is improbability Overall entropy increases, but parts of a system can decrease in entropy. Eric Anderson: Even in its conventional narrow sense, they recognize that the 2nd Law presents a significant hurdle to the naturalistic abiogenesis story and that it is a legitimate scientific question to ask how that hurdle could be overcome. Any solution for abiogenesis has to be consistent with the 2nd law, of course. Eric Anderson: 1. Do you or do you not acknowledge (as abiogenesis researchers do) that the constraints of the 2nd Law are relevant to the abiogenesis of a living organism under purely natural conditions? Of course. All physical processes have to be consistent with the 2nd law of thermodynamics. Eric Anderson: 2. If so, is it legitimate for someone to ask what kinds of conditions must exist and what principles might need to be implemented in order to overcome these constraints? Of course. If someone proposes a theory of abiogenesis that is not consistent with the 2nd law of thermodynamics, then that's a significant problem for the theory. But that's the real 2nd law of thermodynamics, not the made-up one.Zachriel
March 13, 2015
March
03
Mar
13
13
2015
09:21 AM
9
09
21
AM
PDT
kf, Sooooo, no calculations, huh? https://www.youtube.com/watch?v=G2eUopy9sd8DNA_Jock
March 13, 2015
March
03
Mar
13
13
2015
09:20 AM
9
09
20
AM
PDT
D, irrelevant actually, as the organisation separately needs to be accounted for as specifying states -- recall a functionally specific state is macro-observable. There is a question of energy coupling. Entropy, recall is a state function and is additive, not all clumped states are equal -- work of clumping and work of functional organisation are separable and highlight this. I suggest you read Thaxton et al TMLO chs 7 - 9 to see what I am pointing out. Just because you are at [macro-]molecular scales does not move these issues off the table. Hence the force of my nanobots and microjets thought exer4cise in App A my always linked, which also has the ds clump vs ds organised issue. KFkairosfocus
March 13, 2015
March
03
Mar
13
13
2015
09:13 AM
9
09
13
AM
PDT
For all of this blah, blah, blah and the evos still can't come up with a methodology to test the claims of their position. They can't post unguided evolution's entailments because there aren't any. The same goes for their OoL. But that is what happens when one's position is nothing more than "stuff happens and some stuff sticks around". So even if the 2LoT is not a threat to their claims, everything else is. ;)Joe
March 13, 2015
March
03
Mar
13
13
2015
08:06 AM
8
08
06
AM
PDT
kf, Based on the equations that you provided above, how much water would I need to melt to account for the informational content of the human genome? I'll accept answers that are off by six orders of magnitude, but you will have to show your calculations.DNA_Jock
March 13, 2015
March
03
Mar
13
13
2015
07:57 AM
7
07
57
AM
PDT
D, did you appreciate that entropy is inherently informational -- as say G N Lewis pointed out in 1930? That is what is pivotal, as it means mere shuffling across a config space will not plausibly give rise spontaneously to organisation beyond a reasonable threshold of complexity. FSCO/I stares at 500 - 1,000 bits of info which swamps the atomic-temporal resources of sol system and observed cosmos respectively. Where, the chemistry of chaining of D/RNA and AAs simply does not constrain the information-bearing side chains that are "perpendicular." similar to how many Swiss Army Knife attachments have the same spring-mechanism. No, blind Chemistry does not adequately account for functionally specific complex interactive organisation and associated information. KFkairosfocus
March 13, 2015
March
03
Mar
13
13
2015
07:51 AM
7
07
51
AM
PDT
kf, So you agree that the "informational" entropy is small compared with the "thermodynamic" entropy. Cool. That's really the point. Next up, chemistry.DNA_Jock
March 13, 2015
March
03
Mar
13
13
2015
07:44 AM
7
07
44
AM
PDT
F/N: Let me snip from my always linked note regarding these topics: I: Section A: >> Further to this, we may average the information per symbol in the communication system thusly (giving in termns of -H to make the additive relationships clearer): - H = p1 log p1 + p2 log p2 + . . . + pn log pn or, H = - SUM [pi log pi] . . . Eqn 5 H, the average information per symbol transmitted [usually, measured as: bits/symbol], is often termed the Entropy; first, historically, because it resembles one of the expressions for entropy in statistical thermodynamics. As Connor notes: "it is often referred to as the entropy of the source." [p.81, emphasis added.] Also, while this is a somewhat controversial view in Physics, as is briefly discussed in Appendix 1below, there is in fact an informational interpretation of thermodynamics that shows that informational and thermodynamic entropy can be linked conceptually as well as in mere mathematical form. Though somewhat controversial even in quite recent years, this is becoming more broadly accepted in physics and information theory, as Wikipedia now discusses [as at April 2011] in its article on Informational Entropy (aka Shannon Information, cf also here):
At an everyday practical level the links between information entropy and thermodynamic entropy are not close. Physicists and chemists are apt to be more interested in changes in entropy as a system spontaneously evolves away from its initial conditions, in accordance with the second law of thermodynamics, rather than an unchanging probability distribution. And, as the numerical smallness of Boltzmann's constant kB indicates, the changes in S / kB for even minute amounts of substances in chemical and physical processes represent amounts of entropy which are so large as to be right off the scale compared to anything seen in data compression or signal processing. But, at a multidisciplinary level, connections can be made between thermodynamic and informational entropy, although it took many years in the development of the theories of statistical mechanics and information theory to make the relationship fully apparent. In fact, in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon's information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states that it could be in, thus making any complete state description longer. (See article: maximum entropy thermodynamics.[Also,another article remarks: >> in the words of G. N. Lewis writing about chemical entropy in 1930, "Gain in entropy always means loss of information, and nothing more" . . . in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes/no questions that need to be answered in order to fully specify the microstate, given that we know the macrostate .>>]) Maxwell's demon can (hypothetically) reduce the thermodynamic entropy of a system by using information about the states of individual molecules; but, as Landauer (from 1961) and co-workers have shown, to function the demon himself must increase thermodynamic entropy in the process, by at least the amount of Shannon information he proposes to first acquire and store; and so the total entropy does not decrease (which resolves the paradox).
Summarising Harry Robertson's Statistical Thermophysics (Prentice-Hall International, 1993) -- excerpting desperately and adding emphases and explanatory comments, we can see, perhaps, that this should not be so surprising after all. (In effect, since we do not possess detailed knowledge of the states of the vary large number of microscopic particles of thermal systems [typically ~ 10^20 to 10^26; a mole of substance containing ~ 6.023*10^23 particles; i.e. the Avogadro Number], we can only view them in terms of those gross averages we term thermodynamic variables [pressure, temperature, etc], and so we cannot take advantage of knowledge of such individual particle states that would give us a richer harvest of work, etc.) For, as he astutely observes on pp. vii - viii: . . . the standard assertion that molecular chaos exists is nothing more than a poorly disguised admission of ignorance, or lack of detailed information about the dynamic state of a system . . . . If I am able to perceive order, I may be able to use it to extract work from the system, but if I am unaware of internal correlations, I cannot use them for macroscopic dynamical purposes. On this basis, I shall distinguish heat from work, and thermal energy from other forms . . . And, in more details, (pp. 3 - 6, 7, 36, cf Appendix 1 below for a more detailed development of thermodynamics issues and their tie-in with the inference to design; also see recent ArXiv papers by Duncan and Samura here and here): . . . It has long been recognized that the assignment of probabilities to a set represents information, and that some probability sets represent more information than others . . . if one of the probabilities say p2 is unity and therefore the others are zero, then we know that the outcome of the experiment . . . will give [event] y2. Thus we have complete information . . . if we have no basis . . . for believing that event yi is more or less likely than any other [we] have the least possible information about the outcome of the experiment . . . . A remarkably simple and clear analysis by Shannon [1948] has provided us with a quantitative measure of the uncertainty, or missing pertinent information, inherent in a set of probabilities [NB: i.e. a probability different from 1 or 0 should be seen as, in part, an index of ignorance] . . . . [deriving informational entropy, cf. discussions here, here, here, here and here; also Sarfati's discussion of debates and the issue of open systems here . . . ] H({pi}) = - C [SUM over i] pi*ln pi, [. . . "my" Eqn 6] [where [SUM over i] pi = 1, and we can define also parameters alpha and beta such that: (1) pi = e^-[alpha + beta*yi]; (2) exp [alpha] = [SUM over i](exp - beta*yi) = Z [Z being in effect the partition function across microstates, the "Holy Grail" of statistical thermodynamics]. . . . [H], called the information entropy, . . . correspond[s] to the thermodynamic entropy [i.e. s, where also it was shown by Boltzmann that s = k ln w], with C = k, the Boltzmann constant, and yi an energy level, usually ei, while [BETA] becomes 1/kT, with T the thermodynamic temperature . . . A thermodynamic system is characterized by a microscopic structure that is not observed in detail . . . We attempt to develop a theoretical description of the macroscopic properties in terms of its underlying microscopic properties, which are not precisely known. We attempt to assign probabilities to the various microscopic states . . . based on a few . . . macroscopic observations that can be related to averages of microscopic parameters. Evidently the problem that we attempt to solve in statistical thermophysics is exactly the one just treated in terms of information theory. It should not be surprising, then, that the uncertainty of information theory becomes a thermodynamic variable when used in proper context . . . . Jayne's [summary rebuttal to a typical objection] is ". . . The entropy of a thermodynamic system is a measure of the degree of ignorance of a person whose sole knowledge about its microstate consists of the values of the macroscopic quantities . . . which define its thermodynamic state. This is a perfectly 'objective' quantity . . . it is a function of [those variables] and does not depend on anybody's personality. There is no reason why it cannot be measured in the laboratory." . . . . [pp. 3 - 6, 7, 36; replacing Robertson's use of S for Informational Entropy with the more standard H.] As is discussed briefly in Appendix 1, Thaxton, Bradley and Olsen [TBO], following Brillouin et al, in the 1984 foundational work for the modern Design Theory, The Mystery of Life's Origins [TMLO], exploit this information-entropy link, through the idea of moving from a random to a known microscopic configuration in the creation of the bio-functional polymers of life, and then -- again following Brillouin -- identify a quantitative information metric for the information of polymer molecules. For, in moving from a random to a functional molecule, we have in effect an objective, observable increment in information about the molecule. This leads to energy constraints, thence to a calculable concentration of such molecules in suggested, generously "plausible" primordial "soups." In effect, so unfavourable is the resulting thermodynamic balance, that the concentrations of the individual functional molecules in such a prebiotic soup are arguably so small as to be negligibly different from zero on a planet-wide scale. By many orders of magnitude, we don't get to even one molecule each of the required polymers per planet, much less bringing them together in the required proximity for them to work together as the molecular machinery of life. The linked chapter gives the details. More modern analyses [e.g. Trevors and Abel, here and here], however, tend to speak directly in terms of information and probabilities rather than the more arcane world of classical and statistical thermodynamics, so let us now return to that focus; in particular addressing information in its functional sense, as the third step in this preliminary analysis. >> II: Appendix 1: >> Let us reflect on a few remarks on the link from thermodynamics to information: 1] TMLO: In 1984, this well-received work provided the breakthrough critical review on the origin of life that led to the modern design school of thought in science. The three online chapters, as just linked, should be carefully read to understand why design thinkers think that the origin of FSCI in biology is a significant and unmet challenge to neo-darwinian thought. (Cf also Klyce's relatively serious and balanced assessment, from a panspermia advocate. Sewell's remarks here are also worth reading. So is Sarfati's discussion of Dawkins' Mt Improbable.) 2] But open systems can increase their order: This is the "standard" dismissal argument on thermodynamics, but it is both fallacious and often resorted to by those who should know better. My own note on why this argument should be abandoned is: a] Clausius is the founder of the 2nd law, and the first standard example of an isolated system -- one that allows neither energy nor matter to flow in or out -- is instructive, given the "closed" subsystems [i.e. allowing energy to pass in or out] in it. Pardon the substitute for a real diagram, for now: Isol System: | | (A, at Thot) --> d'Q, heat --> (B, at T cold) | | b] Now, we introduce entropy change dS >/= d'Q/T . . . "Eqn" A.1 c] So, dSa >/= -d'Q/Th, and dSb >/= +d'Q/Tc, where Th > Tc d] That is, for system, dStot >/= dSa + dSb >/= 0, as Th > Tc . . . "Eqn" A.2 e] But, observe: the subsystems A and B are open to energy inflows and outflows, and the entropy of B RISES DUE TO THE IMPORTATION OF RAW ENERGY. f] The key point is that when raw energy enters a body, it tends to make its entropy rise. This can be envisioned on a simple model of a gas-filled box with piston-ends at the left and the right: ================================= ||::::::::::::::::::::::::::::::::::::::::::|| ||::::::::::::::::::::::::::::::::::::::::::||=== ||::::::::::::::::::::::::::::::::::::::::::|| ================================= 1: Consider a box as above, filled with tiny perfectly hard marbles [so collisions will be elastic], scattered similar to a raisin-filled Christmas pudding (pardon how the textual elements give the impression of a regular grid, think of them as scattered more or less hap-hazardly as would happen in a cake). 2: Now, let the marbles all be at rest to begin with. 3: Then, imagine that a layer of them up against the leftmost wall were given a sudden, quite, quite hard push to the right [the left and right ends are pistons]. 4: Simply on Newtonian physics, the moving balls would begin to collide with the marbles to their right, and in this model perfectly elastically. So, as they hit, the other marbles would be set in motion in succession. A wave of motion would begin, rippling from left to right 5:As the glancing angles on collision will vary at random, the marbles hit and the original marbles would soon begin to bounce in all sorts of directions. Then, they would also deflect off the walls, bouncing back into the body of the box and other marbles, causing the motion to continue indefinitely. 6: Soon, the marbles will be continually moving in all sorts of directions, with varying speeds, forming what is called the Maxwell-Boltzmann distribution, a bell-shaped curve. 7: And, this pattern would emerge independent of the specific initial arrantgement or how we impart motion to it, i.e. this is an attractor in the phase space: once the marbles are set in motion somehow, and move around and interact, they will soon enough settle into the M-B pattern. E.g. the same would happen if a small charge of explosive were set off in the middle of the box, pushing our the balls there into the rest, and so on. And once the M-B pattern sets in, it will strongly tend to continue. (That is, the process is ergodic.) 8: A pressure would be exerted on the walls of the box by the average force per unit area from collisions of marbles bouncing off the walls, and this would be increased by pushing in the left or right walls (which would do work to push in against the pressure, naturally increasing the speed of the marbles just like a ball has its speed increased when it is hit by a bat going the other way, whether cricket or baseball). Pressure rises, if volume goes down due to compression. (Also, volume of a gas body is not fixed.) 9: Temperatureemerges as a measure of the average random kinetic energy of the marbles in any given direction, left, right, to us or away from us. Compressing the model gas does work on it, so the internal energy rises, as the average random kinetic energy per degree of freedom rises. Compression will tend to raise temperature. (We could actually deduce the classical — empirical — P, V, T gas laws [and variants] from this sort of model.) 10: Thus, from the implications of classical, Newtonian physics, we soon see the hard little marbles moving at random, and how that randomness gives rise to gas-like behaviour. It also shows how there is a natural tendency for systems to move from more orderly to more disorderly states, i.e. we see the outlines of the second law of thermodynamics. 11: Is the motion really random? First, we define randomness in the relevant sense: In probability and statistics, a random process is a repeating process whose outcomes follow no describable deterministic pattern, but follow a probability distribution, such that the relative probability of the occurrence of each outcome can be approximated or calculated. For example, the rolling of a fair six-sided die in neutral conditions may be said to produce random results, because one cannot know, before a roll, what number will show up. However, the probability of rolling any one of the six rollable numbers can be calculated. 12: This can be seen by the extension of the thought experiment of imagining a large collection of more or less identically set up boxes, each given the same push at the same time, as closely as we can make it. At first, the marbles in the boxes will behave very much alike, but soon, they will begin to diverge as to path. The same overall pattern of M-B statistics will happen, but each box will soon be going its own way. That is, the distribution pattern is the same but the specific behaviour in each case will be dramatically different. 13: Q: Why? 14: A: This is because tiny, tiny differences between the boxes, and the differences in the vibrating atoms in the walls and pistons, as well as tiny irregularites too small to notice in the walls and pistons will make small differences in initial and intervening states -- perfectly smooth boxes and pistons are an unattainable ideal. Since the system is extremely nonlinear, such small differences will be amplified, making the behaviour diverge as time unfolds. A chaotic system is not predictable in the long term. So, while we can deduce a probabilistic distribution, we cannot predict the behaviour in detail, across time. Laplace's demon who hoped to predict the future of the universe from the covering laws and the initial conditions, is out of a job. 15: To see diffusion in action, imagine that at the beginning, the balls in the right half were red, and those in the left half were black. After a little while, as they bounce and move, the balls would naturally mix up, and it would be very unlikely indeed — through logically possible — for them to spontaneously un-mix, as the number of possible combinations of position, speed and direction where the balls are mixed up is vastly more than those where they are all red to the right, all alack to the left or something similar. (This can be calculated, by breaking the box up into tiny little cells such that they would have at most one ball in them, and we can analyse each cell on occupancy, colour, location, speed and direction of motion. thus, we have defined a phase or state space, going beyond a mere configuration space that just looks at locations.) 16: So, from the orderly arrangement of laws and patterns of initial motion, we see how randomness emerges through the sensitive dependence of the behaviour on initial and intervening conditions. There would be no specific, traceable deterministic pattern that one could follow or predict for the behaviour of the marbles, through we could work out an overall statistical distribution, and could identify overall parameters such as volume, pressure and temperature. 17: For Osmosis, let us imagine that the balls are of different size, and that we have two neighbouring boxes with a porous wall between them; but only the smaller marbles can pass through the holes. If the smaller marbles were initially on say the left side, soon, they would begin to pass through to the right, until they were evenly distributed, so that on average as many small balls would pass left as were passing right, i.e., we see dynamic equilibrium. [this extends to evaporation and the vapour pressure of a liquid, once we add in that the balls have a short-range attraction that at even shorter ranges turns into a sharp repulsion, i.e they are hard.] 18: For a solid, imagine that the balls in the original box are now connected through springs in a cubical grid. The initial push will now set the balls to vibrating back and forth, and the same pattern of distributed vibrations will emerge, as one ball pulls on its neigbours in the 3-D array. (For a liquid, allow about 3% of holes in the grid, aned let the balls slide over one another, making nes connextions, some of them distorted. The fixed volume but inability to keep a shape that defines a liquid will emerge. The push on the liquid will have much the same effect as for the solid, except that it will also lead to flows.) 19: Randomness is thus credibly real, and naturally results from work on or energy injected into a body composed of microparticles, even in a classical Newtonian world; whether it is gas, solid or liquid. Raw injection of energy into a body tends to increase its disorder, and this is typically expressed in its temperature rising. 20: Quantum theory adds to the picture, but the above is enough to model a lot of what we see as we look at bulk and transport properties of collections of micro-particles. 21: Indeed, even viscosity comes out naturally, as . . . if there are are boxes stacked top and bottom that are sliding left or right relative to one another, and suddenly the intervening walls are removed, the gas-balls would tend to diffuse up and down from one stream tube to another, so their drift verlocities will tend to even out, The slower moving stream tubes exert a dragging effect on the faster moving ones. 22: And many other phenomena can be similarly explained and applied, based on laws and processes that we can test and validate, and their consequences in simplified but relevant models of the real world. 23: When we see such a close match, especially when quantum principles are added in, it gives us high confidence that we are looking at a map of reality. Not the reality itself, but a useful map. And, that map tells us that thanks to sensitive dependence on initial conditions, randomness will be a natural part of the micro-world, and that when energy is added to a body its randomness tends to increase, i.e we see the principle of entropy, and why simply opening up a body to receive energy is not going to answer to the emergence of funcitonal internal organisation. 24: For, organised states will be deeply isolated in the set of possible configurations. Indeed, if we put a measure of possible configurations in terms of say binary digits, bits, if we have 1,000 two-state elements there are already 1.07*10^301 possible configs. The whole observed universe searching at one state per Planck time, could not go through enough states of its 10^80 or so atoms, across its thermodynamically credible lifespan -- about 50 mn times the 13.7 BY said to have elapsed form the big bang -- to go through more than about 10^150 states. That is, the whole cosmos could not search more than a negligible fraction of the space. The hay stack could be positively riddled with needles, but at that rate we have not had any serious search at all.. 25: That is, there is a dominant distribution, not a detailed plan a la Laplace’s (finite) Demon who could predict the long term path of the world on its initial conditions and sufficient calculating power and time. 26: But equally, since short term interventions that are subtle can have significant effects, there is room for the intelligent and sophisticated intervention; e.g. through a Maxwell’s Demon who can spot faster moving and slower moving molecules and open/shut a shutter to set one side hotter and the other colder in a partitioned box. Providing he has to take active steps to learn which molecules are moving faster/slower in the desired direction, Brillouin showed that he will be within the second law of thermodynamics. . . . So, plainly, for the injection of energy to instead do predictably and consistently do something useful, it needs to be coupled to an energy conversion device. g] When such energy conversion devices, as in the cell, exhibit FSCI, the question of their origin becomes material, and in that context, their spontaneous origin is strictly logically possible but -- from the above -- negligibly different from zero probability on the gamut of the observed cosmos. (And, kindly note: the cell is an energy importer with an internal energy converter. That is, the appropriate entity in the model is B and onward B' below. Presumably as well, the prebiotic soup would have been energy importing, and so materialistic chemical evolutionary scenarios therefore have the challenge to credibly account for the origin of the FSCI-rich energy converting mechanisms in the cell relative to Monod's "chance + necessity" [cf also Plato's remarks] only.) h] Now, as just mentioned, certain bodies have in them energy conversion devices: they COUPLE input energy to subsystems that harvest some of the energy to do work, exhausting sufficient waste energy to a heat sink that the overall entropy of the system is increased. Illustratively, for heat engines -- and (in light of exchanges with email correspondents circa March 2008) let us note: a good slice of classical thermodynamics arose in the context of studying, idealising and generalising from steam engines [which exhibit organised, functional complexity, i.e FSCI; they are of course artifacts of intelligent design and also exhibit step-by-step problem-solving processes (even including "do-always" looping!)]: | | (A, heat source: Th): d'Qi --> (B', heat engine, Te): --> d'W [work done on say D] + d'Qo --> (C, sink at Tc) | | i] A's entropy: dSa >/= - d'Qi/Th j] C's entropy: dSc >/= + d'Qo/Tc k] The rise in entropy in B, C and in the object on which the work is done, D, say, compensates for that lost from A. The second law -- unsurprisingly, given the studies on steam engines that lie at its roots -- holds for heat engines. l] However for B since it now couples energy into work and exhausts waste heat, does not necessarily undergo a rise in entropy having imported d'Qi. [The problem is to explain the origin of the heat engine -- or more generally, energy converter -- that does this, if it exhibits FSCI.] m] There is also a material difference between the sort of heat engine [an instance of the energy conversion device mentioned] that forms spontaneously as in a hurricane [directly driven by boundary conditions in a convective system on the planetary scale, i.e. an example of order], and the sort of complex, organised, algorithm-implementing energy conversion device found in living cells [the DNA-RNA-Ribosome-Enzyme system, which exhibits massive FSCI]. n] In short, the decisive problem is the [im]plausibility of the ORIGIN of such a FSCI-based energy converter through causal mechanisms traceable only to chance conditions and undirected [non-purposive] natural forces. This problem yields a conundrum for chem evo scenarios, such that inference to agency as the probable cause of such FSCI -- on the direct import of the many cases where we do directly know the causal story of FSCI -- becomes the better explanation. As TBO say, in bridging from a survey of the basic thermodynamics of living systems in CH 7, to that more focussed discussion in ch's 8 - 9: While the maintenance of living systems is easily rationalized in terms of thermodynamics, the origin of such living systems is quite another matter. Though the earth is open to energy flow from the sun, the means of converting this energy into the necessary work to build up living systems from simple precursors remains at present unspecified (see equation 7-17). The "evolution" from biomonomers of to fully functioning cells is the issue. Can one make the incredible jump in energy and organization from raw material and raw energy, apart from some means of directing the energy flow through the system? In Chapters 8 and 9 we will consider this question, limiting our discussion to two small but crucial steps in the proposed evolutionary scheme namely, the formation of protein and DNA from their precursors. It is widely agreed that both protein and DNA are essential for living systems and indispensable components of every living cell today.11 Yet they are only produced by living cells. Both types of molecules are much more energy and information rich than the biomonomers from which they form. Can one reasonably predict their occurrence given the necessary biomonomers and an energy source? Has this been verified experimentally? These questions will be considered . . . [Bold emphasis added. Cf summary in the peer-reviewed journal of the American Scientific Affiliation, "Thermodynamics and the Origin of Life," in Perspectives on Science and Christian Faith 40 (June 1988): 72-83, pardon the poor quality of the scan. NB:as the journal's online issues will show, this is not necessarily a "friendly audience."] 3] So far we have worked out of a more or less classical view of the subject. But, to explore such a question further, we need to look more deeply at the microscopic level. Happily, there is a link from macroscopic thermodynamic concepts to the microscopic, molecular view of matter, as worked out by Boltzmann and others, leading to the key equation: s = k ln W . . . Eqn.A.3 That is, entropy of a specified macrostate [in effect, macroscopic description or specification] is a constant times a log measure of the number of ways matter and energy can be distributed at the micro-level consistent with that state [i.e. the number of associated microstates; aka "the statistical weight of the macrostate," aka "thermodynamic probability"]. The point is, that there are as a rule a great many ways for energy and matter to be arranged at micro level relative to a given observable macro-state. That is, there is a "loss of information" issue here on going from specific microstate to a macro-level description, with which many microstates may be equally compatible. Thence, we can see that if we do not know the microstates specifically enough, we have to more or less treat the micro-distributions of matter and energy as random, leading to acting as though they are disordered. Or, as Leon Brillouin, one of the foundational workers in modern information theory, put it in his 1962 Science and Information Theory, Second Edition: How is it possible to formulate a scientific theory of information? The first requirement is to start from a precise definition. . . . . We consider a problem involving a certain number of possible answers, if we have no special information on the actual situation. When we happen to be in possession of some information on the problem, the number of possible answers is reduced, and complete information may even leave us with only one possible answer. Information is a function of the ratio of the number of possible answers before and after, and we choose a logarithmic law in order to insure additivity of the information contained in independent situations [as seen above in the main body, section A] . . . . Physics enters the picture when we discover a remarkable likeness between information and entropy. This similarity was noticed long ago by L. Szilard, in an old paper of 1929, which was the forerunner of the present theory. In this paper, Szilard was really pioneering in the unknown territory which we are now exploring in all directions. He investigated the problem of Maxwell's demon, and this is one of the important subjects discussed in this book. The connection between information and entropy was rediscovered by C. Shannon in a different class of problems, and we devote many chapters to this comparison. We prove that information must be considered as a negative term in the entropy of a system; in short, information is negentropy. The entropy of a physical system has often been described as a measure of randomness in the structure of the system. We can now state this result in a slightly different way: Every physical system is incompletely defined. We only know the values of some macroscopic variables, and we are unable to specify the exact positions and velocities of all the molecules contained in a system. We have only scanty, partial information on the system, and most of the information on the detailed structure is missing. Entropy measures the lack of information; it gives us the total amount of missing information on the ultramicroscopic structure of the system. This point of view is defined as the negentropy principle of information [added links: cf. explanation here and "onward" discussion here -- noting on the brief, dismissive critique of Brillouin there, that you never get away from the need to provide information -- there is "no free lunch," as Dembski has pointed out ; ->) ], and it leads directly to a generalization of the second principle of thermodynamics, since entropy and information must, be discussed together and cannot be treated separately. This negentropy principle of information will be justified by a variety of examples ranging from theoretical physics to everyday life. The essential point is to show that any observation or experiment made on a physical system automatically results in an increase of the entropy of the laboratory. It is then possible to compare the loss of negentropy (increase of entropy) with the amount of information obtained. The efficiency of an experiment can be defined as the ratio of information obtained to the associated increase in entropy. This efficiency is always smaller than unity, according to the generalized Carnot principle. Examples show that the efficiency can be nearly unity in some special examples, but may also be extremely low in other cases. This line of discussion is very useful in a comparison of fundamental experiments used in science, more particularly in physics. It leads to a new investigation of the efficiency of different methods of observation, as well as their accuracy and reliability . . . . [From an online excerpt of the Dover Reprint edition, here. Emphases, links and bracketed comment added.] 4] Yavorski and Pinski, in the textbook Physics, Vol I [MIR, USSR, 1974, pp. 279 ff.], summarise the key implication of the macro-state and micro-state view well: as we consider a simple model of diffusion, let us think of ten white and ten black balls in two rows in a container. There is of course but one way in which there are ten whites in the top row; the balls of any one colour being for our purposes identical. But on shuffling, there are 63,504 ways to arrange five each of black and white balls in the two rows, and 6-4 distributions may occur in two ways, each with 44,100 alternatives. So, if we for the moment see the set of balls as circulating among the various different possible arrangements at random, and spending about the same time in each possible state on average, the time the system spends in any given state will be proportionate to the relative number of ways that state may be achieved. Immediately, we see that the system will gravitate towards the cluster of more evenly distributed states. In short, we have just seen that there is a natural trend of change at random, towards the more thermodynamically probable macrostates, i.e the ones with higher statistical weights. So "[b]y comparing the [thermodynamic] probabilities of two states of a thermodynamic system, we can establish at once the direction of the process that is [spontaneously] feasible in the given system. It will correspond to a transition from a less probable to a more probable state." [p. 284.] This is in effect the statistical form of the 2nd law of thermodynamics. Thus, too, the behaviour of the Clausius isolated system above is readily understood: importing d'Q of random molecular energy so far increases the number of ways energy can be distributed at micro-scale in B, that the resulting rise in B's entropy swamps the fall in A's entropy. Moreover, given that FSCI-rich micro-arrangements are relatively rare in the set of possible arrangements, we can also see why it is hard to account for the origin of such states by spontaneous processes in the scope of the observable universe. (Of course, since it is as a rule very inconvenient to work in terms of statistical weights of macrostates [i.e W], we instead move to entropy, through s = k ln W. Part of how this is done can be seen by imagining a system in which there are W ways accessible, and imagining a partition into parts 1 and 2. W = W1*W2, as for each arrangement in 1 all accessible arrangements in 2 are possible and vice versa, but it is far more convenient to have an additive measure, i.e we need to go to logs. The constant of proportionality, k, is the famous Boltzmann constant and is in effect the universal gas constant, R, on a per molecule basis, i.e we divide R by the Avogadro Number, NA, to get: k = R/NA. The two approaches to entropy, by Clausius, and Boltzmann, of course, correspond. In real-world systems of any significant scale, the relative statistical weights are usually so disproportionate, that the classical observation that entropy naturally tends to increase, is readily apparent.) 5] The above sort of thinking has also led to the rise of a school of thought in Physics -- note, much spoken against in some quarters, but I think they clearly have a point -- that ties information and thermodynamics together. Robertson presents their case; in summary: . . . It has long been recognized that the assignment of probabilities to a set represents information, and that some probability sets represent more information than others . . . if one of the probabilities say p2 is unity and therefore the others are zero, then we know that the outcome of the experiment . . . will give [event] y2. Thus we have complete information . . . if we have no basis . . . for believing that event yi is more or less likely than any other [we] have the least possible information about the outcome of the experiment . . . . A remarkably simple and clear analysis by Shannon [1948] has provided us with a quantitative measure of the uncertainty, or missing pertinent information, inherent in a set of probabilities [NB: i.e. a probability should be seen as, in part, an index of ignorance] . . . . [deriving informational entropy, cf. discussions here, here, here, here and here; also Sarfati's discussion of debates and open systems here; the debate here is eye-opening on rhetorical tactics used to cloud this and related issues . . . ] S({pi}) = - C [SUM over i] pi*ln pi, [. . . "my" Eqn A.4] [where [SUM over i] pi = 1, and we can define also parameters alpha and beta such that: (1) pi = e^-[alpha + beta*yi]; (2) exp [alpha] = [SUM over i](exp - beta*yi) = Z [Z being in effect the partition function across microstates, the "Holy Grail" of statistical thermodynamics]. . . .[pp.3 - 6] S, called the information entropy, . . . correspond[s] to the thermodynamic entropy, with C = k, the Boltzmann constant, and yi an energy level, usually ei, while [BETA] becomes 1/kT, with T the thermodynamic temperature . . . A thermodynamic system is characterized by a microscopic structure that is not observed in detail . . . We attempt to develop a theoretical description of the macroscopic properties in terms of its underlying microscopic properties, which are not precisely known. We attempt to assign probabilities to the various microscopic states . . . based on a few . . . macroscopic observations that can be related to averages of microscopic parameters. Evidently the problem that we attempt to solve in statistical thermophysics is exactly the one just treated in terms of information theory. It should not be surprising, then, that the uncertainty of information theory becomes a thermodynamic variable when used in proper context [p. 7] . . . . Jayne's [summary rebuttal to a typical objection] is ". . . The entropy of a thermodynamic system is a measure of the degree of ignorance of a person whose sole knowledge about its microstate consists of the values of the macroscopic quantities . . . which define its thermodynamic state. This is a perfectly 'objective' quantity . . . it is a function of [those variables] and does not depend on anybody's personality. There is no reason why it cannot be measured in the laboratory." . . . . [p. 36.] [Robertson, Statistical Thermophysics, Prentice Hall, 1993. (NB: Sorry for the math and the use of text for symbolism. However, it should be clear enough that Roberson first summarises how Shannon derived his informational entropy [though Robertson uses s rather than the usual H for that information theory variable, average information per symbol], then ties it to entropy in the thermodynamic sense using another relation that is tied to the Boltzmann relationship above. This context gives us a basis for looking at the issues that surface in prebiotic soup or similar models as we try to move from relatively easy to form monomers to the more energy- and information- rich, far more complex biofunctional molecules.)] >> >>>>>>>>>>>>>>>>> Entropy, then, can be understood in terms of avg missing info to specify microstate given macrostate defined by lab scale parameters that attach to clusters of microscopic states in a phase space. From this, we may see how 2LOT is inherently informational and probabilistic in character. This renders attempts to wall off a "traditional" sense of entropy and of the 2nd law from information and organisation issues. Where also, the question of creating FSCO/I rich energy converters by simply opening up highly contingent systems to have raw energy flowing in falls apart for much the same reason as Sewell highlighted. KFkairosfocus
March 13, 2015
March
03
Mar
13
13
2015
07:23 AM
7
07
23
AM
PDT
niwrad @ 159
Mind you, I never deleted a comment in my life (mine or other’s).
That is great.
My #152 says exactly the truth and I confirm it.
But you are wrong.*statistical* thermodynamics is entropy.Simple explanation here so if the 10^14 cells in the human body are rearranged, I get an entropy of k x ln(10^14!) = 1.38x10^-23 x (14 ln[5] + 14 ln[2])! = 8.27×10^12.Me_Think
March 13, 2015
March
03
Mar
13
13
2015
07:23 AM
7
07
23
AM
PDT
Zachriel, keith s (and anyone else): Let's cut to the chase: 1. Do you or do you not acknowledge (as abiogenesis researchers do) that the constraints of the 2nd Law are relevant to the abiogenesis of a living organism under purely natural conditions? 2. If so, is it legitimate for someone to ask what kinds of conditions must exist and what principles might need to be implemented in order to overcome these constraints?Eric Anderson
March 13, 2015
March
03
Mar
13
13
2015
07:20 AM
7
07
20
AM
PDT
Zachriel @139:
Please note they are using the term thermodynamics in its conventional “narrow sense”.
Exactly. Even in its conventional narrow sense, they recognize that the 2nd Law presents a significant hurdle to the naturalistic abiogenesis story and that it is a legitimate scientific question to ask how that hurdle could be overcome. (And that is without even considering the arguably greater challenges with organizational entropy and informational entropy.)Eric Anderson
March 13, 2015
March
03
Mar
13
13
2015
07:15 AM
7
07
15
AM
PDT
Wind coming from the north poses an abstacle for things wanting to go north. An obstacle even for sailboats:
Sailboats cannot sail directly into the environmental wind, nor on a course that is too close to the direction from which the wind is blowing. The range of directions into which a boat cannot sail is called the 'no-go zone'. [wiki]
But it can be said that absent any wind sailing is not possible. Or one could say that northern wind "provides energy" for sailboats to go north. And of course many other clever paradoxical statements can be construed. However anyone will understand that "wind coming from the north poses an obstacle for things wanting to go north".Box
March 13, 2015
March
03
Mar
13
13
2015
07:10 AM
7
07
10
AM
PDT
Hangonasec #161
to see the 2nd Law as purely a destructive, disorganising principle is to completely misunderstand it.
If you leave a system in the wild (ex. a car) what will be its destiny? It will increase its organization? or it will decrease its organization? It will decrease its organization. The 2nd law is that. So, it is an organizing or a disorganising principle? I answer the latter. Do you answer the former? well, put your car in the wild...(I prefer to have my car in maintenance).niwrad
March 13, 2015
March
03
Mar
13
13
2015
06:42 AM
6
06
42
AM
PDT
H, nope, the electrostatic force absent the organised string of highly specifically sequenced AAs would not result in a stable functional protein. Where, the organisation to be explained is just that sequence antecedent to the folding of the protein -- remember any one of 20 AAs can follow any other and the evidence is that proteins are rare in AA space, not to mention wider organic chemistry space. Spell that interfering cross reactions, especially in Darwin's warm salty lightning struck pond or the like -- which is where all of this needs to start. The blindness to such is one of the most revealing aspects of this whole exchange. KFkairosfocus
March 13, 2015
March
03
Mar
13
13
2015
06:16 AM
6
06
16
AM
PDT
Hangonasec, Very well put. I would encourage those who think that the 2LoT is somehow a problem to read hangonasec's post repeatedly, until they understand it.DNA_Jock
March 13, 2015
March
03
Mar
13
13
2015
06:02 AM
6
06
02
AM
PDT
Box @158
The logical question then raised is: WHAT FORCE creates and maintains organization and in so doing balances (or overcomes) the 2nd law? Materialism does not accommodate such a force.
The electrostatic force, principally. Responding to it causes molecular species to adopt lower-energy configurations, if they can. Those lower-energy configurations frequently appear more 'ordered' than the original separate parts. Most commonly, the apparent ordering in biology is coupled to the disassociation of one or more phosphate bonds - the entropy of the entire system goes up, but that of the little bit we focus upon - the complex molecule, in this simple illustration - goes down. This neither 'balances' nor 'overcomes' the 2nd Law. It is driven by it. As entropy is the inverse of the capacity to do useful work, energetic redistribution - the natural tendency of systems to shed energy - can be coupled to 'organisation', by tapping into that capacity. Of course the energy must get into those phosphate bonds in the first place. In living things it comes from proton gradients - again, coupling energy transfer to equilibration - but that's a puzzle prior to 'self-sustaining' replication. And of course entropy can also be inimical to organisation. Bonds can spontaneously break, and so on. But once replication is in train, multiple copies provide insurance. Be all that as it may, to see the 2nd Law as purely a destructive, disorganising principle is to completely misunderstand it. It is simply a process of equilibration which can provide energy. The dynamic, cyclic nature of Life's long-term refusal to actually reach equilibrium may seem mysterious. But its energetic relations are not. We are powered by the 2nd Law, brains and all.Hangonasec
March 13, 2015
March
03
Mar
13
13
2015
05:48 AM
5
05
48
AM
PDT
(...) the question, rather, is why things don’t fall completely apart — as they do, in fact, at the moment of death. What power holds off that moment — precisely for a lifetime, and not a moment longer? [Stephen L. Talbott - "The Unbearable Wholeness of Beings"]
Box
March 13, 2015
March
03
Mar
13
13
2015
05:23 AM
5
05
23
AM
PDT
Zachriel You continue to obfuscate again and again to avoid acknowledging what matters: systems go towards probability, while unguided evolution is improbability. Ergo they are incompatible. Me_Think #153
You might want to delete your comment [#152] before a lot more people see it and embarrass you.[you may delete this comment too]
Mind you, I never deleted a comment in my life (mine or other's). My #152 says exactly the truth and I confirm it.niwrad
March 13, 2015
March
03
Mar
13
13
2015
04:48 AM
4
04
48
AM
PDT
I don't expect an answer to my question from Keith, Zachriel nor DNA_Jock - who wants me to read on how airplanes can have stable flights. It seems that all three have conceded the obvious fact that - like gravity poses an obstacle to things that want to go up - the 2nd law poses an obstacle to organization. The logical question then raised is:
WHAT FORCE creates and maintains organization and in so doing balances (or overcomes) the 2nd law?
Materialism does not accommodate such a force.Box
March 13, 2015
March
03
Mar
13
13
2015
04:29 AM
4
04
29
AM
PDT
Box: So it’s “work” that creates and maintains order – that balances (or overcomes) the 2nd law. What is this force that you term “work”? Work is not force, but force acting over a distance. It's a standard term in physics and mechanics. http://en.wikipedia.org/wiki/Work_%28physics%29Zachriel
March 13, 2015
March
03
Mar
13
13
2015
04:07 AM
4
04
07
AM
PDT
Z: Statistical thermodynamics yields the same answer as classical thermodynamics. They are related by the Boltzmann’s equation, S = k log W. (ETA: W is the number of microstates, not work.) If you consider the formula, you will find the number of possible microstates dwarfs any common notion of order. k is Boltzmann's constant, 1.3806488 × 10^-23. If we use Dembskian logic, then a small piece of quartz is so unlikely to form without intelligent agency that it will never happen in the entire universe over the entirety of its history, or something something.Zachriel
March 13, 2015
March
03
Mar
13
13
2015
04:04 AM
4
04
04
AM
PDT
Zachriel,
Box: what is this “ordering force” – or organizational force – that balances the 2nd law?
Zachriel: Work. - - - Actually, with regards to thermodynamics, it’s work that creates and maintains order. Nature is full of examples, e.g. a raindrop.
So it’s "work" that creates and maintains order - that balances (or overcomes) the 2nd law. What is this force that you term "work"? What exactly is this "work" that creates and maintains e.g. an organism?Box
March 13, 2015
March
03
Mar
13
13
2015
03:57 AM
3
03
57
AM
PDT
Box: Exactly!! One needs a “sufficient force working in the opposite direction to create lift”. Well … Similarly wrt the second law one needs a “sufficient force working in the opposite direction to create ORDER”! Actually, with regards to thermodynamics, it's work that creates and maintains order. Nature is full of examples, e.g. a raindrop. Box: what is this “ordering force” – or organizational force – that balances the 2nd law? Work. niwrad: Indeed at the beginning the OP speaks of *statistical* thermodynamics, and nowhere it contains the term “entropy”. Statistical thermodynamics yields the same answer as classical thermodynamics. They are related by the Boltzmann's equation, S = k log W. (ETA: W is the number of microstates, not work.) Keep in mind that thermodynamic entropy is a specific, well-defined, extensive, measurable property.Zachriel
March 13, 2015
March
03
Mar
13
13
2015
03:35 AM
3
03
35
AM
PDT
niward @ 152 You might want to delete your comment before a lot more people see it and embarrass you.[you may delete this comment too]Me_Think
March 13, 2015
March
03
Mar
13
13
2015
02:48 AM
2
02
48
AM
PDT
Zachriel
You might want to correct the original post, which concerns thermodynamic entropy so there is no confusion.
There is nothing to correct. Indeed at the beginning the OP speaks of *statistical* thermodynamics, and nowhere it contains the term "entropy". Entropy is a term you introduced to obfuscate.niwrad
March 13, 2015
March
03
Mar
13
13
2015
02:43 AM
2
02
43
AM
PDT
Please read the rest of my post.
Likewise, people who view cell division as NOT violating the 2LoT explain about the export of energy to a heat sink, and how that allows local decreases in entropy to occur.
The same applies to computers, spaceships, brains and libraries.DNA_Jock
March 12, 2015
March
03
Mar
12
12
2015
07:28 PM
7
07
28
PM
PDT
DNA_Jock: I rather like Eric’s gravity/planes analogy.
That's good to hear because I intend to use it a lot.
DNA_Jock: As Zachriel and keith s have noted, there is no need to “keep gravity at bay” for a plane to fly. Gravity is there, fully in force (sorry) throughout the flight.
Yep. And they both point out that a "sufficient force working in the opposite direction" is necessary to balance or overcome gravity. What - under materialism - is a "sufficient force" to balance (or overcome) the 2nd law - meant in its statistical sense - wrt e.g. computers, spaceships, brains and libraries?Box
March 12, 2015
March
03
Mar
12
12
2015
06:26 PM
6
06
26
PM
PDT
I rather like Eric's gravity/planes analogy. As Zachriel and keith s have noted, there is no need to "keep gravity at bay" for a plane to fly. Gravity is there, fully in force (sorry) throughout the flight. Furthermore, when confronted by a flight 'skeptic' who asks "What allows the airplane to stay aloft, given the law of gravity?", people who view flight as non-magical do NOT "just dismiss the question and with a rhetorical flick of the finger by saying, “Well, some things fly, so obviously the law of gravity doesn’t prevent things from flying. And so your question is bunk.” NO, people who view flight as non-magical patiently explain about upthrust, and the deflection of the air caused by the wings. Yes, fluid mechanics is tricky, and some questions might not have entirely satisfactory answers, but we are pretty confident that there is no reason to invoke flying-carpet-magic. Likewise, people who view cell division as NOT violating the 2LoT explain about the export of energy to a heat sink, and how that allows local decreases in entropy to occur. Finally, the analogy is apt because : IF gravity did NOT exist, then airplanes would be unable to fly (in a controlled manner). No gravity => No atmosphere => Nothing for the control surfaces to control => CRASH. Rockets, on the other hand, would work really well on very little fuel. :) In an analogous way, the 2LoT sets up the conditions that allow stable, far-from-equilibrium states to arise. 2LoT is equivalent to the arrow of time. Every single irreversible process that occurs, including the production of "computers, TV sets and telephones" occurs thanks to the 2LoT. Regarding
How could a stable far-from-equilibrium system arise under purely natural conditions, in light of the 2nd Law? And what enables such a system to continue to exist over long periods, given the 2nd Law?
These arise naturally all the time. Any mixture of oxygen and a flammable gas is such a system. You can mix hydrogen, methane etc with oxygen and nothing happens. It is quite stable until a spark or suitable catalyst is introduced, because there is a kinetic barrier that prevents the system from moving promptly to equilibrium. If it wasn't for such kinetic barriers (and the 2LoT) there wouldn't be life.
Ready to experiment, you're ready to be burned If it wasn't for some accidents then some would never ever learn
ETA: kinetic barriers are also the answer to Box's questionDNA_Jock
March 12, 2015
March
03
Mar
12
12
2015
06:02 PM
6
06
02
PM
PDT
1 3 4 5 6 7 10

Leave a Reply