Intelligent Design thermodynamics and information

At Quanta: Physicists Rewrite the Fundamental Law That Leads to Disorder

Spread the love

The second law of thermodynamics is among the most sacred in all of science, but it has always rested on 19th century arguments about probability. New arguments trace its true source to the flows of quantum information.

The connection between the 2nd Law of Thermodynamics and information is receiving further grounding through studies of quantum entanglement. The importance of this research is that the boundaries of what cannot happen naturally, as implied by the 2nd Law, are shifting away from the realm of probabilities into definite impossibilities.

In all of physical law, there’s arguably no principle more sacrosanct than the second law of thermodynamics — the notion that entropy, a measure of disorder, will always stay the same or increase. “If someone points out to you that your pet theory of the universe is in disagreement with Maxwell’s equations — then so much the worse for Maxwell’s equations,” wrote the British astrophysicist Arthur Eddington in his 1928 book The Nature of the Physical World. “If it is found to be contradicted by observation — well, these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation.” No violation of this law has ever been observed, nor is any expected.

Is the rise of entropy merely probabilistic, or can it be straightened out by use of clear quantum axioms?
Maggie Chiang for Quanta Magazine

But something about the second law troubles physicists. Some are not convinced that we understand it properly or that its foundations are firm. Although it’s called a law, it’s usually regarded as merely probabilistic: It stipulates that the outcome of any process will be the most probable one (which effectively means the outcome is inevitable given the numbers involved). Yet physicists don’t just want descriptions of what will probably happen. “We like laws of physics to be exact,” said the physicist Chiara Marletto of the University of Oxford. Can the second law be tightened up into more than just a statement of likelihoods?

A number of independent groups appear to have done just that. They may have woven the second law out of the fundamental principles of quantum mechanics — which, some suspect, have directionality and irreversibility built into them at the deepest level. According to this view, the second law comes about not because of classical probabilities but because of quantum effects such as entanglement. It arises from the ways in which quantum systems share information, and from cornerstone quantum principles that decree what is allowed to happen and what is not. In this telling, an increase in entropy is not just the most likely outcome of change. It is a logical consequence of the most fundamental resource that we know of — the quantum resource of information.

Many-particle systems that are more disordered and have higher entropy vastly outnumber ordered, lower-entropy states, so molecular interactions are much more likely to end up producing them. The second law seems then to be just about statistics: It’s a law of large numbers. In this view, there’s no fundamental reason why entropy can’t decrease — why, for example, all the air molecules in your room can’t congregate by chance in one corner. It’s just extremely unlikely.

Yet this probabilistic statistical physics leaves some questions hanging. It directs us toward the most probable microstates in a whole ensemble of possible states and forces us to be content with taking averages across that ensemble.

But the laws of classical physics are deterministic — they allow only a single outcome for any starting point. Where, then, can that hypothetical ensemble of states enter the picture at all, if only one outcome is ever possible?

David Deutsch, a physicist at Oxford, has  for several years been seeking to avoid this dilemma by developing a theory of (as he puts it) “a world in which probability and randomness are totally absent from physical processes.” His project, on which Marletto is now collaborating, is called constructor theory. It aims to establish not just which processes probably can and can’t happen, but which are possible and which are forbidden outright.

Recently, Marletto, working with the quantum theorist Vlatko Vedral at Oxford and colleagues in Italy, showed that constructor theory does identify processes that are irreversible in this sense — even though everything happens according to quantum mechanical laws that are themselves perfectly reversible. “We show that there are some transformations for which you can find a constructor for one direction but not the other,” she said.

The connection between quantum mechanics and thermodynamics is linked to the tendency of a quantum wave function of a system to evolve with time. The wave function changes in such a way as to increase the uncertainty (and decrease the information) an observer can have about the system at a later time

A pure state is one for which we know all there is to be known about it. But when two objects are entangled, you can’t fully specify one of them without knowing everything about the other too. The fact is that it’s easier to go from a pure quantum state to a mixed state than vice versa — because the information in the pure state gets spread out by entanglement and is hard to recover. It’s comparable to trying to re-form a droplet of ink once it has dispersed in water, a process in which the irreversibility is imposed by the second law.

So here the irreversibility is “just a consequence of the way the system dynamically evolves,” said Marletto. There’s no statistical aspect to it.

Read the full article here.

4 Replies to “At Quanta: Physicists Rewrite the Fundamental Law That Leads to Disorder

  1. 1
    kairosfocus says:

    Interesting though on Copenhagen isn’t probability baked into quantum theory?

  2. 2
    kairosfocus says:

    EH, I excerpt SEP, as a thought sparker:

    1. Quantum Mechanics as a Probability Calculus

    It is uncontroversial (though remarkable) that the formal apparatus of quantum mechanics reduces neatly to a generalization of classical probability in which the role played by a Boolean algebra of events in the latter is taken over by the “quantum logic” of projection operators on a Hilbert space.[1] Moreover, the usual statistical interpretation of quantum mechanics asks us to take this generalized quantum probability theory quite literally—that is, not as merely a formal analogue of its classical counterpart, but as a genuine doctrine of chances. In this section, I survey this quantum probability theory and its supporting quantum logic.[2]

    [For further background on Hilbert spaces, see the entry on quantum mechanics. For further background on ordered sets and lattices, see the supplementary document: The Basic Theory of Ordering Relations. Concepts and results explained these supplements will be used freely in what follows.]

    1.1 Quantum Probability in a Nutshell

    The quantum-probabilistic formalism, as developed by von Neumann [1932], assumes that each physical system is associated with a (separable) Hilbert space H
    , the unit vectors of which correspond to possible physical states of the system. Each “observable” real-valued random quantity is represented by a self-adjoint operator A on H, the spectrum of which is the set of possible values of A. If u is a unit vector in the domain of A, representing a state, then the expected value of the observable represented by A in this state is given by the inner product ?Au,u?. The observables represented by two operators A and B are commensurable iff A and B

    commute, i.e., AB = BA. (For further discussion, see the entry on quantum mechanics.)
    1.2 The “Logic” of Projections

    As stressed by von Neumann, the {0,1}
    -valued observables may be regarded as encoding propositions about—or, to use his phrasing, properties of—the state of the system. It is not difficult to show that a self-adjoint operator P with spectrum contained in the two-point set {0,1} must be a projection; i.e., P2=P. Such operators are in one-to-one correspondence with the closed subspaces of H. Indeed, if P is a projection, its range is closed, and any closed subspace is the range of a unique projection. If u is any unit vector, then ?Pu,u?=||Pu||2 is the expected value of the corresponding observable in the state represented by u. Since this observable is {0,1}-valued, we can interpret this expected value as the probability that a measurement of the observable will produce the “affirmative” answer 1. In particular, the affirmative answer will have probability 1 if and only if Pu = u; that is, u lies in the range of P

    . Von Neumann concludes that

    … the relation between the properties of a physical system on the one hand, and the projections on the other, makes possible a sort of logical calculus with these. However, in contrast to the concepts of ordinary logic, this system is extended by the concept of “simultaneous decidability” which is characteristic for quantum mechanics. (1932: 253)

    Let’s examine this “logical calculus” of projections. Ordered by set-inclusion, the closed subspaces of H
    form a complete lattice, in which the meet (greatest lower bound) of a set of subspaces is their intersection, while their join (least upper bound) is the closed span of their union.


  3. 3
    PaV says:

    The upshot of this theoretical thinking turns on something late in the article. Here it is:

    Quantum resource theories allow a kind of zooming in on the fine-grained details of the classical second law. We don’t need to think about huge numbers of particles; we can make statements about what is allowed among just a few of them. When we do this, said Yunger Halpern, it becomes clear that the classical second law (final entropy must be equal to or greater than initial entropy) is just a kind of coarse-grained sum of a whole family of inequality relationships. . . . . . .

    In other words, in resource theories there seem to be a whole bunch of mini-second laws.

    Instead of statistical ensembles, we now deal with these the ‘average’ of these “mini-second laws.” IOW, a finer level of averaging.


    The resource-theory approach, said physicist Markus Müller of the University of Vienna, “admits a fully mathematically rigorous derivation, without any conceptual or mathematical loose ends, of the thermodynamic laws and more. . . . a reconsideration of what one really means by thermodynamics” — it is not so much about the average properties of large ensembles of moving particles, but about a game that an agent plays against nature to conduct a task efficiently with the available resources.

    In the end, though, it is still about information. The discarding of information — or the inability to keep track of it — is really the reason why the second law holds, Yunger Halpern said.

  4. 4
    kairosfocus says:

    There is of course an informational school of thermodynamics, with Jaynes as a key voice.

Leave a Reply