Uncommon Descent Serving The Intelligent Design Community

Has a recent find brought us closer to understanding why time goes only one way?

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email
File:Wooden hourglass 3.jpg
passage of time, imaged/S. Sepp

Rob Sheldon comments. But first, in the interests of explaining time’s arrow, researchers go back to the Big Bang:

The researchers have discovered simple, so-called “universal” laws governing the initial stages of change in a variety of systems consisting of many particles that are far from thermal equilibrium. Their calculations indicate that these systems — examples include the hottest plasma ever produced on Earth and the coldest gas, and perhaps also the field of energy that theoretically filled the universe in its first split second — begin to evolve in time in a way described by the same handful of universal numbers, no matter what the systems consist of.

The findings suggest that the initial stages of thermalization play out in a way that’s very different from what comes later. In particular, far-from-equilibrium systems exhibit fractal-like behavior, which means they look very much the same at different spatial and temporal scales. Their properties are shifted only by a so-called “scaling exponent” — and scientists are discovering that these exponents are often simple numbers like 12 and −13. For example, particles’ speeds at one instant can be rescaled, according to the scaling exponent, to give the distribution of speeds at any time later or earlier. All kinds of quantum systems in various extreme starting conditions seem to fall into this fractal-like pattern, exhibiting universal scaling for a period of time before transitioning to standard thermalization.

Natalie Wolchover, “The Universal Law That Aims Time’s Arrow” at Quanta

The idea is that time’s arrow’s direction got set during a prescaling period.

Our physics color commentator Rob Sheldon offers:

People sometimes wonder why physicists think so much about the arrow of time. but I’ll try to guess at their motivations:

a) The usual suspects.

Physicists are an arrogant lot, and part of the schtick is sounding erudite, edgy, and incomprehensible. Time travel, wormholes, multiverses, mirror matter, and time going backward are all good candidates for impressive cocktail conversations. And strangely enough, the public eats it up. Having dispensed with religion and its priests, they are starved for something that sounds quasi-mathematical (to exclude the crystals and incense crowd), and cloak physicists with the vestments of authority. Physicists, for their part, rarely turn down the role.

b) The unusual suspects.

The myth and worldview that we and the universe are an accident (the marriage between the Big Bang and Darwin), has been hard to sustain recently. Too many events in the history of the universe seem designed or caused. If causation can be eliminated, then design can be sidestepped. One way to eliminate causation is to have time go backward. Hence there is a perverse attempt to do away with time, because it supports causation and a Designer. As an added bonus, doing away with causation also does away with morality, sin and guilt.

Hey, it might even lead to time travel, someone is sure to say.

The Long Ascent: Genesis 1â  11 in Science & Myth, Volume 1 by [Sheldon, Robert]

Rob Sheldon is the author of Genesis: The Long Ascent

See also: Would backwards time travel unravel spacetime?

Economist: Can time go backwards?

Astrobiologist: Why time travel can’t really work

Carlo Rovelli: Future time travel only a technological problem, not a scientific one. Rovelli: A starship could wait [near a black hole ] for half an hour and then move away from the black hole, and find itself millennia in the future.

Rob Sheldon’s thoughts on physicists’ “warped” view of time An attempt to force complete symmetry on a universe that does ot want to be completely symmetrical

At the BBC: Still working on that ol’ time machine… BBC: “But using wormholes for time travel won’t be straightforward.” Indeed not. Unless everything is absolutely determined, some wise person from the future has already gone back through a wormhole and altered the present so that we can’t go anywhere.

Is time travel a science-based idea? (2017)

Apparently, a wormhole is our best bet for a time machine (2013)

and

Does a Time Travel Simulation Resolve the “Grandfather Paradox”?

Follow UD News at Twitter!

Comments
PS: Walker and Davies are also relevant:
In physics, particularly in statistical mechanics, we base many of our calculations on the assumption of metric transitivity, which asserts that a system’s trajectory will eventually [--> given "enough time and search resources"] explore the entirety of its state space – thus everything that is phys-ically possible will eventually happen. It should then be trivially true that one could choose an arbitrary “final state” (e.g., a living organism) and “explain” it by evolving the system backwards in time choosing an appropriate state at some ’start’ time t_0 (fine-tuning the initial state). In the case of a chaotic system the initial state must be specified to arbitrarily high precision. But this account amounts to no more than saying that the world is as it is because it was as it was, and our current narrative therefore scarcely constitutes an explanation in the true scientific sense. We are left in a bit of a conundrum with respect to the problem of specifying the initial conditions necessary to explain our world. A key point is that if we require specialness in our initial state (such that we observe the current state of the world and not any other state) metric transitivity cannot hold true, as it blurs any dependency on initial conditions – that is, it makes little sense for us to single out any particular state as special by calling it the ’initial’ state. If we instead relax the assumption of metric transitivity (which seems more realistic for many real world physical systems – including life), then our phase space will consist of isolated pocket regions and it is not necessarily possible to get to any other physically possible state (see e.g. Fig. 1 for a cellular automata example).
[--> or, there may not be "enough" time and/or resources for the relevant exploration, i.e. we see the 500 - 1,000 bit complexity threshold at work vs 10^57 - 10^80 atoms with fast rxn rates at about 10^-13 to 10^-15 s leading to inability to explore more than a vanishingly small fraction on the gamut of Sol system or observed cosmos . . . the only actually, credibly observed cosmos]
Thus the initial state must be tuned to be in the region of phase space in which we find ourselves [--> notice, fine tuning], and there are regions of the configuration space our physical universe would be excluded from accessing, even if those states may be equally consistent and permissible under the microscopic laws of physics (starting from a different initial state). Thus according to the standard picture, we require special initial conditions to explain the complexity of the world, but also have a sense that we should not be on a particularly special trajectory to get here (or anywhere else) as it would be a sign of fine–tuning of the initial conditions. [ --> notice, the "loading"] Stated most simply, a potential problem with the way we currently formulate physics is that you can’t necessarily get everywhere from anywhere (see Walker [31] for discussion). ["The “Hard Problem” of Life," June 23, 2016, a discussion by Sara Imari Walker and Paul C.W. Davies at Arxiv.]
Food for thought.kairosfocus
August 10, 2019
August
08
Aug
10
10
2019
04:10 AM
4
04
10
AM
PDT
FF, a set of puzzles. Obviously, we inhabit what we call now, which changes in a successive, causally cumulative pattern. Thus we identify past, present (ever advancing), future. In that world, there are entities that while they may change, have a coherent identity: be-ings. As a part of change, energy flows and does so in a way that gradually dissipates its concentrations, hence time's arrow. This flows from the statistical properties of phase spaces for systems with large [typ 10^18 - 26 and up] numbers of particles. Thus we come to the overwhelming trend towards clusters of microstates with dominant statistical weight. We can typify by pondering a string of 500 to 1,000 coins and their H/T possibilities. If you want a more "physical" analogue, try a paramagnetic substance with as many elements in a weak B-field that specifies alignment with or against its direction. We then see that overwhelmingly near 50-50 in H/T or W/A dominates in a sharply peaked binomial distribution. And, if the pattern of changes is not mechanically forced and/or intelligently guided, the resulting bit patterns overwhelmingly will have no particularly meaningful or configuration-dependent functionally specific, information-rich configuration. Indeed, this is another way of putting the key design inference that on this sort of statistical analysis, functionally specific, complex organisation and/or associated information (FSCO/I, the relevant subset of CSI) is a highly reliable index of intelligently directed configuration as material causal factor. Where, too, discussion on such a binomial distribution is WLOG, as any state or configuration, in principle -- and using some description language and universal constructor to give effect -- can be specified as a sufficiently long bit string. Trivially, try AutoCAD and NC instructions for machines. As a result, we see that Orgel is right:
living organisms are distinguished by their specified complexity. Crystals are usually taken as the prototypes of simple well-specified structures, because they consist of a very large number of identical molecules packed together in a uniform way. Lumps of granite or random mixtures of polymers are examples of structures that are complex but not specified. The crystals fail to qualify as living because they lack complexity; the mixtures of polymers fail to qualify because they lack specificity . . . . [HT, Mung, fr. p. 190 & 196:] These vague idea can be made more precise by introducing the idea of information. Roughly speaking, the information content of a structure is the minimum number of instructions needed to specify the structure.
[--> this is of course equivalent to the string of yes/no questions required to specify the relevant J S Wicken "wiring diagram" for the set of functional states, T, in the much larger space of possible clumped or scattered configurations, W, as Dembski would go on to define in NFL in 2002, also cf here, -- here and -- here -- (with here on self-moved agents as designing causes).]
One can see intuitively that many instructions are needed to specify a complex structure. [--> so if the q's to be answered are Y/N, the chain length is an information measure that indicates complexity in bits . . . ] On the other hand a simple repeating structure can be specified in rather few instructions.  [--> do once and repeat over and over in a loop . . . ] Complex but random structures, by definition, need hardly be specified at all . . . . Paley was right to emphasize the need for special explanations of the existence of objects with high information content, for they cannot be formed in nonevolutionary, inorganic processes [--> Orgel had high hopes for what Chem evo and body-plan evo could do by way of info generation beyond the FSCO/I threshold, 500 - 1,000 bits.] [The Origins of Life (John Wiley, 1973), p. 189, p. 190, p. 196.]
Thus, we come to the heart of why the design inference on FSCO/I is so reliable: trillions of observed cases of its origin; uniformly, caused by intelligently directed configuration. So, the design inference has underpinnings in the informational view of statistical thermodynamics, on which entropy is seen as a metric of average missing information to specify particular microstate, given only the macro-observable state. Thus, too, degree of randomness of a particular configuration is strongly tied to required "minimum" description length. A truly random configuration essentially has to be quoted to describe it. A highly orderly (so, repetitive) one can be specified as: unit cell, repeated n times in some array. Information-rich descriptions are intermediate and are meaningful in some language: language is naturally somewhat compressible but is not reducible to repeating unit cells. Consequently, the design inference on FSCO/I is robust. We also recognise a space-time matrix and presence of fields so that there is a context of interactions of beings in the common world. However, simultaneity turns out to be an observer-relative phenomenon, energy and information face a speed limit, c. Further, massive bodies of planetary, stellar or galactic scale sufficiently warp the space-time fabric to have material effects. Along the way, the micro scale is quantised, leading also to a world of virtual particles below the Einstein energy-time uncertainty threshold that shapes field interactions and may have macro effects up to the level of cosmic inflation and even bubbles expanding into sub-cosmi. There are also issues of rationally free mind rising above what computing substrates can do as dynamic-stochastic systems, interacting with brains through quantum influences. Reality is complex yet somehow unified -- the first metaphysical problem contemplated as a philosophical general issue in our civilisation, the one and the many. And that in turn points to the importance of the world-root that springs up into the causal-temporal, successive, cumulative order that we inhabit. KFkairosfocus
August 10, 2019
August
08
Aug
10
10
2019
03:54 AM
3
03
54
AM
PDT
FourFaces@4 When will this crackpottery disappear from physics? Evolutionists cling to a crackpot theory and nothing seems to deter them. As long as physicists want to be science fiction writers, you will have physicists pushing for its existence.BobRyan
August 8, 2019
August
08
Aug
8
08
2019
11:26 PM
11
11
26
PM
PDT
It's always frustrating to read articles on time's arrow or time travel. In one camp, we have the Star Trek physics fanatics who believe in time travel in any direction. In the other camp, we have those who believe only in travel toward the future. But both camps are wrong. It is logically impossible for time to change at all, in any direction. We are always in the present, a continually changing present. This is easy to prove. Changing time is self-referential. Changing time (time travel) would require a velocity in time which would have to be given as v = dt/dt = 1, which is of course nonsense. That's it. This is the only proof you need. And this is the reason that nothing can move in spacetime and why Karl Popper called spacetime, "Einstein's block universe in which nothing happens." When will this crackpottery disappear from physics?FourFaces
August 8, 2019
August
08
Aug
8
08
2019
07:06 PM
7
07
06
PM
PDT
I would add to the motivation the desire to be philosophers, rather than physicists. Darwinists are easily drawn to the realm of fantasy.BobRyan
August 8, 2019
August
08
Aug
8
08
2019
05:22 AM
5
05
22
AM
PDT
Atheistic materialists have tried to get around the Quantum Zeno effect by postulating that interactions with the environment are sufficient to explain the Quantum Zeno effect.
Perspectives on the quantum Zeno paradox - 2018 Excerpt: The references to observations and to wavefunction collapse tend to raise unnecessary questions related to the interpretation of quantum mechanics. Actually, all that is required is that some interaction with an external system disturb the unitary evolution of the quantum system in a way that is effectively like a projection operator. https://iopscience.iop.org/article/10.1088/1742-6596/196/1/012018/pdf
Yet, the following interaction-free measurement of the Quantum Zeno effect demonstrated that the presence of the Quantum Zeno effect can be detected without interacting with a single atom.
Interaction-free measurements by quantum Zeno stabilization of ultracold atoms – 14 April 2015 Excerpt: In our experiments, we employ an ultracold gas in an unstable spin configuration, which can undergo a rapid decay. The object—realized by a laser beam—prevents this decay because of the indirect quantum Zeno effect and thus, its presence can be detected without interacting with a single atom. http://www.nature.com/ncomms/2015/150414/ncomms7811/full/ncomms7811.html?WT.ec_id=NCOMMS-20150415
In short, the quantum zeno effect, regardless of how atheistic materialists may feel about it, is experimentally shown to be a real effect that is not reducible to any materialistic explanation. And thus the original wikipedia statement of, “an unstable particle, if observed continuously, will never decay”, stands as being a true statement. In fact, and in order to further falsify the atheist’s belief that the observer is unnecessary to explain the quantum zeno effect, advances in quantum information theory have now shown that, as the following article states, “an object's entropy is always dependent on the observer”.
Quantum knowledge cools computers: New understanding of entropy - June 1, 2011 Excerpt: Recent research by a team of physicists,,, describe,,, how the deletion of data, under certain conditions, can create a cooling effect instead of generating heat. The cooling effect appears when the strange quantum phenomenon of entanglement is invoked.,,, The new study revisits Landauer's principle for cases when the values of the bits to be deleted may be known. When the memory content is known, it should be possible to delete the bits in such a manner that it is theoretically possible to re-create them. It has previously been shown that such reversible deletion would generate no heat. In the new paper, the researchers go a step further. They show that when the bits to be deleted are quantum-mechanically entangled with the state of an observer, then the observer could even withdraw heat from the system while deleting the bits. Entanglement links the observer's state to that of the computer in such a way that they know more about the memory than is possible in classical physics.,,, In measuring entropy, one should bear in mind that an object does not have a certain amount of entropy per se, instead an object's entropy is always dependent on the observer. Applied to the example of deleting data, this means that if two individuals delete data in a memory and one has more knowledge of this data, she perceives the memory to have lower entropy and can then delete the memory using less energy.,,, No heat, even a cooling effect; In the case of perfect classical knowledge of a computer memory (zero entropy), deletion of the data requires in theory no energy at all. The researchers prove that "more than complete knowledge" from quantum entanglement with the memory (negative entropy) leads to deletion of the data being accompanied by removal of heat from the computer and its release as usable energy. This is the physical meaning of negative entropy. Renner emphasizes, however, "This doesn't mean that we can develop a perpetual motion machine." The data can only be deleted once, so there is no possibility to continue to generate energy. The process also destroys the entanglement, and it would take an input of energy to reset the system to its starting state. The equations are consistent with what's known as the second law of thermodynamics: the idea that the entropy of the universe can never decrease. Vedral says "We're working on the edge of the second law. If you go any further, you will break it." http://www.sciencedaily.com/releases/2011/06/110601134300.htm
And as the following 2017 article states: James Clerk Maxwell (said), “The idea of dissipation of energy depends on the extent of our knowledge.”,,, quantum information theory,,, describes the spread of information through quantum systems.,,, Fifteen years ago, “we thought of entropy as a property of a thermodynamic system,” he said. “Now in (quantum) information theory, we wouldn’t say entropy is a property of a system, but a property of an observer who describes a system.”,,,
The Quantum Thermodynamics Revolution – May 2017 Excerpt: the 19th-century physicist James Clerk Maxwell put it, “The idea of dissipation of energy depends on the extent of our knowledge.” In recent years, a revolutionary understanding of thermodynamics has emerged that explains this subjectivity using quantum information theory — “a toddler among physical theories,” as del Rio and co-authors put it, that describes the spread of information through quantum systems. Just as thermodynamics initially grew out of trying to improve steam engines, today’s thermodynamicists are mulling over the workings of quantum machines. Shrinking technology — a single-ion engine and three-atom fridge were both experimentally realized for the first time within the past year — is forcing them to extend thermodynamics to the quantum realm, where notions like temperature and work lose their usual meanings, and the classical laws don’t necessarily apply. They’ve found new, quantum versions of the laws that scale up to the originals. Rewriting the theory from the bottom up has led experts to recast its basic concepts in terms of its subjective nature, and to unravel the deep and often surprising relationship between energy and information — the abstract 1s and 0s by which physical states are distinguished and knowledge is measured.,,, Renato Renner, a professor at ETH Zurich in Switzerland, described this as a radical shift in perspective. Fifteen years ago, “we thought of entropy as a property of a thermodynamic system,” he said. “Now in (quantum) information theory, we wouldn’t say entropy is a property of a system, but a property of an observer who describes a system.”,,, https://www.quantamagazine.org/quantum-thermodynamics-revolution/
Again to repeat that last sentence, “Now in (quantum) information theory, we wouldn’t say entropy is a property of a system, but a property of an observer who describes a system.”,,, This statement is just fascinating! Why in blue blazes should the finely tuned entropic actions of the universe, entropic actions which happen to explain the origin of time itself, even care if I am consciously observing them unless 'the observer’ really is more foundational to reality than the finely tuned 1 in 10^10^123 entropy of the universe is? To state the obvious, this finding of entropy being “a property of an observer who describes a system.” is very friendly to a Mind First, and/or to a Theistic view of reality. For instance,,,
Romans 8:20-21 For the creation was subjected to frustration, not by its own choice, but by the will of the one who subjected it, in hope that the creation itself will be liberated from its bondage to decay and brought into the glorious freedom of the children of God. “We have the sober scientific certainty that the heavens and earth shall ‘wax old as doth a garment’…. Dark indeed would be the prospects of the human race if unilluminated by that light which reveals ‘new heavens and a new earth.’” Sir William Thomson, Lord Kelvin (1824 – 1907) – pioneer in many different fields, particularly electromagnetism and thermodynamics. Psalm 102:25-27 Of old You laid the foundation of the earth, And the heavens are the work of Your hands. They will perish, but You will endure; Yes, they will all grow old like a garment; Like a cloak You will change them, And they will be changed. But You are the same, And Your years will have no end.
Thus, all in all, this finding that even the supposed undirected random aftermath of particle collisions are instead found to be governed by a ‘few numbers’ that reflect 'structure and simplicity', as well as the recent finding in quantum information theory, (i.e.“Now in (quantum) information theory, we wouldn’t say entropy is a property of a system, but a property of an observer who describes a system.”),,, those recent findings in science should give the atheist, who presupposes undirected randomness to be the creator of all things, severe pause for his belief in undirected randomness. Indeed, if an atheist were ever inclined to be honest with himself, these recent findings should be more that enough cause for the atheist to abandon his worldview altogether and to become a Theist, even to become a Christian Theist.bornagain77
August 8, 2019
August
08
Aug
8
08
2019
04:30 AM
4
04
30
AM
PDT
as to:
The Universal Law That Aims Time’s Arrow Excerpt: The ideas about the early universe aren’t easily testable. But around 2012, the researchers realized that a far-from-equilibrium scenario also arises in experiments — namely, when heavy atomic nuclei are smashed together at nearly the speed of light in the Relativistic Heavy Ion Collider in New York and in Europe’s Large Hadron Collider. These nuclear collisions create extreme configurations of matter and energy, which then start to relax toward equilibrium. You might think the collisions would produce a complicated mess. But when Berges and his colleagues analyzed the collisions theoretically, they found structure and simplicity. The dynamics, Berges said, “can be encoded in a few numbers.” The pattern continued. Around 2015, after talking to experimentalists who were probing ultracold atomic gases in the lab, Berges, Gasenzer and other theorists calculated that these systems should also exhibit universal scaling after being rapidly cooled to conditions extremely far from equilibrium. Last fall, two groups — one led by Markus Oberthaler of Heidelberg and the other by Jörg Schmiedmayer of the Vienna Center for Quantum Science and Technology — reported simultaneously in Nature that they had observed fractal-like universal scaling in the way various properties of the 100,000-or-so atoms in their gases changed over space and time. “Again, simplicity occurs,” said Berges, who was one of the first to predict the phenomenon in such systems. “You can see that the dynamics can be described by a few scaling exponents and universal scaling functions. And some of them turned out to be the same as what was predicted for particles in the early universe. That’s the universality.” The researchers now believe that the universal scaling phenomenon occurs at the nanokelvin scale of ultracold atoms, the 10-trillion-kelvin scale of nuclear collisions, and the 10,000-trillion-trillion-kelvin scale of the early universe. “That’s the point of universality — that you can expect to see these phenomena on different energy and length scales,” Berges said.,,, Prescaling suggests that when a system first evolves from its initial, far-from-equilibrium condition, scaling exponents don’t yet perfectly describe it. ,,, If this idea is borne out by future experiments, prescaling may be the nocking of time’s arrow onto the bowstring. https://www.quantamagazine.org/the-universal-law-that-aims-times-arrow-20190801/
So even the supposed random aftermath of particle collisions are instead found to be governed by a ‘few numbers’ that reflect structure and simplicity To repeat:
You might think the collisions would produce a complicated mess. But when Berges and his colleagues analyzed the collisions theoretically, they found structure and simplicity. The dynamics, Berges said, “can be encoded in a few numbers.”
As well in the article they admitted that they don’t know where these ‘few numbers’ that govern the seemingly ‘complicated mess’ of particle collisions, etc.., came from. To repeat:
“Prescaling suggests that when a system first evolves from its initial, far-from-equilibrium condition, scaling exponents don’t yet perfectly describe it. ,,, If this idea is borne out by future experiments, prescaling may be the nocking of time’s arrow onto the bowstring.”
Exactly "Who" nocked “time’s arrow onto the bowstring.” ? The fact that even the aftermath of particle collisions are not random, but are governed by ‘a few numbers’ that reflect ‘structure and simplicity’, since atheists presuppose that undirected randomness is the creator of all things, should, if atheists were inclined to ever question the validity of their worldview, give the atheist severe pause to believe his undiected randomness postulate to be true. This finding reminds me of the following. Ludwig Boltzmann, an atheist, when he linked entropy and probability, did not, as Max Planck a Christian Theist points out in the following link, think to look for a constant for entropy:
The Austrian physicist Ludwig Boltzmann first linked entropy and probability in 1877. However, the equation as shown, involving a specific constant, was first written down by Max Planck, the father of quantum mechanics in 1900. In his 1918 Nobel Prize lecture, Planck said:This constant is often referred to as Boltzmann's constant, although, to my knowledge, Boltzmann himself never introduced it – a peculiar state of affairs, which can be explained by the fact that Boltzmann, as appears from his occasional utterances, never gave thought to the possibility of carrying out an exact measurement of the constant. Nothing can better illustrate the positive and hectic pace of progress which the art of experimenters has made over the past twenty years, than the fact that since that time, not only one, but a great number of methods have been discovered for measuring the mass of a molecule with practically the same accuracy as that attained for a planet. http://www.daviddarling.info/encyclopedia/B/Boltzmann_equation.html
I hold that the primary reason why Boltzmann, an atheist, never thought to carry out, or even propose, a precise measurement for the constant on entropy is that he, as an atheist, had thought he had arrived at the ultimate ‘random’ explanation for how everything in the universe operates when he had link probability with entropy. i.e. In linking entropy with probability, Boltzmann, again an atheist, thought he had explained everything that happens in the universe to a ‘random’ chance basis. To him, as an atheist, I hold that it would simply be unfathomable for him to conceive that even the ‘random chance’ (probabilistic) events of entropy in the universe should ever be constrained by a constant that would limit the effects of ‘random’ entropic events of the universe. Whereas on the contrary, to a Christian Theist such as Planck, it is expected that even these seemingly random entropic events of the universe should be bounded by a constant. In fact modern science was born precisely out of such thinking:
‘Men became scientific because they expected Law in Nature, and they expected Law in Nature because they believed in a Legislator. In most modern scientists this belief has died: it will be interesting to see how long their confidence in uniformity survives it. Two significant developments have already appeared—the hypothesis of a lawless sub-nature, and the surrender of the claim that science is true.’ Lewis, C.S., Miracles: a preliminary study, Collins, London, p. 110, 1947.
As to the initial entropy of the universe, (which the researchers basically conceded that they don’t know how the initial state of entropy for the universe was arrived at), entropy is, by a wide margin, the most finely tuned of the initial conditions of the Big Bang. Finely tuned to an almost incomprehensible degree of precision, 1 part in 10 to the 10 to the 123rd power. As Roger Penrose himself stated that, “This now tells us how precise the Creator’s aim must have been: namely to an accuracy of one part in 10^10^123.”
“This now tells us how precise the Creator’s aim must have been: namely to an accuracy of one part in 10^10^123.” Roger Penrose - How special was the big bang? – (from the Emperor’s New Mind, Penrose, pp 339-345 – 1989) “The time-asymmetry is fundamentally connected to with the Second Law of Thermodynamics: indeed, the extraordinarily special nature (to a greater precision than about 1 in 10^10^123, in terms of phase-space volume) can be identified as the “source” of the Second Law (Entropy).” Roger Penrose - The Physics of the Small and Large: What is the Bridge Between Them?
In the following video, Dr, Bruce Gordon touches upon just how enormous that number truly is. Dr. Gordon states, “you would need a hundred million, trillion, trillion, trillion, universes our size, with a zero on every proton and neutron in all of those universes just to write out this number. That is how fine tuned the initial entropy of our universe is.”
“An explosion you think of as kind of a messy event. And this is the point about entropy. The explosion in which our universe began was not a messy event. And if you talk about how messy it could have been, this is what the Penrose calculation is all about essentially. It looks at the observed statistical entropy in our universe. The entropy per baryon. And he calculates that out and he arrives at a certain figure. And then he calculates using the Bekenstein-Hawking formula for Black-Hole entropy what the,,, (what sort of entropy could have been associated with,,, the singularity that would have constituted the beginning of the universe). So you've got the numerator, the observed entropy, and the denominator, how big it (the entropy) could have been. And that fraction turns out to be,, 1 over 10 to the 10 to the 123rd power. Let me just emphasize how big that denominator is so you can gain a real appreciation for how small that probability is. So there are 10^80th baryons in the universe. Protons and neutrons. No suppose we put a zero on every one of those. OK, how many zeros is that? That is 10^80th zeros. This number has 10^123rd zeros. OK, so you would need a hundred million, trillion, trillion, trillion, universes our size, with zero on every proton and neutron in all of those universes just to write out this number. That is how fine tuned the initial entropy of our universe is. And if there were a pre-Big Bang state and you had some bounces, then that fine tuning (for entropy) gets even finer as you go backwards if you can even imagine such a thing. ” Dr Bruce Gordon - Contemporary Physics and God Part 2 - video – 1:50 minute mark - video https://youtu.be/ff_sNyGNSko?t=110
Moreover, it is said that “Entropy explains time; it explains every possible action in the universe;,,”,, “Even gravity,,,, can be expressed as a consequence of the law of entropy.,,,”
Shining Light on Dark Energy – October 21, 2012 Excerpt: It (Entropy) explains time; it explains every possible action in the universe;,, Even gravity, Vedral argued, can be expressed as a consequence of the law of entropy.,,, The principles of thermodynamics are at their roots all to do with information theory. Information theory is simply an embodiment of how we interact with the universe —,,, http://crev.info/2012/10/shining-light-on-dark-energy/
In fact, entropy is also the primary reason why our own material, temporal, bodies grow old and eventually die in this universe,,,
Entropy Explains Aging, Genetic Determinism Explains Longevity, and Undefined Terminology Explains Misunderstanding Both - 2007 Excerpt: There is a huge body of knowledge supporting the belief that age changes are characterized by increasing entropy, which results in the random loss of molecular fidelity, and accumulates to slowly overwhelm maintenance systems [1–4].,,, http://www.plosgenetics.org/article/info%3Adoi/10.1371/journal.pgen.0030220 Dr. John Sanford Lecture at NIH (National Institute of Health): Genetic Entropy - Can Genome Degradation be Stopped? https://www.youtube.com/watch?v=2Mfn2upw-O8
And yet, although entropy "explains every possible action in the universe" and is the primary reason out material bodies grow old and die in this universe, it is interesting to note the Quantum Zeno effect's very counter-intuitive relation to entropy. An old entry in wikipedia described the Quantum Zeno effect as such “an unstable particle, if observed continuously, will never decay.”
Perspectives on the quantum Zeno paradox - 2018 The quantum Zeno effect is,, an unstable particle, if observed continuously, will never decay. https://iopscience.iop.org/article/10.1088/1742-6596/196/1/012018/pdf
Likewise, the present day entry on wikipedia about the Quantum Zeno effect also provocatively states that "a system can't change while you are watching it"
Quantum Zeno effect Excerpt: Sometimes this effect is interpreted as "a system can't change while you are watching it" https://en.wikipedia.org/wiki/Quantum_Zeno_effect
And here is a fairly recent experiment which verified the ‘Quantum Zeno Effect’:
'Zeno effect' verified—atoms won't move while you watch - October 23, 2015 Excerpt: Graduate students,, created and cooled a gas of about a billion Rubidium atoms inside a vacuum chamber and suspended the mass between laser beams.,,, In that state the atoms arrange in an orderly lattice just as they would in a crystalline solid.,But at such low temperatures, the atoms can "tunnel" from place to place in the lattice.,,, The researchers demonstrated that they were able to suppress quantum tunneling merely by observing the atoms.,,, The researchers observed the atoms under a microscope by illuminating them with a separate imaging laser. A light microscope can't see individual atoms, but the imaging laser causes them to fluoresce, and the microscope captured the flashes of light. When the imaging laser was off, or turned on only dimly, the atoms tunneled freely. But as the imaging beam was made brighter and measurements made more frequently, the tunneling reduced dramatically.,,, The experiments were made possible by the group's invention of a novel imaging technique that made it possible to observe ultracold atoms while leaving them in the same quantum state.,,, The popular press has drawn a parallel of this work with the "weeping angels" depicted in the Dr. Who television series – alien creatures who look like statues and can't move as long as you're looking at them. There may be some sense to that. In the quantum world, the folk wisdom really is true: "A watched pot never boils." http://phys.org/news/2015-10-zeno-effect-verifiedatoms-wont.html
bornagain77
August 8, 2019
August
08
Aug
8
08
2019
04:29 AM
4
04
29
AM
PDT

Leave a Reply