Astronomy News

Universe is accelerating – not so fast?

Spread the love

From Eurekalert:

The team, led by UA astronomer Peter A. Milne, discovered that type Ia supernovae, which have been considered so uniform that cosmologists have used them as cosmic “beacons” to plumb the depths of the universe, actually fall into different populations. The findings are analogous to sampling a selection of 100-watt light bulbs at the hardware store and discovering that they vary in brightness.

“We found that the differences are not random, but lead to separating Ia supernovae into two groups, where the group that is in the minority near us are in the majority at large distances — and thus when the universe was younger,” said Milne, an associate astronomer with the UA’s Department of Astronomy and Steward Observatory. “There are different populations out there, and they have not been recognized. The big assumption has been that as you go from near to far, type Ia supernovae are the same. That doesn’t appear to be the case.”

The discovery casts new light on the currently accepted view of the universe expanding at a faster and faster rate, pulled apart by a poorly understood force called dark energy. This view is based on observations that resulted in the 2011 Nobel Prize for Physics awarded to three scientists, including UA alumnus Brian P. Schmidt. More.

Rob Sheldon kindly writes to say,

The 2011 Nobel prize in physics was awarded to Perlmutter, Schmidt and Riess for “proving” that the universe was accelerating. Since the Big Bang happened 13.7 billion years ago, the only thing that might account for this acceleration was a larger volume. Another way to say it is that if spacetime has a pressure then the bigger it got the more it expanded. Yet another way to view this term is as a “anti-gravity” term to balance gravity. That’s why Einstein put it into his equation–because he wanted the universe to be unchanging, eternal and static. DeSitter proved that this was unstable to collapsing at the slightest perturbation, but Hubble showed that the galaxies were moving away from each other–e.g., not static at all. Einstein removed the term, supposedly calling it “his greatest mistake.” But by the 1970’s, theorists were putting it back in. For one thing, the galaxies were distributed unevenly throughout the sky like a lace tablecloth, and no one knew why there were these gigantic holes or voids with no galaxies. Only by putting back in some “anti-gravity” was it possible to simulate the voids in the cosmology models. Therefore when Perlmutter and colleagues used the “calibrated” Type Ia supernova to estimate distances–since every SNIa was supposed to be the same intrinsic brightness–then they could map distance (red shift) with brightness, and see what the universe had been doing. Sure enough, the most distant, red-shifted galaxies were also the faintest by <1% or so, and Perlmutter argued that this could only be caused by the acceleration of the universe and anti-gravity.

Now many people asked. What about dust? How about variations in SNIa brightness? How about magnetic field? Maybe stars were less “metallic” in the distant past and this made them fainter? “No,” said Perlmutter, “we took all that into account and it doesn’t explain it. The only alternative is anti-gravity.”

This is argument by exhaustion. This is why Global warming is true, because “We took into account all other natural forms of warming, so the remainder must be anthropogenic.” This is why Darwin is true, “We took into account all other explanations of change over time, and the remainder must be evolution.” This is why there are no man-eating tigers in my town, because I have a lucky charm that prevents them from coming within 10 miles of me.

Anyway, after awarding the Nobel prize about ten years after Perlmutter’s paper was peer reviewed, we now have evidence that all SNIa are not created equal, but some are intrinsically more bright than others, and the ones that he was looking at were intrinsically fainter.

Thoughts?

File:A small cup of coffee.JPG Note: Man-eating tigers in the Ottawa, Canada region had better have several really heavy winter coats of hair. Nothing else will save them.

See also: Dark Energy Mission

Follow UD News at Twitter!

8 Replies to “Universe is accelerating – not so fast?

  1. 1
    bornagain77 says:

    I wonder if Dr. Sheldon would be willing to comment on what effect, if any, this finding will have on the 1 in 10^120 fine-tuning of the cosmological constant

    Hugh Ross PhD. – Scientific Evidence For Cosmological Constant (1 in 10^120 Expansion Of The Universe)
    http://www.metacafe.com/watch/4347218/

    Here are the 9 lines of evidence that Dr. Ross mentioned in the preceding video

    Accumulating Evidence for Dark Energy and Supernatural Design – 2011
    Excerpt: I (Hugh Ross) often refer to nine different lines of observational evidence that establish dark energy’s reality and dominance in my talks. These nine are:
    1.radial velocities of type Ia supernovae;
    2.WMAP of the cosmic microwave background radiation (CMBR);
    3.ground-based measures of the CMBR;
    4.Sloan Digital Sky Survey of galaxies and galaxy clusters;
    5.Two-Degree Field Survey of galaxies;
    6.gravitational lens measurements of distant galaxies and quasars;
    7.distributions of radio galaxies;
    8.galaxy velocity distributions; and
    9.x-ray emissions from galaxy clusters.

    In the last several years, astronomers have added seven additional lines of observational evidence confirming the reality of the finely tuned cosmological constant, bringing the total to sixteen. These seven are:

    10.Lyman-alpha forest measurements;
    11.polarization measures of the cosmic microwave background radiation;
    12.stellar ages;
    13.cosmic inhomogeneities;
    14.gamma-ray bursts;
    15.evolution of galaxy clustering; and
    16.galaxy cluster angular size measurements.
    http://www.reasons.org/article.....y-articles

    Here is the paper from the atheistic astrophysicists, that Dr. Ross referenced in the preceding video, that speaks of the ‘disturbing implications’ of the finely tuned expanding universe (1 in 10^120 cosmological constant):

    Disturbing Implications of a Cosmological Constant – Dyson, Kleban, Susskind (each are self proclaimed atheists) – 2002
    Excerpt: “Arranging the universe as we think it is arranged would have required a miracle.,,,”
    “The question then is whether the origin of the universe can be a naturally occurring fluctuation, or must it be due to an external agent which starts the system out in a specific low entropy state?”
    page 19: “A unknown agent [external to time and space] intervened [in cosmic history] for reasons of its own.,,,”
    Page 21 “The only reasonable conclusion is that we don’t live in a universe with a true cosmological constant”.
    http://arxiv.org/pdf/hep-th/0208013.pdf

    Besides the evidence that Dr. Ross listed for the 1 in 10^120 finely tuned expansion of the universe, this following paper clearly indicates that we do live in universe with a ‘true cosmological constant’. A cosmological constant that is not reducible to a materialistic basis. Thus, the atheistic astrophysicists are at a complete loss to explain why the universe expands in such a finely tuned way, whereas Theists are vindicated once again in their beliefs that the universal constants are truly transcendent of any possible materialistic explanation!

    Dark energy alternatives to Einstein are running out of room – January 9, 2013
    Excerpt: Last month, a group of European astronomers, using a massive radio telescope in Germany, made the most accurate measurement of the proton-to-electron mass ratio ever accomplished and found that there has been no change in the ratio to one part in 10 million at a time when the universe was about half its current age, around 7 billion years ago. When Thompson put this new measurement into his calculations, he found that it excluded almost all of the dark energy models using the commonly expected values or parameters.
    If the parameter space or range of values is equated to a football field, then almost the whole field is out of bounds except for a single 2-inch by 2-inch patch at one corner of the field. In fact, most of the allowed values are not even on the field. “In effect, the dark energy theories have been playing on the wrong field,” Thompson said. “The 2-inch square does contain the area that corresponds to no change in the fundamental constants, (a ‘true cosmological constant’), and that is exactly where Einstein stands.”
    http://phys.org/news/2013-01-d.....-room.html

    Also of note, in the following paper it is found that the proton-electron mass ratio does not vary in strong gravitational fields either.:

    Physical constant is constant even in strong gravitational fields – Sep 19, 2014
    Excerpt: An international team of physicists has shown that the mass ratio between protons and electrons is the same in weak and in very strong gravitational fields.,,,
    The idea that the laws of physics and its fundamental constants do not depend on local circumstances is called the equivalence principle. This principle is a cornerstone to Einstein’s theory of general relativity.,,,
    The researchers compared the proton-electron mass ratio near the surface of a white dwarf star to the mass ratio in a laboratory on Earth. White dwarfs stars, which are in a late stage of their life cycle, have collapsed to less than 1% of their original size. The gravitational field at the surface of these stars is therefore much larger than that on earth, by a factor of 10,000. The physicists concluded that even these strong gravitational conditions, the proton-electron mass ratio is the same within a margin of 0.005%. In both cases, the proton mass is 1836.152672 times as big as the electron mass.,,,
    http://phys.org/news/2014-09-p.....ields.html

    Here are the verses from the Bible which Dr. Ross listed, which were written well over 2000 years before the discovery of the finely tuned expansion of the universe, that speak of God ‘Stretching out the Heavens’; Job 9:8; Isaiah 40:22; Isaiah 44:24; Isaiah 48:13; Zechariah 12:1; Psalm 104:2; Isaiah 42:5; Isaiah 45:12; Isaiah 51:13; Jeremiah 51:15; Jeremiah 10:12. The following verse is my favorite out of the group of verses:

    Job 9:8
    He alone stretches out the heavens and treads on the waves of the sea.

    The Truman Show – Truman walking on water – screenshot picture
    http://gaowsh.files.wordpress......0-pm-2.jpg

  2. 2
    ppolish says:

    Less dark energy (ie smaller cosmo constant), more dark matter, same normal matter. Although I’ve put on a couple pounds.

  3. 3

    BA@1
    Hugh Ross, before he started his apologetics group, was a post-doc in astronomy at CalTech. Despite his conversion at a church there, astronomy was his first love, and we all know how significant that is. So I would assume that much of his astronomical statements come from identifying and approving of the “standard science” of the astronomy community. That is, one can find outliers like Sir Fred Hoyle or the Burbidges, but these are not the people Ross identifies with. For whatever reason, I’m an eternal outsider, so frankly I identify more with Sir Fred than, say, with Baron Martin Rees. (Even Hoyle saw himself as a quixotic knight compared to the establishment Rees.)

    So that addresses the psychology of why Ross might prefer the “standard” model of Lambda-CDM. If we were discussing the physics, I would argue that the current spate of computer models have far too many adjustable dials in them to be metaphysically convincing. For reasons that Peter Woit discusses in his book “Not Even Wrong”, the various branches of physics have had far too much money thrown at hastily constructed models. Unlike the Manhattan Project, which also had a lot of money thrown at it, the current crop of models has little data, and nothing as momentous as winning a war with Japan to motivate it. As a consequence, physicists are swayed by prestige and grant money to buy into an establishment “model”, which often has little in the way of data to support it. In case you think I’m exaggerating, Global Climate Models have recieved $1 billion a year for the past 20 years, gravitational waves (LIGO) is a $1bn set of observatories now trying to get funding for a satellite, and CERN just spent $10bn finding the Higgs boson (a booby prize for not finding SUSY particles).

    With those kinds of money pots, it is no wonder that science gets derailed, and I am 99.9% sure that this has happened to the Lambda-CDM “standard” cosmology model.

    So what do I make of the 10^120 mistake in estimating dark energy? Well, since I don’t believe in dark energy, the mistake is not quantitative, but qualitative; it’s not a gazillion percent wrong, it’s infinitely wrong. Or to borrow Pauli’s phrase, “it isn’t even wrong”.

  4. 4
    bornagain77 says:

    as to my question in post 1:

    As long as the universe is accelerating at all, it appears that this finding will have no effect on the 1 in 10^120 fine-tuning for the cosmological constant, since the accelerating expansion of the universe only ‘implies’ (i.e. empirically verifies) a slight positive value for the cosmological constant and is not the means by which the fine-tuning is derived.

    What is the cosmological constant paradox, and what is its significance? David H. Bailey – 1 Jan 2015
    Excerpt: Perhaps the most startling “cosmic coincidence” that modern scientists have noted in the structure of our universe is the fine-tuning of the cosmological constant [Vilenkin2006, pg. 121-126]. The paradox derives from the fact that when one calculates, based on known principles of quantum mechanics, the “zero-point mass density'” or the “vacuum energy density” of the universe, focusing for the time being on the electromagnetic force, one obtains the incredible result that empty space “weighs” 10^93 grams per cc. The actual average mass density of the universe is 10-28 grams per cc [Susskind2005, pg. 70-78], which is roughly 120 orders of magnitude lower than the predicted value. As Stephen Hawking has quipped, this is arguably the most spectacular failure of a physical theory in history [Davies2007, pg. 147]. The “cosmological constant'” of Einstein’s general relativity equations is linearly related to the zero-point mass density. Einstein originally posited a nonzero value for the cosmological constant, but after the expansion of the universe was discovered, he lamented that this was his greatest blunder and set the constant to zero [Davies2007, pg. 58].
    Physicists, who have fretted over this paradox for decades, have noted that calculations such as the above involve only the electromagnetic force, and so perhaps when the contributions of the other known forces are included (bosons give rise to positive terms, whereas fermions give rise to negative terms), all terms will cancel out to exactly zero, as a consequence of some unknown, yet-to-be-discovered fundamental principle of physics. When “supersymmetry” was theorized in the 1970s, it was thought that it would meet this requirement, but when it was later discovered that our universe is not precisely supersymmetric, this explanation was abandoned. In any event, until recently physicists remained hopeful that some yet-to-be-discovered principle would imply that the positive and negative terms of the zero-point mass density (and thus the cosmological constant) precisely cancel out to zero.
    These hopes were shattered with the 1998 discovery that the expansion of the universe is accelerating, which implies that the cosmological constant (and the zero-point mass density) must be slightly nonzero. This “dark energy,” which is the unknown force accelerating the universe, also appears to be just what is needed to fill the 70% “missing mass” of the universe, namely the mass needed to explain the observed fact that space is very nearly flat (i.e., locally it appears to be almost perfectly rectilinear) [Panek2011]. But this means that physicists are left to explain the startling fact that the positive and negative contributions to the cosmological constant cancel to 120-digit accuracy, yet fail to cancel beginning at the 121-st digit. This is an even stranger paradox! Curiously, this observation is in accord with a prediction made by physicist Steven Weinberg in 1987, who argued from basic principles that the cosmological constant must be zero to within one part in roughly 10^120, or else the universe either would have dispersed too fast for stars and galaxies to have formed, or else would have recollapsed upon itself long ago [Susskind2005, pg. 80-82].,,,
    In short, the recent discovery of the accelerating expansion of the universe and the implied slightly positive value of the cosmological constant constitutes, in the words of physicist Leonard Susskind (who is an atheist), a “cataclysm,” a “stunning reversal of fortunes” [Susskind2005, pg., 22, 154]. It is literally shaking the entire field of theoretical physics, astronomy and cosmology to its foundations.,,,
    http://www.sciencemeetsreligio.....nstant.php

  5. 5
    bornagain77 says:

    Thanks Dr. Sheldon for your answer at 3. As you can see from my reference at 4, (while I don’t personally believe in ‘Dark Energy’ either), I believe the 1 in 10^120 fine-tuning holds for the cosmological constant as long as any acceleration of the expansion of the universe is confirmed to any degree.

    Perhaps, if I read him right, it even holds without any acceleration?!?

    Although I’m not versed enough in the issue to be able to say for sure if the fine-tuning would hold without any acceleration, but it seemed prior to the supposed discovery of acceleration, i.e. ‘Dark Energy’, that the materialistic models trying to account for the 1 in 10^120 fine-tuning were falling apart then too.

    from the reference

    “Physicists, who have fretted over this paradox for decades, have noted that calculations such as the above involve only the electromagnetic force, and so perhaps when the contributions of the other known forces are included (bosons give rise to positive terms, whereas fermions give rise to negative terms), all terms will cancel out to exactly zero, as a consequence of some unknown, yet-to-be-discovered fundamental principle of physics. When “supersymmetry” was theorized in the 1970s, it was thought that it would meet this requirement, but when it was later discovered that our universe is not precisely supersymmetric, this explanation was abandoned. In any event, until recently physicists remained hopeful that some yet-to-be-discovered principle would imply that the positive and negative terms of the zero-point mass density (and thus the cosmological constant) precisely cancel out to zero.’

  6. 6
    bornagain77 says:

    semi OT: Here is a fairly recent lecture by Gerald L. Schroeder

    R&B Science: Lecture by Professor Gerald L. Schroeder on “Proof of God in Two Steps”
    https://www.youtube.com/watch?v=M0H6zS-cxCU

  7. 7
    tjguy says:

    Crev.info has a good write-up on this press release:

    Type 1a supernovae, vital to estimates of the size and expansion of the universe, are not uniform. This has cosmic implications.

    A team from the University of Arizona has news of cosmic proportions. For many years, Type 1a supernovae have been considered “standard candles” at all distances. This allows astronomers to calculate cosmic distances, and in fact was used to deduce the accelerated expansion of the universe in the late 1990s that won three astronomers a Nobel Prize.

    Without a known cause for cosmic acceleration, astronomers have proposed some unknown kind of “dark energy” as the cause. All of this has relied on the assumption of uniformity of Type 1a supernovae (see 9/30/12, 3/15/08, and 11/01/06).

    A UA press release now calls that assumption into question. Using data from the Swift satellite, which measures stars in ultraviolet light, the UA astronomers found that Type 1a’s fall into two classes. The nearest ones are redder than the more distant ones. Reporter Daniel

    Stolte titled his article, “Accelerating universe? Not so fast.”

    “Since nobody realized that before, all these supernovae were thrown in the same barrel. But if you were to look at 10 of them nearby, those 10 are going to be redder on average than a sample of 10 faraway supernovae.”

    The authors conclude that some of the reported acceleration of the universe can be explained by color differences between the two groups of supernovae, leaving less acceleration than initially reported. This would, in turn, require less dark energy than currently assumed.

    The team is unable to put a number on how much the figures will need to be adjusted, saying further work is needed.

    Meanwhile, though, another mission is seeking to measure dark energy if it exists. The DESI spectroscope (Dark Energy Spectroscopic Instrument) will be fit at the prime focus of the Mayall 4-meter telescope at Kitt Peak (see photo) to measure 30 million galaxies’ worth of the universe in 3-D.

    PhysOrg reports on the instrument built at University of Michigan:

    Cosmologists suspect a mysterious property called dark energy. Although it is thought to comprise 75 percent of the universe, its nature and the physics behind it are still mysteries.

    DESI will create a high-definition, 3-D map of a swath of the universe going back 10 billion light-years. By exploring how structure in the universe has evolved through time, scientists hope to uncover the tug-of-war between the forces of gravity and dark energy.

    The UM astronomers need to talk to the UA astronomers.

    If you win a Nobel Prize for a false conclusion, do you have to give the money back?

    Note once again how assumptions play crucial roles in models that try to understand observations.

    Perhaps dark energy is real, though less than expected. There’s an outside chance though, if supernova populations are not as uniform as even the UA astronomers believe, that there is no dark energy at all. This should be a lesson in assumptions and unknowns in science.

    Conclusions in theory are only tentative. They can always be overthrown by later findings.

    How many other assumptions, on which we build our MODELS, are false and we just don’t know it?

  8. 8
    ppolish says:

    “How many other assumptions, on which we build our MODELS, are false and we just don’t know it?”

    History tells us every single one to a varying degree, Tjguy. But NS + RM is way way off the mark. Way.

Leave a Reply