Uncommon Descent Serving The Intelligent Design Community

New analysis: Planck data more in line with Standard Model than originally thought?

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

So Nature notes:

A new analysis of data on the infant universe released earlier this year indicates that the initial findings are more in line with the standard cosmological picture than had originally been thought.

Apparently, there is a possible “systematic error” in previous measurements at certain levels, aswll as calibration and mechanical issues.

“With an experiment as sensitive as Planck, the great challenge is fully understanding all of the systematics,” says Spergel. …

One issue is that slight variations can make a great difference.

Cosmologist Michael Turner of the University of Chicago in Illinois quips, “Precision cosmology is hard; accurate cosmology is even harder.”

Put another way, the universe doesn’t tell us when we are right. Or wrong either.  We just have to keep checking.

Comments
Yes, I understand that cosmology also deals with the present. Since work in the present can be tested and basically verified, I have no problem accepting its findings to the extent that bias does not influence the interpretation. I don't understand it all, but I do know there seems to be some experimental evidence for the existence of dark matter, but whether what exists actually supports the Standard model remains to be seen.tjguy
December 21, 2013
December
12
Dec
21
21
2013
12:46 AM
12
12
46
AM
PDT
Hi tjguy, Cosmology is not just studying the history, it is studying the present and future too. Many of the particles have been discovered by studying cosmic rays.Half the world is powered by nuclear reactors which wouldn't be possible if we hadn't figured out the correct values, delicate Medical imaging, advanced medical treatments would also not be possible if we had wrong values. Defense of countries, world's Navigation, air traffic, communication, entertainment and of course space programs depends on correct values of innumerable cosmic values that we have calculated and checked for accuracy. So ,yes we did get many values right. Our universe has millions of variables which still are a mystery to us. Many variables can be accurately calculated and confirmed only at higher energy level and when technology matures. For Eg, we had to wait for LHC to be built to get the value of Higgs Boson. For particles heavier than Top quarks, we have to have energies in the 20s TeVs, which needs heavy funding and land and cooperation from many countries. We got very high accuracy measurements of our universe because of WMAP and Planck data; we had to wait for years for data like those obtained from WMAP and Planck satellite. Given the complexity of universe and equipment accuracy errors, value will never be as accurate as we want but it will be within 2 sigma, which is fine for most applications. I would also like to emphasize that cosmology is not just calculations. It is more about logical thinking. Let me explain the case of Dark matter. Where did we get the stupid idea? Well as you know the orbital speed of a planet (or any object for that matter) is inversely proportional to it's distance from the Sun /center mass, so the orbital speed of distance planet is low ,and planets nearer the Sun have higher orbital speed. We might expect the same when we observe the stars in galaxies - we might expect that the distant star's orbital velocity around the center of the galaxy will be low and stars near the galaxy center will have greater velocity, but surprise, we find that the velocity of near and distance star remain unchanged! So it means that there is matter even at the edge of the galaxy else the distance stars would have flown off their orbit because without matter there would be no centripetal force for distance stars. That is the hunch, but can we confirm it? Yes, we know that v^2 = GM/r , where v is the velocity of star. Now if v^2 is constant, then the term GM/r will also be constant. Since we know G( gravitational constant) and r, we can easily calculate the galaxy mass based on the r (distance). We can also get the visible mass of the galaxy from Planck / WMAP data. We then calculate the possible invisible mass by observing the Gravitational lensing effect. What we find is that the observed mass is not equal to the calculated mass ! So there has to be matter which we haven't detected yet- and that matter is what is called Dark matter. Of course there are many other explanations for difference between calculated and observed mass, but currently Dark matter is the most popular hypothesis. In fact logically we already know what the Dark matter should be- It has to be heavier than Top Quark else we would have detected the particle in LHC, the Dark matter is not charged, else we would been seen it being affected by Electromagnetic force, and it doesn't have Strong nuclear force, else we would have detected it in clumps (light would bend around large lump of the dark matter),we know it is not force carrier particles because those are not stable and will disappear within fraction of a second, so our best bet is WIMPs or Weakly interacting Massive Particles. What we have to look for is a matter which is heavy, has neutral charge and has only weak nuclear force interaction. This can be observed only be studying the matters interaction with Higgs Boson and that is why Higgs Boson is call Higgs portal because it will help us identify newer world of particles.selvaRajan
December 19, 2013
December
12
Dec
19
19
2013
05:56 PM
5
05
56
PM
PDT
Selva, it seems you are fairly knowledgable on this. I am not, but the problem with cosmology is that it deals with history that is unobservable and unrepeatable. We cannot test our theories to see if they are right. We can try and use models and computer programs to test them, but the variables are so illusive that we can never really know if we are right. Assumptions abound! Why would we even think that we are right? Fooling around with the variables until we get it to work - is that science? What have we proven? Nothing, except that IF our variables are correct, perhaps it could have happened by totally natural means. But did it? No one really knows. Look at what these scientists said:
“With an experiment as sensitive as Planck, the great challenge is fully understanding all of the systematics,” says Spergel. …
One issue is that slight variations can make a great difference.
Cosmologist Michael Turner of the University of Chicago in Illinois quips, “Precision cosmology is hard; accurate cosmology is even harder.”
It is just something we need to keep in mind when we deal with the unobservable past. Historical science is no where near as accurate as science we do here and now using the scientific method. There is a huge difference between the two!tjguy
December 19, 2013
December
12
Dec
19
19
2013
04:08 AM
4
04
08
AM
PDT
When the 217 Ghz data is analyzed, Planck 2013 data showed that Hubble constant is lower and that the matter density is higher (by 2.5 Sigma), now we are back to the standard model. It is good news, not bad! The anomaly was only in the 217 Ghz Planck 2013 data. The incomplete removal of 4k line caused the error. 4k line is produced by electromagnetic interference in the Joul Thomson cooler drive electronics and the electronics of the read out. This has been rectified and the we await eagerly for Planck 2014 data.selvaRajan
December 18, 2013
December
12
Dec
18
18
2013
05:53 PM
5
05
53
PM
PDT

Leave a Reply