Uncommon Descent Serving The Intelligent Design Community

Cosmology: The strength of gravity is an open question


Says Ars Technica :

You might expect that, all these years after Newton, we might have a good measure of his gravitational constant, G. As the authors of a new paper on the topic note, there are plenty of reasons to want a good measure of G “given the relevance of the gravitational constant in several fields ranging from cosmology to particle physics, and in the absence of a complete theory linking gravity to other forces.”

Yet most of our measurements of G come from an updated version of a device designed by Henry Cavendish back in the 1700s. And rather annoyingly, these measurements don’t agree with each other—they’re all close to a single value, but their error bars don’t consistently overlap. Now, researchers have made a new measurement of G using a method that certainly wasn’t available in the 1700s: interference between clouds of ultracold atoms. And the value that they have come up with doesn’t agree with many of the other measurements, either. …

A new measuring system may help by providing a different set of experimental errors—facts established slowly by degrees.

Interesting to contrast this with the popular simile: “sure as the law of gravity”

Follow UD News at Twitter!

@Mapou Isn't the Higgs-Boson the subatomic particle responsible for giving things mass? If so, what would make it different from the Graviton? Also, if gravity is indeed instantaneous and not reducible to particle physics, can it even have waves or fields or the like? Can it even be manipulated? VunderGuy
@Mapou What do you think of the graviton and what do you think of the Higgs Boson and how does that tie into it?
I believe any talk of graviton or any other particle to explain gravity is crackpottery. I agree with Newton that gravity is instantaneous because its universality (it affects everything the same way) suggests that it is caused by violations to the principle of the conservation of energy. It is instantaneous because all conservation principles are inherently nonlocal. I don't know much about the Higgs particle to comment intelligently on it. However, given the mountain of cr@p that current physics is built on, I am deeply suspicious of any pronouncement from the physics community. Mapou
Yet most of our measurements of G come from an updated version of a device designed by Henry Cavendish back in the 1700s.
Incorrect attribution of device design. the famous "Cavendish experiment" and "Cavendish apparatus" were not designed by Cavendish, but by a geologist named John Michell http://www.public.iastate.edu/~lhodges/Michell.htm cantor
@Mapou What do you think of the graviton and what do you think of the Higgs Boson and how does that tie into it? VunderGuy
PPPS: I should clarify, the Saros affects mean sea level as measured at a given location. So do factors such as subsidence or emergence of land, etc etc. Strictly, one needs a 19-year cycle of tidal gauge measurement to track it but a year or a few will suffice to have a good idea, given background knowledge base. But strictly, that injects a degree of circularity. It is very hard not to have built in circles in experimental work. Much less work that uses computer simulations etc. kairosfocus
PPS: Value of G is not strongly tied to the sort of resonances that lead to H, He, O and C as most abundant elements in the observed cosmos, with N nearby (& IIRC, 5th for our galaxy). That gives us, stars and galaxies, the gateway to the rest of the periodic table, water with its astonishing properties, organic chemistry's connector-block element and proteins. Sir Fred Hoyle was right to point to this pattern as a first pivotal manifestation of fine tuning. Even, though the values involved do not run to huge numbers of decimals. This looks like a put-up job on the physics behind our cosmos, and points to there being no blind forces of consequence in physics, chemistry or biology. In plain words -- independent of whether we ever get to some prebiotic soup that is reasonable and does somehow throw up living cells, or whether we show that lucky noise driven variation can feed body plan level origination by successive survival based culling out -- we have evidence that points to a cosmos set up to facilitate the existence of C-Chemistry, aqueous medium cell based life in terrestrial planets in galactic habitable zones orbiting the right sort of Pop I second generation stars with high metallicity. And, in my view, that is where design theory should first point . . . it decisively undercuts the 150 years of indoctrination on the world of life. Then, with that in hand, we are in a position to ask pointed and politely but firmly insist on sound and prudent answers to questions on the sampling of config spaces given planetary, solar system and observed cosmos scale resources, regarding the plausibility of the origin of codes, algorithms and supportive complex functional organisation by blind chance and mechanical necessity. Those questions will very rapidly show to the un-indoctrinated onlooker, that something is very wrong in the state of confident manner assertions and official pronouncements and force backed policies regarding dominant schools of thought and linked education approaches on origins science. KF kairosfocus
F/N: Report in Nature:
About 300 experiments have tried to determine the value of the Newtonian gravitational constant, G, so far, but large discrepancies in the results have made it impossible to know its value precisely1. The weakness of the gravitational interaction and the impossibility of shielding the effects of gravity make it very difficult to measure G while keeping systematic effects under control. Most previous experiments performed were based on the torsion pendulum or torsion balance scheme as in the experiment by Cavendish2 in 1798, and in all cases macroscopic masses were used. Here we report the precise determination of G using laser-cooled atoms and quantum interferometry. We obtain the value G = 6.67191(99)?×?10?11?m3?kg?1?s?2 with a relative uncertainty of 150 parts per million (the combined standard uncertainty is given in parentheses). Our value differs by 1.5 combined standard deviations from the current recommended value of the Committee on Data for Science and Technology3. A conceptually different experiment such as ours helps to identify the systematic errors that have proved elusive in previous experiments, thus improving the confidence in the value of G. There is no definitive relationship between G and the other fundamental constants, and there is no theoretical prediction for its value, against which to test experimental results. Improving the precision with which we know G has not only a pure metrological interest, but is also important because of the key role that G has in theories of gravitation, cosmology, particle physics and astrophysics and in geophysical models.
So, there will be need to ponder why the two diverge, and it will likely take time. I suspect that effects like the time we are in the Saros ~ 19 y moon-sun-earth cycle and associated tidal effects may count; we cannot shield gravity and it will be hard to compensate for astronomical effects without effectively building in circularities similar to how at school level V, I and R measurements with moving coil meters [and now digital ones] build in circularities as they all rely on Ohm's Law relationships. That means things like time of day, location on earth's surface, nearby mountains, rock densities, orientation of apparatus and year in which such measurements are taken will affect the result in ways that will be hard to get rid of without implicitly building in an expected value for G and biasing the values. Certainly, the Saros materially affects mean sea level, which points to a significant effect. KF PS: Let's see if those who are busily spreading accusations to the effect that design thinkers don't understand or reasonably address the empirical evidence and control on theory issue will be willing to adjust their views. kairosfocus
News: Non-overlapping error bars point to systematic errors in measurement approaches. The torsion experiments and the cold atom experiments agree in the first three significant figures, 6.67 * 10^-11 N m^2 kg^-2, not a high precision, but maybe we have to live with that until we figure out more clearly underlying issues with how we set out to measure. Reminds me of the old saying that structures fail by their weakest mechanism, usually the one no one had figured out -- we need to ask ourselves hard as to whether we are overlooking something. It is likely that we need to remember that in was it the 1920's, there were values for atomic etc weights that were thought to be known to certain error bars but later experiments showed errors beyond the error bars. So, we need to wait until we understand more closely what is biasing the experiments, whether the old style torsion balance ones (notoriously ticklish to work with as anyone who has used a ballistic galvanometer will recall) or the new ones (which sound even more ticklish and subject to thermodynamics noise effects hence the big chill involved). And, the root issue is a fight for more decimals in agreement. KF kairosfocus
This is a problem that would not exist if physicists truly understood gravity, a thousand relativists claiming otherwise notwithstanding. Mapou

Leave a Reply