Philip Cunningham writes to tell us of an interesting experiment by quantum physicist Anton Zeilinger and colleagues that pushed the “free-will loophole” back to 7.8 billion years ago, using quasars to determine measurement settings:
Abstract: In this Letter, we present a cosmic Bell experiment with polarization-entangled photons, in which measurement settings were determined based on real-time measurements of the wavelength of photons from high-redshift quasars, whose light was emitted billions of years ago; the experiment simultaneously ensures locality. Assuming fair sampling for all detected photons and that the wavelength of the quasar photons had not been selectively altered or previewed between emission and detection, we observe statistically significant violation of Bell’s inequality by 9.3 standard deviations, corresponding to an estimated p value of ≲ 7.4 × 10^21. This experiment pushes back to at least ∼ 7.8 Gyr ago the most recent time by which any local-realist influences could have exploited the “freedom-of-choice” loophole to engineer the observed Bell violation, excluding any such mechanism from 96% of the space-time volume of the past light cone of our experiment, extending from the big bang to today. – Anton Zeilinger, “[article title]” at Cosmic Bell Test Using Random Measurement Settings from High-Redshift Quasars, 14 June 2018
Cunningham adds, “It should be noted that this present experiment is a vast improvement over their last Cosmic Bell Test which only went back 600 years.”
From Quanta 2017:
In the first of a planned series of “cosmic Bell test” experiments, the team sent pairs of photons from the roof of Zeilinger’s lab in Vienna through the open windows of two other buildings and into optical modulators, tallying coincident detections as usual. But this time, they attempted to lower the chance that the modulator settings might somehow become correlated with the states of the photons in the moments before each measurement. They pointed a telescope out of each window, trained each telescope on a bright and conveniently located (but otherwise random) star, and, before each measurement, used the color of an incoming photon from each star to set the angle of the associated modulator. The colors of these photons were decided hundreds of years ago, when they left their stars, increasing the chance that they (and therefore the measurement settings) were independent of the states of the photons being measured.
And yet, the scientists found that the measurement outcomes still violated Bell’s upper limit, boosting their confidence that the polarized photons in the experiment exhibit spooky action at a distance after all.
Nature could still exploit the freedom-of-choice loophole, but the universe would have had to delete items from the menu of possible measurement settings at least 600 years before the measurements occurred (when the closer of the two stars sent its light toward Earth). “Now one needs the correlations to have been established even before Shakespeare wrote, ‘Until I know this sure uncertainty, I’ll entertain the offered fallacy,’” Hall said.
Next, the team plans to use light from increasingly distant quasars to control their measurement settings, probing further back in time and giving the universe an even smaller window to cook up correlations between future device settings and restrict freedoms. Natalie Wolchover, “Experiment Reaffirms Quantum Weirdness” at Quanta
And here is another recent interesting experiment by Anton Zeilinger, (and about 70 other researchers), that insured unpredictable measurement settings in a Bell test from the free will choices of 100,000 human participants instead of having a physical randomizer determine measurement settings:
Abstract: A Bell test, which challenges the philosophical worldview of local realism against experimental observations, is a randomized trial requiring spatially-distributed entanglement, fast and high-efficiency detection, and unpredictable measurement settings. While technology can perfect the first two of these, and while technological randomness sources enable device-independent protocols based on Bell inequality violation, challenging local realism using physical randomizers inevitably makes assumptions about the same physics one aims to test. Bell himself noted this weakness of physical setting choices and argued that human free will could rigorously be used to assure unpredictability in Bell tests. Here we report a suite of local realism tests using human choices, avoiding assumptions about predictability in physics. We recruited ~100,000 human participants to play an online video game that incentivizes fast, sustained input of unpredictable bits while also illustrating Bell test methodology. The participants generated 97,347,490 binary choices, which were directed via a scalable web platform to twelve laboratories on five continents, in which 13 experiments tested local realism using photons, single atoms, atomic ensembles, and superconducting devices. Over a 12-hour period on the 30 Nov. 2016, participants worldwide provided a sustained flow of over 1000 bits/s to the experiments, which used different human-generated bits to choose each measurement setting. The observed correlations strongly contradict local realism and other realist positions in bi-partite and tri-partite scenarios. Project outcomes include closing of the freedom-of-choice loophole, gamification of statistical and quantum non-locality concepts, new methods for quantum-secured communications, a very large dataset of human-generated randomness, and networking techniques for global participation in experimental science. Antoine Suarez, “Challenging local realism with human choices” at arXiv, 20 May 2018
See also: Suarez: Quantum nonlocal correlations come from outside space-time