November 2014

Multiverse Collisions May Dot the Sky
Early in cosmic history, our universe may have bumped into another — a primordial clash that could have left traces in the Big Bang’s afterglow.

"But on larger scales, exponential expansion continues forever, and new bubble universes are continually being created. Each bubble is deemed a universe in its own right, despite being part of the same space-time, because an observer could not travel from one bubble to the next without moving faster than the speed of light. And each bubble may have its own distinct laws of physics. “If you buy eternal inflation, it predicts a multiverse,” Peiris said."

www.quantamagazine.org/20141110-multiverse-collisions-may-dot-the-sky/
... See MoreSee Less

Multiverse Collisions May Dot the Sky
Early in cosmic history, our universe may have bumped into another — a primordial clash that could have left traces in the Big Bang’s afterglow.

But on larger scales, exponential expansion continues forever, and new bubble universes are continually being created. Each bubble is deemed a universe in its own right, despite being part of the same space-time, because an observer could not travel from one bubble to the next without moving faster than the speed of light. And each bubble may have its own distinct laws of physics. “If you buy eternal inflation, it predicts a multiverse,” Peiris said.

https://www.quantamagazine.org/20141110-multiverse-collisions-may-dot-the-sky/

Image Credit: ALMA (ESO/NAOJ/NRAO), NSF
Explanation: Why does this giant disk have gaps? The exciting and probable answer is: planets. A mystery is how planets massive enough to create these gaps formed so quickly, since the HL Tauri star system is only about one million years old. The picture on which the gaps were discovered was taken with the new Atacama Large Millimeter Array (ALMA) of telescopes in Chile. ALMA imaged the protoplanetary disk, which spans about 1,500 light-minutes across, in unprecedented detail, resolving features as small as 40 light minutes. The low energy light used by ALMA was also able to peer through an intervening haze of gas and dust. The HL Tauri system lies about 450 light years from Earth. Studying HL Tauri will likely give insight into how our own Solar System formed and evolved. #APOD #AstronomyPictureOfTheDay
... See MoreSee Less

Image Credit: ALMA (ESO/NAOJ/NRAO), NSF
Explanation: Why does this giant disk have gaps? The exciting and probable answer is: planets. A mystery is how planets massive enough to create these gaps formed so quickly, since the HL Tauri star system is only about one million years old. The picture on which the gaps were discovered was taken with the new Atacama Large Millimeter Array (ALMA) of telescopes in Chile. ALMA imaged the protoplanetary disk, which spans about 1,500 light-minutes across, in unprecedented detail, resolving features as small as 40 light minutes. The low energy light used by ALMA was also able to peer through an intervening haze of gas and dust. The HL Tauri system lies about 450 light years from Earth. Studying HL Tauri will likely give insight into how our own Solar System formed and evolved. #APOD #AstronomyPictureOfTheDay

Radical science, pushing the envelope on the radical but rational. ... See MoreSee Less

Radical science, pushing the envelope on the radical but rational.

Watch Newton's gravity at work...

www.youtube.com/watch?v=E43-CfukEgs#t=124
... See MoreSee Less

Programme website: www.bbc.co.uk/programmes/p0276q28 Brian Cox visits NASA’s Space Power Facility in Ohio to see what happens when a bowling ball and ...

PREPARING THE FUTURE: "MINERvA scientist Jorge Morfin of Fermilab proposed a special course for graduate students and postdocs to learn the nuts, bolts and subtleties of interactions between neutrinos and a nucleus made up of multiple protons and neutrons. Culminating months of organizing committee effort, the nine-day NuSTEC Training in Neutrino-Nucleus Scattering Physics, held at Fermilab, concluded on Wednesday...."

►READ more about it here: www.fnal.gov/pub/today/archive/archive_2014/today14-11-04.html
... See MoreSee Less

PREPARING THE FUTURE: MINERvA scientist Jorge Morfin of Fermilab proposed a special course for graduate students and postdocs to learn the nuts, bolts and subtleties of interactions between neutrinos and a nucleus made up of multiple protons and neutrons. Culminating months of organizing committee effort, the nine-day NuSTEC Training in Neutrino-Nucleus Scattering Physics, held at Fermilab, concluded on Wednesday....

►READ more about it here: http://www.fnal.gov/pub/today/archive/archive_2014/today14-11-04.html

RSN is happy to endorse Gabriel Rothblatt for Congress.

Why? Hes the only NewSpace advocate out there willing to put our money where Uncle Sam's mouth is. If you want a Congressman whos 100% pro space, it's him.

Hes the closest the radical science community (read H+, Singularity, Futurist) has to a real LEADER and advocate in congress. In a time when we need people who understand the wide scope of where humanity is headed, people like him in high office are needed more than ever. So GOOD LUCK Gabriel! Lets hope the Space Coast knows who represents their best interests. - RSN Staff
... See MoreSee Less


Humanity is altering Earth’s life support system....

youtu.be/_EWOrZQ3L-c
... See MoreSee Less

Produced by the International Geosphere-Biosphere Programme and Globaia and funded by the UN Foundation. The data visualization summarises and visualizes sev...

Jen Johnson, Paulo Vieira Neto and 5 others like this

David FinleyTell me again how the sun is altering the Earth's life support system

3 years ago   ·  1
Avatar

Comment on Facebook

Heisenberg and Wi-Fi..........(Stay with me on this).....Here's the connection:

*************Heisenberg's uncertainty principle and Wi-Fi************

"When I first started teaching, I was stumped by a student who asked me if quantum mechanics affected anything in daily life. I said that the universe is fundamentally quantum mechanical and therefore it affects everything, but this didn't satisfy him. Since then, I've been noticing examples everywhere.

One surprising example is the effect of Heisenberg's uncertainty principle on Wi-Fi communication (wireless internet). Heisenberg's uncertainty principle is usually described as a limit on knowledge of a particle's position and speed: The better you know its position, the worse you know its speed. However, it is a general principle with many consequences. The most common in particle physics is that the shorter a particle's lifetime, the worse you know its mass. Both of these formulations are far removed from everyday life, though.

In everyday life, the wave nature of most particles is too small to see. The biggest exception is radio and light, which are wave-like in daily life and only particle-like (photons) in the quantum realm. In radio terminology, Heisenberg's uncertainty principle is called the bandwidth theorem, and it states that the rate at which information is carried over a radio band is proportional to the width of that band. Bandwidth is the reason that radio stations with nearly the same central frequency can sometimes be heard simultaneously: Each is broadcasting over a range of frequencies, and those ranges overlap. If you try to send shorter pulses of data at a higher rate, the range of frequencies broadens.

Although this theorem was developed in the context of Morse code over telegraph systems, it applies just as well to computer data over Wi-Fi networks. A typical Wi-Fi network transmits 54 million bits per second, or 18.5 nanoseconds per bit (zero or one). Through the bandwidth theorem, this implies a frequency spread of about 25 MHz, but the whole Wi-Fi radio dial is only 72 MHz across. In practice, only three bands can be distinguished, so only three different networks can fill the same airwaves at the same time. As the bit rate of Wi-Fi gets faster, the bandwidth gets broader, crowding the radio dial even more.

Mathematically, the Heisenberg uncertainty principle is just a special case of the bandwidth theorem, and we can see this relationship by comparing units. The lifetime of a particle can be measured in nanoseconds, just like the time for a computer to emit a zero or a one. A particle's mass, which is a form of energy, can be expressed as a frequency (for example, 1 GeV is a quarter of a trillion trillion Hz). Uncertainty in mass is therefore a frequency spread, which is to say, bandwidth.

Although it's fundamentally the same thing, the numerical scale is staggering. A computer network comprising decaying Z bosons could emit 75 million petabytes per second, and its bandwidth would be 600 trillion GHz wide."

—Jim Pivarski

►►http://www.fnal.gov/pub/today/archive/archive_2014/today14-11-20.html
... See MoreSee Less

Heisenberg and Wi-Fi..........(Stay with me on this).....Heres the connection:

   *************Heisenbergs uncertainty principle and Wi-Fi************

When I first started teaching, I was stumped by a student who asked me if quantum mechanics affected anything in daily life. I said that the universe is fundamentally quantum mechanical and therefore it affects everything, but this didnt satisfy him. Since then, Ive been noticing examples everywhere.

One surprising example is the effect of Heisenbergs uncertainty principle on Wi-Fi communication (wireless internet). Heisenbergs uncertainty principle is usually described as a limit on knowledge of a particles position and speed: The better you know its position, the worse you know its speed. However, it is a general principle with many consequences. The most common in particle physics is that the shorter a particles lifetime, the worse you know its mass. Both of these formulations are far removed from everyday life, though.

In everyday life, the wave nature of most particles is too small to see. The biggest exception is radio and light, which are wave-like in daily life and only particle-like (photons) in the quantum realm. In radio terminology, Heisenbergs uncertainty principle is called the bandwidth theorem, and it states that the rate at which information is carried over a radio band is proportional to the width of that band. Bandwidth is the reason that radio stations with nearly the same central frequency can sometimes be heard simultaneously: Each is broadcasting over a range of frequencies, and those ranges overlap. If you try to send shorter pulses of data at a higher rate, the range of frequencies broadens.

Although this theorem was developed in the context of Morse code over telegraph systems, it applies just as well to computer data over Wi-Fi networks. A typical Wi-Fi network transmits 54 million bits per second, or 18.5 nanoseconds per bit (zero or one). Through the bandwidth theorem, this implies a frequency spread of about 25 MHz, but the whole Wi-Fi radio dial is only 72 MHz across. In practice, only three bands can be distinguished, so only three different networks can fill the same airwaves at the same time. As the bit rate of Wi-Fi gets faster, the bandwidth gets broader, crowding the radio dial even more.

Mathematically, the Heisenberg uncertainty principle is just a special case of the bandwidth theorem, and we can see this relationship by comparing units. The lifetime of a particle can be measured in nanoseconds, just like the time for a computer to emit a zero or a one. A particles mass, which is a form of energy, can be expressed as a frequency (for example, 1 GeV is a quarter of a trillion trillion Hz). Uncertainty in mass is therefore a frequency spread, which is to say, bandwidth.

Although its fundamentally the same thing, the numerical scale is staggering. A computer network comprising decaying Z bosons could emit 75 million petabytes per second, and its bandwidth would be 600 trillion GHz wide.

—Jim Pivarski

►►http://www.fnal.gov/pub/today/archive/archive_2014/today14-11-20.html

Vio Dan, David Torres and 13 others like this

Comment on Facebook

John Stewart Bell FRS (28 June 1928 – 1 October 1990) was a Northern Irish physicist, and the originator of Bell's theorem, a significant theorem in quantum physics regarding hidden variable theories.

Bell's theorem is a no-go theorem that draws an important distinction between quantum mechanics (QM) and the world as described by classical mechanics. In its simplest form, Bell's theorem states:

"No physical theory of local hidden variables can ever reproduce all of the predictions of quantum mechanics."

In the early 1930s, the philosophical implications of the current interpretations of quantum theory troubled many prominent physicists of the day, including Albert Einstein. In a well-known 1935 paper, Einstein and co-authors Boris Podolsky and Nathan Rosen (collectively "EPR") sought to demonstrate by a paradox that QM was incomplete. This provided hope that a more-complete (and less-troubling) theory might one day be discovered. But that conclusion rested on the seemingly reasonable assumptions of locality and realism (together called "local realism" or "local hidden variables", often interchangeably). In the vernacular of Einstein: locality meant no instantaneous ("spooky") action at a distance; realism meant the moon is there even when not being observed. These assumptions were hotly debated within the physics community, notably between Nobel laureates Einstein and Niels Bohr.

In his groundbreaking 1964 paper, "On the Einstein Podolsky Rosen paradox", physicist John Stewart Bell presented an analogy (based on spin measurements on pairs of entangled electrons) to EPR's hypothetical paradox. Using their reasoning, he said, a choice of measurement setting here should not affect the outcome of a measurement there (and vice versa). After providing a mathematical formulation of locality and realism based on this, he showed specific cases where this would be inconsistent with the predictions of QM theory.

In experimental tests following Bell's example, now using quantum entanglement of photons instead of electrons, John Clauser and Stuart Freedman (1972) and Alain Aspect et al. (1981) demonstrated that the predictions of QM are correct in this regard, although relying on additional unverifiable assumptions that open loopholes for local realism. The present status is that no conclusive, loophole-free Bell test has been performed. While a loophole-free Bell test would not demonstrate QM is complete, one would be forced to reject at least one of the principles of locality, realism, or freedom (the last leads to alternative superdeterministic theories).[citation needed] Two of these logical possibilities, non-locality and non-realism, correspond to well-developed interpretations of quantum mechanics, and have many supporters; this is not the case for the third logical possibility, non-freedom. Conclusive experimental evidence of the violation of Bell's inequality would drastically reduce the class of acceptable deterministic theories but would not falsify absolute determinism, which was described by Bell himself as '...not just inanimate nature running on behind-the-scenes clockwork, but with our behaviour, including our belief that we are free to choose to do one experiment rather than another, absolutely predetermined'.

These three key concepts – locality, realism, freedom – are highly technical and much debated. In particular, the concept of realism is now somewhat different from what it was in discussions in the 1930s. It is more precisely called counterfactual definiteness; it means that we may think of outcomes of measurements that were not actually performed as being just as much part of reality as those that were made. Locality is short for local relativistic causality. Freedom refers to the physical possibility to determine settings on measurement devices independently of the internal state of the physical system being measured.

Cornell solid-state physicist David Mermin has described the various appraisals of the importance of Bell's theorem within the physics community as ranging from "indifference" to "wild extravagance". Lawrence Berkeley particle physicist Henry Stapp declared: "Bell's theorem is the most profound discovery of science."

en.wikipedia.org/wiki/Bell's_theorem

Bell's critique of von Neumann's proof

Bell's interest in hidden variables was motivated by the existence in the formalism of quantum mechanics of a "movable boundary" between the quantum system and the classical apparatus:

A possibility is that we find exactly where the boundary lies. More plausible to me is that we will find that there is no boundary. ... The wave functions would prove to be a provisional or incomplete description of the quantum-mechanical part, of which an objective account would become possible. It is this possibility, of a homogeneous account of the world, which is for me the chief motivation of the study of the so-called 'hidden variable' possibility.

Bell was impressed that in the formulation of David Bohm’s nonlocal hidden variable theory, no such boundary is needed, and it was this which sparked his interest in the field of research. Bell also criticized the standard formalism of quantum mechanics on the grounds of lack of physical precision:

For the good books known to me are not much concerned with physical precision. This is clear already from their vocabulary. Here are some words which, however legitimate and necessary in application, have no place in a formulation with any pretension to physical precision: system, apparatus, environment, microscopic, macroscopic, reversible, irreversible, observable, information, measurement. .... On this list of bad words from good books, the worst of all is 'measurement'.

But if he were to thoroughly explore the viability of Bohm's theory, Bell needed to answer the challenge of the so-called impossibility proofs against hidden variables. Bell addressed these in a paper entitled "On the Problem of Hidden Variables in Quantum Mechanics". (Bell had actually written this paper before his paper on the EPR paradox, but it did not appear until two years later, in 1966, due to publishing delays.) Here he showed that John von Neumann’s argument does not prove the impossibility of hidden variables, as it was widely claimed, due to its reliance on a physical assumption that is not valid for quantum mechanics—namely, that the probability-weighted average of the sum of observable quantities equals the sum of the average values of each of the separate observable quantities. Bell subsequently claimed, "The proof of von Neumann is not merely false but foolish!". In this same work, Bell showed that a stronger effort at such a proof (based upon Gleason's theorem) also fails to eliminate the hidden variables program. The supposed flaw in von Neumann's proof had been previously discovered by Grete Hermann in 1935, but did not become common knowledge until after it was rediscovered by Bell. However, in 2010, Jeffrey Bub published an argument that Bell (and, implicitly, Hermann) had misconstrued von Neumann's proof, claiming that it does not attempt to prove the absolute impossibility of hidden variables, and is actually not flawed, after all. (Thus, it was the physics community as a whole that had misinterpreted von Neumann's proof as applying universally.) Bub provides evidence that von Neumann understood the limits of his proof, but there is no record of von Neumann attempting to correct the near universal misinterpretion which lingered for over 30 years and exists to some extent to this day. Von Neumann's proof does not in fact apply to contextual hidden variables, as in Bohm's theory.

Bell test experiments

Bell test experiments or Bell's inequality experiments are designed to demonstrate the real world existence of certain theoretical consequences of the phenomenon of entanglement in quantum mechanics which could not possibly occur according to a classical picture of the world, characterised by the notion of local realism. Under local realism, correlations between outcomes of different measurements performed on separated physical systems have to satisfy certain constraints, called Bell inequalities. John Bell derived the first inequality of this kind in his paper "On the Einstein-Podolsky-Rosen Paradox". Bell's Theorem states that the predictions of quantum mechanics cannot be reproduced by any local hidden variable theory.

The term "Bell inequality" can mean any one of a number of inequalities satisfied by local hidden variables theories; in practice, in present day experiments, most often the CHSH; earlier the CH74 inequality. All these inequalities, like the original inequality of Bell, by assuming local realism, place restrictions on the statistical results of experiments on sets of particles that have taken part in an interaction and then separated. A Bell test experiment is one designed to test whether or not the real world satisfies local realism.

en.wikipedia.org/wiki/Bell_test_experiments

Conclusions from experimental tests

In 1972 the first of many experiments that have shown (under the extrapolation to ideal detector efficiencies) a violation of Bell's inequality was conducted. Bell himself concludes from these experiments that "It now seems that the non-locality is deeply rooted in quantum mechanics itself and will persist in any completion." This, according to Bell, also implied that quantum theory is not locally causal and cannot be embedded into any locally causal theory. Bell regretted that results of the tests did not agree with the concept of local hidden variables:

For me, it is so reasonable to assume that the photons in those experiments carry with them programs, which have been correlated in advance, telling them how to behave. This is so rational that I think that when Einstein saw that, and the others refused to see it, he was the rational man. The other people, although history has justified them, were burying their heads in the sand. ... So for me, it is a pity that Einstein's idea doesn't work. The reasonable thing just doesn't work."

Bell seemed to have become resigned to the notion that future experiments would continue to agree with quantum mechanics and violate his inequality. Referring to the Bell test experiments, he remarked:

It is difficult for me to believe that quantum mechanics, working very well for currently practical set-ups, will nevertheless fail badly with improvements in counter efficiency ..."

Some people continue to believe that agreement with Bell's inequalities might yet be saved. They argue that in the future much more precise experiments could reveal that one of the known loopholes, for example the so-called "fair sampling loophole", had been biasing the interpretations. Most mainstream physicists are highly skeptical about all these "loopholes", admitting their existence but continuing to believe that Bell's inequalities must fail.

Bell remained interested in objective 'observer-free' quantum mechanics. He felt that at the most fundamental level, physical theories ought not to be concerned with observables, but with 'be-ables': "The beables of the theory are those elements which might correspond to elements of reality, to things which exist. Their existence does not depend on 'observation'." He remained impressed with Bohm's hidden variables as an example of such a scheme and he attacked the more subjective alternatives such as the Copenhagen interpretation.

en.wikipedia.org/wiki/John_Stewart_Bell
... See MoreSee Less

John Stewart Bell FRS (28 June 1928 – 1 October 1990) was a Northern Irish physicist, and the originator of Bells theorem, a significant theorem in quantum physics regarding hidden variable theories.

Bells theorem is a no-go theorem that draws an important distinction between quantum mechanics (QM) and the world as described by classical mechanics. In its simplest form, Bells theorem states:

No physical theory of local hidden variables can ever reproduce all of the predictions of quantum mechanics.

In the early 1930s, the philosophical implications of the current interpretations of quantum theory troubled many prominent physicists of the day, including Albert Einstein. In a well-known 1935 paper, Einstein and co-authors Boris Podolsky and Nathan Rosen (collectively EPR) sought to demonstrate by a paradox that QM was incomplete. This provided hope that a more-complete (and less-troubling) theory might one day be discovered. But that conclusion rested on the seemingly reasonable assumptions of locality and realism (together called local realism or local hidden variables, often interchangeably). In the vernacular of Einstein: locality meant no instantaneous (spooky) action at a distance; realism meant the moon is there even when not being observed. These assumptions were hotly debated within the physics community, notably between Nobel laureates Einstein and Niels Bohr.

In his groundbreaking 1964 paper, On the Einstein Podolsky Rosen paradox, physicist John Stewart Bell presented an analogy (based on spin measurements on pairs of entangled electrons) to EPRs hypothetical paradox. Using their reasoning, he said, a choice of measurement setting here should not affect the outcome of a measurement there (and vice versa). After providing a mathematical formulation of locality and realism based on this, he showed specific cases where this would be inconsistent with the predictions of QM theory.

In experimental tests following Bells example, now using quantum entanglement of photons instead of electrons, John Clauser and Stuart Freedman (1972) and Alain Aspect et al. (1981) demonstrated that the predictions of QM are correct in this regard, although relying on additional unverifiable assumptions that open loopholes for local realism. The present status is that no conclusive, loophole-free Bell test has been performed. While a loophole-free Bell test would not demonstrate QM is complete, one would be forced to reject at least one of the principles of locality, realism, or freedom (the last leads to alternative superdeterministic theories).[citation needed] Two of these logical possibilities, non-locality and non-realism, correspond to well-developed interpretations of quantum mechanics, and have many supporters; this is not the case for the third logical possibility, non-freedom. Conclusive experimental evidence of the violation of Bells inequality would drastically reduce the class of acceptable deterministic theories but would not falsify absolute determinism, which was described by Bell himself as ...not just inanimate nature running on behind-the-scenes clockwork, but with our behaviour, including our belief that we are free to choose to do one experiment rather than another, absolutely predetermined.

These three key concepts – locality, realism, freedom – are highly technical and much debated. In particular, the concept of realism is now somewhat different from what it was in discussions in the 1930s. It is more precisely called counterfactual definiteness; it means that we may think of outcomes of measurements that were not actually performed as being just as much part of reality as those that were made. Locality is short for local relativistic causality. Freedom refers to the physical possibility to determine settings on measurement devices independently of the internal state of the physical system being measured.

Cornell solid-state physicist David Mermin has described the various appraisals of the importance of Bells theorem within the physics community as ranging from indifference to wild extravagance. Lawrence Berkeley particle physicist Henry Stapp declared: Bells theorem is the most profound discovery of science.

http://en.wikipedia.org/wiki/Bells_theorem

Bells critique of von Neumanns proof

Bells interest in hidden variables was motivated by the existence in the formalism of quantum mechanics of a movable boundary between the quantum system and the classical apparatus:

A possibility is that we find exactly where the boundary lies. More plausible to me is that we will find that there is no boundary. ... The wave functions would prove to be a provisional or incomplete description of the quantum-mechanical part, of which an objective account would become possible. It is this possibility, of a homogeneous account of the world, which is for me the chief motivation of the study of the so-called hidden variable possibility.

Bell was impressed that in the formulation of David Bohm’s nonlocal hidden variable theory, no such boundary is needed, and it was this which sparked his interest in the field of research. Bell also criticized the standard formalism of quantum mechanics on the grounds of lack of physical precision:

For the good books known to me are not much concerned with physical precision. This is clear already from their vocabulary. Here are some words which, however legitimate and necessary in application, have no place in a formulation with any pretension to physical precision: system, apparatus, environment, microscopic, macroscopic, reversible, irreversible, observable, information, measurement. .... On this list of bad words from good books, the worst of all is measurement.

But if he were to thoroughly explore the viability of Bohms theory, Bell needed to answer the challenge of the so-called impossibility proofs against hidden variables. Bell addressed these in a paper entitled On the Problem of Hidden Variables in Quantum Mechanics. (Bell had actually written this paper before his paper on the EPR paradox, but it did not appear until two years later, in 1966, due to publishing delays.) Here he showed that John von Neumann’s argument does not prove the impossibility of hidden variables, as it was widely claimed, due to its reliance on a physical assumption that is not valid for quantum mechanics—namely, that the probability-weighted average of the sum of observable quantities equals the sum of the average values of each of the separate observable quantities. Bell subsequently claimed, The proof of von Neumann is not merely false but foolish!. In this same work, Bell showed that a stronger effort at such a proof (based upon Gleasons theorem) also fails to eliminate the hidden variables program. The supposed flaw in von Neumanns proof had been previously discovered by Grete Hermann in 1935, but did not become common knowledge until after it was rediscovered by Bell. However, in 2010, Jeffrey Bub published an argument that Bell (and, implicitly, Hermann) had misconstrued von Neumanns proof, claiming that it does not attempt to prove the absolute impossibility of hidden variables, and is actually not flawed, after all. (Thus, it was the physics community as a whole that had misinterpreted von Neumanns proof as applying universally.) Bub provides evidence that von Neumann understood the limits of his proof, but there is no record of von Neumann attempting to correct the near universal misinterpretion which lingered for over 30 years and exists to some extent to this day. Von Neumanns proof does not in fact apply to contextual hidden variables, as in Bohms theory.

Bell test experiments

Bell test experiments or Bells inequality experiments are designed to demonstrate the real world existence of certain theoretical consequences of the phenomenon of entanglement in quantum mechanics which could not possibly occur according to a classical picture of the world, characterised by the notion of local realism. Under local realism, correlations between outcomes of different measurements performed on separated physical systems have to satisfy certain constraints, called Bell inequalities. John Bell derived the first inequality of this kind in his paper On the Einstein-Podolsky-Rosen Paradox. Bells Theorem states that the predictions of quantum mechanics cannot be reproduced by any local hidden variable theory.

The term Bell inequality can mean any one of a number of inequalities satisfied by local hidden variables theories; in practice, in present day experiments, most often the CHSH; earlier the CH74 inequality. All these inequalities, like the original inequality of Bell, by assuming local realism, place restrictions on the statistical results of experiments on sets of particles that have taken part in an interaction and then separated. A Bell test experiment is one designed to test whether or not the real world satisfies local realism.

http://en.wikipedia.org/wiki/Bell_test_experiments

Conclusions from experimental tests

In 1972 the first of many experiments that have shown (under the extrapolation to ideal detector efficiencies) a violation of Bells inequality was conducted. Bell himself concludes from these experiments that It now seems that the non-locality is deeply rooted in quantum mechanics itself and will persist in any completion. This, according to Bell, also implied that quantum theory is not locally causal and cannot be embedded into any locally causal theory. Bell regretted that results of the tests did not agree with the concept of local hidden variables:

For me, it is so reasonable to assume that the photons in those experiments carry with them programs, which have been correlated in advance, telling them how to behave. This is so rational that I think that when Einstein saw that, and the others refused to see it, he was the rational man. The other people, although history has justified them, were burying their heads in the sand. ... So for me, it is a pity that Einsteins idea doesnt work. The reasonable thing just doesnt work.

Bell seemed to have become resigned to the notion that future experiments would continue to agree with quantum mechanics and violate his inequality. Referring to the Bell test experiments, he remarked:

It is difficult for me to believe that quantum mechanics, working very well for currently practical set-ups, will nevertheless fail badly with improvements in counter efficiency ...

Some people continue to believe that agreement with Bells inequalities might yet be saved. They argue that in the future much more precise experiments could reveal that one of the known loopholes, for example the so-called fair sampling loophole, had been biasing the interpretations. Most mainstream physicists are highly skeptical about all these loopholes, admitting their existence but continuing to believe that Bells inequalities must fail.

Bell remained interested in objective observer-free quantum mechanics. He felt that at the most fundamental level, physical theories ought not to be concerned with observables, but with be-ables: The beables of the theory are those elements which might correspond to elements of reality, to things which exist. Their existence does not depend on observation. He remained impressed with Bohms hidden variables as an example of such a scheme and he attacked the more subjective alternatives such as the Copenhagen interpretation.

http://en.wikipedia.org/wiki/John_Stewart_Bell

Roberto Martin, Peter Fernandez and 12 others like this

Fjordan Andersen"Thus, it was the physics community as a whole that had misinterpreted von Neumann's proof as applying universally.", I lol'd. Silly physicists.

3 years ago
Avatar

Comment on Facebook