Saturday, July 30, 2016

More on nonlocality and realism

I mentioned Musser and nonlocality, and Lumo adds:
George Musser identified himself as the latest promoter of the delusion started by John Bell, the delusion saying that the world has to be "non-local" ...

The truth is just the opposite one, of course. Locality works perfectly – at least in non-gravitational context. ...

All the differences between classical physics and quantum mechanics are consequences of the nonzero commutators in quantum mechanics i.e. the uncertainty principle. There are absolutely no other differences between classical physics and quantum mechanics. That fact also means that whenever the commutators between the relevant quantities are zero or negligible, the difference between classical physics and quantum mechanics becomes zero or negligible, too.

The uncertainty principle is the actual reason why it's inconsistent in quantum mechanics to assume that the observables have their values before they're actually observed.
Yes, and that is how the quantum mechanics textbooks have explained it since before EPR, Bohm, and Bell.

Here is a new paper on retrocausality in quantum mechanics, and the authors keep talking about how they want to believe in "realism" and reality. That means that physical systems have their values before they're actually observed. Quantum mechanics shows that observables cannot have their values before observations, but realism is the hope that the state can be fully described by other variables before observation.

So realism is not a belief in the real world, but a believe in a mathematical abstraction of reality. This word usage seems peculiar to me, as I would have said that I believe in realism until I found out what they mean by the term. In quantum mechanics, realism has been a dead-end for 80 years.

Wednesday, July 27, 2016

Not to reveal to us the real nature of things

Much of the confusion over the interpretation of quantum mechanics concerns whether the mathematical wave functions reveal to us the true (ontic) nature of electrons and photons.

Henri Poincare explained it in his famous 1902 book:
The object of mathematical theories is not to reveal to us the real nature of things; that would be an unreasonable claim. Their only object is to co-ordinate the physical laws with which physical experiment makes us acquainted, the enunciation of which, without the aid of mathematics, we should be unable to effect. Whether the ether exists or not matters little — let us leave that to the metaphysicians; what is essential for us is, that everything happens as if it existed, and that this hypothesis is found to be suitable for the explanation of phenomena. After all, have we any other reason for believing in the existence of material objects? That, too, is only a convenient hypothesis; only, it will never cease to be so, while some day, no doubt, the ether will be thrown aside as useless. [Science and Hypothesis, chap. 12, 1st paragraph]
Yes, I could not say it better today. The object of mathematical theories is to make sense of quantified experiments, not to be a perfect description of the real nature of electrons.

The aether is a convenient hypothesis, because our best theories assume a pervasive and uniform quantum field. Even in a vacuum, it has energy, Lorentz symmetries, gauge fields, and virtual particles. Quantum electrodynamics is all about perturbations to that quantum vacuum. In the early XXc the aether was cast aside as useless but it keeps coming back under different names.

It is foolish to think that the true nature of the electron is its wave function. That is just a mathematical device for predicting observables. Attempts to clarify the nature of electrons with hidden variables, as by Bell and his followers, are even more foolish.

Sunday, July 24, 2016

Musser tries to explain nonlocality

George Musser writes on the Lumo blog about nonlocality:
The situation is nonlocal inasmuch as we are speaking of joint properties of spatiotemporally separated objects. We know the singlet electrons have a total spin of zero, but we cannot ascribe either particle a definite spin in advance of measurement. If you object to the word “nonlocal” in this context, fine. I would also be happy with “nonseparable,” “delocalized,” or “global.” ...

The real issue is how to explain the phenomenology of correlations. I know that Luboš does not think highly of the EPR paper (neither did Einstein), but it is the usual starting point for this discussion, so let us focus on the most solid part of that paper: the dilemma it presents us with. Given certain assumptions, to explain correlated outcomes, we must either assign some preexisting values to the properties of entangled particles or we must imagine action at a distance. Einstein recoiled from the latter possibility — he was committed to (classical) field theory. The former possibility was later ruled out by Bell experiments. So, presumably we need to question one of the assumptions going into the argument, and that’s where we go down the interpretive rabbit hole of superdeterminism, Everettian views, and so forth, none of which is entirely satisfactory, either. We seem to be stuck. ...

If you disagree, fine. Tell me what is going on. Give me a step-by-step explanation of how particle spins show the observed correlations even though neither has a determinate value in advance of being measured.
He is saying that if you want an intuitive understanding of "what is going on", then you have to either accept action-at-distance or contradictions with experiment. Both of those are unacceptable.

The way out of this conundrum, as the textbooks have explained for 85 years, is to reject the idea that particle spin can be modeled by classical (pre-quantum) objects. By "what is going on", he means something he can relate to by personal experience. In other words, he wants a classical interpretation.

The classical interpretations are made impossible by the noncommuting observables. Or by Bell's theorem or several other ways the point has been explained.

When you observe a particle's spin, you change its state into something that has a classical interpretation. But just temporarily. If you then measure spin in a different direction, you are back to non-classical behavior.

The supposed nonlocality is just an illusion. The experiments only seem nonlocal if you try to match them to a classical model.

I don't know why this is so difficult. I am just saying what the textbooks have said for decades.

Thursday, July 21, 2016

Einstein’s process of discovery

Einstein idolizer John D. Norton writes a lot of nonsense about Einstein, with the latest being How Einstein Did Not Discover. It includes:
11. The Power of a Single Experiment

The example of Einstein’s discovery of the light quantum will illustrate another popular myth about what powered Einstein’s discoveries. There is, in each case, a single, perplexing, powerful, decisive, crucial experiment. Only Einstein, it is said, was able to interpret the experiment correctly and arrive at a great discovery.

This myth is best known through the example of the Michelson-Morley experiment. Contrary to many reports, Einstein did not formulate the theory as a direct response to its null outcome. The mistake is an easy one to make. It was long standard for pedagogic accounts of special relativity to begin with an account of the experiment and jump directly to special relativity. The pattern starts with Einstein’s (1907) early review. It introduces the Michelson-Morley experiment and no others in its opening pages. Holton’s (1969) analysis of the myth is standard and includes numerous examples. To it, we should add that the null result of the Michelson-Morley experiment was unhelpful and possibly counter-productive in Einstein’s investigations of an emission theory of light, for the null result is predicted by an emission theory.
As Norton alludes, Einstein's 1907 article and most modern textbooks introduce the Michelson-Morley experiment as being crucial to the development of special relativity. The papers that announced the discovery of the FitzGerald contraction, Lorentz transformation, local time, and spacetime geometry all explained these concepts as consequences of MMX.

The null result could be explained by a light emission theory, or by a stationary Earth, or by an aether drag. So MMX alone did not prove special relativity. Other experiments cause the experts to reject those possibilities.

Einstein did not cite MMX or those other theories and experiments because his work was derivative. He was just giving a reformulation of Lorentz's theory, but not recapitulating Lorentz's theoretical and experimental arguments.

Einstein historians have to do a funny dance around this subject, because the relativity textbooks don't make any sense. They say that the MMX was crucial, and they say that Einstein invented special relativity, but Einstein denied that MMX had anything to do with his work.

There is a larger reason for denying the importance of MMX. Philosophers and science historians today are anti-positivists and they deny that scientific breakthrus are based on crucial experiments. Relativity was considered the first great breakthru of the XXc, so the postivist-haters need to have some way of saying that it was not driven by experiment.

It seems possible that someone could have predicted special relativity from abstract principles of causality, or from mathematical analysis of electromagnetism. But that is not how it happened. It was postivist analysis of experiments.

Monday, July 18, 2016

Air Force funds quantum computers

NextGov reports:
The Air Force wants white papers that describe new ways quantum computing could help achieve its mission, according to an amended Broad Agency Announcement posted Friday. Eventually, the government could provide a test-bed where a contractor might install, develop and test a quantum computing system, according to the announcement.

Last year, the Air Force announced it had about $40 million available to fund research into, and the eventual maintenance and installation of a quantum system -- a branch of emerging computing technology that relies on the mechanics of atomic particles to process complex equations. ...

The Air Force is among several other federal groups interested in quantum.

Last year, for instance, the Intelligence Advanced Research Projects Activity, which focuses on research and development, said it planned to award a multiyear grant to IBM to build out a component of a quantum computer. A true quantum computer might be useful, IARPA program manager David Moehring told Nextgov then, because it might be applied to complex questions like the "Traveling Salesman Problem" -- what's the best way for a salesman to visit several different locations?
$40M is not much money to the Air Force, but it shows how money is pouring into the field.

Most quantum computing projects are not even very expensive, by the standards of modern physics experiments.

I liked this comment:
String theory, multiple universes, complexity, quantum teleportation... these are to Physics what Division I football is to college, which is to say, it sells tickets and opens purse strings. No one is going to buy a book on Newtonian physics and relive their junior year in high school. But let Brian Greene write something crazy and out there about a "Holographic Universe" or somesuch and the peeps will scoop it up, and maybe even decide to become physics and math majors, and there are lots of worse results than that. So let the alumni donate for the football team, and let the googley-eyed high schoolers all plan on high-paying and fulfilling careers as Quantum Mechanics. It puts butts in the seats...
So do most physicists realize that 90% of the public image of Physics is garbage? But they quietly go along with it because it keeps the funding dollars coming in?

Sometimes I think that I am just posting the obvious on this blog. Maybe everyone knows it, but cannot say. I can say it because I am not part of the Physics money machine.

Tuesday, July 12, 2016

Google wants us to worry about quantum computing

Google is trying to get nervous about quantum computing:
Google is working on safeguarding Chrome against the potential threat of quantum computers, the company announced today.
At least they admit that the quantum computers may be impossible:
Quantum computers exist today but, for the moment, they are small and experimental, containing only a handful of quantum bits. It's not even certain that large machines will ever be built, although Google, IBM, Microsoft, Intel and others are working on it. ... quantum computers could undermine the security of the entire internet.
Johah Lehrer is back with a new book, after a spectacular rise and fall as a science writer. Before his fall, I accused him of fabricating an Einstein quote, but no one cared about that.

Thursday, July 7, 2016

When evidence is too good to be true

Phys.org reported this in January:
Under ancient Jewish law, if a suspect on trial was unanimously found guilty by all judges, then the suspect was acquitted. This reasoning sounds counterintuitive, but the legislators of the time had noticed that unanimous agreement often indicates the presence of systemic error in the judicial process, even if the exact nature of the error is yet to be discovered. They intuitively reasoned that when something seems too good to be true, most likely a mistake was made.

In a new paper to be published in The Proceedings of The Royal Society A, a team of researchers, Lachlan J. Gunn, et al., from Australia and France has further investigated this idea, which they call the "paradox of unanimity."

"If many independent witnesses unanimously testify to the identity of a suspect of a crime, we assume they cannot all be wrong," coauthor Derek Abbott, a physicist and electronic engineer at The University of Adelaide, Australia, told Phys.org. "Unanimity is often assumed to be reliable. However, it turns out that the probability of a large number of people all agreeing is small, so our confidence in unanimity is ill-founded. This 'paradox of unanimity' shows that often we are far less certain than we think."

The researchers demonstrated the paradox in the case of a modern-day police line-up, in which witnesses try to identify the suspect out of a line-up of several people. The researchers showed that, as the group of unanimously agreeing witnesses increases, the chance of them being correct decreases until it is no better than a random guess.

This is an important point, and a paradox. If someone tells you that scientists are unanimous about global warming, vaccine policy, cosmic inflation, or Donald Trump, you should be suspicious.

Of course the textbooks are unanimous on many things, such as energy conservation. So we should not reject all that textbook knowledge. But most of those things only got into the textbooks after some healthy debate about the pros and cons.

Wednesday, July 6, 2016

Comparing science to poetry

Philosophers sometimes complain that they get no respect from scientists.

The NY Times has a running series of essays, and the latest one denies the scientific method because he says science is like poetry:
In 1970, I had the chance to attend a lecture by Stephen Spender. He described in some detail the stages through which he would pass in crafting a poem. He jotted on a blackboard some lines of verse from successive drafts of one of his poems, asking whether these lines (a) expressed what he wanted to express and (b) did so in the desired form. He then amended the lines to bring them closer either to the meaning he wanted to communicate or to the poetic form of that communication.

I was immediately struck by the similarities between his editing process and those associated with scientific investigation and began to wonder whether there was such a thing as a scientific method. Maybe the method on which science relies exists wherever we find systematic investigation. In saying there is no scientific method, what I mean, more precisely, is that there is no distinctly scientific method.

There is meaning, which we can grasp and anchor in a short phrase, and then there is the expression of that meaning that accounts for it, whether in a literal explanation or in poetry or in some other way. Our knowledge separates into layers: Experience provides a base for a higher layer of more conceptual understanding. This is as true for poetry as for science. ...

James Blachowicz is a professor emeritus of philosophy at Loyola University Chicago
Science finds objective truths about the world. Poetry just expresses thoughts in an entertaining way. If he cannot see the difference, he deserves no respect.

Saturday, July 2, 2016

The crisis in Physics

NPR Radio reports:
Of course, there are many scientists who continue to see great promise in string theory and the multiverse. But, as Marcelo and I wrote in The New York Times last year, it all adds up to muddied waters and something some researchers see as a "crisis in physics."

Smolin and Unger believe this crisis is real — and it's acute. They pull no punches in their sense that the lack of empirical data has led the field astray.
The "crisis" here is that we have good physical theories that explain nearly everything that is observed. Theoretical physicists like to speculate about unobservable parallel universes, but then they have no data to test their ideas.