Wednesday, September 24, 2014

Aaronson writing quantum computing book

MIT complexity theorist Scott Aaronson announces:
A few months ago, I signed a contract with MIT Press to publish a new book: an edited anthology of selected posts from this blog, along with all-new updates and commentary.  The book’s tentative title (open to better suggestions) is Speaking Truth to Parallelism: Dispatches from the Frontier of Quantum Computing Theory.
His book will surely include his post with the most hits:
For better or worse, I’m now offering a US$100,000 award for a demonstration, convincing to me, that scalable quantum computing is impossible in the physical world. This award has no time limit other than my death, and is entirely at my discretion (though if you want to convince me, a good approach would be to convince most of the physics community first).
He explains that he was driven to make this offer by skepticism from me and others about the possibility of quantum computers. He has been active in throwing cold water on the over-hyped claims about quantum computers, but he is also stung by criticisms that he is devoting his life to analyzing something that may not even be physically possible. He has confessed his envy because other scientists can point to the intrinsic worth of the field, but quantum computer theorists have to rely on bogus claims about practical applications.

If I collect my quantum mechanics posts for a book, I'll be sure to mention his offer.

Among physicists, the common views on quantum computing, in order of decreasing popularity, are: (1) quantum computers have already been built, and they will eventually have enough qubits to be useful; (2) scalable quantum computing has not been demonstrated but is a logical consequence of quantum mechanics and advanced engineering should eventually make it possible; and (3) achieving super-Turing computing is like building a perpetual motion machine, and is unlikely to ever be achieved.

The popular press would lead you to believe opinion (1). Aaronson stands for opinion (2), and I agree with him that (1) is wrong. I believe in opinion (3), for reasons explained here. I could be proved wrong, of course.

In addition to those reasons, I have some philosophical differences with him that contribute to our divergent views.

I subscribe to an epistemic, rather than ontic, interpretation of quantum mechanics. That is, I accept the Copenhagen interpretation that was promoted by Bohr and generally accepted since the 1930s, and what Mermin now calls QBism. Aaronson paints a picture of our universe as weirdly intermediate between local and nonlocal. The psi-ontic physicists are the ones who are forever saying that quantum mechanics does not make sense, and that philosophical principles require an unobservable multiverse.

I subscribe to logical positivism, so I am very skeptical about what cannot be demonstrated. My preference is for a more positivist interpretation than even what Bohr proposed.

My slogan is Natura non facit saltus. Leibniz used this phrase to attack the "occult qualities" of an action-at-a-distance theory.

Here is Lumo and Gell-Mann sensibly dismissing many-worlds and nonlocality:
Gell-Mann spends several minutes by arguing that the feature of Everett's ideology that there are "many worlds that are equally real" is operationally meaningless. The comment may only mean that the theory treats the possible alternative histories on equal footing, except for their generally different probabilities. But only one of those actually takes place in the "real", experience-based sense of the word "actually". ...

However, that changes totally after 11:50 when Gell-Mann starts to talk about the "foolishness" often associated with the entanglement ("Einstein-Podolsky-Rosen-Bohm effect", using his words). He treats this issue at some length in his book; I hope he meant The Quark and the Jaguar.

OK, where did the "foolishness" come from? Gell-Mann says that the bulk of John Bell's work was right but he introduced words that were prejudicial such as "nonlocal". People often say that there is something nonlocal about the EPR phenomena but the only correct similar statement that they could mean, Gell-Mann emphasizes (and I often do, too) is that a classical interpretation of what is happening would require nonlocality (or negative probabilities). But the world is not classical, and no nonlocality is needed because the world is quantum mechanical. As far as Gell-Mann can tell, it's like giving a bad name to a dog and sticking with it.
Many of the quantum computing enthusiasts subscribe to many-worlds, as some of them argument that the mysterious speedup is going to come from computation in parallel universes. Guru David Deutsch says that, and Brian Cox just said something similar on the UK BBC. Aaronson does not go that far, but he does stress that the key to understanding quantum mechanics is negative probability. Gell-Mann has a much more sensible view.


  1. So some people claim that quantum computing would involve calculations being made in parallel universes. Presumably that implies the existence of a vast number of parallel universes where people have built quantum computers that do calculations in our universe. Would that cause any effects that could be observed?

  2. Good question. No one has ever seen any such effects, of course. I don't know how the MWI advocates would try to answer that.