Thursday, October 31, 2019

Aaronson explain qubits in the NY Times

Scott Aaronson announces his New York Times op-ed on quantum supremacy. His own personal interest in this is greater than I thought, as he says the NY Times forced him to reveal:
Let’s start with applications. A protocol that I came up with a couple years ago uses a sampling process, just like in Google’s quantum supremacy experiment, to generate random bits. ... Google is now working toward demonstrating my protocol; it bought the non-exclusive intellectual property rights last year.
He was the outside reviewer of the Google paper published in Nature. So he had a big hand in the editorial decision to say that this was quantum supremacy. Aaronson claims the credit for Google confirming that quantum computers can be used for generating random numbers. And Google paid Aaronson for the privilege.

I am not accusing Aaronson of being crooked here. I sure his motives are as pure as Ivory Soap. But he sure has a lot invested in affirming quantum supremacy based on random number generation. Maybe the Nature journal should have also required this disclosure.

He admits:
The computer revolution was enabled, in large part, by a single invention: the transistor. ... We don’t yet have the quantum computing version of the transistor — that would be quantum error correction.
So we don't have real qubits yet.

Aaronson has spent many years trying to convince everyone that there is a right way and a wrong way to explain qubits. Here is the wrong way:
For a moment — a few tens of millionths of a second — this makes the energy levels behave as quantum bits or “qubits,” entities that can be in so-called superpositions of the 0 and 1 states.

This is the part that’s famously hard to explain. Many writers fall back on boilerplate that makes physicists howl in agony: “imagine a qubit as just a bit that can be both 0 and 1 at the same time, exploring both possibilities simultaneously.”
So here is his better version:
Here’s a short version: In everyday life, the probability of an event can range only from 0 percent to 100 percent (there’s a reason you never hear about a negative 30 percent chance of rain). But the building blocks of the world, like electrons and photons, obey different, alien rules of probability, involving numbers — the amplitudes — that can be positive, negative, or even complex (involving the square root of -1). Furthermore, if an event — say, a photon hitting a certain spot on a screen — could happen one way with positive amplitude and another way with negative amplitude, the two possibilities can cancel, so that the total amplitude is zero and the event never happens at all. This is “quantum interference,” and is behind everything else you’ve ever heard about the weirdness of the quantum world.
Really? I may be dense, but I don't see that this is any better. He insists that the key is realizing that probabilities can be negative, or imaginary.

But this is just nonsense. There are no negative probabilities in quantum mechanics, or anywhere else.

We do have interference. Light does show interference patterns, as is possible for all waves. There is nothing the slightest bit strange about waves showing interference. But Aaronson insists on saying that the interference comes from negative probabilities. I don't see how that is mathematically accurate, or helpful to understanding quantum mechanics.


  1. Negative probability hunh? No. For the same reason to you can't have 200% certainty of anything. Using a '%' symbol for quantities above 100 or below zero kind of misses the point or what 'percent' means.

    I bet he believes in negative amounts of time too. I can't remember how many times my calculus teacher had to explain to the class how every answer produced is not always useful, and how some are discarded outright.

  2. I think he has issues because he mixes up 2-3 metaphors/ideas but in a very confusing way---because the metaphors themselves are like that.

    He wants to use the word ``amplitude'' (but by dropping the adjective ``complex,'' which may perhaps be OK for NY Times). He emphasizes amplitudes obviously because Feynman liked that word (cf. small QED book). But then, since the layman wouldn't know about the complex exponentials, he can't connect these ``amplitude''s with interference or waves. So, the word ``interference'' comes up rather suddenly, with a jerk as it were. Then, since our poor fellow works in the QC field, he also has to use only two states for any superposition. There, he loses the guy who has caught on with ``interference'' (as of waves). Finally, he must also have ``probability'' thrown in it because this is QM (though he would be careful not to emphasize that the QC output is always probabilistic, in principle). The only guy who can connect with ``probability'' perhaps would be the one who understands ``amplitude'', but now not just interference but also two-state qubits go out the window.

    It's all there in the kind of metaphors he picks up. They can't connect together very easily.

    But it beats me why he doesn't drop all that QM-mystifying boiler-plates of the Feynmanian kind, and start directly with the Bloch sphere?

    He could simplify the Bloch sphere to a circle, say of the clock with only one, continuously movable, hand. Such a simplification would be OK; this is just the NY Times. (Insert caveats if necessary.)

    Then he could say that a single qubit's state is like having clock-hand pointing to some time, say 2.5 o'clock position or the 4 o'clock position. Then, when we do the measurement, the measurement process works in such a way that it snaps the hand to either the 12 o'clock position or the 6 o'clock position. Only one of the two positions. No other. Always. Like how the electrical switch snaps to either "on" or "off" positions. Further, which position it will snap to is at random, but it has definite probabilities. The probability for the hand swinging to one of the two positions is related to how close or away it is from it. In QM, before measurement, a hand at some other position (say at the 2.5 o'clock position) is said to be in the superposition of the vertically up *and* the vertically down positions (the 12 o'clock and the 6 o'clock positions).

    Then, he could continue, if he wanted, to say that the many-qubit QC is like many clocks, side by side, all connected together in some complicated way. There is no probability for a single clock anymore. The probability is only for all the hands of all the clocks taken together.

    I am sure he (and his editor) could express the idea above in a much better and compact way.

    But then, following a Bloch sphere-based explanation requires that he give up mentioning amplitudes, interference, etc.---all of Feynman's favorite words. He would also have to give up the sexy-fying idea that the ``building block''s (or ``brick''s) of *solid* matter are not solid. They are not even some things. They are just some hazy *probabilistic* clouds of mathematics in the high-dimensional Hilbert space. He would lose this sexy-fying aspect.

    Basically, he would have to give up the idea that simpler metaphors can be used in explaining QM.

    But if he insists on using the metaphors he does (like ``amplitudes''), then there is no simple way to connect them.

    That's basically why he ends up going around in circles, in a chaotic way.

    He should consider my suggestion.

    [You too should like my suggestion---the Block sphere was originally invented much earlier by Poincare---the same mathematician who originated the movement of denying the Lorentz aether.]