Pages

Monday, March 31, 2014

High precision needed for quantum computing

Craig Feinstein asks:
Leonid Levin said, "Exponential summations used in QC require hundreds if not millions of decimal places accuracy. I wonder who would expect any physical theory to make sense in this realm."
Peter Shor replies:
If you believe the fault-tolerant threshold theorem for quantum computers, you do not require hundreds of digits of accuracy.

Levin does not believe this theorem. More precisely, he believes that the hypotheses required for the theorem to work do not apply to the actual universe.

I believe his mental model of quantum mechanics resembles the idea that the physics of the universe is being simulated on a classical machine which has floating point errors. I don't believe this is true. ...

The real question is whether the rules of the universe are exact unitary evolution or something else. If they're exact unitary evolution and you have locality of action (quantum field theories, including QED, satisfy these) then the fault-tolerant threshold theorem holds. If the universe has extra levels of weirdness under the quantum field theory, then it's not clear the hypotheses are satisfied.
I am not sure who is right here. Quantum mechanics is a linear theory and has been verified to high precision is some contexts. But a linear theory is nearly always an approximation to a nonlinear theory, and I don't think that the quantum computer folks have shown that they are operating within a valid approximation.

Shor assumes "unitary", but there are interpretations of quantum mechanics that are not unitary, and no one has proved them wrong. So how do we know nature is really unitary?

If being unitary is some physically observed law, like conservation of momentum, then we should have error bars that show us just how close to being unitary the world, and what confidence in different situations.

If being unitary is a metaphysical necessary truth, derived from the conservation of probability, then how have so many textbooks managed to get by with the Copenhagen interpretation?

I say that quantum computing is a vast extrapolation of known physics, and extrapolations are unreliable.

In other news:
An international team of researchers has created an entanglement of 103 dimensions with only two photons, beating the previous record of 11 dimensions.

The discovery could represent an advance toward toward better encryption of information and quantum computers with much higher processing speeds, according to a statement by the researchers.

Until now, to increase the “computing” capacity of these particle systems, scientists have mainly turned to increasing the number of qubits (entangled particles), up to 14 particles. ...

“The most immediate practical use is expected to be in secure communication,” Huber explained to KurzweilAI in an email interview.
I haven't read the paper but I am pretty sure than there is no practical application to secure communication. I expected them to claim that all those dimensions could be used for quantum computing.

No comments:

Post a Comment