Friday, October 4, 2019

Google scooped by unconventional p-bit computer

It is funny how quantum computing evangelist Scott Aaronson is flummoxed by being scooped by a rival technology:
Nature paper entitled Integer factorization using stochastic magnetic tunnel junctions (warning: paywalled). See also here for a university press release.

The authors report building a new kind of computer based on asynchronously updated “p-bits” (probabilistic bits). A p-bit is “a robust, classical entity fluctuating in time between 0 and 1, which interacts with other p-bits … using principles inspired by neural networks.” They build a device with 8 p-bits, and use it to factor integers up to 945. They present this as another “unconventional computation scheme” alongside quantum computing, and as a “potentially scalable hardware approach to the difficult problems of optimization and sampling.”

A commentary accompanying the Nature paper goes much further still — claiming that the new factoring approach, “if improved, could threaten data encryption,” and that resources should now be diverted from quantum computing to this promising new idea, one with the advantages of requiring no refrigeration or maintenance of delicate entangled states. (It should’ve added: and how big a number has Shor’s algorithm factored anyway, 21? Compared to 945, that’s peanuts!)

Since I couldn’t figure out a gentler way to say this, here goes: it’s astounding that this paper and commentary made it into Nature in the form that they did. This is funny. While Google is keeping mum in order to over-dramatize their silly result, a rival group steals the spotlight with non-quantum technology.

Aaronson is annoyed that this is non-quantum technology making extravagant claims, but exactly how is the Google quantum computer effort any better?

Apparently Google refuses to compete in any meaningful way, as Aaronson says
How large a number Google could factor, by running Shor’s algorithm on its current device, is a semi-interesting question to which I don’t know the answer. My guess would be that they could at least get up to the hundreds, depending on how much precompilation and other classical trickery was allowed. The Google group has expressed no interest in doing this, regarding it (with some justice) as a circus stunt that doesn’t showcase the real abilities of the hardware.
A circus stunt? Obviously the results would be embarrassingly bad for Google.

Others have claimed to use quantum computers to factor 15 or , but those were circus stunts. They failed to show any evidence of a quantum speedup.

An interesting quantum computer result would factor numbers with Shor's algorithm, and show how the work scales with the size of the number.

Also:
But as I explained in the FAQ, running Shor to factor a classically intractable number will set you back thousands of logical qubits, which after error-correction could translate into millions of physical qubits. That’s why no can do it yet.
And that is why we will not see true quantum supremacy any time soon. All Google has is a fancy random number generator.

No comments:

Post a Comment