Wednesday, October 1, 2014

Quantum crypto hype on PBS Nova

I just watched PBS NOVA Rise of the Hackers
A new global geek squad is harnessing cryptography to stay a step ahead of cybercriminals.
At the heart of almost everything we do on the web is a special kind of number, a prime number. ... But what makes them so special to codes is when you combine two of them. ... If ... you've got to figure out the two original primes, the only way is trial and error. ...

This system of public and private keys is known as the RSA algorithm. That beautiful piece of mathematics has fundamentally changed the world around us.
I do not think that these explanations are very helpful. It is true that most SSL web servers use RSA, but Diffie-Hellman came first and is arguably superior for most purposes. That was the real breakthrough. And trial and error is not the only way to break RSA, as other methods are asymptotically much faster.
It has made the hunt for very very large prime numbers one of the most important quests in the field of mathematics. And here is the current largest, all 5000 pages of it.
This is nonsense. These numbers have almost no importance to either mathematics or cryptography. They are just a silly curiosity.
The mathematics of the world of the very small means that things can be in many places at the same time. ... They may be a way to crack the world's most powerful codes.
Not really. Saying that particles can be in more than one place at the same time is just an interpretation of the wave function, and has never been observed. So you can choose to believe in it or not, and no one can disprove you.

They go on to explain qubits as being 0 and 1 at the same time, so that a quantum computer can calculate both simultaneously. Scott Aaronson argues on his blog that this is a bad explanation, because it leads you to believe that quantum computers are more powerful than they really are.
It blows the doors off of RSA encryption. Right. All we need is more and more qubits, a larger quantum computer. Really all that's left to do is to scale up this particular architecture.
This is just nonsense. They have not made a scalable qubit, and there is no known way to scale up what they have. They have never achieved that quantum speedup.

Seth LLoyd says:
Quantum computers are particular fine for ... or for simulating happens when a black hole collapses, or for that matter, a recent experiment that we did to actually implement a version of time travel.
He is dreaming here.
But quantum mechanics also provides methods of communicating securely in a way that is guaranteed by the laws of physics.
More nonsense from Lloyd.
Quantum cryptography is already used by folks who want extreme security. By banks and by agencies whose job is to protect information. And nowadays there are number of commercial companies who actually build quantum cryptographic systems. For a fee, you too can communicate in communicate in complete and utter privacy guaranteed by the laws of quantum mechanics.
Before you pay that fee, you should realize: Every one of those systems has been broken. No bank or agency uses them for anything important. They cannot be used with routers or any other internet equipment. They cannot authenticate who you are communicating with. Even that supposed quantum mechanics guarantee is just a probabilistic statement that is much weaker than you can get with RSA or other methods.

Lloyd admits that these systems are susceptible to an impostor at the other end, but then implies that all crypto systems have this same problem. But that is not true. The main technical advance of RSA was actually not encryption, but authentication. When you log into Ebay or Amazon, it verifies that you are really connecting to Ebay or Amazon. In worst case, you could be connecting to someone who has hijacked the DNS servers and stolen Ebay's RSA secret key, or who built a quantum computer to factor the RSA public key. Neither has ever happened. Quantum crypto offers no such assurances, and you would be on your own to verify that a cable connects directly from your computer to the Ebay server. It can detect certain kinds of tampering, but it cannot detect some hijacking the whole line.

I don't know how respectable professors like Lloyd can shamelessly over-hype their field like this. They are charlatans. Even Aaronson is not this bad.

A better popular explanation can be found in this Wired article on the double slit experiment:
To summarize, we’ve arrived at a pattern of fringes that’s built up one particle at a time. But when you try to work out exactly how those particles got there, you find that they didn’t take the left route, they didn’t take the right route, they didn’t take both routes, and they didn’t take neither routes. As MIT professor Allan Adams puts it, that pretty much exhausts all the logical possibilities!

An electron is not like a wave of water, because unlike a wave, it hits a screen at a single location. An electron is not like a baseball, because when you throw in a bunch of them through a double slit, they interfere and create patterns of fringes. There is no satisfying analogy that anyone can give you for what an electron is.
Some people prefer to think of the electron as being two places at once, or being a wave, or even as being immune to the usual laws of logic. If one of those makes you feel better, then go ahead. But there is no perfect classical analogy of the sort sought by Einstein and Bell. Those who use these analogies to promote quantum crypto or computing are just charlatans.

Update: I mentioned above a point that Aaronson likes to make on his blog, and he just made it again:
Several readers have expressed disapproval and befuddlement over the proposed title of my next book, “Speaking Truth to Parallelism.” ...

What it means, of course, is fighting a certain naïve, long-ago-debunked view of quantum computers—namely, that they would achieve exponential speedups by simply “trying every possible answer in parallel”—that’s become so entrenched in the minds of many journalists, laypeople, and even scientists from other fields that it feels like nothing you say can possibly dislodge it. The words out of your mouth will literally be ignored, misheard, or even contorted to the opposite of what they mean, if that’s what it takes to preserve the listener’s misconception about quantum computers being able to solve NP-hard optimization problems by sheer magic. ...

Coincidentally, this week I also got an email from a longtime reader of this blog, saying that he read and loved Quantum Computing Since Democritus, and wanted my feedback on a popular article he’d written about quantum computing.  What was the gist of the article?  You guessed it: “quantum computing = generic exponential speedups for optimization, machine learning, and Big Data problems, by trying all the possible answers at once.”

These people’s enthusiasm for quantum computing tends to be so genuine, so sincere, that I find myself unable to blame them—even when they’ve done the equivalent of going up to Richard Dawkins and thanking him for having taught them that evolution works for the good of the entire species, just as its wise Designer intended.  I do blame the media and other careless or unscrupulous parties for misleading people about quantum computing, but most of all I blame myself, for not making my explanations clear enough.  In the end, then, meeting the “NP space” folks only makes me want to redouble my efforts to Speak Truth to Parallelism: eventually, I feel, the nerd world will get this point.
The reasoning is this: (1) quantum mechanics proves that an electron can be in two places at once, as a cat can be in a superposition of dead and alive; (2) a qubit is a computer 0-1 bit that stores a superposition of the two possible states; and (3) a quantum computer processes qubits like a regular computer processes bits, thereby computing all possible data values at once.

I am glad to see that Aaronson is writing a whole book on why this reasoning is wrong, but I fear that he will not get to the heart of the matter, because of his faith in quantum computing. The root of the problem is that it is just not true that electrons can be in two places at once, or have negative probability.

His blog readers do not like his new book title, and one suggests: “Everything You Believe About Quantum Computing is a Lie”. I can agree with that title. So he will not use it. He wants to hang on to the lie that quantum computers are almost certainly possible in the future. (I think we agree that they are not possible with today's technology.)

2 comments:

  1. Roger,
    In relation to the double slit experiment. There are other interpretations of the data that do not depend upon contradictory linguistics. Yes there is a pattern. The foolishness is the assumption that the electrons were the sole cause of the interference. For griefs sake, play with the charge of the surface with the slit in it. The thing is practically if not literally an electron gun. Think CRT. Think of how that electron gun somehow 'magically' manages to draw a picture. Play with the charge across your surface with the double slit and watch the damn defraction light show. I'm sure glad someone else thought of that or DOH!!!! If an electron beam can be played with so easily that it can be manipulated into drawing an image on phosphor covered glass, what could an unintentional background charge across a plate do? Has anyone who performed this double slit experiment ever watched interference patters and waves on an old TV screen? They were actually kind of fun, and as easy to generate as mom turning on the vacuum cleaner.


    ReplyDelete
  2. And when will Intel/Microsoft/etc... introduce encrypted RAM to stop memory hackers from obtaining personal information from the computers of Target/Chase/etc... and which would also stop the game hackers from ruining multiplayer games which various foreign intelligence agencies play all day to destroy our children's morale.

    ReplyDelete