Thursday, September 1, 2016

Quantum Hanky-Panky

MIT professor Seth Lloyd gives this argument for quantum computers:
My motto about this is that if a little quantum hanky-panky will allow you to reproduce a little faster, then by God, you're going to use quantum hanky-panky. It turns out that plants and bacteria and algae have been using quantum hanky-panky for a billion years to make their lives better.

Indeed, what's happening in quantum information and quantum computing is it's become more and more clear that quantum information is a universal language for how nature behaves. ...

There is an old idea, originally due to Richard Feynman, that you could use quantum computers to simulate other quantum systems — a quantum analog computer. Twenty years ago, I wrote the first algorithms for how you could program the quantum computers we have now to explore how their quantum systems behave. ...

Thinking about the future of quantum computing, I have no idea if we're going to have a quantum computer in every smart phone, or if we're going to have quantum apps or quapps, that would allow us to communicate securely and find funky stuff using our quantum computers; that's a tall order. It's very likely that we're going to have quantum microprocessors in our computers and smart phones that are performing specific tasks.

This is simply for the reason that this is where the actual technology inside our devices is heading anyway. If there are advantages to be had from quantum mechanics, then we'll take advantage of them, just in the same way that energy is moving around in a quantum mechanical kind of way in photosynthesis. If there are advantages to be had from some quantum hanky-panky, then quantum hanky-panky it is.
The fallacy here is that our personal computers are already using some quantum hanky-panky. You need quantum mechanics to understand CPU chips, memory chips, disc drives, and most of the other electronics. They use quantum hanky-panky as much as plants do in photosynthesis.

Yes, Feynman was right that you could use a quantum machine to simulate a quantum world, for the simple reason that the world simulates itself, and that using a digital computer is computationally inefficient.

The interesting question is whether a quantum computer can do a super-Turing digital computation. I doubt it.

Lloyd recites how many brilliant physicists have worked on this question for 30 years, and now 1000s are working on it. He compares progress to the early days of digital computers.

I don't buy it. Within about 5 years of building the earliest digital computers, they were simulating the weather and nuclear bomb explosions. The quantum computer folks are still working on connecting one qubit with another. The fact that so many smart ppl are working on it and the results so meager, only shows that it may not be possible.

All of the smart high-energy physicists were believers in SUSY (supersymmetry) for 30 years also, and some of them are just now paying off bets now that the LHC has failed to find any SUSY particles. So yes, when all the experts say that something is theoretically possible, they may be all wrong.

Lloyd explains:
Quantum computers work by storing and processing information at the most microscopic level. For instance, if you have an electron, you could call an electron spinning out like this zero, you could call an electron spinning like that one, and you could call an electron spinning like that zero and one at the same time, which is the primary feature of quantum computation. A quantum bit—qubit—can be zero and one at the same time; this is where quantum computers gain their power over classical computers.
This is like saying Schroedinger's cat is alive and dead at the same time. If we could do that, maybe we could have a horse plow a field and pull a carriage at the same time, and thus do twice as much work.

That is impossible, because it violates energy principles. But Lloyd wants the qubit to do two computations at the same time, and thus twice as much computational work on each low-level operation. String those together, and you get exponentially as much work.

That would violate computational principles, but the quantum computer folks do not believe them.

In fairness, Scott Aaronson would say that I am taking advantage of Lloyd's oversimplified explanation, and that the qubit does not really do two things at the same time. It just does some quantum hanky panky with negative probabilities that is too subtle for fools like me to understand, and that is enuf for a big computational gain.

The quantum computers are now getting a lot bigger. Now, we have tens of bits, soon we'll have 50 bits, then we'll have 500 bits, because now there's a clear path to how you build a larger scale quantum computer.
If they had 50 qubits, they would have enuf for a super-Turing computation. The first physicist to demonstrate that would be a likely candidate for a Nobel prize. Based on the current hype, it should happen in the next year or so.

So if it happens, you will hear about it. Then you can leave a comment here telling me that I am wrong. Otherwise, you should start to wonder if maybe Lloyd and the 1000s of others might be wrong.



  2. Thanks. I will post some comments on that quantum computer hype.

    1. I'm very glad they are trying hard to build them, because their failure will demonstrate to everyone that nature doesn't work by magic.

    2. Really? Take a look!

      I don't think QCs will work because of some mythical cancellation or simultaneity. However, remember how stupid mathematicians can and have been:

      The major questions in geometry were how to square a circle, trisect an angle, inscribe regular polygons and inverse trig. They can ALL be done with Archimedes' Spiral, constructed only with a piece of string unwrapped from a circle. Gauss and others somehow thought a piece of string was not a basic instrument of construction, although people used them to make an ellipse since antiquity.

      We have the moronic theorems of Cantor, Godel & Turing about impossibility but they are nothing of the sort. Asking a system you don't know is consistent, if it's consistent, is moronically stupid. It's obvious without a proof that logical explosion could have it tell you it is. It in no way says that a formal system could not produce all arithmetical truths. Transfinite numbers, "uncomputable" numbers and completed infinities are pure non-sense involving impredicativity mentioned by Poincare and mor recently Solomon Feferman. Not a single real number has been truly demonstrated. Finite math rids ALL, and I mean ALL, the paradoxes of analysis and set theory.

      P vs. NP is also hardly obvious when you see that the question is poorly posed. "Finite" length programs with pre-computation (a finite set of hard cases) and constant-time jumps (my additional observation about the poor modeling by clumsy Turing machines) even make Don Knuth wonder.

      People like Seth Lloyd cannot even understand the meaning of Maxwell's Demon and thermodynamics. No absolute proofs have been given to refute reversibility. Read Maxwell's letters and you will understand that he was saying that such a demon could do no work and was not a physical thing but was only entertained to explain a circular reasoning of statistics.

      Some things do seem impossible in physics but some of the most prominent examples are not demonstrated by the "skeptic" community. Anyone with a fifth-grade education (say a degree from libtard UCLA) can understand that energy requirements are too great for deep-space travel, even before considering general relativity. The non-local nature of faster-than-light propagation also seems unlikely when considering how people can't even understand Gell-Mann's invocation of Bertleman's Socks.

      However, autistics never see the spirit side of life. They'll never truly understand this universe and I wouldn't be too proud of skepticism as a movement. It has led to a lot of bloodbaths.

    3. Let me not forget quintic polynomials. Here is my little bisection script: All kinds of things are possible when previous times thought they weren't. Numerical methods, optimization algos, non-parametric statistics, perturbation theory (asymptotics), etc... solve just about everything to arbitrary accuracy. Even chiral fermions are being simulated with lattice gauge theory. Fairly simple 4-D gauge theory can basically unify physics. My point is that we don't need a supercomputer anyway.

  3. You think that Google will admit failure? I doubt it.

    1. Good point. But they could always spin it as a scientific discovery that QM is false.

    2. It's not QM that's false, it's the notion that "entanglement" is a real physical property

    3. The experts in QC are convinced that QM is false iff large scale QC is impossible.

  4. I'm all for PRIVATE investors wasting their money. When the government gets involved however... there goes the economy. Science is not a product of committees, politics and bureaucracy.

  5. Roger, my condolences on your mom's passing!

    Dave R

  6. Roger, my deepest sympathies to you and all of your family today.


  7. This comment has been removed by the author.

  8. "Researchers at the University of Rochester have moved beyond the theoretical in demonstrating that an unbreakable encrypted message can be sent with a key that's far shorter than the message—the first time that has ever been done."

  9. "Researchers at the University of Rochester have moved beyond the theoretical in demonstrating that an unbreakable encrypted message can be sent with a key that's far shorter than the message—the first time that has ever been done."