My motto about this is that if a little quantum hanky-panky will allow you to reproduce a little faster, then by God, you're going to use quantum hanky-panky. It turns out that plants and bacteria and algae have been using quantum hanky-panky for a billion years to make their lives better.The fallacy here is that our personal computers are already using some quantum hanky-panky. You need quantum mechanics to understand CPU chips, memory chips, disc drives, and most of the other electronics. They use quantum hanky-panky as much as plants do in photosynthesis.
Indeed, what's happening in quantum information and quantum computing is it's become more and more clear that quantum information is a universal language for how nature behaves. ...
There is an old idea, originally due to Richard Feynman, that you could use quantum computers to simulate other quantum systems — a quantum analog computer. Twenty years ago, I wrote the first algorithms for how you could program the quantum computers we have now to explore how their quantum systems behave. ...
Thinking about the future of quantum computing, I have no idea if we're going to have a quantum computer in every smart phone, or if we're going to have quantum apps or quapps, that would allow us to communicate securely and find funky stuff using our quantum computers; that's a tall order. It's very likely that we're going to have quantum microprocessors in our computers and smart phones that are performing specific tasks.
This is simply for the reason that this is where the actual technology inside our devices is heading anyway. If there are advantages to be had from quantum mechanics, then we'll take advantage of them, just in the same way that energy is moving around in a quantum mechanical kind of way in photosynthesis. If there are advantages to be had from some quantum hanky-panky, then quantum hanky-panky it is.
Yes, Feynman was right that you could use a quantum machine to simulate a quantum world, for the simple reason that the world simulates itself, and that using a digital computer is computationally inefficient.
The interesting question is whether a quantum computer can do a super-Turing digital computation. I doubt it.
Lloyd recites how many brilliant physicists have worked on this question for 30 years, and now 1000s are working on it. He compares progress to the early days of digital computers.
I don't buy it. Within about 5 years of building the earliest digital computers, they were simulating the weather and nuclear bomb explosions. The quantum computer folks are still working on connecting one qubit with another. The fact that so many smart ppl are working on it and the results so meager, only shows that it may not be possible.
All of the smart high-energy physicists were believers in SUSY (supersymmetry) for 30 years also, and some of them are just now paying off bets now that the LHC has failed to find any SUSY particles. So yes, when all the experts say that something is theoretically possible, they may be all wrong.
Quantum computers work by storing and processing information at the most microscopic level. For instance, if you have an electron, you could call an electron spinning out like this zero, you could call an electron spinning like that one, and you could call an electron spinning like that zero and one at the same time, which is the primary feature of quantum computation. A quantum bit—qubit—can be zero and one at the same time; this is where quantum computers gain their power over classical computers.This is like saying Schroedinger's cat is alive and dead at the same time. If we could do that, maybe we could have a horse plow a field and pull a carriage at the same time, and thus do twice as much work.
That is impossible, because it violates energy principles. But Lloyd wants the qubit to do two computations at the same time, and thus twice as much computational work on each low-level operation. String those together, and you get exponentially as much work.
That would violate computational principles, but the quantum computer folks do not believe them.
In fairness, Scott Aaronson would say that I am taking advantage of Lloyd's oversimplified explanation, and that the qubit does not really do two things at the same time. It just does some quantum hanky panky with negative probabilities that is too subtle for fools like me to understand, and that is enuf for a big computational gain.
The quantum computers are now getting a lot bigger. Now, we have tens of bits, soon we'll have 50 bits, then we'll have 500 bits, because now there's a clear path to how you build a larger scale quantum computer.If they had 50 qubits, they would have enuf for a super-Turing computation. The first physicist to demonstrate that would be a likely candidate for a Nobel prize. Based on the current hype, it should happen in the next year or so.
So if it happens, you will hear about it. Then you can leave a comment here telling me that I am wrong. Otherwise, you should start to wonder if maybe Lloyd and the 1000s of others might be wrong.