Modern computers are not unlike the looms of the industrial revolution: They follow programmed instructions to weave intricate patterns. With a loom, you see the result in a cloth or carpet. With a computer, you see it on an electronic display.If they are right. Too bad they are not right.
Now a group of physicists and computer scientists who are funded by Microsoft are trying to take the analogy of interwoven threads to what some believe will be the next great leap in computing, so-called quantum computing.
If they are right, their research could lead to the design of computers that are far more powerful than today’s supercomputers and could solve problems in fields as diverse as chemistry, material science, artificial intelligence and code-breaking.
The proposed Microsoft computer is mind-bending even by the standards of the mostly hypothetical world of quantum computing.No progress. No speedup. All based on hypothetical ideas that have never been observed.
Conventional computing is based on a bit that can be either a 1 or a 0, representing a single value in a computation. But quantum computing is based on qubits, which simultaneously represent both zero and one values. If they are placed in an “entangled” state — physically separated but acting as though they are connected — with many other qubits, they can represent a vast number of values simultaneously.
And the existing limitations of computing power are thrown out the window.
In the approach that Microsoft is pursuing, which is described as “topological quantum computing,” precisely controlling the motions of pairs of subatomic particles as they wind around one another would manipulate entangled quantum bits. Although the process of braiding particles takes place at subatomic scales, it is evocative of the motions of a weaver overlapping threads to create a pattern.
By weaving the particles around one another, topological quantum computers would generate imaginary threads whose knots and twists would create a powerful computing system. Most important, the mathematics of their motions would correct errors that have so far proved to be the most daunting challenge facing quantum computer designers. ...
On Thursday, however, an independent group of scientists reported in the journal Science that they had so far found no evidence of the kind of speedup that is expected from a quantum computer in tests of a 503 qubit D-Wave computer. The company said through a spokesman that the kinds of problems the scientists evaluated would not benefit from the D-Wave design.
Microsoft’s topological approach is generally perceived as the most high-risk by scientists, because the type of exotic anyon particle needed to generate qubits has not been definitively proved to exist.
How many years can the press write these articles without any concrete results to point to?
For some time, many thought quantum computers were useful only for factoring huge numbers — good for N.S.A. code breakers but few others. But new algorithms for quantum machines have begun to emerge in areas as varied as searching large amounts of data or modeling drugs. Now many scientists believe that quantum computers could tackle new kinds of problems that have yet to be defined.Freedman was joking. He is a genius in a different field, but this is not going to happen.
Indeed, when Mr. Mundie asked Dr. Freedman what he might do with a working quantum computer, he responded that the first thing he would program it to do would be to model an improved version of itself.
A lot of smart people are betting on quantum computing, and you are not going to take my word over them. But this is not like string theory, where smart people can just go on forever with untestable bogus theories. The quantum computer folks are claiming to build a machine that can outperform conventional computers. These claims have been going for 20 years or so, and are likely to go on for another 20 years without any such machine having any demonstrable advantage. How long are you going to believe the hype without results?
I say it will not happen, for reasons explained on this blog.