Several people have asked me to comment about a Financial Times opinion piece entitled The Quantum Computing Bubble (subtitle: “The industry has yet to demonstrate any real utility, despite the fanfare, billions of VC dollars and three Spacs”). The piece is purely deflationary — not a positive word in it — though it never goes so far as to suggest that QC is blocked by any Gil-Kalai-like fundamental principle, nor does it even evince curiosity about that question.The article is paywalled, so I don't know anything about it.
Aaronson always likes to draw a distinction between saying something is impossible, and saying something is impossible because it is blocked by a fundamental principle.
For example, saying that perpetual motion machines are impossible is not as satisfying as saying perpetual motion machines are impossible because the First Law of Thermodynamics says energy is conserved.
Okay, maybe, but it depends on how convincing the principle is.
Aarson concedes that the article is right that it is not yet known whether it is "possible to build a large-scale, fault-tolerant quantum computer."
As for applications, my position has always been that if there were zero applications, it would still be at least as scientifically important to try to build QCs as it was to build the LHC, LIGO, or the James Webb telescope. If there are real applications, such as simulating chemical dynamics, or certifiable randomness — and there very well might be — then those are icing on the cake.That is because he likes to study quantum complexity theory. But the practical applications may well be negative.
It is possible that the biggest practical application of QC will be to destroy the security of internet communications that everyone uses everyday.
We should build a quantum computer because even if it doesn't do anything useful... or even work, it will provide overpriced employment for experts who have nothing else more useful to do. Yay science.
ReplyDeleteThere is nothing more ridiculous than trying to to design a machine that has no specific applications. This isn't how engineering works.
Roger, Scott was never challenged on the thermodynamic limits of QCs.
ReplyDelete"All quantum computers are reversible computers and, as such, are constrained thermodynamically; the operating speed of a physically realisable reversible computer scales linearly with the amount of heat or entropy it generates (i.e., the more reversible a computer is, the slower it operates). Reversible computers require a small external force with each step to drive them forward. The speed of the step scales linearly with the applied force, which also scales linearly with the energy dissipated with each step. Thus the speed of operation scales linearly with the entropy released. For example the entropy released during adiabatic switching (an example of an implementation of reversible computing) scales linearly with the speed of operation: storing a bit dissipates CV2 *RC/t, of energy where t is the ramp time over which the voltage rises linearly from base. (“Instantaneous” storage releases CV2/2.) Brownian Turing machines also require a small driving force with each step; again their forward velocity scales with the energy dissipated. For a general quantum computer the Heisenberg uncertainty time-energy principle implies a bound of dE > h/dt, with each step (dE = energy released per step, dt = time for step to complete, h = Planck’s constant) and we would expect dE to be released as heat giving TdS = dE > h/(T dt), where T is the ambient temperature and dS is the entropy released per step.
Another constraint is that a quantum computer must avoid decoherence, which implies that the total amount of entropy released over an entire computation must be O(k), k = Boltzmann’s constant. If dS >> k then the quantum computer decoheres into O(exp(dS/k)) independent microstates (by the Planck-Boltzmann law); any subsequent attempt by an external agency to read off the result of the computation will only be able to access O(exp(-dS/k)) of these microstates; garbage is read. For an operation that requires M steps this means that the averaged entropy release per step must be of O(k/M) or less for the quantum computer to function.
Putting these two results together, we see that a quantum computer that completes processing in M steps takes at least hM2/ kT or O(M2) time to complete. This is a completely general result that applies to all quantum computers. No amount of shielding from external decoherence (which is believed to be their main problem) will get around this other problem which arises from internal decoherence. Lowering the temperature actually slows the computation down."
https://megasociety.org/noesis/179
"It is shown that spontaneous symmetry breaking imposes a fundamental limit to the time that a system can stay quantum coherent...This universal timescale turns out to be t_spon ≃ 2πNħ/(k_B*T)."
https://arxiv.org/abs/cond-mat/0408357
"one might wonder what help it could be to know that, in principle, a computer can be operated by spending zero energy but, in practice, this is obtained only under the condition that all the operations are performed adiabatically, i.e., extremely slowly."
https://www.mdpi.com/1099-4300/21/9/822/htm