Quantum mechanics allows us to calculate that rate, so that we can put the theory of decoherence to the test. Serge Haroche and colleagues at the École Normale Supérieure in Paris first did that in 1996 by measuring decoherence of an atom held in a device called a ‘light trap’ and interacting with photons. The loss of interference between states of the atom owing to decoherence, as calculated from quantum theory, matched the experimental observations perfectly. And in 2003 a team at the University of Vienna led by Anton Zeilinger and Markus Arndt watched interference vanish between the quantum waves of large molecules, as they altered the rate of decoherence by gradually admitting a background gas into the chamber where the interference took place, so that the gas molecules would collide with those in the matter waves. Again, theory and experiment tallied well.LuMo quibbles about this, and explains:
Decoherence is a phenomenally efficient process, probably the most efficient one known to science. For a dust grain 100th of a millimetre across floating in air, it takes about 10-31 seconds: a million times faster than the passage of a photon of light across a single proton! Even in the near-isolation of interstellar space, the ubiquitous photons of the cosmic microwave background – the Big Bang’s afterglow – will decohere such a grain in about one second.
So, for objects approaching the macroscopic scale under ordinary conditions, decoherence is, to all practical purposes, inevitable and instantaneous: you can’t keep them looking ‘quantum’. It’s almost as if the laws of quantum physics that make the world are contrived to hide those very laws from anything much bigger than atom-sized, tricking us into thinking that things just have to be the way we experience them.
Decoherence is an effective process – perhaps a gedanken process – which is irreversible and erases the information about the relative complex phases. You start with an observed physical system, like a cat. Decoherence will ultimately pick "dead" and "alive" as the preferred basis.What they don't explain is that decoherence is what destroys quantum computers.
When you hear about some hypothetical quantum computer doing some fantasy computation like factoring a 200-digit integer, it has to do it all within that decoherence time. But as Ball says, decoherence is nearly instantaneous.
When Ball says "Decoherence is a phenomenally efficient process", he means that it is efficient at destroying any possibility of super-Turing computation.
It is almost as if the laws of physics are contrived to prevent us from doing quantum supremacy computations.
Decoherence is a fancy word for why quantum weirdness does not show up at a macroscopic level. I am in a minority, but I say that it is also why quantum weirdness does not enable super-Turing computation.