Friday, January 31, 2014

Multiverse tries to escape the supernatural

The WSJ published this letter:
Multiverse Doesn't Help Explain Origins
Kurt Gödel's "Incompleteness Theorem" basically proves mathematically that the answer to the origin of anything, even the physical universe, always lies outside of the thing itself.

Peter Woit's review of Max Tegmark's "Our Mathematical Universe" (Books, Jan. 18) emphasizes the role of math in physics but leaves out the work of 1930s mathematician Kurt Gödel and his "Incompleteness Theorem." This theorem basically proves mathematically that the answer to the origin of anything, even the physical universe, always lies outside of the thing itself. Therefore the origin of this universe that is incredibly fine-tuned in over 100 parameters, must have a supernatural agent outside of itself. A natural agent would have to be included in the encircled physical universe.

The multiverse concept basically says that there are an infinite number of universes out there, and we just happen to live in the one and only one where all the dials were randomly set "just right." In a nutshell, perhaps many are simply trying to intellectually escape from accountability to this supernatural agent.
James Kraft
Green Bay, Wis.
Gödel's Incompleteness Theorem says nothing of the kind. Tegmark's answer is that Godel uses infinities, and the physical universe has no infinities. Tegmark says that there are an infinite number of universes in the multiverse.

The universe is fine-tuned, but I am not sure why that implies a supernatural agent. It has been known for centuries that the Earth is fine-tuned for life.

Tuesday, January 28, 2014

Dyson against wave-function collapse

Famous mathematical physicist Freeman Dyson answers 2014 : WHAT SCIENTIFIC IDEA IS READY FOR RETIREMENT?:
The Collapse Of The Wave-Function

Fourscore and seven years ago, Erwin Schrödinger invented wave-functions as a way to describe the behavior of atoms and other small objects. According to the rules of quantum mechanics, the motions of objects are unpredictable. The wave-function tells us only the probabilities of the possible motions. When an object is observed, the observer sees where it is, and the uncertainty of the motion disappears. Knowledge removes uncertainty. There is no mystery here.

Unfortunately, people writing about quantum mechanics often use the phrase "collapse of the wave-function" to describe what happens when an object is observed. This phrase gives a misleading idea that the wave-function itself is a physical object. A physical object can collapse when it bumps into an obstacle. But a wave-function cannot be a physical object. A wave-function is a description of a probability, and a probability is a statement of ignorance. Ignorance is not a physical object, and neither is a wave-function. When new knowledge displaces ignorance, the wave-function does not collapse; it merely becomes irrelevant.
Dyson was a genius, but this is nonsense. Observing an electron does not just remove uncertainty; it alters the electron.

The wave-function may not be a physical object, but it still collapses. Dyson says that it does not collapse, but becomes irrelevant and is replaced with a new wave-function. That is what collapse means -- the old wave-function is projected to a subspace based on the observation.

I thought that he was going to advocate the many worlds interpretation (MWI), as those are the main one who argue against collapse of the wave-function. They argue that the collapsing part of the wave-function is really escaping to a parallel universe. The argument is based on a belief that wave-function uncertainty should be some sort of conserved quantity like energy, so they postulate a vast collection of unobservable alternate universes.

Dyson is botching up an explanation of conventional quantum mechanics. I guess that is better than advocating some completely unscientific multiverse idea.

David Deutsch likes the MWI, and hence dislikes wave-function collapse, but his answer objects to quantum jumps:
The term "quantum jump has entered everyday language as a metaphor for a large, discontinuous change. It has also become widespread in the vast but sadly repetitive landscape of pseudo-science and mysticism. ...

OK, maybe some physicists still subscribe to an exception to that, namely the so-called "collapse of the wave function" when an object is observed by a conscious observer. But that nonsense is not the nonsense I am referring to here. ...

Quantum jumps are an instance of what used to be called "action at a distance": something at one location having an effect, not mediated by anything physical, at another location. Newton called this "so great an Absurdity that I believe no Man who has in philosophical Matters a competent Faculty of thinking can ever fall into it". And the error has analogues in fields quite distant from classical and quantum physics. For example in political philosophy the "quantum jump" is called revolution, and the absurd error is that progress can be made by violently sweeping away existing political institutions and starting from scratch. In the philosophy of science it is Thomas Kuhn's idea that science proceeds via revolutions—i.e. victories of one faction over another, both of which are unable to alter their respective "paradigms" rationally. In biology the "quantum jump" is called saltation: the appearance of a new adaptation from one generation to the next, and the absurd error is called saltationism.
Deutsch is right about this, as expressed in my motto.

A new article explains that reality of the wave function is a continuing debate:
It is not exaggerated to claim that one of the major divides in the foundations of non-relativistic quantum mechanics derives from the way physicists and philosophers understand the status of the wave function. On the instrumentalist side of the camp, the wave function is regarded as a mere instrument to calculate probabilities that have been established by previous measurement outcomes.1 On the other “realistic” camp, the wave function is regarded as a new physical entity or a physical field of some sort.
That's right, those are the two main views. Both are tenable, I guess, but you should be suspicious of anyone who makes strong claims based on the reality of the wave function, without recognizing the other view.

Monday, January 27, 2014

Hawking flips on black holes

Fox Newe reports:
Stephen Hawking now says there are no black holes, doing an about-face on the objects that helped cement his reputation as the world’s preeminent scientist, ...
Here is Hawking's paper, and the New Scientist story. Another account says:
The wheelchair-bound genius has posted a paper online that demolishes modern black hole theory. He says that the idea of an event horizon, from which light cannot escape, is flawed.

It is considered one of the pillars of physics that the incredible gravitational pull created by the collapse of a star will be so strong that nothing can break free...much of this is thanks to Hawking’s own work.

But Hawking smashes this idea by saying that rather than there being an inescapable event horizon, we should think of a far less total “apparent horizon”. And, at a stroke, he has contradicted Albert Einstein.

He sets out his argument in the paper, called Information Preservation and Weather Forecasting For Black Holes, which is likely to send his fellow scientists into a spin.

Hawking writes: “The absence of event horizons means that there are no black holes — in the sense of regimes from which light can't escape to infinity.”
Hawking is the world's most famous physicist, and his biggest accomplishments are in theoretical work supporting the existence of black holes and event horizons. And now he puts out some stupid 2-page paper saying that they don't exist?!

Physics has degenerated to the point where its biggest names give nonsense interviews about nonsense papers claiming to solve nonsense problem. There is no actual scientific evidence brought to beat at all.

Peter Woit criticized Max Tegmark's math multiverse, and finds himself saying:
I am not now and never have been a creationist.
He blocks comments on this subject. He seem a bit sensitive to me, as no New York liberal SWPL intellectual wants to be associated with creationists. Not that anyone even accused him of having anything to do with creationists.

Intelligent Design does have something in common with Tegmark's mathematical universe hypothesis. Both say that the universe shows properties of a mathematical design. Both refuse to be limited by materialism. Both are criticized for lacking empirical evidence. Both make arguments like "I find it valuable when the community carefully explores the full range of logical possibilities."

Woit also complains about a new movie about the Earth being the center of the universe, and featuring physicists. Apparently they quote physicists about the anthropic principle, fine-tuning, etc. It may also have some people misinterpreting the Bible. I do think that a lot of physicists say crazy stuff, so they should not be too upset when a movie shows them saying crazy stuff. And yes, I expect more from physicists than I do from Bible scholars.

Update: The movie producer is defensive about the movie, and says it is about the Copernican principle. He refuses to say whether he advocates geocentrism. He quotes:
Einstein on Copernicus: “The struggle, so violent in the early days of science, between the views of Ptolemy and Copernicus would then be quite meaningless. Either CS [coordinate system] could be used with equal justification. The two sentences, ‘the sun is at rest and the earth moves’, or ‘the sun moves and the earth is at rest’, would simply mean two different conventions concerning two different CS [coordinate systems].”
That is correct, and at the time of Copernicus the scientific evidence was against him. Those who say that Copernicus was more scientific than Ptolemy are seriously mistaken.

Tegmark doubles down on his creationist analogy, and argues that it is unscientific to say that multiverse ideas are nonsense without mentioning evidence to the contrary. I have made a point of posting Tegmark's so-called evidence. I say that there is no scientific evidence for the multiverse, except maybe for matter outside our light cone (level I multiverse).

Update: Wikipedia just reinserted this in its list of common misconceptions:
Black holes, contrary to their common image, have the same gravitational effects as any other equal mass in their place. They will draw objects nearby towards them, just as any other planetary body does, except at very close distances.[117] If, for example, the Sun were replaced by a black hole of equal mass, the orbits of the planets would be essentially unaffected. A black hole can act like a "cosmic vacuum cleaner" and pull a substantial inflow of matter, but only if the star it forms from is already having a similar effect on surrounding matter.[118]
This entry originally said that the misconception is that a black hole is like a cosmic vacuum cleaner, but many black holes are in fact sucking in huge amounts of matter. Maybe the physicists like Hawking and Polchinski have more misconceptions than the laymen.

Sunday, January 26, 2014

Physicists promote quantum woo

The London Daily Mail had an article on Quantum physics proves that there IS an afterlife, claims scientist:
By looking at the universe from a biocentric's point of view, this also means space and time don't behave in the hard and fast ways our consciousness tell us it does.

In summary, space and time are 'simply tools of our mind.'

Once this theory about space and time being mental constructs is accepted, it means death and the idea of immortality exist in a world without spatial or linear boundaries.

Theoretical physicists believe that there is infinite number of universes with different variations of people, and situations taking place, simultaneously.

Lanza added that everything which can possibly happen is occurring at some point across these multiverses and this means death can't exist in 'any real sense' either.

Lanza, instead, said that when we die our life becomes a 'perennial flower that returns to bloom in the multiverse.' ...

Lanza cites the double-slit test, pictured, to backup his claims. When scientists watch a particle pass through two slits, the particle goes through one slit or the other. If a person doesn't watch it, it acts like a wave and can go through both slits simultaneously. This means its behaviour changes based on a person's perception

This is nonsense, but I cannot be bothered with all the nonsense in the world. This blog focuses mainly on nonsense from physicists pretending to do physics.

Physicist Phil Moriarty posted a video rant against the above article and its quantum woo. Leftist-atheist-evolutionist Jerry Coyne praises it. I agreed with much of what Moriarty said, but along the way he argues:
You make a measurement on this one [particle], and this [other distant] one responds instantaneously. Not at the speed of light, instantaneously. ... We don't understand it. [at 5:00] ...

Spin is not spin.
No, it has never been shown that a particle responds instantaneously to a distant measurement. He is referring to the phenomenon of entanglement, which is well-understood and explain in textbooks. (BTW, all three volumes of the Feynman lectures are not freely online.) If such nonlocality were ever proved, a Nobel Prize would be given for it, and it would be one of the great discoveries in the history of science.

When genuine physicists recite this nonsense, there is little wonder that non-physicist intellectuals say it, and the popular press reports it. I blame the physicists.

His argument that spin is not spin is also nonsense. Quantum spin is the quantization of classical spin, as explained in
The electron is spinning, after all. If you treat the electron as a classical particle, you will get some paradoxes, but not just with spin. You get them with position, momentum, charge, and every other observable. Spin is real spin just like those other observables.

Philosopher Massimo Pigliucci
I’ve been reading for a while now Jim Baggott’s Farewell to Reality: How Modern Physics Has Betrayed the Search for Scientific Truth, a fascinating tour through cutting edge theoretical physics, led by someone with a physics background and a healthy (I think) dose of skepticism about the latest declarations from string theorists and the like.

Chapter 10 of the book goes through the so-called “black holes war” (BHW) ...

And now comes what Baggott properly refers to as the reality check. Let us start with the obvious, but somehow overlooked, fact that we only have (very) indirect evidence of the very existence of black holes, the celestial objects that were at the center of the above sketched dispute. And let us continue with the additional fact that we have no way of investigating the internal properties of black holes, even theoretically (because the laws of physics as we understand them break down inside a black hole’s event horizon). We don’t actually know whether Hawking radiation is a real physical phenomenon, nor whether black holes do evaporate.

To put it another way, the entire BHW was waged on theoretical grounds, by exploring the consequences of mathematical theories that are connected to, but not at all firmly grounded in, what experimental physics and astronomy are actually capable of telling us. How, then, do we know if any of the above is “true”? Well, that depends on what you mean by truth or, more precisely, to what sort of philosophical account of truth (and of science) you subscribe to.
I previously made similar points in my book, How Einstein Ruined Physics. "Ruined Physics" is another way of saying "Modern Physics Has Betrayed the Search for Scientific Truth". One of my examples is how the black hole war is debated based on faulty theoretical concepts for things like information, with no possibility of scientific evidence either way.

I partially blame the trend on how Einstein idolizers claim to be following in his footsteps. Baggott does not go so far in blaming Einstein.

Coyne hates Pigliucci for calling him on the bad science, philosophy, and theology of the New Atheists like himself. I am not sure about the philosophical issues, but Pigliucci does explain decisively why Coyne and the others are wrong about free will.

Thursday, January 23, 2014

Tegmark on book tour

I criticized Max Tegmark's new book, and I attended his book tour lecture in Santa Cruz.

He mainly tried to impress the audience that the history of science had two big trends: finding the universe to be bigger than expected, and finding it to be more mathematical than expected. He is taking these trends to the logical conclusion, and hypothesizing that the universe includes all imaginable possibilities, and that they are all purely mathematical.

I thought that I had an understanding of what he meant by "mathematical". But not I do not think that he has a coherent idea himself. A student asked that if the universe is reducible to math, then is math reducible to axioms, set theory, homotopy type theory, or what? He evaded the question, and did not answer it.

In response to another question, he said that he likes infinity, and mathematicians and physicists use infinity all the time, but he does not believe in it. He not only does not believe in infinite cardinals, he does not believe that the real numbers are infinitely divisible. At least not the real numbers that match up to his mathematical universe. By avoiding infinity, he says he also avoid Goedel paradoxes. (Update: See Tegmark's clarification in the comments below.)

This makes very little sense. The Goedel paradoxes occur with just finite proofs about finite natural numbers. I guess he can assume that the universe is some finite discrete automaton with only finitely many measurement values possible, but then the universe is not truly described by differential equations. All of his arguments for the universe being mathematical were based on differential equations.

Tegmark also spent a lot of time arguing that the govt should spend a lot more money trying to reduce risk of future disasters, such as funding the Union of Concerned Scientists or monitoring stray asteroids. He complained that Justin Bieber is more famous than some Russian technician who helped avert war during the Cuban missile crisis.

The trouble with this argument is that his math multiverse philosophy requires him to believe that time, randomness, probability, risk, human caring, emotion, and free will are all illusions. What seems like a choice is really determined. We might appear to be lucky when an asteroid misses the Earth, but a parallel asteroid hits a parallel Earth in a parallel universe, and someone with the same thoughts and feelings as you gets killed. The difference between you are the parallel guy who gets killed is just another illusion.

I asked him about this afterwards, and he claimed that I should care about the outcome of this universe for the same reasons that I put my clothes on in the morning. The woman next to me suggested that I read Sartre, if I wanted to blindly contemplate my own existence. No thanks, he was a Marxist kook.

I also listened to Tegmark's FQXi podcast on his new paper, Consciousness as a State of Matter, in addition to the solid, liquid, and gas states.

On another blog, Tegmark gave this experimental evidence for his ideas:
a) Observations of the cosmic microwave background by the Planck satellite etc. have make some scientists take cosmological inflation more seriously, and inflation in turn generically predicts (according to the work of Vilenkin, Linde and others) a Level I multiverse.

b) Steven Weinberg’s use of the Level II multiverse to predict dark energy with roughly the correct density before it was observed and awarded the Nobel prize has made some scientists take Level II more seriously.

c) Experimental demonstration that the collapse-free Schrödinger equation applies to ever larger quantum systems appears to have made some scientists take the Level III multiverse more seriously.

Is it really completely obvious that these people are all deluded and that none of these three developments have any bearing on your question?
I can believe that there is matter outside of our observable universe (light cone), and that maybe we will get indirect evidence for it, even tho we cannot see it. Call it another universe if you want. But beyond that, these multiverse arguments are silly. Weinberg's argument was merely an argument about how different dark energy densities could affect galaxy formation. It says nothing about any mulitiverse. (Lee Smolin gives another argument.) And those quantum experiments have no evidence against the Copenhagen interpretation, or you would hear about it.

Being a mathematician, my prejudices are toward a Pythagorean view that math explains everything. But Tegmark seems completely misguided to me. He has put himself out there before the public promoting these ideas as legitimate science, but I do not see it as either good math or good physics.

Update: Woit posted some sharper criticism:
The “Mathematical Universe Hypothesis” and Level IV multiverse of Tegmark’s book is not “controversial”. As far as I can tell, no serious scientist other than him thinks these are non-empty ideas. There is a controversy over the string theory landscape, but none here. These ideas are also not “radical”, they are content-free.
That is wishful thinking. The various multiverse ideas, such as many worlds, are increasingly popular. The only serious criticism of Tegmark, as far as I know, is my 2012 FQXi essay.

Wednesday, January 22, 2014

Quantum indefiniteness, not retrocausality

David Ellerman writes:
From the beginning of quantum mechanics, there has been the problem of interpretation, and, even today, the variety of interpretations continues to multiply [21]. ... mathematics itself contains a very basic duality that can be associated with two meta-physical types of reality:

1. the common-sense notion of objectively definite reality assumed in classical physics, and
2. the notion of objectively indefinite reality suggested by quantum physics.

The "problem" of interpreting quantum mechanics (QM) is essentially the problem of making sense out of the notion of objective indefiniteness. ...

There has long been the notion of subjective or epistemic indefiniteness ("cloud of ignorance") that is slowly cleared up with more discrimination and distinctions (as in the game of Twenty Questions). But the vision of reality that seems appropriate for quantum mechanics is objective or ontological indefiniteness. The notion of objective indefiniteness in QM has been most emphasized by Abner Shimony ([34], [35], [36]). ...

In addition to Shimony's "objective indefiniteness" (the phrase used here), other philosophers of physics have suggested related ideas such as:
Peter Mittelstaedtís "incompletely determined" quantum states with "objective indeterminateness" [31],
Paul Busch and Greg Jaegerís "unsharp quantum reality" [4],
Paul Feyerabendís "inherent indeÖniteness" [16],
Allen Stairsí"value indeÖniteness" and "disjunctive facts" [37],
E. J. Loweís "vague identity" and "indeterminacy" that is "ontic" [28],
Steven French and Decio Krauseís "ontic vagueness" [18],
Paul Tellerís "relational holism" [39], and so forth.

Indeed, the idea that a quantum state is in some sense "blurred" or "like a cloud" is now rather commonplace even in the popular literature. The problem of making sense out of quantum reality is the problem of making sense out of the notion of objective indefiniteness that "conflicts sharply with common sense."
The quantum indefiniteness is not so hard to understand, and it is implicit in the uncertainty principle. An electron is really a wave, and its position and momentum cannot be simultaneously observed. You can accept the objectivity of the electron, but the position and momentum are indefinite until observed.

A reader questions whether Bohr really won the Bohr-Einstein debates. F.A. Muller writes:
In his Nobel Lecture of 1969, Murray Gell-Mann notoriously declared that an entire generation of physicists was brainwashed into believing that the interpretation problems of QM were solved, by the Great Dane.
Gell-Mann also called quarks a useful mathematical figment.

I guess a lot of physicists and philosophers still do not accept what Bohr had to say, but it is a historical fact that Bohr's side of the argument is what made it into the textbooks. (Gell-Mann's lecture is not on the Nobel Sweden site, so I could not confirm his statement. He got the prize for discovering quarks, but he was always afraid to say that the quarks were real, until after everyone else accepted them.)

For a more level-headed explanation of quantum mechanics, see Why Delayed Choice Experiments do NOT imply Retrocausality. Those experiments are a little puzzling, and they do demonstrate quantum indefiniteness, but they not imply retrocausality, nonlocality, or other mystical concepts.

Monday, January 20, 2014

Math multiverse fails falsifiability test

Peter Woit trashes Tegmark's new book on the math universe as grandiose nonsense from a crank:
One answer to the question is Tegmark’s talent as an impresario of physics and devotion to making a splash. Before publishing his first paper, he changed his name from Shapiro to Tegmark (his mother’s name), figuring that there were too many Shapiro’s in physics for him to get attention with that name, whereas “Tegmark” was much more unusual. In his book he describes his method for posting preprints on the arXiv, before he has finished writing them, with the timing set to get pole position on the day’s listing. Unfortunately there’s very little in the book about his biggest success in this area, getting the Templeton Foundation to give him and Anthony Aguirre nearly $9 million for a “Foundational Questions Institute” (FQXi). Having cash to distribute on this scale has something to do with why Tegmark’s multiverse ideas have gotten so much attention, and why some physicists are respectfully reviewing the book.
Tegmark's "mathematical universe" is really a misnomer, because it is a mathematical multiverse. He does not just say that the universe is mathematical, but he postulates a multitude of other unobservable universe based on purely mathematical constructions.

Massimo Pigliucci also explains what is wrong with Tegmark's idea, but does get hung up on some side issues:
Now, the spin of a particle, although normally described as its angular momentum, is an exquisitely quantum mechanical property (i.e., with no counterpart in classical mechanics), and it is highly misleading to think of it as anything like the angular momentum of a macroscopic object.
That is nonsense. Quantum spin is the quantized version of ordinary classical spin angular momentum, as explained here.

Pigliucci also gets hung up on some issues involving Goedel and infinity. So does Tegmark.

A comment tries to defend Tegmark:
Does de Broglie-Bohm and Many Worlds tell you something non-empty about quantum mechanics? Tegmark’s work is in the same vein. Perhaps you don’t find it compelling, but that puts you in the Copenhagen camp that dismissed the utility of such alternate interpretations for years, ie. “shut up and calculate!”. Alternate interpretations don’t say anything interesting if you’re only interested in empirical data or calculated predictions, since all interpretations are formally equivalent. They do have significant explanatory power though, a power that Copenhagen completely lacks.
The de Broglie-Bohm and Many Worlds interpretations are also empty. They have told us nothing, and have no explanatory power.

Scott Aaronson mocks the idea that these interpretations tell us something about determinism:
1. “Bohmian mechanics achieves something amazing and overwhelmingly important: namely, it makes quantum mechanics completely deterministic! Of course, the actual outcomes of actual measurements are just as unpredictable as they are in standard QM, but that’s a mere practical detail.”

2. “Hey, don’t forget Many-Worlds, which also makes QM completely deterministic! Of course, the ‘determinism’ we’re talking about here is at the level of the whole wavefunction, rather than at the level of the actual outcomes you actually experience in your little neck of the woods.”
That's right, if some interpretations of quantum mechanics are deterministic, and some are not, then determinism is not a useful concept. And those who give arguments for or against free will based on determinism are misguided. So it is not true that those interpreations say anything interesting.

Roger Penrose proposed a provocative and speculative mechanism for how quantum mechanics could enable human consciousness and free will, called Orchestrated objective reduction. It is widely regarded as having been falsified. However new research claims that the theory is still viable.

Sean M. Carroll attacks falsifiability:
String theory and other approaches to quantum gravity involve phenomena that are likely to manifest themselves only at energies enormously higher than anything we have access to here on Earth. The cosmological multiverse and the many-worlds interpretation of quantum mechanics posit other realms that are impossible for us to access directly. Some scientists, leaning on Popper, have suggested that these theories are non-scientific because they are not falsifiable.

The truth is the opposite. Whether or not we can observe them directly, the entities involved in these theories are either real or they are not.
Carroll has acquired a public image as a prominent physicist who speaks out for science and against religion. He is currently on the TCM TV channel promoting old movies with science themes. He is an embarrassment because he promotes crackpot ideas and denies scientific evaluation. I had assumed that he was a Cal Tech professor, but he is not. His blog is on the Wikipedia local blacklist. I don't know why, except maybe for all the silly unfalsifiable ideas he promotes in the name of science.

Aaronson has the good sense to defend Popper and falsifiability, even if he does misunderstand how it would apply to a statement about zebras, but Lumo says that string theory needs a free pass:
Rahul: there are hundreds of rock-solid reasons to be near certain that string theory is right and none of these reasons has anything to do with experiments of the last 40 years. The characteristic scale of string theory – or any other hypothetical unifying theory or theory of quantum gravity – is inaccessible to direct experiments which means that the bulk of pretty much any progress is of mathematical nature. Am I really the first one who tells you about this fact?
Update: Aaronson adds his justification for quantum computing research:
Meanwhile, far away from the din of the circus tent lies the actual truth of the matter: that we’re in more-or-less the same situation with QC that Charles Babbage was with classical computing in the 1830s. We know what we want and we know why the laws of physics should allow it, but that doesn’t mean our civilization happens to have reached the requisite technological level to implement it. With unforeseen breakthroughs, maybe it could happen in 20 years; without such breakthroughs, maybe it could take 200 years or longer. Either way, though, I’d say that the impact QC research has already had on classical computer science and on theoretical and experimental physics, more than justifies the small number of people who work on it (fewer, I’d guess, than the number of people whose job is to add pointless features and bugs to Microsoft Office) to continue doing so.
I guess that there is some worthwhile research that makes extravagant claims in order to get funding. I am not seeing that as a good thing.

Also, Google still hopes for QC performance.

Friday, January 17, 2014

No evidence of quantum speedup

Quantum computing is one of the most overhyped technologies of our day, with nearly all the press stories and experts saying that it is an inevitable consequence of known physics. And yet after maybe $100M of research money over 20 years, it has been a total failure.

MIT quantum computing theorist Scott Aaronson announces:
A few days ago, a group of nine authors (Rønnow, Wang, Job, Boixo, Isakov, Wecker, Martinis, Lidar, and Troyer) released their long-awaited arXiv preprint Defining and detecting quantum speedup, which contains the most thorough performance analysis of the D-Wave devices to date, and which seems to me to set a new standard of care for any future analyses along these lines.
The paper says:
The development of small-scale digital and analog quantum devices raises the question of how to fairly assess and compare the computational power of classical and quantum devices, and of how to detect quantum speedup. ... we find no evidence of quantum speedup when the entire data set is considered ...
Separately M.I. Dyakonov posts another paper, Prospects for quantum computing: extremely doubtful:
When will we have useful quantum computers? The most optimistic experts say: “In 10 years”, others predict 20 to 30 years, and the most cautious ones say: “Not in my lifetime”. The present author belongs to the meager minority answering “Not in any foreseeable future”, and this paper is devoted to explaining such a point of view.
I agree with him, for reasons stated here, and I have been banned from Aaronson's blog for expressing that view.

(In fairness, it is alleged that Dyakonov does not address this paper from last month.)

I agree with Lumo that Scientific theories need to be falsifiable, and quantum computing has not yet been falsified. But unless someone finds some evidence of a quantum speedup, people are going to stop believing in this nonsense.

In this recent NPR interview, Seth Lloyd and others describe quantum computing as technologically inevitable, but breaking communications in real time might be 5 years away. This is as crazy as saying that we might have a manned space station on Jupiter in 5 years.

The above D-wave device supposedly has 503 qubits. Meanwhile the NSA classifies any research on a mere 3 logical qubits. If the research were showing an increasing number of qubits with a corresponding quantum speedup, then there would be reason to believe that some progress was being made. But whether the device as 1 or 503 qubits, no quantum speedup has been found.

Wednesday, January 15, 2014

The reality of the quantum state

Many physicists have had this idea:
Quantum mechanics predicts observations without telling us what is really going on. Perhaps the elusive truth is hidden in variables that are yet to be discovered.

So look at some reasonable class of theories, and find that they either conflict with quantum mechanics or have unphysical properties that make them much stranger than quantum mechanics.
Ever since Von Neumann in 1932, the consensus has been that the hidden variable theories do not work.

Nevertheless, people keep redisoovering some argument against hidden variables, and acting as if it a profound result. The latest is Notes on the reality of the quantum state
Notes on the reality of the quantum state

The physical meaning of the quantum state is an important interpretative problem of quantum mechanics. A long-standing question is whether a pure state relates only to an ensemble of identically prepared systems or directly to the state of a single system. Recently, Pusey, Barrett and Rudolph (PBR) demonstrated that under an independence assumption, the quantum state is a representation of the physical state of a single quantum system [1]. This poses a further interesting question, namely whether psi-ontology can be argued without resorting to nontrivial assumptions such as the in dependence assumption (cf. Ref. [2-4]). In this Letter, we will show that protective measurements [5,6] already provide such an argument.
A protective measurement is one that gets some info from a quantum state without destroying it. He is trying to imply that the quantum wave function must be some objective physical reality, just because quantum mechanics works so well. But this is no argument against a subjective psi-epistemic interpretation, as discussed in QBism below.

Monday, January 13, 2014

Tegmark book pushes math universe

MIT and FQXi physicist Max Tegmark writes in his new book, Our Mathematical Universe: My Quest for the Ultimate Nature of Reality:
In summary, there are two key points to take away: The External Reality Hypothesis implies that a “theory of everything” (a complete description of our external physical reality) has no baggage, and something that has a complete baggage-free description is precisely a mathematical structure. Taken together, this implies the Mathematical Universe Hypothesis, i.e., that the external physical reality described by the theory of everything is a mathematical structure. So the bottom line is that if you believe in an external reality independent of humans, then you must also believe that our physical reality is a mathematical structure. Everything in our world is purely mathematical – including you. ...

This crazy-sounding belief of mine that our physical world not only is described by mathematics, but that it is mathematics, makes us self-aware parts of a giant mathematical object. As I describe in the book, this ultimately demotes familiar notions such as randomness, complexity and even change to the status of illusions; it also implies a new and ultimate collection of parallel universes so vast and exotic that all the above-mentioned bizarreness pales in comparison, forcing us to relinquish many of our most deeply ingrained notions of reality.
Yes it is a crazy idea. A much more sensible SciAm reader comments:
Confusing mathematics with physical reality (extreme Platonism) has a long and undistinguished history going back to a few ancient Greek philosophers.

No one doubts that mathematics is an effective way to describe the physical world, but to give mathematics some sort of physical substance, or to say there is no physical substance - it's just math, is a fairly bizarre way to understand nature.

Could this conjecture be tested? I doubt it.

Is it science? Not by my definition.

A more sensible view of mathematics is that it is a very effective, but artificial and invented language we use to model nature.

Further, all mathematical models are approximations to physical reality, and thus subject to change and evolution.

The extreme Platonists like Tegmark are true believers who want to lead us into the completely abstract and unscientific world of fantasy, and define it as "reality".

We do not have to follow. ...

Michael S. Turner said in the late 1990s that the "go-go junk bond days of physics were over".

That may have been wishful thinking, or maybe it was just what was PC at the moment, but the fact is that junk-bond physics has grown even more prolific and exotic since then.
He is right -- the junk-bond physics has grown even more prolific and exotic than ever. I explained Tegmark's errors in my FQXi essay.

Tegmark replies:
Thanks Robert for raising these important issues! I discuss them extensively in the book, exploring the full spectrum of views.

Like you and Popper, I view untestable theories as unscientific.

Please beware that parallel universes are not a theory, merely a prediction of certain mathematical theories (such as cosmological inflation and unitary quantum mechanics), which have in turn passed some experimental tests (hence their popularity) but may be ruled out during new experiments in coming years. ...

You asked for more detailed material: you'll find a sample on http://mathematicaluniverse.org (click either "Popular" or "Technical" depending on your taste), and please feel free to ask me direct questions on Facebook as well. The book is of course mainly on uncontroversial but fascinating recent discoveries in modern physics, from cosmology to particle physics, but Scientific American predictably chose to highlight some of the most controversial material.
So he blames SciAm for this choice? Tegmark has spent much of his career promoting the math (multi) universe. See his 2003 SciAm cover story on
Parallel Universes - Not just a staple of science fiction, other universes are a direct implication of cosmological observations and his defense to criticisms, The Multiverse Strikes Back. His own web site says:
Articles about the book
  • Discover magazine, December 2013 issue (excerpts from Chapter 10; paywall disappears December 3 2013)
  • Discover magazine (interview with me about a key idea from the book)
Those articles are titled, "Everything in the Universe Is Made of Math – Including You" and "Is the Universe Actually Made of Math? Cosmologist Max Tegmark says mathematical formulas create reality." So yes, the SciAm excerpt on the "key idea" from the book.

His proposal is silly because no one has ever reduced any physical object to math, not even a photon or electron. By his own admission, he needs to assume that randomness, complexity, and change (over time) are just illusions. These are philosophical issues that were debated by the ancient Greeks. His position is no stronger today than it was 2300 years ago.

Update: Woit reviews Tegmark.
I’m still though left without an answer to the question of why the scientific community tolerates if not encourages all this. Why does Nature review this kind of thing favorably? Why does this book come with a blurb from Edward Witten? I’m mystified.

Thursday, January 9, 2014

Predicting quantum computers

Brian Hayes describes quantum computers and writes:
The hardware doesn’t yet exist, but languages for quantum coding are ready to go.

The year is 2024, and I have just brought home my first quantum computer. When I plug it in and switch it on, the machine comes to life with a soft, breathy whisper from the miniature cryogenic unit. A status screen tells me I have at my disposal 1,024 qubits, or quantum bits, offering far more potential for high-speed calculation than all the gigabits and terabytes of a conventional computer. So I sit down to write my first quantum program. ...

Languages such as QCL and Quipper may well solve the problem of how to write programs for quantum computers. ...

Of course all these questions remain academic until reliable, full-scale quantum computers become available. A company called D-Wave offers machines with up to 512 superconducting qubits, but the architecture of that device is not suited to running the kinds of programs generated by QCL and Quipper; indeed, there’s controversy over whether the D-Wave machine should be called a quantum computer at all. For technologies that can implement quantum circuits with controlled interference and entanglement, the state of the art is roughly a dozen qubits. With those resources, Shor’s algorithm can factor the number 21. Much work remains to be done before my kiloqubit laptop arrives in 2024.
If that is a prediction, then I would like to bet against it. Those programming languages will never run faster on a quantum computer than on the simulator. Hayes never mentions that quantum computers may be impossible, and those dozen qubits are not real.

Monday, January 6, 2014

Misconceptions about QM and free will

Swedish physicist Sabine Hossenfelder posts 10 Misconceptions about Free Will:
If somebody talks about a “question that science cannot answer” what they really mean is a question they don’t want an answer to. Science can indeed be very disrespectful to people’s beliefs.
No, saying that science cannot answer a question means that there is no experiment or set of observations to resolve the question.

Are sunsets beautiful? Is Obamacare a failure? Is P = NP? These are legitimate questions, but science cannot answer them.
“Do humans have free will?” is a question I care deeply about. It lies at the heart of how we understand ourselves and arrange our living together. It also plays a central role for the foundations of quantum mechanics. In my darker moods I am convinced we’re not making any progress in quantum gravity because physicists aren’t able to abandon their belief in free will. And from the foundations of quantum mechanics the roadblock goes all the way up to neuroscience and politics.

Yes, I just blamed the missing rational discussion about free will for most of mankind’s problems, including quantum gravity.
There you have the folly of modern physics in a nutshell. Yes of course free will lies at the heart of how we understand ourselves. Yes, free will plays a central role for the foundations of quantum mechanics, as QM is unique among modern science theories in that it accommodates free will, as shown by the free will theorem.

But no, an Eisnteinian belief in determinacy will never lead to any progress in quantum gravity.

She lays bare her illogical argument:
So, first the facts.

Fact 1: Everything in the universe, including you and your brain, is composed of elementary particles. What these particles do is described by the fundamental laws of physics. Everything else follows from that, in principle. ...

Fact 2: All known fundamental laws of nature are either deterministic or random. To our best present knowledge, the universe evolves in a mixture of both, but just exactly how that mixture looks like will not be relevant in the following.

Having said that, I need to explain just exactly what I mean by the absence of free will:

a) If your future decisions are determined by the past, you do not have free will.

b) If your future decisions are random, meaning nothing can influence them, you do not have free will.

c) If your decisions are any mixture of a) and b) you do not have free will either.
It would be understandable if she said this in the 19th century. But QM is our leading theory, and it teaches that future decisions are not determined by the past, and that they are only random in the sense of not being predictable by others. In other words, QM is completely compatible with free will.

If you think that QM is somehow incompatible with free will, ask yourself this question: Is it possible for any physical theory to be more compatible with free will than QM? I do not see how. Free will is baked into QM.

If the answer is yes, then ask why no one has ever proposed such a theory. If the answer is no, and you still oppose free will, then your opposition to free will is purely philosophical and independent of any physical principles.

Some ancient philosophers opposed free will, so it is possible to take such a view. Just do not pretend that the view is informed by science, because it is not.
The conflict that doesn't go away is this: There's no sense in which you can change your future. ...

"I could go and jump out of a window. That would change my future dramatically."

No it wouldn't. Your future either does or doesn't contain you jumping out the window. There's nothing you can change about that.
The root of her problem is that she does not believe in counterfactuals. She rejects a concept that is easily understood by little kids. I will post more on that.

She believes in Einsteinian determinism, and even likes superdeterministic hidden variables theories, more commonly known as "conspiracy theories".

Lumo addresses her misconceptions one by one.

Saturday, January 4, 2014

NSA classifies quantum computers

According to leaks reported by the Wash. Post, the NSA classifies quantum crypto research:
The National Security Agency is conducting what it calls "basic research" to determine whether it's possible to build a quantum computer that would be useful for breaking encryption. ...

“It seems improbable that the NSA could be that far ahead of the open world without anybody knowing it,” MIT Professor of Electrical Engineering and Computer Science Scott Aaronson told the Post.
We have to keep with all the others making no progress in this field:
Seth Lloyd, an MIT professor of quantum mechanical engineering, said the NSA’s focus is not misplaced. “The E.U. and Switzerland have made significant advances over the last decade and have caught up to the U.S. in quantum computing technology,” he said.
Here are the qubit details:
(S//REL) Level B QC - Classified theoretical and/or experimental research in the design, physical implementation, and operation of quantum computers, as established by the Laboratory for Physical Sciences/R3. The boundaries are based on the number and quality of qubits, realism and specificity of design, control precision, and detail of analysis. While these boundaries may change over time, as of the publication of this guide, the values are:

(1) (S//REL) Detailed engineering design of 51 or more physical qubits;
(2) (S//REL) Implementation and operation of a high-fidelity 21-or-more physical-qubit device; or
(3) (S//REL) Implementation and operation of three (3) or more logical qubits, with sufficient speed and precision to allow preservation of quantum information and logical gates between the qubits.
So they classify 51 physical qubits, but a mere 3 logical qubits.

A qubit is not really a qubit unless it can preserve quantum information and be operated on by logic gates. So the NSA is using the term "logical qubit" for a true qubit, which has never been demonstrated. Apparently it is customary for quantum computer researchers to claim some large number of qubits, but they are not real qubits. The NSA will have a major top-secret breakthru if it obtains a device with a mere 3 logical qubits.

I am still banned from Aaronson's blog. He recently posted, in response to skeptics:
In my personal opinion, proving that QC can work would be at least as interesting—purely for fundamental physics, setting aside all applications!—as the discovery of the Higgs boson was. And our civilization (OK, mostly the EU) decided that finding the Higgs boson was worth $11 billion. It’s hard for me to understand how people can simultaneously believe that that was justified, and that spending a comparable amount to prove the reality of beyond-classical computational power in our universe wouldn’t be.
Here is my censored reply:
The EU would not have agreed to spend $11B just to find the Higgs. The LHC was sold as a machine to disprove the Standard Model and find more fundamental laws of physics. If they knew that the LHC was just going to confirm what was already known, plus give a mass value for the Higgs, I doubt that they would have funded it.
Physicists overhype these projects in order to get funding. They are still complaining about how the Texas Superconducting Super Collider got killed, but that was supposed to discover a new unified field theory. The LHC has only produced evidence confirming the theory that was developed and accepted in the 1970s.

The promise for quantum computing has been going on for about 20 years, and will probably go on for another 20 years. No computational shortcuts have ever been demonstrated. A top-secret NSA 3-qubit computer would not do much either. The NSA can say that it has to do this research to keep up with whomever else might be doing it.

Friday, January 3, 2014

Mermin taking Bohr seriously

N. David Mermin writes in QBism as CBism: Solving the Problem of "the Now":
In a Physics Today Commentary,1 and more carefully, extensively, and convincingly with Chris Fuchs and Ruediger Schack,2 I argued that stubborn longstanding problems in the interpretation of quantum mechanics fade away if one takes literally Niels Bohr’s dictum3 that the purpose of science is not to reveal “the real essence of the phenomena” but to find “relations between the manifold aspects of our experience.”
I have mentioned QBism and Mermin's coauthored paper, with the comment that people should have just listened to Bohr in the first place.

The 2012 Physics Today article starts:
Quantum mechanics is the most useful and powerful theory physicists have ever devised. Yet today, nearly 90 years after its formulation, disagreement about the meaning of the theory is stronger than ever. New interpretations appear every year. None ever disappear.

Probability theory is considerably older than quantum mechanics and has also been plagued from the beginning by questions about its meaning. And quantum mechanics is inherently and famously probabilistic.

For the past decade, Carl Caves, Chris Fuchs, and Ruediger Schack have been arguing that the confusion at the foundations of quantum mechanics arises out of a confusion, prevalent among physicists, about the nature of probability.1 They maintain that if probability is properly understood, the notorious quantum paradoxes either vanish or assume less vexing forms.
I agree with most of that, except that I say that quantum mechanics is not more probabilistic than any other scientific theory.

Many times on this blog I have defended the Copenhagen interpretation against those who say that it is incoherent, unscientific, and obsolete. I repeatedly trash big-shot physicists who promote many-worlds or nonlocal interpretations of quantum mechanics as being somehow necessitated by disproving traditional interpretations. Mermin and Lubos Motl are about the only living physicists where I have agreed with their views on this.

To me, the paradox is why so many smart people find quantum mechanics so paradoxical, when the essentials were so clearly explained by Bohr, Heisenberg, and von Neumann 80 years ago. Mermin's theory for this has to do with a refusal to accept a probability interpretation. I have some other ideas, and I will be posting them.

Mermin defends his article with the opinion of another quantum founder:
a 1931 letter from Erwin Schrödinger to Arnold Sommerfeld:3 "Quantum mechanics forbids statements about what really exists--statements about the object. It deals only with the object-subject relation. Although this holds, after all, for any description of nature, it appears to hold in a much more radical and far-reaching sense in quantum mechanics"
There is also a May 2013 SciAm article on QBism where it is again treated as something new, but it is really just the Copenhagen interpretation.

Anyone proposing a new (or old) quantum interpretation should at least recognize the Bohr-Heisenberg position, as that was considered standard for decades. People today act as if Bohr's position was nonsensical, as can be seen in the recent Dilbert cartoon.

I am not sure why a new name is needed to reiterate what Bohr said. After all, Bohr won those Bohr-Einstein debates back in the 1930s.

Update: Mermin previously promoted some opposing views of quantum mechanica in what he called the "Ithaca interpretation" (he is a professor in Ithaca). He said in 1996:
To live with so many requirements I need room for maneuver. This is provided by adopting, as my sixth and final desideratum, the view that probabilities are objective intrinsic properties of individual physical systems. I freely admit that I cannot give a clear and coherent statement of what this means. The point of my game is to see if I can solve the interpretive puzzles of quantum mechanics, given a primitive, uninterpreted notion of objective probability. ...

It therefore appears that the view of probability underlying the Ithaca interpretation must be anti-Bayesian.
And he said in 1997:
I shall not explore further the notion of probability and correlation as objective properties of individual physical systems, though the validity of much of what I say depends on subsequent efforts to make this less problematic. My instincts are that this is the right order to proceed in: objective probability arises only in quantum mechanics. We will understand it better only when we understand quantum mechanics better. My strategy is to try to understand quantum mechanics contingent on an understanding of objective probability, and only then to see what that understanding teaches us about objective probability.10

10 That objective probability plays an essential role in the quantum mechanical description of an individual system was stressed by Popper, who used the term "propensity". See Karl Popper, Quantum Theory and the Schism in Physics, Rowman and Littlefield, Totowa, New Jersey, 1982. Heisenberg may have had something similar in mind with his term "potentia". While I agree with Popper that quantum mechanics requires us to adopt a view of probability as a fundamental feature of an individual system, I do not believe that he gives anything like an adequate account of how this clears up what he called the "quantum mysteries and horrors".
I do not agree that there is any such thing as "objective probability" or propensity in quantum mechanics, any more than probability figures into other scientific theories.

Apparently Mermin tried to improve quantum mechanics with his own Ithaca interpretation and objective probability. Then he realized that was all a mistake, and switched back to completely subjective probability, and calls it QBism. I don't want to blame him for changing his mind, but it would be nice if he explained that QBism is a throwback to Bohr and its novelty is mainly in correcting errors by himself and others.

A comment below disagrees with me, and cites R.P. Feynman for how illogical QM is. I cannot agree. Feynman's textbook on quantum mechanics is now online, and it is not nonsensical, absurd, or illogical. He gives a coherent exposition of the theory. Yes, he has also been quoted as saying QM is strange and mysterious.