Thursday, October 30, 2014

Silly atheist attack on the Pope

Here is a current academic evolutionist attack on creationism, in today's New Republic. Leftist-atheist-evolutionist Jerry A. Coyne writes Stop Celebrating the Pope's Views on Evolution and the Big Bang. They Make No Sense. Then he announces that he is refusing to read the comments. He quotes the Pope:
The Big Bang, which today we hold to be the origin of the world, does not contradict the intervention of the divine creator but, rather, requires it. ...
Coyne then attacks:
Let’s start with the Big Bang, which, said Francis, requires the intervention of God. I’m pretty sure physicists haven’t put that factor into their equations yet, nor have I read any physicists arguing that God was an essential factor in the beginning of the universe. We know now that the universe could have originated from “nothing” through purely physical processes, if you see “nothing” as the “quantum vacuum” of empty space. Some physicists also think that there are multiple universes, each with a separate, naturalistic origin. Francis’s claim that the Big Bang required God is simply an unsupported speculation based on outmoded theological arguments that God was the First Cause of Everything.
The Pope is not a scientist, and I don't doubt that he uses theological arguments that lack scientific support. My concern here is with scientists misrepresenting the science.

Physicists have no idea whether God or anything was a factor in the beginning of the Big Bang. We have no observational evidence. The closest was supposed to be the BICEP2 data, but that is in serious doubt.

We do not know that the universe could have originated from nothing.

We have no evidence for multiple universes.

Coyne accuses the Pope of unsupported speculation, but the same could be said for multiple universes, or the universe originating from nothing.

Monday, October 27, 2014

Relativity has Lorentz and geometry explanations

I posted about the 3 views of special relativity, which may be called the Lorentz aether theory (LET), Einsteinian special relativity (ESR), and Minkowski spacetime (MST).

Briefly, Lorentz explained the contraction (and Lorentz transformation LT) as motion causing an electromagnetic distortion of matter and fields. ESR deduces the LT as a logical consequence from either the Michelson-Morley experiment (as FitzGerald 1889 and Lorentz 1992 did) or from postulates that Lorentz distilled from that experiment and Maxwell's equations (as Einstein 1905 did). MSR recasts the LT as symmetries of a 4-dimensional geometry (as per Poincare 1905 and Minkowski 1908).

In case you think that I am taking some anachronistic view of MSR, I suggest The Non-Euclidean Style of Minkowskian Relativity and Minkowski’s Modern World (pdf) by Scott Walter, and Geometry and Astronomy: Pre-Einstein Speculations of Non-Euclidean Space, by Helge Kragh. Non-Euclidean geometry was crucial for the early acceptance and popularity of special relativity in 1908, and dominated the early textbooks.

While the Lorentzian constructive view is unpopular, it has some explanatory advantages, and is a legitimate view. I tried to add a sentence last year saying that to the Wikipedia article on length contraction, but I was overruled by other editors.

Dan Shanahan recently wrote a paper on explanatory advantages to LET, and asks:
You say:

"In the preferred (Minkowski) view, the LT is not really a transmutation of space and time. Minkowski spacetime has a geometry that is not affected by motion. The LT is just a way to get from one set of artificial coordinates to another."

and later:

"The physics is all built into the geometry, and the contraction seems funny only because we are not used to the non-Euclidean geometry."

I would understand, and could accept, both passages if you were describing the non-Euclidean metric of the gravitational equation where we have the intuitive picture of a curved spacetime. But I cannot see what changes could actually be occurring in flat spacetime in consequence of the rotation described by the LT. You say that ”Minkowski spacetime has a geometry that is not affected by motion” and it is here in particular that I am not sure of your meaning.

I would say myself that whatever space might be, neither it nor its geometry could be affected (except in the sense described by general relativity) either by the motion through it of an observer or other object, or (and this is the important point) by changes in that motion. But that cannot be what you are saying for that is the Lorentzian constructive approach.
To answer this, you have to understand that Poincare and Minkowski were mathematicians, and they mean something subtle by a geometrical space.

Roughly, a geometrical space is a set of points with geometrical structures on it. The subtle part is that it must be considered the same as every other isomorphic space. So a space could be given in terms of particular coordinates, but other coordinates can be used isomorphically, so the coordinates are not to be considered part of the space. The space is a coordinate-free manifold with a geometry, and no particular coordinates are necessarily preferred.

The concept is so subtle that Hermann Weyl wrote a 1913 book titled, "The Concept of a Riemann Surface". The concept was that it could be considered a coordinate-free mathematical object. I can say that in one sentence, but Weyl had a lot of trouble explaining it to his fellow mathematicians.

Minkowski space can be described as R4 with either the metric or LT symmetry group (or both), but that is not really the space. It is just a particular parameterization of the space. There is no special frame of reference in the space, and many frames can be used.

Poincare introduced Minkowski space in 1905, and gave an LT-invariant Lagrangian for electromagnetism. The point of this is that the Lagrangian is a (covariant) scalar function on the coordinate-free Minkowski space. He then introduced the 4-vector potential, and implicitly showed that it was covariant. Minkowski extended this work in 1907 and 1908, and was more explicit about using a 4-dimensional non-Euclidean manifold, and constructing a covariant tensor out of the electric and magnetic fields. With this proof, the fields became coordinate-free objects on a geometrical space.

This is explained in section 8 of Henri Poincare and Relativity Theory, by A. A. Logunov, 2005.

Thus the physical variables are defined on the 4D geometrical space. The LT are just isomorphisms of the space that preserve the physical variables.

This concept is like the fact that Newton's F=ma means the same in any units. To do a calculation, you might use pounds, grams, or kilograms, but it does not matter as long as you are consistent. The physical variables do not depend on your choice of units.

Changing from pounds to grams does not affect the force, and the LT does not transmute spacetime. They are just changes of coordinates that might make calculations easier or harder, but do affect any physical reality.

To answer Dan's question, motion is just a change of coordinates, and in MST, the physics is built on top of a geometry that does not depend on any such coordinates. Motion by itself is not a real thing, as it is always relative to something else.

To me, the essence of special relativity is that it puts physics on a 4D non-Euclidean geometry. That is why it bugs me that Einstein is so heavily credited, when he only had a recapitulation of Lorentz's theory, and not the 4D, symmetry group, metric, covariance, and geometry. Many physicists and historians blame Poincare for believing in an aether with a fixed reference frame. The truth is more nearly the opposite, as Poincare said that the aether was an unobservable convention and proved that symmetries make all the frames equivalent in his theory.

Dan gives this thought experiment, which is his variant of the twin paradox:
Consider two explorers, who we will call Buzz and Mary. They had been travelling, in separate space ships, side by side, in the same direction. But Buzz has veered away to explore a distant asteroid. Mary concludes from her knowledge of the LT that time must now be running more slowly for Buzz and that he and his ship have become foreshortened in the direction that Buzz is travelling relative to her. Buzz observes no such changes either in himself or in his ship. To Buzz, it is in Mary and her ship that these changes have occurred. Buzz is also aware that events that he might previously have regarded as simultaneous are no longer so.

But what has actually changed? ...

For Buzz the LT will describe very well his altered perspective. But it would be as inappropriate to explain length contraction, time dilation and loss of simultaneity as resulting from a physical transformation of space or spacetime as it would be to describe the rotation of an object in 3-space as a rotation of space rather than a rotation in space.
Some books try to avoid this issue by saying that ESR only applies to inertial frames like Mary, not Buzz. But historically, Lorentz, Poincare, Einstein, and Minkowski all applied special relativity to accelerating objects. General relativity is distinguished by gravitational curvature.

Dan argues in his paper that LET gives a physical explanation for what is going on, and ESR does not. I agree with that. Our disagreement lies with MST.

I say that there are two ways of explaining this paradox, and they are the same that Poincare described in 1905. You can follow Lorentz, say that everything is electromagnetic, and attribute the LT to distortions in matter and fields. Dan's paper explains this view.

His second explanation was to say that relativity is a theory about “something which would be due to our methods of measurement.” The LT is not a contraction of matter or of space. It is just a different parameterization of a geometric spacetime that is not affected by motion at all.

Let me give my own thought example. Suppose Mary and Buzz are on the equator, traveling north side-by-side to the North Pole. Assume that the Earth is a perfect sphere. Along the way, Buzz takes a right turn for a mile, then two left turns for a mile each, expecting to meet back up with Mary. Mary waits for him, but does not alter her northerly path. Then they discover that they are not on the same course. What happened?

You can apply Euclidean geometry, draw the great circles, and find that they do not close up. Or you can apply spherical geometry, where Mary and Buzz are traveling in straight lines, and note that squares have angles larger than 90 degrees. The problem is not that matter or space got transmuted. The problem is that Buzz took a non-Euclidean detour but returned with Euclidean expectations.

In Dan's example, Buzz takes a detour in spacetime. It does not do anything strange to his spaceship or his clocks. The strangeness only occurs when he uses Euclidean coordinates to compare to Mary. Nothing physical is strange, as long as you stay within Minkowski's geometry.

Thus there are two physical explanations for the special relativistic effects. One is the FitzGerald Lorentz LET, and one is Poincare Minkowski geometry. Einstein gives more of an instrumentalist approach to LET, and does not really explain it.

Friday, October 24, 2014

Science publicists attacking religion

Wikipedia has voted not to cite Neil deGrasse Tyson's misquotes. Apparently a majority feels as tho mentioning it would be caving into a right-wing plot to discredit a prominent atheist, evolutionist, and global warming promoter.

Here is another Tyson appearance from July:
Later, when the topic of randomness came up again, Tyson reminded us of the importance of random events like how the impact that produced the moon shaped our planetary history, or how the impact that wiped out the dinosaurs made it possible for mammals (and humans) to dominate the earth. Maher reminded us of how religious folks hate the idea that random natural events get in the way of their carefully controlled world run by a God who looks over them. To which Tyson replied, “Get over it”.
First, "random events" is a rather unscientific term. There is no way to say that an event is or is not random. Scott Aaronson has foolishly claimed that it is possible in some cases to certify randomness, but even if he were right, the idea does not apply to cosmological impacts.

Second, I have never heard of religious folks who hate the idea of random natural events. On the contrary, they frequently cite such events as evidence for their beliefs in God's plan or supernatural intervention. A big calamity is often called an "act of God".

Bill Maher says stupid stuff all the time. He is hardly worth criticizing, as few people take him seriously. But Tyson is a scientist, and we expect better.

George Johnson writes a NY Times science essay, starting:
Galileo knew he would have the Church to contend with after he aimed his telescope at the skies over Padua and found mountains on the moon and more moons orbiting Jupiter — and saw that the Milky Way was made from “congeries of innumerable stars.” The old order was overturned, and dogma began to give way to science.
No, I doubt it. The Church never contested his telescopic observations, or had problem with mountains on the Moon or anything like that. The Church did doubt that he had proof of the Earth's motion, because that would require re-interpretation of a couple of Bible passages. And the Church was right about that.
“It’s bad for science, but good (I suppose) for the Native American groups involved,” he wrote in an email. “Given that the U.S.A. was founded on two great sins — genocide of Native Americans and slavery of Africans — I think science can afford this act of contrition and reparation.”

But how is letting Indian creationism interfere with scientific research any different from Christian creationism interfering with public education — something that he would surely resist?

Logically they are the same, Dr. Lekson agreed. But we owed the Indians. “I’m given to understand that the double standard rankles,” he said.
Logically the same? I do not see the similarity. Christian creationists do not interfere with scientific research. They only dispute some scientific wisdom about the age of the Earth and some related matters. The Indians are not disputing scientific wisdom. They just want to preserve some land and bones.

I do not mind when scientists say that certain religious views are wrong, but I wish that would demonstrate their scientific outlook, and make sure that their attacks are accurate and grounded in evidence.

Wednesday, October 22, 2014

Defending a physical FitzGerald contraction

Daniel Shanahan writes A Case for Lorentzian Relativity, just published in Foundations of Physics:
For the student of physics, there comes a moment of intellectual pleasure as he or she realizes for the first time how changes of length, time and simultaneity conspire to preserve the observed speed of light. Yet Einstein's theory [1] provides little understanding of how Nature has contrived this apparent intermingling of space and time.

The language of special relativity (SR) may leave the impression that the Lorentz transformation (the LT) describes actual physical changes of space and time. Thus we have Minkowski's confident prediction that,
Henceforth, space by itself, and time by itself, are doomed to fade away into mere shadows and only a kind of union of the two will preserve an independent reality [2].
The impression that the LT involves some physical transmutation of "spacetime" might seem consistent with the change of that nature contemplated in general relativity (GR). ...

The effects described by the LT can be explained in their entirety by changes occurring in matter as it changes inertial frame. This is not to suggest that the LT does not describe a transformation of space and time. But what the LT describes are changes in what is observed, and in the Lorentzian approach17 offered here, what is observed by an accelerated observer is essentially an illusion induced by changes in that observer.

This view relies crucially on the conclusion reached in Sect. 1 that the LT does not involve actual physical change in the properties of space. But once that conclusion is reached, it becomes apparent that there is something elusive in Einstein's theory, and that it is the Lorentzian approach that better explains the origin of the contraction, dilation and loss of simultaneity described by the LT.

Once the LT is explained from the wave characteristic of matter a good deal else becomes apparent. The de Broglie wave is seen to be a modulation rather than an independent wave, thus explaining the superluminality of this wave, its significance in the Schrodinger equation, and its roles in determining the optical properties of matter and the dephasing that underlies the relativity of simultaneity.

Einstein's bold assertion that the laws of physics must be the same for all observers revealed the elegance of SR and something indeed of the elegance of the universe itself. It is suggested nonetheless that it is a Lorentzian approach that will provide the deeper understanding of the physical meaning of SR.
He is correct that there is some merit to viewing the LT as a physical effect caused by motion. Einstein called this a "constructive" approach in 1919, as opposed to what he called a "principle theory".

The relativity principle was not "Einstein's bold assertion". It was conventional wisdom, until Maxwell and Lorentz suggested electromagnetic tests of it. Then it was Poincare who proposed how it could be a symmetry of nature, including electromagnetism and gravity. Einstein only recited what Lorentz had already said.
But to entertain these thoughts is to embark upon a process of reasoning, associated primarily with Lorentz, that became unfashionable2 following Einstein's famous paper of 1905 [1]. Lorentz had sought to explain the transformation that bears his name on the basis of changes that occur in matter as it changes velocity. This was, it is suggested, an idea before its time.
Einstein later wrote that he also sought a constructive approach, but could not get it to work.

Historically, it is not true that Einstein's famous 1905 paper caused the constructive view to become unfashionable. Minkowski's 1908 paper did that.

In the preferred (Minkowski) view, the LT is not really a transmutation of space and time. Minkowski spacetime has a geometry that is not affected by motion. The LT is just a way to get from one set of artificial coordinates to another.
The LT was already reasonably well known by 1905. There had been significant contributions to its development, not only from Lorentz and Fitzgerald, but also by (among others) Heaviside, Larmor and Poincaré. It was Heaviside's analysis of the disposition of fields accompanying a charged particle (the "Heaviside ellipsoid") that had suggested to FitzGerald the idea of length contraction[12]. Larmor had described an early form of the LT and discussed the necessity of time dilation [13]. Poincaré had recognized the relativity of simultaneity and had studied the group theoretic properties that form the basis for the covariance of the transformation [14].

But these "trailblazers" (Brown [6], Ch. 4) appear to have missed in varying degrees the full significance of the transformation3. It is not only particular phenomena, but all of Nature that changes for the accelerated observer. Lorentz struggled to explain how all aspects of matter could became transformed in equal measure, being discouraged by experimental reports that seemed to show that particles do not contract in the direction of travel (see Brown [6], p. 86). A wider view seems to have been noticed by Poincaré [14], who has been regarded by some as codiscoverer of SR (see, for instance, Zahar [15], and Reignier [16]). But it is not apparent that these earlier investigators saw the changes described by the LT as anything more than mathematical constructs. In his paper of 1905 [1], Einstein simply asserted that the velocity of light, and other properties of Nature, must appear the same for all uniformly moving observers, thereby effecting an immediate reconciliation between Maxwell's equations and classical mechanics.

In 1905, Einstein's approach may have been the only way forward. It was not until 1924, only a few years before the death of Lorentz, and well after that of Poincaré, that de Broglie proposed that matter is also wavelike [9], an idea that might have suggested to Lorentz why molecules become transformed in the same degree as intermolecular forces.
Here he recites this strange criticism that the LT was just a mathematical construct for Lorentz and Poincare, and therefore inferior to the 1905 Einstein view that has been accepted ever since.

A lot of historians and others make this argument. They cannot deny that Lorentz and Poincare had the LT formulas ahead of Einstein, so they say that Einstein had the physics while Lorentz and Poincare just had some mathematical ideas without realizing the physical relevance. In particular, they say that Lorentz had a "local time" that was not necessarily related to the time given by local clocks.

The trouble with their argument is that Lorentz and Poincare explicitly use their theory to explain the Michelson-Morley experiment, and that explanation only makes sense if the distances and times in the LT formulas are the same as those measured by meter sticks and clocks in the lab.

There were three historical formulations of SR, using Einstein's terminology:

  • Principle theory (FitzGerald 1889, Lorentz 1992, Einstein 1905) The LT is a logical consequence of an interpretation of Michelson-Morley, without much explanation of how it works.

  • Constructive theory (Lorentz 1995) The LT is explained by motion causing distortions in the electromagnetic fields that constitute our measuring instruments.

  • Geometric theory (Poincare 1905, Minkowski 1908) The LT reflects the mismatch between Euclidean and non-Euclidean coordinates.

    The Michelson-Morley experiment could be interpreted as evidence for (1) a stationary Earth; (2) an emitter (non-wave) theory of light; (3) an aether drift theory; or (4) the relativity principle combined with a constant speed of light. FitzGerald and Lorentz reject the first 3 possibilities based on other experiments, and were led to option (4). They then deduced the LT as a logical consequence of those principles.

    The constructive theory has been unfashionable, but the above paper gives a good account of the merits of that view. I prefer to call this the bottom-up view. If there is a contraction, then the distance between molecules decreases, and this view explains it in terms of the molecular forces. The view is completely consistent with modern quantum field theory, but not usually mentioned in the textbooks.

    Einstein preferred the principle theory formulation that he got from Lorentz. The Einstein idolizers rave about how great this was, but geometric theory has been the dominant one among theoretical physicists since 1908.

    I would say that the constructive theory has the most physical LT. In it, solid objects really contract because of increases in the forces that hold the molecules together. In the geometry theory, the LT is more of a mathematical illusion. The contraction is an artifact of trying to use Euclidean coordinates on a non-Euclidean spacetime. In a sense, the LT is merely mathematical, because it has more to do with how we measure than to physical changes, but it also explains the Michelson-Morley and other experiments.

    Here is a more complete version of the above Minkowski quote:
    The views of space and time which I wish to lay before you have sprung from the soil of experimental physics, and therein lies their strength. They are radical. Henceforth space by itself, and time by itself, are doomed to fade away into mere shadows, and only a kind of union of the two will preserve an independent reality.
    The geometric view is a mathematical view "sprung from the soil of experimental physics". The story of SR is that Maxwell's equations and experiments like Michelson-Morley drove Lorentz, Poincare, and Minkowski to a series of discoveries culminating in our modern understanding that spacetime has a non-Euclidean geometry. Einstein had almost nothing to do with it, as he ignored Michelson-Morley, did not make any of the key breakthrus, and never even liked the geometric view.

    I can understand Shanahan giving a defense of the constructive (bottom-up) view. It is a completely legitimate view that was historically crucial and useful for modern understanding. It deserves more respect, even tho it is not the preferred view. But he is wrong to endorse the opinion that Lorentz and Poincare, but not Einstein, missed the full significance of the LT.

    Shanahan is essentially saying that the LT can be considered a transmutation of matter and fields, or a transmutation of space and time. If you emphasize the wave-like nature of matter, he argues, then the LT makes logical and physical sense as a transmutation of matter and fields. It is no stranger than transmuting space and time.

    My problem with this dichotomy is that it ignores the Poincare-Minkowski geometric view where no physical things (matter, fields, space, time) get transmuted. The physics is all built into the geometry, and the contraction seems funny only because we are not used to the non-Euclidean geometry.

    The full significance of the LT is that it generates the Lorentz group that preserves the non-Euclidean geometry of spacetime. Poincare explained this in his 1905 paper, and explained that it is a completely different view from Lorentz's. Lorentz deduced relativity from electromagnetic theory, while Poincare described his relativity as a theory about “something which would be due to our methods of measurement.”

    Einstein's view was essentially the same as Lorentz's, except that Einstein had the velocity addition formula for collinear velocities, and lacked the constructive view.

    I am beginning to think that the historians have some fundamental misunderstandings about relativity. SR only became popular after that 1908 Minkowski paper presented it as a non-Euclidean geometry on spacetime, with its metric, symmetry group, world lines, and covariant equations. Publications took off that year, and within a couple of years, textbooks started to be written. Minkowski's geometry theory was the hot theory, not Einstein's.

    Today the general relativity (GR) textbooks emphasize the non-Euclidean geometry. The SR textbooks pretend that it is all Einstein's theory, and the geometry takes a back seat. The SR historians appear to not even understand the non-Euclidean geometry at the core of what made the theory popular.

    (Shanahan once posted an excellent comment on this blog, and then deleted it. I do not know why. Maybe I will reinstate it as an anonymous comment. I also previously mentioned his current paper.)
  • Friday, October 17, 2014

    S.M. Carroll begs for funding

    Physicist Sean M. Carroll is begging for money:
    Current research includes:

    Emergence and Complexity: ... These issues connect physics to information theory, biology, and even consciousness.

    Cosmology, Particle Physics, and Gravitation: Dr. Carroll has pioneered new ways of understanding the dark matter and dark energy that dominate our universe, as well as the origin of the Big Bang and how it determines the arrow of time that distinguishes past from future. ...

    Understanding the Quantum Nature of Reality: Quantum mechanics is our most comprehensive and successful way of understanding reality, but even after decades of effort we still don't understand quantum mechanics itself. Dr. Carroll has developed a new approach to understanding how probability arises in quantum mechanics, and is investigating the foundations of quantum theory to better understand the emergence of spacetime, the nature of locality, and the appearance of multiple worlds.

    How can you help?
    Your contributions will support Dr. Carroll's research as he investigates fundamental challenges in theoretical physics.
    If there were any chance Carroll would make significant progress on any of those issues, Caltech would give him tenure and promote him to professor.

    I hate to dump on the guy, but he has become a prominent physicist publicity-seeker, and he uses his position to promote crackpot ideas. He believes in many-worlds.

    Another big cosmology publicity-seeker is Neil deGrasse Tyson. Wikipedia is still debating inserting info about his bogus quotes. My complaint was that it was not just a misquote, but it was nearly the opposite of what the President said in his justification for war.

    Now I happened to listen to a March 2014 podcast on Neil deGrasse Tyson on Why He Doesn't Call Himself an Atheist. Tyson's main point was that he does not like the term "atheist" because it is identified with his friend Richard Dawkins, and Tyson says his views are different because he like AD, BC, and Christian music, and Dawkins does not. However, nearly everything they said about Dawkins was wrong:
    Julia suggested and Tyson agreed that Dawkins uses C.E. and B.C.E instead of BC and AD. This simply isn't true. For example here is a line from The God Delusion, "Arius of Alexandria, in the fourth century AD, denied that Jesus was consubstantial..." (Dawkins 2006, p. 33). Dawkins uses BC and AD. It was also implied that the fact that Tyson enjoys things like Handel's Messiah, Jesus Christ Superstar, and art at cathedrals was somehow different from Dawkins "ardent" atheism. Dawkins has no compunction about liking religiously themed music. He writes, "I once was the guest of the week on a British radio show called Desert Island Discs. You have to choose the eight records you would take with you if marooned on a desert island. Among my choices was Mache dich mein Herze rein from Bach's St Matthew Passion" (Dawkins 2008, pp.110-111).
    I can understand Tyson not wanting to get sucked into the atheist social justice warrior crowd, but he needs to learn to get his facts straight. If he is going to give a speech, if the main point of the speech is to attack Bush or Dawkins, then find out what they really say.

    Speaking of funding, philosopher Daniel Dennett praises the new book Free: Why Science Hasn't Disproved Free Will
    In recent years a growing gang of cognitive neuroscientists have announced to the world that they have made discoveries that show that “free will is an illusion.” This is, of course, momentous news, which promises or threatens to render obsolete a family of well-nigh sacred values: just deserts (for both praise and blame), guilt, punishment, honour, respect, trust, indeed the very meaning of life. Can it be that scientists have uncovered a secret quirk or weakness in our nervous systems that shows that we are never the responsible authors of our actions? Many have thought that if the physics of our world is deterministic (with no random swerves), free will is impossible whatever the details of brain activity prove to be. Philosophers have spent the last half century or so sorting themselves into two camps on this: “compatibilists” who claim that free will is compatible with determinism in spite of first appearances, and “incompatibilists” who disagree. In his new book Free: Why Science Hasn’t Disproved Free Will, philosopher Alfred Mele sets that apparently interminable wrangle aside and argues that whether you address the “modest free will” of compatibilism or the “ambitious free will” of those who hold out for indeterminism playing a role, the verdict is the same: these scientists have not come close to demonstrating that free will, in either sense, is an illusion. Their ingenious experiments, while yielding some surprising results, don’t have the revolutionary implications often claimed. ...

    The mistakes are so obvious that one sometimes wonders how serious scientists could make them.
    I have also argued here that those arguments against free will are fallacious.

    So Dennett is happy with the book? No, he is a compatibilist atheist, and he does not like some of the author's funding:
    So it is important to note that Mele’s research, as he scrupulously announces, and not in fine print, is supported by the Templeton Foundation. ... The Templeton Foundation has a stated aim of asking and answering the “Big Questions,” and its programmes include both science and theology. In fact, yoking its support of science with its support of theology (and “individual freedom and free markets”) is the very core of its strategy. ...

    And that, as I persist in telling my friends in science whenever they raise the issue, is why I advise them not to get too close to Templeton.
    This is a foolish anti-theology bias. Maybe Dennett is the sort of ardent atheist that Tyson wants to distance himself from. This book can be evaluated on its own merits, and I do not see that the funding makes any difference. No more than other books and articles, anyway.

    When BICEP2 announced their inflation evidence, they were saying something that would seem to please their funding agencies. So they were biased to find inflation. The LHC was biased to find the Higgs. If you reject these sorts of biases, you will reject much of modern science.

    Wednesday, October 15, 2014

    The people who said QC was impossible

    MIT Complexity theorist Scott Aaronson just gave a talk on his favorite topics in quantum computing complexity, and his slides concluded with:
    Single most important application of QC (in my opinion): Disproving the people who said QC was impossible!
    Dream on!

    At least he admits that no one has disproved us yet. Quantum computing is impossible. If I am right, then the majority of what he says is meaningless.

    His title was "When Exactly Do Quantum Computers Provide A Speedup?". The answer would be never. Everyone should just ignore this field until someone actually demonstrates some sort of speedup.

    He tries to explain that quantum computers have been overhyped as being able to compute all possible inputs at once, but that is parallelism, and not how quantum computer algorithms really work. On his blog, he says this is caused by common misunderstandings:
    Over the years, I’ve developed what I call the Minus-Sign Test, a reliable way to rate popularizations of quantum mechanics.  To pass the Minus-Sign Test, all a popularization needs to do is mention the minus signs: i.e., interference between positive and negative amplitudes, the defining feature of quantum mechanics, the thing that makes it different from classical probability theory, the reason why we can’t say Schrödinger’s cat is “really either dead or alive,” and we simply don’t know which one, the reason why the entangled particles can’t have just agreed in advance that one would spin up and the other would spin down.  Another name for the Minus-Sign Test is the High-School Student Test, since it’s the thing that determines whether a bright high-school student, meeting quantum mechanics for the first time through the popularization, would come away thinking of superposition as

    (a) one of the coolest discoveries about Nature ever made, or
    (b) a synonym used by some famous authority figures for ignorance.

    Despite the low bar set by the Minus-Sign Test, I’m afraid almost every popular article about quantum mechanics ever written has failed it
    This is a stupid test, because there are no "positive and negative amplitudes". The wave function for an electron is a complex-coefficient spinor function. Only in his imaginary world of hypothetical qubits are things so simple.

    I do emphasize the interference. There is a always a diagram of it on the side of this blog.

    But the interference is a lousy explanation for "why the entangled particles can’t have just agreed in advance that one would spin up and the other would spin down." The particles cannot agree in advance because spin (up or down) is not a discrete property of particles. We only get a discrete up or down when we do a measurement, like in a Stern-Gerlach detector. The spin has wave-like properties, like everything else in quantum mechanics, whether or not any interference is taking place.

    Aaronson is writing a new book on this subject, because he complains that no one understood these points in his previous books.

    Sunday, October 12, 2014

    Proability and the arrow of time

    The subject of probability is endlessly confusing to most people, and that confusion gets amplified in quantum mechanics. A lot of the mystery of quantum mechanics is rooted in simple misunderstandings about logical reasoning that little to do with quantum mechanics. I have been arguing this here for a couple of years.

    The theory of quantum mechanics predicts probabilities. This should not be troubling, because all other scientific theories do the same thing.

    Belief in the many-worlds interpretation (MWI) is based on a belief in time reversal symmetry, followed by a rejection of probability. The MWI advocates don't really believe in probability, because they believe that if there are many possibilities of events occurring in the past, then there must similarly be all those possibilities in the future. Their theory is then untestable, because they refuse to make probabilistic predictions, and say that all possibilities occur in some worlds and it is meaningless to say that some worlds are more likely than others.

    You could take any theory with probabilistic predictions, and hypothesize that each probability is really a branch into a parallel universe. Thus you could believe in MWI without even mentioning quantum mechanics. Usually the MWI arguments are couched in quantum technicalities, but they are mainly just a misunderstanding of probability.

    Lubos Motl writes:
    The very meaning of "probability" violates the time-reversal symmetry

    An exchange with the reader reminded me that I wanted to dedicate a special blog post to one trivial point which is summarized by the title. This trivial issue is apparently completely misunderstood by many laymen as well as some low-quality scientists such as Sean Carroll.

    This misunderstanding prevents them from understanding both quantum mechanics and classical statistical physics, especially its explanation for the second law of thermodynamics (or the arrow of time). ...

    This logical arrow of time is a simple, elementary, and irreducible part of our existence within Nature. But it has consequences. If you think about the comments above and recognize that all these things are as clear as you can get, you should also understand that there is no "measurement problem" in quantum mechanics – the existence of the "a measurement" is tautologically an inseparable part of any statement about the corresponding "probabilities".
    He is right about this, and the MWI folks are hung up on this simple point.

    Confusion about probability leads to other faulty conclusions. For example, some say that quantum mechanics proves that nature is inherently probabilistic, and hence there must be some underlying reality of all the possibilities that can be represented by hidden variables. No, quantum experiments have proven that wrong.

    Physicists should take a math class in probability before they start trying to rewrite the foundations of quantum mechanics.

    Monday, October 6, 2014

    No Nobels for quantum foundations

    Speculation about this year's Nobel physics prize is starting, and Jennifer Ouellette |
    My pick, just to mix things up a bit, was something a bit more esoteric and theoretical: Wojciech (pronounced Voy-check) Zurek, for his many contributions to the fundamentals of quantum mechanics, most notably decoherence and the no-cloning theorem, which forbids the creation of identical copies of an unknown quantum state, with critical implications for quantum computing, quantum teleportation, and quantum information in general. Decoherence is kind of related to the Schroedinger’s Cat thought experiment, specifically the observation question — the notion that once we “look” inside the box, the wave function collapses into a single reality (the cat is either dead or alive).

    Einstein once asked Niels Bohr whether Bohr truly believed that the moon is not really there when we don’t happen to be looking at it. Decoherence answers Einstein’s question. It’s like a built-in fail-safe mechanism, ensuring that a large object made of billions of subatomic particles rarely behaves in a truly coherent fashion. It is extremely difficult to get more than a few atoms to vibrate together, perfectly synchronized, because of interference. In the real world, objects interact constantly with the environment, and decoherence occurs instantaneously. So Schrödinger’s macroscopic-yet-quantum cat is an impossible beast. The slightest interaction with the outside world causes the wave function of super-imposed states to gradually fall out of synch, or decohere. The outside interference constitutes an act of measurement. The moon does not exist in isolation. It interacts with everything around it, including the Sun. The rain of photons from the sun’s rays onto the moon’s surface constitutes a “measurement”: the photons interact with the particles that make up the moon, collapsing their respective wave functions and causing decoherence. This gets rid of any super-imposed states, with no need for conscious human interaction. It’s ingenious, really. Definitely Nobel-worthy.
    Matt Leifer says:
    It means you are a closet Everettian in denial. Zurek is explicit that his work is a development of the Everettian program, although he does not necessarily endorse many-worlds. Still, the only way I can really make sense of Zurek’s work is within a many-worlds context. Zurek would likely disagree, but I think he is working with an unusual definition of “objective”.
    I am sure Zurek has done some good work, but I cannot see a Nobel prize for his Quantum Darwinism.

    Lumo writes:
    You know, decoherence is a genuine process. The insights about it are right and probably needed to explain some aspects of quantum mechanics to those who believe that there is something wrong about quantum mechanics because of what they call the "measurement problem".

    But I have increasingly felt that the fans of the word "decoherence" and related words, including Zurek himself, have contributed to the proliferation of quantum flapdoodle – redundant talk about non-existent problems. ...

    Decoherence has been around for over 30 years but the obsession with the idea that "something needs to be clarified" hasn't gone away – even though the literature has grown more muddy than it was 30+ years ago.

    Even Zurek himself would coin not just one but at least four different words related to decoherence:

    pointer states
    decoherence
    einselection
    quantum Darwinism

    In reality, the actual idea behind all these things is always the same ...

    In other words, "shut up and calculate".
    Ouellette also suggests a prize for quantum teleportation. Others have suggested prizes for work testing Bell's theorem or quantum cryptography or quantum computing.

    I really don't see how there can be a Nobel prize for foundational work in quantum mechanics when there is no consensus on an interpretation. If anything, the whole field is regressing with more physicists believing in wackier interpretations like many-worlds.

    Friday, October 3, 2014

    Using Bayes to update odds

    The NY Times science section rarely publishes an article about math, and when they do, it is so confused that I wonder if anyone learns anything. Here is the latest, its most emailed story for the day:
    Now Bayesian statistics are rippling through everything from physics to cancer research, ecology to psychology. Enthusiasts say they are allowing scientists to solve problems that would have been considered impossible just 20 years ago. And lately, they have been thrust into an intense debate over the reliability of research results.

    When people think of statistics, they may imagine lists of numbers — batting averages or life-insurance tables. But the current debate is about how scientists turn data into knowledge, evidence and predictions. Concern has been growing in recent years that some fields are not doing a very good job at this sort of inference. In 2012, for example, a team at the biotech company Amgen announced that they’d analyzed 53 cancer studies and found it could not replicate 47 of them.

    Similar follow-up analyses have cast doubt on so many findings in fields such as neuroscience and social science that researchers talk about a “replication crisis”
    It is a problem that major cancer studies cannot be replicated, but the suggestion here is that Bayesianism solves the problem by avoiding lists of numbers and using more computing power. That is nonsense.

    It quotes Andrew Gelman as a prominent Bayesian gurn, but he disavows much of the article, including nearly all the opinions attributed to him. Still, he says that it is a good article, because he expects dumb journalists to screw up the content anyway.

    A big example is the Monty Hall problem:
    A famously counterintuitive puzzle that lends itself to a Bayesian approach is the Monty Hall problem, in which Mr. Hall, longtime host of the game show “Let’s Make a Deal,” hides a car behind one of three doors and a goat behind each of the other two. The contestant picks Door No. 1, but before opening it, Mr. Hall opens Door No. 2 to reveal a goat. Should the contestant stick with No. 1 or switch to No. 3, or does it matter?

    A Bayesian calculation would start with one-third odds that any given door hides the car, then update that knowledge with the new data: Door No. 2 had a goat. The odds that the contestant guessed right — that the car is behind No. 1 — remain one in three. Thus, the odds that she guessed wrong are two in three. And if she guessed wrong, the car must be behind Door No. 3. So she should indeed switch.
    This is too confusing to be useful. There are some hidden assumptions about how Mr. Hall opens the door, or the problem cannot be solved. It says "odds" instead of "probability". Furthermore, it doesn't really require Bayesianism because other approaches give the same answer.

    The article is really about Bayesian probability, an increasingly popular interpretation of probability. The interpretation has a cult-like following, where followers insist that it is the most scientific way to look at data.

    Bayesianism has its merits, but it is just an interpretation, and is not provably better than others. There are some subtle points here, as some of the confusion over quantum mechanics is rooted in confusion over how to interpret probability. The NY Times should have asked Gelman to review the article for accuracy before publication, but the paper considers that unethical.

    The same newspaper reports:
    All five research groups came to the conclusion that last year’s heat waves could not have been as severe without the long-term climatic warming caused by human emissions.

    “When we look at the heat across the whole of Australia and the whole 12 months of 2013, we can say that this was virtually impossible without climate change,” said David Karoly, a climate scientist at the University of Melbourne who led some of the research.
    It seems obvious that if you take a computer model of the Earth's climate, and subtract out some human-attributed warming, then the model will show the Earth not as warm. Or that if you observe some extreme weather, you can attribute it to climate change.

    What's missing here is some showing of a net worsening of the climate. Maybe some extreme cold waves were avoid. And what's also missing is a showing that the human influence is significant. Maybe humans made the heat waves worse, but only slightly, and not enuf for anyone to care.

    The research papers probably address these points. I get the impression that the NY Times staff wants to convince everyone of various evironmental causes associated to global warming, but I don't think these explanations help.

    Wednesday, October 1, 2014

    Quantum crypto hype on PBS Nova

    I just watched PBS NOVA Rise of the Hackers
    A new global geek squad is harnessing cryptography to stay a step ahead of cybercriminals.
    At the heart of almost everything we do on the web is a special kind of number, a prime number. ... But what makes them so special to codes is when you combine two of them. ... If ... you've got to figure out the two original primes, the only way is trial and error. ...

    This system of public and private keys is known as the RSA algorithm. That beautiful piece of mathematics has fundamentally changed the world around us.
    I do not think that these explanations are very helpful. It is true that most SSL web servers use RSA, but Diffie-Hellman came first and is arguably superior for most purposes. That was the real breakthrough. And trial and error is not the only way to break RSA, as other methods are asymptotically much faster.
    It has made the hunt for very very large prime numbers one of the most important quests in the field of mathematics. And here is the current largest, all 5000 pages of it.
    This is nonsense. These numbers have almost no importance to either mathematics or cryptography. They are just a silly curiosity.
    The mathematics of the world of the very small means that things can be in many places at the same time. ... They may be a way to crack the world's most powerful codes.
    Not really. Saying that particles can be in more than one place at the same time is just an interpretation of the wave function, and has never been observed. So you can choose to believe in it or not, and no one can disprove you.

    They go on to explain qubits as being 0 and 1 at the same time, so that a quantum computer can calculate both simultaneously. Scott Aaronson argues on his blog that this is a bad explanation, because it leads you to believe that quantum computers are more powerful than they really are.
    It blows the doors off of RSA encryption. Right. All we need is more and more qubits, a larger quantum computer. Really all that's left to do is to scale up this particular architecture.
    This is just nonsense. They have not made a scalable qubit, and there is no known way to scale up what they have. They have never achieved that quantum speedup.

    Seth LLoyd says:
    Quantum computers are particular fine for ... or for simulating happens when a black hole collapses, or for that matter, a recent experiment that we did to actually implement a version of time travel.
    He is dreaming here.
    But quantum mechanics also provides methods of communicating securely in a way that is guaranteed by the laws of physics.
    More nonsense from Lloyd.
    Quantum cryptography is already used by folks who want extreme security. By banks and by agencies whose job is to protect information. And nowadays there are number of commercial companies who actually build quantum cryptographic systems. For a fee, you too can communicate in communicate in complete and utter privacy guaranteed by the laws of quantum mechanics.
    Before you pay that fee, you should realize: Every one of those systems has been broken. No bank or agency uses them for anything important. They cannot be used with routers or any other internet equipment. They cannot authenticate who you are communicating with. Even that supposed quantum mechanics guarantee is just a probabilistic statement that is much weaker than you can get with RSA or other methods.

    Lloyd admits that these systems are susceptible to an impostor at the other end, but then implies that all crypto systems have this same problem. But that is not true. The main technical advance of RSA was actually not encryption, but authentication. When you log into Ebay or Amazon, it verifies that you are really connecting to Ebay or Amazon. In worst case, you could be connecting to someone who has hijacked the DNS servers and stolen Ebay's RSA secret key, or who built a quantum computer to factor the RSA public key. Neither has ever happened. Quantum crypto offers no such assurances, and you would be on your own to verify that a cable connects directly from your computer to the Ebay server. It can detect certain kinds of tampering, but it cannot detect some hijacking the whole line.

    I don't know how respectable professors like Lloyd can shamelessly over-hype their field like this. They are charlatans. Even Aaronson is not this bad.

    A better popular explanation can be found in this Wired article on the double slit experiment:
    To summarize, we’ve arrived at a pattern of fringes that’s built up one particle at a time. But when you try to work out exactly how those particles got there, you find that they didn’t take the left route, they didn’t take the right route, they didn’t take both routes, and they didn’t take neither routes. As MIT professor Allan Adams puts it, that pretty much exhausts all the logical possibilities!

    An electron is not like a wave of water, because unlike a wave, it hits a screen at a single location. An electron is not like a baseball, because when you throw in a bunch of them through a double slit, they interfere and create patterns of fringes. There is no satisfying analogy that anyone can give you for what an electron is.
    Some people prefer to think of the electron as being two places at once, or being a wave, or even as being immune to the usual laws of logic. If one of those makes you feel better, then go ahead. But there is no perfect classical analogy of the sort sought by Einstein and Bell. Those who use these analogies to promote quantum crypto or computing are just charlatans.

    Update: I mentioned above a point that Aaronson likes to make on his blog, and he just made it again:
    Several readers have expressed disapproval and befuddlement over the proposed title of my next book, “Speaking Truth to Parallelism.” ...

    What it means, of course, is fighting a certain naïve, long-ago-debunked view of quantum computers—namely, that they would achieve exponential speedups by simply “trying every possible answer in parallel”—that’s become so entrenched in the minds of many journalists, laypeople, and even scientists from other fields that it feels like nothing you say can possibly dislodge it. The words out of your mouth will literally be ignored, misheard, or even contorted to the opposite of what they mean, if that’s what it takes to preserve the listener’s misconception about quantum computers being able to solve NP-hard optimization problems by sheer magic. ...

    Coincidentally, this week I also got an email from a longtime reader of this blog, saying that he read and loved Quantum Computing Since Democritus, and wanted my feedback on a popular article he’d written about quantum computing.  What was the gist of the article?  You guessed it: “quantum computing = generic exponential speedups for optimization, machine learning, and Big Data problems, by trying all the possible answers at once.”

    These people’s enthusiasm for quantum computing tends to be so genuine, so sincere, that I find myself unable to blame them—even when they’ve done the equivalent of going up to Richard Dawkins and thanking him for having taught them that evolution works for the good of the entire species, just as its wise Designer intended.  I do blame the media and other careless or unscrupulous parties for misleading people about quantum computing, but most of all I blame myself, for not making my explanations clear enough.  In the end, then, meeting the “NP space” folks only makes me want to redouble my efforts to Speak Truth to Parallelism: eventually, I feel, the nerd world will get this point.
    The reasoning is this: (1) quantum mechanics proves that an electron can be in two places at once, as a cat can be in a superposition of dead and alive; (2) a qubit is a computer 0-1 bit that stores a superposition of the two possible states; and (3) a quantum computer processes qubits like a regular computer processes bits, thereby computing all possible data values at once.

    I am glad to see that Aaronson is writing a whole book on why this reasoning is wrong, but I fear that he will not get to the heart of the matter, because of his faith in quantum computing. The root of the problem is that it is just not true that electrons can be in two places at once, or have negative probability.

    His blog readers do not like his new book title, and one suggests: “Everything You Believe About Quantum Computing is a Lie”. I can agree with that title. So he will not use it. He wants to hang on to the lie that quantum computers are almost certainly possible in the future. (I think we agree that they are not possible with today's technology.)

    Tuesday, September 30, 2014

    The enemies of good science

    A lot of science bloggers call themselves "skeptics", and are always on the warpath against religion (like Christianity) and pseudoscience (like homeopathy). I am more concerned with bad thinking that corrupts otherwise-intelligent scholars. Here are my main targets.

    Paradigm shifters - They subscribe to T.S. Kuhn's theory that science is all a big popularity contest, with new theories winning out without any rational or measurable advantages, and never making progress towards truth.

    Proof deniers - They fail to appreciate that a mathematical proof can give certain knowledge.

    Goedel fools - They argue that math lacks solid foundations because of some technical logic theorems.

    Einstein idolizers - They treat Einstein as a god, and learn all the wrong lessons from the history of relativity and the Bohr debates.

    Thought experimenters - They endlessly speculate about a black hole interior, or universes before the big bang, or quantum gravity, or other questions outside the domain of observable science.

    Falsifiers - They deny that science teaches us anything, except that older theories have been

    Dreamers of imaginary worlds - They like to invent fantasy worlds, and put them out under names like the multiverse, many worlds interpretation, and strings in higher dimensions.

    Hidden variable searchers - These are always telling us that quantum mechanics is incompatible with realism because it fails to identify the hidden variables that will unlock all the mysteries of the universe.

    Publicity panderers - They will say whatever gets media attention, such as fantastic claims for quantum communication and computers.

    Political correctness enforcers - On an assortment of topics (global warming, evolution, human biodiversity, etc), they are more interested in silencing their enemies than promoting scientific truth.

    Sunday, September 28, 2014

    Tyson's excuse is absence of evidence

    I mentioned that Tyson used a bogus quote to bash Bush, and he replied:
    I have explicit memory of those words being spoken by the President. I reacted on the spot, making note for possible later reference in my public discourse. Odd that nobody seems to be able to find the quote anywhere -- surely every word publicly uttered by a President gets logged.

    FYI: There are two kinds of failures of memory. One is remembering that which has never happened and the other is forgetting that which did. In my case, from life experience, I’m vastly more likely to forget an incident than to remember an incident that never happened. So I assure you, the quote is there somewhere. When you find it, tell me. Then I can offer it to others who have taken as much time as you to explore these things.

    One of our mantras in science is that the absence of evidence is not the same as evidence of absence.
    That is a stupid mantra. It allows people to promote all sorts of unscientific nonsense.

    Someone could say, "Sure there is no evidence for psychokinesis, but that does not mean there is evidence against psychokinesis." Or substitute astrology, parallel universes, higher dimensions, quantum computing, or whatever is your favorite belief that lacks evidence.

    In this case, G.W. Bush's post-9/11 public statements were all logged. The sillier ones ended up in a Michael Moore movie. If the quote is not on the record, then Bush did not say it. In public for Tyson to hear it, anyway.

    A lot of people believe that the 9/11 WTC hit was just a battle in a multi-century war between Christendom and Islam. Bush is a Christian, so it is fair to assume that he believes Christianity to be superior to Islam. He may have even used the word "crusade" by mistake. But he certainly never publicly framed this as a religious war, with the Christian God on our side.

    Not only is the quote wrong, but the larger point about the President announcing a religious war is wrong also. Bush's statements were similar to what the President said last week:
    At the same time, we have reaffirmed again and again that the United States is not and never will be at war with Islam. Islam teaches peace. ... So we reject any suggestion of a clash of civilizations. Belief in permanent religious war is the misguided refuge of extremists ...
    The other misquotes are in support of stupid points also. I know of areas where the community would be very upset if half their schools were below average, as "average" would probably mean on the state standardized tests. There are also issues where my personal views have changed 360 degrees. There is nothing wrong with a congressman saying, “I have changed my views 360 degrees on that issue.” In some cases, their views might change 180 degrees every time a lobbyist walks in his office. Two such changes, and his view have changed 360 degrees.

    Electrons have to rotate 720 degrees to get back to their original state, so maybe a clever congressman might say that his views changed 720 degrees.

    I am all in favor of making fun of politicians for saying something stupid, but these things are not stupid without additional context. There are plenty of better examples. Just look at the above Obama speech to the UN, where he compares Islamic terrorists to Ferguson policemen. Or look at the dumb stuff VP Biden says all the time.

    Tyson is still refusing to admit that he made up the Bush quote, that Bush's actual statements were more nearly the opposite, that Tyson was trying to score cheap political points, that those points are entirely false, and that Tyson made up the other quotes as well. I was not looking for an apology. But now I know that he has a disregard for the facts when he tells personal anecdotes.

    Update: Tyson now says:
    When eager scrutinizers looked for the quote they could not find it, and promptly accused me of fabricating a Presidential sentence. Lawyers are good at this. They find something that you get wrong, and use it to cast doubt on everything else you say. ...

    My bad. And I here publicly apologize to the President for casting his quote in the context of contrasting religions rather than as a poetic reference to the lost souls of Columbia. ...

    And I will still mention the President’s quote. But instead, I will be the one contrasting what actually happened in the world with what the Bible says: The Arabs named the stars, not Yahweh.
    So he will be disproving the Bible instead of Bush? There must be a better way to credit Arab astronomy, if it really matters that some stars have Arab names.

    Update: Sean Davis concludes:
    After all the strum and drang, Tyson still doesn’t seem to grasp the main issue here: this wasn’t a misquote. It was a fabrication that deliberately created the exact opposite impression of how reality actually transpired. It was the sort of thing a dishonest politician does, not the sort of behavior you’d expect from a scientist who’s allegedly devoted to studying reality.
    That's right. I would not be piling on if Tyson just corrected the (mis-)quote, and skipped the argument about how the misquote was okay.

    Friday, September 26, 2014

    Ed Witten still believes in string theory

    John Horgan has interviewed the smartest living physicist:
    At a 1990 conference on cosmology, I asked attendees, who included folks like Stephen Hawking, Michael Turner, James Peebles, Alan Guth and Andrei Linde, to nominate the smartest living physicist. Edward Witten got the most votes (with Steven Weinberg the runner-up). Some considered Witten to be in the same league as Einstein and Newton. Witten was and is famous for his work on string theory, which unifies quantum mechanics and relativity and holds that all of nature’s forces—including gravity–stem from infinitesimal particles wriggling in a hyperspace consisting of many extra dimensions.

    Even then, string theory — which some enthusiasts (not including Witten) called a “theory of everything” – was extremely controversial, because there seemed to be no way to confirm experimentally the existence of strings or the extra dimensions they supposedly inhabit.
    Witten has somehow convinced mathematicians and physicists that he is a great genius, even tho he does not do straight math or straight physics. Some of his math ideas have been turned into legitimate proofs by others. Not sure if any of his physics ideas have panned out.
    Horgan: Do you see any other rivals for a unified theory of physics?

    Witten: There are not any interesting competing suggestions. One reason, as remarked in “Unravelling,” is that interesting competing ideas (twistor theory, noncommutative geometry, …) tend to be absorbed as part of a larger picture in string theory. The competing interesting ideas have been very fragmentary and have tended to gain power when absorbed in string theory. ...

    Witten: Personally, I hope the landscape interpretation of the universe would turn out to be wrong, as I would like to be able to eventually calculate from first principles the ratio of the masses of the electron and muon (among other things). However, the universe wasn’t made for our convenience. Plenty of leading physicists — prominent examples being Steve Weinberg and Martin Rees – have taken the acceleration of the cosmic expansion seriously as a hint that a landscape interpretation of the universe may be correct.
    This is the opinion of a true believer. Whatever he sees, he finds a way to interpret it to match his beliefs from 30 years ago.
    Horgan: Do you agree with Sean Carroll that falsifiability is overrated as a criterion for distinguishing science from pseudo-science?

    Witten: Scientists aim to get as reliable and precise an understanding of nature as we can. The gold standard is a precise prediction that can be tested in a precise way in a laboratory experiment. Experiments that disprove theories are an important part of the scientific process.

    With that said, it is a little too narrow to claim that science consists of trying to falsify theories because a lot of science consists of trying to discover things. (Chemists who attempt a new synthesis could say they are trying to falsify the hypothesis that this new synthesis won’t work. But that isn’t what they usually say. People who search for life on Mars could say they are trying to falsify the hypothesis that there is no life on Mars. Again, people don’t usually talk that way.)
    Witten attacks a straw man, as nobody ever said that science consists only of trying to falsify theories. As Horgan said, it is a criterion for distinguishing pseudoscience. Karl Popper argued that Sigmund Freud would not accept anything as falsifying his theory of dream interpretation, and hence it is unscientific (and pseudoscience). Popper was right, and falsifiability is a useful criterion.

    Carroll and Witten don't like it because they promote ideas that are no more testable than Freud's dream interpretations. Like the multiverse and string theory.

    Woit points out that Witten responded to Horgan in a 1996 WSJ article:
    There is a high probability that supersymmetry, if it plays the role physicists suspect, will be confirmed in the next decade. The existing accelerators that have a chance of doing so are the proton collider at the Department of Energy’s Fermi Lab in Batavia, Ill., and the electron collider at the European Center for Nuclear Research (CERN) in Geneva.
    Speaking of scientists being wrong, Tyson is apparently still refusing to admit that he has been making up quotes in speeches, and there is a Wikipedia edit war about it. This is pathetic. The Bush quote is almost directly opposite what Bush said. Others have been disgraced for inventing quotes. I try to verify quotes on this blog. Occasionally I'll use a quote that I cannot verify, but then I will say so.

    Wednesday, September 24, 2014

    Aaronson writing quantum computing book

    MIT complexity theorist Scott Aaronson announces:
    A few months ago, I signed a contract with MIT Press to publish a new book: an edited anthology of selected posts from this blog, along with all-new updates and commentary.  The book’s tentative title (open to better suggestions) is Speaking Truth to Parallelism: Dispatches from the Frontier of Quantum Computing Theory.
    His book will surely include his post with the most hits:
    For better or worse, I’m now offering a US$100,000 award for a demonstration, convincing to me, that scalable quantum computing is impossible in the physical world. This award has no time limit other than my death, and is entirely at my discretion (though if you want to convince me, a good approach would be to convince most of the physics community first).
    He explains that he was driven to make this offer by skepticism from me and others about the possibility of quantum computers. He has been active in throwing cold water on the over-hyped claims about quantum computers, but he is also stung by criticisms that he is devoting his life to analyzing something that may not even be physically possible. He has confessed his envy because other scientists can point to the intrinsic worth of the field, but quantum computer theorists have to rely on bogus claims about practical applications.

    If I collect my quantum mechanics posts for a book, I'll be sure to mention his offer.

    Among physicists, the common views on quantum computing, in order of decreasing popularity, are: (1) quantum computers have already been built, and they will eventually have enough qubits to be useful; (2) scalable quantum computing has not been demonstrated but is a logical consequence of quantum mechanics and advanced engineering should eventually make it possible; and (3) achieving super-Turing computing is like building a perpetual motion machine, and is unlikely to ever be achieved.

    The popular press would lead you to believe opinion (1). Aaronson stands for opinion (2), and I agree with him that (1) is wrong. I believe in opinion (3), for reasons explained here. I could be proved wrong, of course.

    In addition to those reasons, I have some philosophical differences with him that contribute to our divergent views.

    I subscribe to an epistemic, rather than ontic, interpretation of quantum mechanics. That is, I accept the Copenhagen interpretation that was promoted by Bohr and generally accepted since the 1930s, and what Mermin now calls QBism. Aaronson paints a picture of our universe as weirdly intermediate between local and nonlocal. The psi-ontic physicists are the ones who are forever saying that quantum mechanics does not make sense, and that philosophical principles require an unobservable multiverse.

    I subscribe to logical positivism, so I am very skeptical about what cannot be demonstrated. My preference is for a more positivist interpretation than even what Bohr proposed.

    My slogan is Natura non facit saltus. Leibniz used this phrase to attack the "occult qualities" of an action-at-a-distance theory.

    Here is Lumo and Gell-Mann sensibly dismissing many-worlds and nonlocality:
    Gell-Mann spends several minutes by arguing that the feature of Everett's ideology that there are "many worlds that are equally real" is operationally meaningless. The comment may only mean that the theory treats the possible alternative histories on equal footing, except for their generally different probabilities. But only one of those actually takes place in the "real", experience-based sense of the word "actually". ...

    However, that changes totally after 11:50 when Gell-Mann starts to talk about the "foolishness" often associated with the entanglement ("Einstein-Podolsky-Rosen-Bohm effect", using his words). He treats this issue at some length in his book; I hope he meant The Quark and the Jaguar.

    OK, where did the "foolishness" come from? Gell-Mann says that the bulk of John Bell's work was right but he introduced words that were prejudicial such as "nonlocal". People often say that there is something nonlocal about the EPR phenomena but the only correct similar statement that they could mean, Gell-Mann emphasizes (and I often do, too) is that a classical interpretation of what is happening would require nonlocality (or negative probabilities). But the world is not classical, and no nonlocality is needed because the world is quantum mechanical. As far as Gell-Mann can tell, it's like giving a bad name to a dog and sticking with it.
    Many of the quantum computing enthusiasts subscribe to many-worlds, as some of them argument that the mysterious speedup is going to come from computation in parallel universes. Guru David Deutsch says that, and Brian Cox just said something similar on the UK BBC. Aaronson does not go that far, but he does stress that the key to understanding quantum mechanics is negative probability. Gell-Mann has a much more sensible view.

    Monday, September 22, 2014

    BICEP2 just saw cosmic dust

    Here is the current SciAm cover story:
    How Big Bang Gravitational Waves Could Revolutionize Physics
    If the recent discovery of gravitational waves emanating from the early universe holds up under scrutiny, it will illuminate a connection between gravity and quantum mechanics and perhaps, in the process, verify the existence of other universes
    By Lawrence M. Krauss

    In March a collaboration of scientists operating a microwave telescope at the South Pole made an announcement that stunned the scientific world. They claimed to have observed a signal emanating from almost the beginning of time. The putative signal came embedded in radiation left over from the action of gravitational waves that originated in the very early universe — just a billionth of a billionth of a billionth of a billionth of a second after the big bang.

    The observation, if confirmed, would be one of the most important in decades. It would allow us to test ideas about how the universe came to be that hitherto scientists have only been able to speculate about. It would help us connect our best theories of the subatomic (quantum) world with our best theories of the massive cosmos — those based on Einstein's general theory of relativity. And it might even provide compelling (though indirect) evidence of the existence of other universes.
    No, the observation has not been confirmed, and it appears that all BICEP2 saw was some polarization caused by cosmic dust.

    Even if it had been confirmed, I don't see how it could have been evidence for either quantum gravity or the existence of other universes. It is widely believed that the big bang was accelerated by something called inflation, but we do not know the source, magnitude, or duration of the inflation force, or even whether it is reasonable to call it a force. So if we see echoes of the big bang, we are probably seeing inflation waves, not quantum gravity waves. And we are certainly not seeing other universes.

    A clue to the over-hype is that the title says "revolutionize physics" and the first name in the article is Einstein. No, this would not have been some stupid paradigm shift. Einstein did not even believe in the big bang, gravity waves, or quantum mechanics, and probably would not have believed in the multiverse either.

    SciAm also have a couple of letters about free will. A philospher writes:
    In “The World without Free Will,” Azim F. Shariff and Kathleen D. Vohs assert that a survey revealed that “the more people doubt free will, the less they favor ‘retributive’ punishment” and indicate that the notion of free will is necessary to social order. What constitutes human freedom is a complex matter, fraught with ambiguities that have been debated for millennia. The authors don't clarify the survey's questions. For instance, what if it had asked respondents to rate the relative influence of several factors, such as physical laws, biological impulses, life experiences, the cultural environment, rational deliberation or a sense of self-determination? Wouldn't that have elicited a more nuanced response?
    Yes, more nuanced, as those things cannot be distinguished without a lot of careful definitions. Another writes:
    Shariff and Vohs ask the question “What will our society do if it finds itself without the concept of free will?” But they do little to clarify the issue.
    How much can you say about what people will do if they find out that they do not have free will? If they do not have free will, then they are just robots who will follow their programming. It always seems funny to me when people who say that they do not believe in free will, and then try to convince people of various beliefs.

    Update: Lumo is a believer:
    There is a scientific substance. Everyone who is interested in cosmology would love to know whether the imprints of the primordial gravitational waves have been seen. I agree with those who say that this discovery, if true, is the greatest discovery in many years if not decades or a century. I would probably place it above the Higgs boson discovery because unlike the Higgs boson, it wasn't really guaranteed.

    However, we must ask: is the discovery real?

    Of course that I am not 100.00000% sure. But I still think it's significantly more likely than not that the BICEP2 discovery is genuine and the pattern they see is simply not dust. Why? Because it clearly doesn't look like dust.
    And it looks like primordial quantum gravity waves? I don't think so.