Sunday, January 27, 2013

Brain Behind Einstein's Famous Equation

ScienceDaily reports:
Jan. 25, 2013 — A new study reveals the contribution of a little known Austrian physicist, Friedrich Hasenöhrl, to uncovering a precursor to Einstein famous equation.

A new study reveals the contribution of a little known Austrian physicist, Friedrich Hasenöhrl, to uncovering a precursor to Einstein famous equation.

Two American physicists outline the role played by Austrian physicist Friedrich Hasenöhrl in establishing the proportionality between the energy (E) of a quantity of matter with its mass (m) in a cavity filled with radiation. In a paper about to be published in the European Physical Journal H, Stephen Boughn from Haverford College in Pensylvannia and Tony Rothman from Princeton University in New Jersey argue how Hasenöhrl's work, for which he now receives little credit, may have contributed to the famous equation E=mc2.

According to science philosopher Thomas Kuhn, the nature of scientific progress occurs through paradigm shifts, which depend on the cultural and historical circumstances of groups of scientists. Concurring with this idea, the authors believe the notion that mass and energy should be related did not originate solely with Hasenöhrl. Nor did it suddenly emerge in 1905, when Einstein published his paper, as popular mythology would have it.

Given the lack of recognition for Hasenöhrl's contribution, the authors examined the Austrian physicist's original work on blackbody radiation in a cavity with perfectly reflective walls. This study seeks to identify the blackbody's mass changes when the cavity is moving relative to the observer.

They then explored the reason why the Austrian physicist arrived at an energy/mass correlation with the wrong factor, namely at the equation: E = (3/8) mc2. Hasenöhrl's error, they believe, stems from failing to account for the mass lost by the blackbody while radiating.

Before Hasenöhrl focused on cavity radiation, other physicists, including French mathematician Henri Poincaré and German physicist Max Abraham, showed the existence of an inertial mass associated with electromagnetic energy. In 1905, Einstein gave the correct relationship between inertial mass and electromagnetic energy, E=mc2. Nevertheless, it was not until 1911 that German physicist Max von Laue generalised it to include all forms of energy.
I reported on this paper in Aug. 2011 here and here, with a link to an English translation of Hasenöhrl's 1904 paper.

None of this supports Kuhnian paradigm shifts. Kuhn defined such a shift as an irrational change of viewpoint with no measurable evidence. (Kuhn preferred the terms arational and incommensurable.) The relation between relativistic mass and energy was first predicted by Lorentz in 1899, and experimentally tested in 1902.

Wednesday, January 23, 2013

Poincare's conventionalism

Henri Poincare was the leading mathematical physicist of 1900. Besides discovering relativity, he had a big impact on the philosophy of science. Here are a couple of new papers on him.

Poincare's impact on 20th century philosophy of science:
Poincaré’s conventionalism has thoroughly transformed both the philosophy of science and the philosophy of mathematics. Not only proponents of conventionalism, such as the logical positivists, were influenced by Poincaré, but also outspoken critics of conventionalism, such as Quine and Putnam, were inspired by his daring position. Indeed, during the twentieth century, most philosophers of mathematics and of science engaged in dialogue with conventionalism. As is often the case with such complex clusters of ideas, there is no consensus about the meaning of conventionalism in general, and Poincaré’s original version of it, in particular. Nonetheless, notions such as the under-determination (of theory), empirical equivalence (of incompatible theories), implicit definition, holism and conceptual relativity, all of which can be linked to Poincare's writings (even if not under those very names) have become central to philosophy. This essay explores the impact of some of these notions on twentieth century philosophy of science. In addition to inspiration based on Poincare's actual views, it emphasizes directions based on misreading and unjustified appropriations of Poincaré.
Did Perrin's experiments convert Poincaré to Scientific Realism?
In this paper I argue that Poincaré’s acceptance of the atom does not indicate a shift from instrumentalism to scientific realism. I examine the implications of Poincaré’s acceptance of the existence of the atom for our current understanding of his philosophy of science. Specifically, how can we understand Poincaré’s acceptance of the atom in structural realist terms? I examine his 1912 paper carefully and suggest that it does not entail scientific realism in the sense of acceptance of the fundamental existence of atoms but rather, argues against fundamental entities. I argue that Poincaré’s paper motivates a non-fundamentalist view about the world, and that this is compatible with his structuralism.

Sunday, January 20, 2013

Saturday, January 19, 2013

Explaining quantum observers

Lumo explains quantum mechanics again:
I am writing down this preposterous story because this is exactly the type of thinking that many popular – and, using Sidney Coleman's words, sometimes even not-so-popular – books and articles want you to manipulate you into. GRW and Penrose collapse theories as well as the many-worlds ideology are example models giving special objects the right to "intervene" into Schrödinger's equation, either by discontinuous jumps or collapses or by splitting the world (which is comparably, infinitely ambitious). However, all this reasoning is completely nonsensical. There doesn't exist any systems for which the evolution according to the laws of quantum mechanics such as Schrödinger's equation is replaced by some discontinuous jumps. Quantum mechanics applies to all systems and processes in Nature, regardless of their size, duration, sex, race, and nationality. ...

Needless to say, people are looking for an "objective classical model of reality" that is valid for everyone. But quantum mechanics shows that Nature can't be described in this way. Instead, quantum mechanics tells you that you must understand yourself as an observer who may perceive the values of certain observables and quantum mechanics tells you that observing some values of observables at one moment implies that the probability of observing some combination of other observables at a different moment is something or something else. That's the only thing you may really empirically verify so it's just unphysical to "demand" that science also explains something else (such as an "underlying objective reality").
He is right. As you can see from my slogan, I do not believe in discontinuous jumps. Belief in hidden variables and jumps is misguided.

Sean M. Carroll writes about the poll I posted last week:
I’ll go out on a limb to suggest that the results of this poll should be very embarrassing to physicists. Not, I hasten to add, because Copenhagen came in first, although that’s also a perspective I might want to defend (I think Copenhagen is completely ill-defined, and shouldn’t be the favorite anything of any thoughtful person). The embarrassing thing is that we don’t have agreement.

Think about it — quantum mechanics has been around since the 1920's at least, in a fairly settled form. John von Neumann laid out the mathematical structure in 1932. Subsequently, quantum mechanics has become the most important and best-tested part of modern physics. Without it, nothing makes sense. Every student who gets a degree in physics is supposed to learn QM above all else. There are a variety of experimental probes, all of which confirm the theory to spectacular precision.

And yet — we don’t understand it. Embarrassing. To all of us, as a field (not excepting myself).

I’m sitting in a bistro at the University of Nottingham, where I gave a talk yesterday about quantum mechanics. I put it this way: here in 2013, we don’t really know whether objective “wave function collapse” is part of reality (as the poll above demonstrates).
Lumo rips him for these silly comments, but I think that it is embarrassing that so many physicists like Carroll do not seem to understand what von Neumann elucidated in 1932.

Chad Orzel responds:
He dates this from John von Neumann laying out the mathematical foundations in 1932, which is a little ironic, because about thirty of those eighty years of inaction can be laid at von Neumann’s feet. When he laid out his formulation of quantum mechanics, von Neumann asserted that hidden-variable theories were ruled out mathematically, and his reputation was such that most physicists regarded this as a settled question on that basis. The problem is, he was flat wrong on this point, relying on a mathematical theorem that didn’t actually say what he claimed it did.

The question was eventually re-opened in part by people thinking about it the right way– in particular David Bohm, and then John Bell.
Von Neumann was right about there being no hidden variables. His theorem was maybe weaker than what some people realized, but the search by Bohm and others for hidden variable theories has been a theoretical and experimental failure.

Wednesday, January 16, 2013

What worries a physicist about the quantum

The has its annual question for intellectuals. Lee Smolin writes:
But there is another possibility: that quantum mechanics does not provide an explanation for what happens in individual phenomena because it is incomplete, because it simply leaves out aspects of nature needed for a true description. This is what Einstein believed and it is also what de Broglie and Schroedinger, who made key steps formulating the theory, believed. This is what I believe and my lifelong worry has been how to discover that more complete theory.

A completion of quantum mechanics which allows a full description of individual phenomena is called a hidden variables theory. Several have been invented; one which has been much studied was invented by de Broglie in 1928 and reinvented by David Bohm in the 1950s. This shows its possible, now what we need to do is find the right theory. The best way to do that would be to discover a theory that agreed with all past tests of quantum mechanics but disagreed about the outcomes of experiments with large, complex quantum devices now under development.

We know that such a theory must be radically non-local, in the sense that once two particles interact and separate, their properties are entangled even if they travel far from each other. This implies that information as to the precise outcomes of experiments they may be each subject to has to be able to travel faster than light.

This means that a complete theory of quantum phenomena must contain a theory of space and time. As a result I've long believed that the task of completing quantum mechanics and the challenge of unifying quantum mechanics with spacetime are one and the same problem. I also see the problem of extending our understanding of physics at the cosmological scale to be the same as discovering the world behind quantum mechanics.
This belief in a theory of nonlocal hidden variables is irrational. There is not a shred of theoretical or experimental evidence for it. All attmpts at such a theories have been miserable failures.

Monday, January 14, 2013

Copenhagen interpretation did not demand realism

NewScientist explains strange new research in quantum physics, and concludes:
So, has Bohr been proved wrong too? Johannes Kofler of the Max Planck Institute of Quantum Optics in Garching, Germany, doesn't think so. "I'm really very, very sure that he would be perfectly fine with all these experiments," he says. The complementarity principle is at the heart of the "Copenhagen interpretation" of quantum mechanics, named after Bohr's home city, which essentially argues that we see a conflict in such results only because our minds, attuned as they are to a macroscopic, classically functioning cosmos, are not equipped to deal with the quantum world. "The Copenhagen interpretation, from the very beginning, didn't demand any 'realistic' world view of the quantum system," says Kofler.

The outcomes of the latest experiments simply bear that out. "Particle" and "wave" are concepts we latch on to because they seem to correspond to guises of matter in our familiar, classical world. But attempting to describe true quantum reality with these or any other black-or-white concepts is an enterprise doomed to failure.

It's a notion that takes us straight back into Plato's cave, says Ionicioiu. In the ancient Greek philosopher's allegory, prisoners shackled in a cave see only shadows of objects cast onto a cave wall, never the object itself. A cylinder, for example, might be seen as a rectangle or a circle, or anything in between. Something similar is happening with the basic building blocks of reality. "Sometimes the photon looks like a wave, sometimes like a particle, or like anything in between," says Ionicioiu. In reality, though, it is none of these things. What it is, though, we do not have the words or the concepts to express.
That's right. Physicists keep cooking up fancy terminology for quantum paradoxes, but the core strangeness is the same as what Bohr explained 80 years ago.

Saturday, January 12, 2013

Paper says all probability is quantum

NewScientist reports:
WHY is there a 1 in 2 chance of getting a tail when you flip a coin? It may seem like a simple question, but the humble coin toss is now at the heart of a lively row about the multiverse. At stake is the ability to calculate which, of an infinite number of parallel universes, is the one that we inhabit.

The debate comes in the wake of a paper posted online a couple of weeks ago by cosmologists Andreas Albrecht and Daniel Phillips, both at the University of California, Davis. They argue that conventional probability theory, the tool we all use to quantify uncertainty in the real world, has no basis in reality ( Instead, all problems in probability are ultimately about quantum mechanics. "Every single time we use probability successfully, that use actually comes from quantum mechanics," says Albrecht.

This controversial claim traces back to the uncertainty principle, which says that it is impossible to know both a quantum particle's exact position and its momentum.

Albrecht and Phillips think particle collisions within gases and liquids amplify this uncertainty to the scale of everyday objects. This, they say, is what drives all events, including the outcome of a coin toss. Conventional probability - which says the outcome simply arises from two equally likely possibilities - is just a useful proxy for measuring the underlying quantum uncertainties. ...

There is just one problem. The Born rule breaks down in some situations. The latest theories in cosmology say that our universe is just one part of a vast multiverse containing a large or even infinite number of other "pocket" universes. Some of those universes will be exact copies of our own, right down to a duplicate you. The mathematics behind the Born rule can't cope with this.

"In these situations, the quantum wave function can tell you nothing about which pocket you are in," says Albrecht. That's a problem if we want to predict the properties of our universe, which will look identical to many others at a given point in time, but which can eventually evolve differently due to quantum uncertainty.

Until now, physicists seeking to predict the properties and behaviour of the multiverse have added a sprinkling of conventional probability to reflect the chance of us being in a particular universe. For example, in a multiverse with just two universes, you might add a 50-50 chance of being in either one, just as we instinctively assign the same odds to a coin toss.
It is baffling how someone could think that probability is some sort of physical thing. Probability is just a mathematical interpretation, and the claims of this paper do not make any sense. And even if probabilistic aspects of observable events are attributable to quantum mechanics, that still would not say anything about the multiverse.

Thursday, January 10, 2013

Quantum computing is an open question

Complexity theorist Scott Aaronson reaffirmed that I am banned from his blog, and blocked this comment about quantum computing (QC):
Scott, you are backtracking. Maybe the truth is that no scalable QC is possible. It seems to me that either (1) scalable QC is a consequence of currently established physics, and will be demonstrated someday if civilization does not collapse; (2) scalable QC is a speculative extrapolation of current physics, and its possibility is a open question; or (3) current physics is fundamentally wrong for reasons that are not understood. I thought that you were in camp (2) but now you seem to be in camp (1).
He justifies quantum computing research as being objectively important because he says that an "advanced extraterrestrial civilization" would be doing it also.

I say that scalable QC is an open scientific question. It might be true, and it might be false. Aaronson does not seem to want to accept the idea that it is an open question. He will agree that no one has shown it to possible, but then he will deny that it could be impossible.

Tuesday, January 8, 2013

New poll on quantum mechanics

Maximilian Schlosshauer, Johannes Kofler, Anton Zeilinger have posted A Snapshot of Foundational Attitudes Toward Quantum Mechanics:
Foundational investigations in quantum mechanics, both experimental and theoretical, gave birth to the field of quantum information science. Nevertheless, the foundations of quantum mechanics themselves remain hotly debated in the scientific community, and no consensus on essential questions has been reached. Here, we present the results of a poll carried out among 33 participants of a conference on the foundations of quantum mechanics. The participants completed a questionnaire containing 16 multiple-choice questions probing opinions on quantum-foundational issues. Participants included physicists, philosophers, and mathematicians. We describe our findings, identify commonly held views, and determine strong, medium, and weak correlations between the answers. Our study provides a unique snapshot of current views in the field of quantum foundations, as well as an analysis of the relationships between these views. ...

The statements that found the support of a majority -- i.e., answers checked by more than half of the participants -- were, in order of the number of votes received:
1. Quantum information is a breath of fresh air for quantum foundations (76%).
2. Superpositions of macroscopically distinct states are in principle possible (67%).
3. Randomness is a fundamental concept in nature (64%).
4. Einstein's view of quantum theory is wrong (64%).
5. The message of the observed violations of Bell's inequalities is that local realism is untenable (64%).
6. Personal philosophical prejudice plays a large role in the choice of interpretation (58%).
7. The observer plays a fundamental role in the application of the formalism but plays no distin-guished physical role (55%).
8. Physical objects have their properties well defined prior to and independent of measurement in some cases (52%).
9. The message of the observed violations of Bell's inequalities is that unperformed measurements have no results (52%). ...

Yet, nearly 90 years after the theory's development, there is still no consensus in the scientific community regarding the interpretation of the theory's foundational building blocks. Our poll is an urgent reminder of this peculiar situation
Don't take the percentages too seriously -- apparently some people have two different ideas for the message of the Bell experiments.

Update: See Lumo and John Preskill for additional opinions. My take-away is that there is still no consensus on matters that I thought were settled 80 years ago.

Saturday, January 5, 2013

Why solid matter is solid

The current Radiolab podcast says:
It's comforting to think that if you take an object -- a rock, let's say -- and break it down into tinier and tinier more elemental parts, that that's exactly what you end up with: smaller and smaller particles until you reach the smallest. And voila! Those are the building blocks of everything around us.

But as Jim Holt, author of Why Does the World Exist? points out... that's an old worldview that no longer jives with modern-day science. If you start slicing and sleuthing in subatomic particle land -- trying to get to the bottom of what makes matter -- you mostly find empty space. Your hand, your chair, the's all made up of mostly of nothing.
I say the old worldview is more accurate.

It is not true to say the modern-day science teaches that atoms are mostly empty space. First, there is no such thing as empty space, as modern physics teaches that it is filled with pervasive fields that used to be called the aether. Second, atoms are held together by nuclear and electromagnetic fields, and those fields fill up atoms and material objects.

Holt goes on to say that solid matter is solid because of the Pauli Exclusion Principle and the Heisenberg uncertainty principle, and not because of the electrical properties of electrons. This is only partially correct, and not a helpful explanation. For a more technical explanation, I suggest this review.

Holt's attitude seems to be that because we have mathematical descriptions of subatomic particles, matter is just ethereal math, and is not substantial. Physicists sometimes talk this way, so he is not just making it up. First, we have not reduced these atoms to pure math. Second, having mathematical approximations has nothing to do with matter being substantial.

Holt's book made the NY Times list of the 10 best books of 2012, and it was the only science book on the list. Maybe I should read his book to get his full story, as maybe he quotes prominent physicists for his goofy ideas.

Friday, January 4, 2013

Prizes for quantum physics

A reader refers to the 2012 Nobel Prize in Physics as proof that quantum decoherence can be measured, and others have claimed that it is progress towards quantum computing. For completeness, I quote the official citation:
On the verge of a new computer revolution
A possible application of ion traps that many scientists dream of is the quantum computer. In present-day classical computers the smallest unit of information is a bit that takes the value of either 1 or 0. In a quantum computer, however, the basic unit of information – a quantum bit or qubit – can be 1 and 0 at the same time. Two quantum bits can simultaneously take on four values – 00, 01, 10 and 11 – and each additional qubit doubles the amount of possible states. For n quantum bits there are 2n possible states, and a quantum computer of only 300 qubits could hold 2300 values simultaneously, more than the number of atoms in the universe.

Wineland’s group was the first in the world to demonstrate a quantum operation with two quantum bits. Since control operations have already been achieved with a few qubits, there is in principle no reason to believe that it should not be possible to achieve such operations with many more qubits. However, to build such a quantum computer is an enormous practical challenge. One has to satisfy two opposing requirements: the qubits need to be adequately isolated from their environment in order not to destroy their quantum properties, yet they must also be able to communicate with the outside world in order to pass on the results of their calculations. Perhaps the quantum computer will be built in this century. If so, it will change our lives in the same radical way as the classical computer transformed life in the last century.
And also this:
David Wineland and Serge Haroche have invented and implemented new technologies and methods allowing the measurement and control of individual quantum systems with high accuracy. Their work has enabled the investigation of decoherence through measurements of the evolution of Schrödinger’s cat-like states, the first steps towards the quantum computer, and the development of extremely accurate optical clocks.
I say that there is no way that the quantum computer will change our lives in the same radical way as the classical computer transformed life in the last century.

There has to be a better scientific term than "evolution of Schrödinger’s cat-like states".

A couple of other quantum physicists have just won the Israeli Wolf Prize. Here is praise for them and their work on ion traps for quantum computing. I do not know whether their work proves me wrong about anything. I look forward to finding out.

Wednesday, January 2, 2013

QC is a failed pathological program

Scott Aaronson attacks this Dyakonov paper for its conclusion that quantum computing is a failed, pathological research program, which will soon die out and be of interest only to sociologists:
This is a brief review of the experimental and theoretical quantum computing. The hopes for eventually building a useful quantum computer rely entirely on the so-called "threshold theorem". In turn, this theorem is based on a number of assumptions, treated as axioms, i.e. as being satisfied exactly. Since in reality this is not possible, the prospects of scalable quantum computing will remain uncertain until the required precision, with which these assumptions should be approached, is established. Some related sociological aspects are also discussed.
I do think that QC is a failed research program.

QC starts with the hypothesis that it is impossible to efficiently simulate a quantum system with a classical (Turing) computer. I suspect that is correct. But the leap to scalable QC seems extremely doubtful to me.

Tuesday, January 1, 2013


My last 2012 issue of the NY Times has some new words not in any dictioary: xenointoxication (poisoning the guest) and infiniphobia (fear of infinity). Natalie Angier writes:
Given infinity’s potential for troublemaking, it’s small wonder the ancient Greeks abhorred the very notion of it. ...

On Pythagoras’ Table of Opposites, “the finite” was listed along with masculinity and other good things in life, while “the infinite” topped the column of bad traits like femininity. “They saw it as a cosmic fight,” Dr. Moore said, “with the finite constantly having to subjugate the infinite.”

Aristotle helped put an end to the rampant infiniphobia by drawing a distinction between what he called “actual” infinity, something that would exist all at once, at a given moment — which he declared an impossibility — and “potential” infinity, which would unfold over time and which he deemed perfectly intelligible. As a result, Dr. Moore said, “Aristotle believed in finite space and infinite time,” and his ideas held sway for the next 2,000 years.

Newton and Leibniz began monkeying with notions of infinity when they invented calculus, ...

With his majestic theory of relativity, Einstein knitted together time and space, quashing old Aristotelian distinctions between actual and potential infinity and ushering in the contemporary era of infinity seeking. Another advance came in the 1980s, when Alan Guth introduced the idea of cosmic inflation, a kind of vacuum energy that vastly expanded the size of the universe soon after its fiery birth. ...

Relativity and inflation theory, said Dr. Aguirre, “allow us to conceptualize things that would have seemed impossible before.”
Reading this, it appears that we have not made much progress since the Greeks. No, Einstein did not knit together time and space. That was done by Lorentz and Poincare, and Einstein did not even understand spacetime until after Minkowski's papers became popular. And Einstein had nothing to do with our understanding of infinity, as far as I know.

Guth's theory is just an interesting hypothesis with no hard evidence.

Ms. Angier raves about the mystical aspects of infinity, and sounds as if she is trying to match that Pythagorean image of women. She seems to think that relativity allows infinity because time is not absolute.

She gives the impression that believing in the multiverse is just like Cantor discovering infinite numbers. It is not.

Mathematical analysis is all about the study of the infinite. But the infinities are usually just shorthands for finitary arguments with precise meanings. When physicists talk about infinities, they are usually very sloppy about what is meant. There is no math to support the infinities of the multiverse.

String theorist Lumo explains:
People such as Sean Carroll or Brian Greene correctly notice that the microscopic laws of Nature are time-reversal-invariant (more precisely, CPT-invariant if we want to include subtle asymmetries of the weak nuclear force) but they're overinterpreting or misinterpreting this fact. This symmetry doesn't mean that every statement about the future and past may be simply reverted upside down. It only means that the microscopic evolution of particular microstates – pure states – to particular other microstates – pure states – may be reverted.

But no probabilistic statements may actually be reverted in this naive way.
I posted before that Unitarity is not a fundamental tenet of quantum mechanics, while a reader accused me of arguing from authority.

Belief in unitarity is rooted in the belief that predicting the future is just like predicting the past. That belief is entirely mistaken, as Lumo explains better than I do. I thought that this stuff was obvious, but prominent physicists keep saying crazy things.

Now you might notice that I sometimes quote authorities favorably, and sometimes unfavorably. There is no contradiction. If I am writing about how quantum mechanics has been understood for 80 years, then I quote authorities, because they are the one who define that understanding. But if an expert says something silly, then I criticize it.

In the case of unitarity, I would not mind so much if a physicist said that it was an interesting hypothesis, and wrote a paper exploring the consequences of the hypothesis. It might be true, but it is contrary to the textbooks and contrary to the most common interpretations of the popular experiments. But when a physicist says that it is an essential part of quantum mechanics, he is just wrong.

(Besides the NY Times words, this post has several other words that are not in my dictionary.)