Wednesday, November 14, 2018

Close orbit to Milky Way black hole

The Bad Astronomer (aka Phil Plait) writes:
Because in the paper, a team of astronomers show that they have observed a blob of dust sitting just outside the point of no return of a supermassive black hole, where the gravity is so intense that this material is moving at thirty percent the speed of light. And this wasn’t inferred, deduced, or shown indirectly. No: They measured this motion by literally seeing the blobs move in their observations. ...

Sitting in the exact center of the Milky Way is a supermassive black hole… and astronomers don’t use that adjective lightly. It has a mass over 4 million times that of the Sun, and all of that is squeezed down into a spherical region of space only 20 million kilometers across. The Sun itself is over a million kilometers across, so this is a tiny volume for all that mass. The gravity of such a beast is so immense that if you get too close, you cannot escape. Not even light, which travels at the fastest possible speed in the Universe, can get out. It’s like a dark extremely massive infinitely deep hole. ...

Their motions can be directly seen, and one, called S2, circles the center on an orbit just 16 years long, taking it to within a breathtaking 18 billion kilometers of the exact center.

Using Kepler’s laws of motion, the shapes and periods of the stars’ orbits can be used to find the mass of the object they orbit, and that’s where the 4 million solar mass figure comes from. Yet we see nothing emitting light there, no huge object, no star cluster. It really must be a black hole. Anything else would be extremely bright.
This is interesting, but it really doesn't much to do with relativity.

The way relativity is usually described, a black hole is a singularity, and not "a spherical region of space only 20 million kilometers across". The distance across is infinite. Or as BA says, "a dark extremely massive infinitely deep hole." And nothing comes within any finite distance of "the exact center", because the exact center is the singularity, with infinite distance to everything.

So I am surprised that BA talks about black holes as if they can exist in Euclidean geometry, without a singularity.

We don't see inside the black hole, so we don't really know. On the outside, it looks spherical. The above paper just describes a plain old Keplerian orbit as it might have been understood four centuries ago. Just one involving bigger masses and faster speeds than has been seen before. We don't see any light from the central mass, but that is just what would have been expected two centuries ago.

It is to BA's credit that he does not lecture us on how this confirms Einstein's view of black holes.

Monday, November 12, 2018

Horgan interviews Maudlin

John Horgan interviews philosopher Tim Maudlin for SciAm. I sometimes trash philosophers, including Maudlin, so I will emphasize where I agree with him.
Filmmaker Errol Morris hates Thomas Kuhn. What’s your take on Kuhn?

The Structure of Scientific Revolutions contains some nice observations on the nature of what Kuhn calls “normal science”, which makes it out to have none of the heroic aspects that Popper insisted on. But when Kuhn goes beyond normal science to “revolutionary science” the book is a disaster. It promotes an irrationalist view of scientific revolutions that is both false and pernicious.
Exactly correct. Kuhn's popularity is a large part of why I trash philosophy of science.
Overwhelmingly most philosophers are atheists or agnostics, which I take to be convergence to the truth. Most are compatibilist about free will and believe in it, which I also take to be convergence to the truth. Almost all believe in consciousness and most don’t have a clue how to explain it, which is wisdom.
This is reassuring.
What’s your take on multiverses and strings and the problem of testability?

Some people have been mesmerized by fancy math. It is not interesting physics in my view, and has had a very, very bad effect on the seriousness of theoretical physics as practiced.
Yes.
Does Gödel’s incompleteness theorem have implications beyond mathematics? Is it a worm in the apple of rationality?

No. Absolutely no one should have ever been surprised that mathematical truth cannot be equated with theoremhood in some finite axiomatic system.
Again, I agree. Godel's theorem is fascinating and profound for logic and the foundations of mathematics, but nearly all applications outside math in the popular literature are nonsense.

He lost me with his favorite interpretation of quantum mechanics. I have discussed that elsewhere. He also lost me with this:
What’s your position on the status of ethics? Do any moral rules have the same status as mathematical truths? Do you believe in moral progress?

Yes (with qualification) and yes. Already in Republic (Plato again!) we have an argument — a clear and compelling rational argument — that even the highest political office should be open to women. The argument? List what it takes to be a good leader of the state, then note the conditions that distinguish the sexes. There just is zero overlap between the two lists. That is as compelling as a rational argument can be, and it follows that opening all political offices to women (much less acknowledging in law that women should have as much right to vote as men) is objective moral progress. Similarly for invidious legal restrictions by race. The civil rights movement was strict moral progress. That’s as true as 2 + 2 = 4.
Wow. Because of some logical, almost mathematical argument, known to Plato, someone like Hillary Clinton should be President of the USA?!

Donald Trump has that list of qualities. Fearless. Honest. Loyal. Blunt. Likable. Strength of character. True to his word. Alpha. Not intimidated by his enemies. Maintain hundreds of friendships and political alliances. Forceful. Smart. Competent. Just enough of a narcissist Machiavellian sociopath to be effective. Strong moral compass. Unflinching about sticking up for the people he represents. Vision for a better future. Communicates his ideas well. Owned by no one. Shitlord.

Neither Hillary Clinton nor any other woman has these qualities.

Maudlin is probably a typical academic leftist Trump-hater who voted for Hillary Clinton, so I am sure he disagrees. But I do wonder about his list of what it takes to be a good leader of the state. Is there really such a list where Donald Trump and Hillary Clinton do equally well?

Maybe Maudlin is making a joke here. He would probably be ostracized from his profession if he openly supported Trump.

When the thought-control police are forcing you to take a political stand, sometimes the best way is to give an argument that is so unreasonable that no one could take it seriously. Maybe Maudlin is doing that here, and trolling us. Can he really think that supporting Hillary Clinton is like 2 + 2 = 4?

He says he believes in free will. At least he says he believes Brett Cavanaugh has free will. We don't want any more pre-programmed automatons on the Supreme Court, do we? Did he say Cavanaugh has free will as a sneaky way of supporting him?

I should just agree with his arguments that made sense, and not try to decode his political sarcasm. I don't like to get political on this blog anyway.

Saturday, November 10, 2018

Physics rejects counterfactual definiteness

Lubos Motl rants, as part of a defense of string theory:
People enjoying terms such as the "counterfactual definiteness" have two main motivations. One of them is simply their desire to look smart even though almost all of them are intellectually mediocre folks, with the IQ close to 100. This category of people greatly overlaps with those who like to boast about their scores from IQ tests – or who struggle for 10 years to make a journal accept their crackpot paper, so that they can brag to be finally the best physicists in the world (I've never had a problem with my/our papers' getting published). The other is related but more specific: "counterfactual definiteness" was chosen to represent their prejudices that Nature obeys classical physics – which they believe and they're mentally unable to transcend this belief.

If something is called "counterfactual definiteness", it must be right, mustn't it? The person who invented such a complicated phrase must have been smart, listeners are led to believe, so the property must be obeyed in Nature. Wouldn't it otherwise be a giant waste of time that someone invented the long phrase and wrote papers and books about it? Sorry, it's not obeyed, the awkward terminology cannot change anything about it, the people who enjoy using similar phrases have the IQ about 100 and they are simply not too smart, and indeed, all the time was wasted.
He is correct that counterfactual definiteness is not obey in Nature, but I doubt that he is right about the term being invented to trick low-IQ ppl into falling for a false concept.

Believing in counterfactual definiteness is like believing in Many-Worlds. It literally means that your counterfactual fantasies have some definite reality. Things that never happened can be discussed as if they did.

Technically, nothing is really definite in Many-Worlds, so maybe it is not the best example. Newtonian mechanics is a better example of counterfactual definiteness.

It is opposite the more conventional quantum mechanical view that "unperformed experiments have no results". You cannot analyze the double-slit experiment by assuming that particles definitely went thru one slit or the other. If you do, then you don't see an interference pattern. We see the interference pattern, so counterfactual definiteness is wrong.

The essence of Bell's Theorem is that assuming counterfactual definiteness leads to conclusions that contradict quantum mechanics. The sensible conclusion is that counterfactual definiteness is wrong. There are some other possibilities, but they require rejecting more basic scientific principles.

Thinking sensibly about counterfactuals is the key to understanding quantum mechanics. Many of the paradoxes that make it hard to understand quantum mechanics are based on attributing some faulty meaning to a counterfactual.

Thursday, November 8, 2018

Astronomers excited about black holes

NY Times science writer Dennis Overbye writes about the black hole at the center of the Milky Way.

The article mentions Einstein ten times, even tho he had almost nothing to do with the concept.

Black holes were first proposed in 1784. The relativistic equations for a black hole were found by Schwarzschild and a student of Lorentz's, but many mistakenly thought that there was a singularity on the event horizon. Some modern theoretical physicists still think that there is such a singularity, in order to preserve their intuition about information emerging from evaporating black holes.

Much as I like to see relativity research research, the astronomy work on black holes does not have much to do with relativity.
Black holes — objects so dense that not even light can escape them — are a surprise consequence of Einstein’s general theory of relativity, which ascribes the phenomenon we call gravity to a warping of the geometry of space and time.
Not really. Since 1784 it has been understood that if gravitational force obeys an inverse square law, and the mass is sufficiently concentrated, then the escape velocity will exceed the speed of light and a black hole results.

Relativity does predict some strange things inside the event horizon of a black hole, but relativity also teaches that none of that is observable, so we will never know. There is no proof that there is any sort of singularity.

While general relativity is commonly described as explaining gravity as the warping of the geometry of space and time, that was not Einstein's view. He denounced this geometrical interpretation. And he did not believe in black holes.
“The road is wide open to black hole physics,” Dr. Eisenhauer proclaimed.
It is true that we are getting a lot more info about black holes. A few decades ago we were not even sure that they exist, and now they are crucial for theories of galaxy formation, for explaining the brightest objects in the universe, and for studying gravity waves.

But all that stuff about singularities, entropy, evaporation, firewalls, information conservation, and quantum gravity are completely out of reach.

Monday, November 5, 2018

Leaving true physics to wither

Bee quotes this NY Times article:
“Unable to mount experiments that would require energies comparable to that of the Big Bang genesis event, Dr. Chodos believes, growing numbers of physicists will be tempted to embrace grandiose but untestable theories, a practice that has more than once led science into blind alleys, dogma and mysticism.

In particular, Dr. Chodos worries that “faddish” particle physicists have begun to flock all too uncritically to a notion called “superstring theory.” […] Deprived of the lifeblood of tangible experiment, physicists will “wander off into uncharted regions of philosophy and pure mathematics,'' says Dr. Chodos, leaving true physics to wither.””
This was conventional wisdom among a lot of physicists in the 1970s. I remember hearing a lecture in the late 1970s explaining the exponentially increasing cost of particle accelerators, and how they will never get to the energies that they need to resolve the questions that they are really interested in. Finding some unified field theory would be a miracle of good luck.

It was known back then that even if susy had merit, there would be dozens of free parameters that would be hopeless to determine experimentally. The string theorists decided that they determine them by pure theory instead. By the year 2000 or so, it was established that the plan would never work.

Bee just wrote a book on how theoretical physics has lost its way, but it has been lost for 40 years

Thursday, November 1, 2018

Philosopher defends Many-Worlds

I mentioned the failure of many-worlds, but in fairness, here is a new philosophy paper with another view:
We defend the many-worlds interpretation of quantum mechanics (MWI) against the objection that it cannot explain why measurement outcomes are predicted by the Born probability rule. We understand quantum probabilities in terms of an observer's self-location probabilities. We formulate a probability postulate for the MWI: the probability of self-location in a world with a given set of outcomes is the absolute square of that world's amplitude.
There is no world's amplitude. This paper is just nonsense.

If MWI really predicted probabilities, or predicted any measurement outcomes, you would not need philosophy papers like this.

The whole point of every other scientific theory is to predict outcomes. If MWI does not, then what is it doing for you?

The paper claims that MWI can make predictions, but it is just a stupid hand wave. There are no physics papers that use MWI to predict and experimental outcome.

Monday, October 29, 2018

Creating the First Quantum Internet

Here is the misguided attempt at quantum crypto:
Scientists in Chicago are trying to create the embryo of the first quantum internet. If they succeed, the researchers will produce one, 30-mile piece of a far more secure communications system with the power of fast quantum computing. From a report:
The key was the realization of an unused, 30-mile-long fiber optic link connecting three Chicago-area research institutions -- Argonne National Lab, Fermi Lab and the University of Chicago. This led to the idea to combine efforts and use the link for what they call the Chicago Quantum Exchange. David Awschalom, an Argonne scientist and University of Chicago professor who is the project's principal investigator, tells Axios that the concept is difficult to grasp, even for experts.
MIT Technology Review elaborates:
The QKD approach used by Quantum Xchange works by sending an encoded message in classical bits while the keys to decode it are sent in the form of quantum bits, or qubits. These are typically photons, which travel easily along fiber-optic cables. The beauty of this approach is that any attempt to snoop on a qubit immediately destroys its delicate quantum state, wiping out the information it carries and leaving a telltale sign of an intrusion. The initial leg of the network, linking New York City to New Jersey, will allow banks and other businesses to ship information between offices in Manhattan and data centers and other locations outside the city.
However, sending quantum keys over long distances requires "trusted nodes," which are similar to repeaters that boost signals in a standard data cable. Quantum Xchange says it will have 13 of these along its full network. At nodes, keys are decrypted into classical bits and then returned to a quantum state for onward transmission. In theory, a hacker could steal them while they are briefly vulnerable.
This is really foolish. We have cheap reliable end-to-end encryption that has not been broken.

The quantum crypto methods are unable to offer similar assurances. They cannot authenticate messages. They cannot do end-to-end encryption, so they require trusted nodes. They are subject to hardware faults, and such faults have been used to break all the commercial equipment.

The big advantage of the quantum crypto is that you are supposed to be able to shut down the network if you detect a probability of an attack. Who wants that? The whole point of the real internet is to always transmit traffic, regardless of problems. The quantum internet will shut down at the first sign of a problem.

The whole idea of a quantum internet is a scam.

Saturday, October 27, 2018

Stop doing fundamental physics

Lubos Motl writes:
The video starts boldly:
I will talk about string theory not because I think it's interesting but because it's uninteresting and we should stop talking about it.
Holy cow. String theory remains the only game in town and everyone who wants to scientifically investigate any physical phenomena that go beyond effective quantum field theories – whose limitations are self-evident and well-known – simply must learn string/M-theory. There is no known alternative. To "stop talking about it" is almost exactly equivalent to stop doing fundamental physics. ...

On the other hand, Hossenfelder clearly doesn't have any alternative to string theory. She doesn't have any quantum mechanical theory that agrees with Einstein's equations at long distances but preserves the information when the black hole evaporates. But she – and her brain-dead followers – just don't care.
Yeah, I just don't care about such a theory.

Einstein's equations at long distances are the same as for Newtonian gravity. Either way, you can add dark energy, altho, as Bee explains, much of string theory was based on dark energy being negative, and we now know that it is positive.

But string theory preserves the info when a black hole evaporates? That is just nonsense. It makes more sense to talk about the Biblical apocalypse.

Maybe fundamental physics should stop. It is going nowhere. All that brainpower could be put to more productive purposes.

Wednesday, October 24, 2018

Krauss pushed into retirement

I posted before about physicist Lawrence Krauss being silenced.

BuzzFeed brags that it has hounded a physicist out of academia:
Lawrence Krauss, the celebrity physicist who faced dismissal from Arizona State University for violating sexual misconduct policy, has agreed to step down from the school.

In statements posted on Facebook and Twitter on Sunday, Krauss said: “I have chosen to retire from ASU in May, 2019, when I turn 65.”
I am not going to pile on here. I believe he is innocent until proven guilty. Among the accusations are that he made "sexist comments".

Krauss has written some worthwhile popular physics books. If I were going to be offended by his comments, then I would probably be offended by his leftist political views. Making sexist comments is not a crime. Not yet.

Saturday, October 20, 2018

Explaining the failure of Many-Worlds

Philip Ball explains what is wrong with the Many Worlds Interpretation of quantum mechanics:
The MWI is qualitatively different from the other interpretations of quantum mechanics, although that’s rarely recognized or admitted. For the interpretation speaks not just to quantum mechanics itself but to what we consider knowledge and understanding to mean in science. It asks us what sort of theory, in the end, we will demand or accept as a claim to know the world. ...

What the MWI really denies is the existence of facts at all. It replaces them with an experience of pseudo-facts (we think that this happened, even though that happened too). In so doing, it eliminates any coherent notion of what we can experience, or have experienced, or are experiencing right now. We might reasonably wonder if there is any value — any meaning — in what remains, and whether the sacrifice has been worth it. ...

It says that our unique experience as individuals is not simply a bit imperfect, a bit unreliable and fuzzy, but is a complete illusion. If we really pursue that idea, rather than pretending that it gives us quantum siblings, we find ourselves unable to say anything about anything that can be considered a meaningful truth. We are not just suspended in language; we have denied language any agency. The MWI — if taken seriously — is unthinkable. ...

What quantum theory seems to insist is that at the fundamental level the world cannot supply clear “yes/no” empirical answers to all the questions that seem at face value as though they should have one. The calm acceptance of that fact by the Copenhagen interpretation seems to some, and with good reason, to be far too unsatisfactory and complacent. The MWI is an exuberant attempt to rescue the “yes/no” by admitting both of them at once. But in the end, if you say everything is true, you have said nothing.
That's right. Ultimately MWI says nothing that you would want from a scientific theory. There are no facts, predictions, or confirming experiments.

MWI just says that everything that can happen does happen in some parallel world. It allows you to think and believe whatever you want. Probabilities are meaningless. Reality and facts are meaningless.

MWI is just the same as the child's fantasy. The proponents give the impression that it is a scientific theory that gives a detailed explanation of the worlds, with Hilbert space, wave function, Schroedinger equation, atomic forces, etc. Yes, but none of them can explain how all that apparatus tells you anything beyond the simplistic child's fantasy. There are no predictions or confirming experiments.

I used to to think that string theory was the epitome of unscientific thinking. But string theory is vastly more reasonable that MWI. String theory at least had some hope of getting some theoretical explanations. MWI explains nothing, and discards almost everything we know about science.

Update: LuMo writes:
Now, Ball has written a text about some conceptual and basically insurmountable problems of the "many-world interpretation" paradigm sometimes used to misinterpret quantum mechanics. Among other things, he focused on the impossibility to define what a "splitting of the Universes" is and when and how many times it takes place. This is of course one of the problems about MWI that I see and often write about – but there are others, too. ...

However, the comment sections are frustrating. Both articles have attracted over 100 comments by now. Pretty much all the most upvoted comments attack Wolchover's and Ball's texts. You can see that none of these people actually understands quantum mechanics and all of them assume that classical physics is right throughout their comments and lives.
I had not noticed that Quanta mag allows comments, because my adblocker blocks them.

I would not bother criticizing MWI, except that it has such a huge following, from leading physicists on down to the general public. Here is one of the dopey comments:
The fact is that the MWI is strictly adherent to the mathematics of quantum physics. There is no extra phenomenon like "observation" (that's just entanglement) there is no extra phenomenon like "waveform collapse" the entangled particle becomes part of a more complex waveform.

MWI doesn't have to justify adding any additional complexity to QM because it doesn't. Copenhagen, Pilot Wave, et. al. are the interpretations that add extra complexity that don't show up in the math, so they're the ones that have to justify that complexity. What the hell is an observer? What the hell is waveform collapse?
MWI doesn't have to define observers because it does not make any predictions.

The math of quantum physics makes predictions that are verified by experiments. MWI makes no such predictions. Therefore MWI does not adhere to the math of quantum physics.

Thursday, October 11, 2018

The decline of relativistic mass

Vesselin Petkov notes how the concept of "relativistic mass" has gone out of fashion:
These facts make the campaign against the concept of relativistic mass both inexplicable and worrisome. Instead of initiating and stimulating research on the origin of relativistic mass (and on the nature of mass in general) in order to achieve a more profound understanding of this fundamental concept in physics,7 the relativistic mass is not mentioned at all in many publications8 (see, for example, the well-known textbook [35]) or, if it is mentioned, it is done to caution the readers9, that "Most physicists prefer to consider the mass of a particle as fixed" [25, p. 760], that "Most physicists prefer to keep the concept of mass as an invariant, intrinsic property of an object" [32], that "We choose not to use relativistic mass, because it can be a misleading concept" [36] or to warn them [22, p. 1215]:
Watch Out for "Relativistic Mass"

Some older treatments of relativity maintained the conservation of momentum principle at high speeds by using a model in which a particle's mass increases with speed. You might still encounter this notion of "relativistic mass" in your outside reading, especially in older books. Be aware that this notion is no longer widely accepted; today, mass is considered as invariant, independent of speed. The mass of an object in all frames is considered to be the mass as measured by an observer at rest with respect to the object.
As he explains, this opinion is pretty arbitrary, and relativistic mass is analogous to length contraction or time dilation. Yes, it depends on the frame, and it can be a little confusing, but that's relativity.

Wednesday, October 10, 2018

Biologist defends de-publishing papers

Computational biology professor Lior Pachter writes:
In the case discussed in this blog post, the underlying subtext is pervasive sexism and misogyny in the mathematics profession, and if this sham paper on the variance hypothesis had gotten the stamp of approval of a journal as respected as NYJM, real harm to women in mathematics and women who in the future may have chosen to study mathematics could have been done. It’s no different than the case of Andrew Wakefield‘s paper in The Lancet implying a link between vaccinations and autism. By the time of the retraction (twelve years after publication of the article, in 2010), the paper had significantly damaged public health, and even today its effects, namely death as a result of reduced vaccination, continue to be felt.
He and his liberal colleagues have a funny idea of what science is all about.

Wakefield's paper did not damage public health. It merely suggested a health concern, based on some very limited data. The proper response would have been to do a more thorough study on measles vaccine safety.

Instead the medical authorities blamed Wakefield for reduced confidence in vaccination, so they retracted the paper and stripped Wakefield of his medical license.

Those who suspected a cover-up of vaccine risks had their suspicions confirmed. Nobody would every publish anything critical of vaccines again, or risk losing his medical license.

Pachter points out that papers on the evolution of sex differences go back to 1895, at least. So how is it that publishing another one will do real harm to women in mathematics? Pachter doesn't actually explain what is wrong with the paper, except that it is politically incorrect and fails to cite some previous work on the subject.

I do not get confidence in vaccines by having a ban on papers describing vaccine dangers. And I do not think that women should get encouragement in math by banning papers on variance in mathematical ability.

Sunday, October 7, 2018

Claiming quantum mechanics is inconsistent

These is whole industry of physicists working in quantum foundations who make various arguments that quantum mechanics doesn't make any sense. They can't deny that quantum mechanics correctly predicts experiments, and yet they keep coming up with clever sleight-of-hand thought experiments and paradoxes that supposedly show that the theory does not work.

The whole enterprise is foolish. If there were really such contradictions, then there would be some failure to predict experiments.

Scott Aaronson pauses from his agony of being a Jewish leftist Trump-hating professor in a red state to explain:
So: a bunch of people asked for my reaction to the new Nature Communications paper by Daniela Frauchiger and Renato Renner, provocatively titled “Quantum theory cannot consistently describe the use of itself.” Here’s the abstract:
Quantum theory provides an extremely accurate description of fundamental processes in physics. It thus seems likely that the theory is applicable beyond the, mostly microscopic, domain in which it has been tested experimentally. Here, we propose a Gedankenexperiment to investigate the question whether quantum theory can, in principle, have universal validity. The idea is that, if the answer was yes, it must be possible to employ quantum theory to model complex systems that include agents who are themselves using quantum theory. Analysing the experiment under this presumption, we find that one agent, upon observing a particular measurement outcome, must conclude that another agent has predicted the opposite outcome with certainty. The agents’ conclusions, although all derived within quantum theory, are thus inconsistent. This indicates that quantum theory cannot be extrapolated to complex systems, at least not in a straightforward manner.
The paper authors separately argue that this proves the many-world interpretation.

That conclusion should be enuf to dispose of the argument. The MWI does not predict any experimental outcomes. There is nothing scientific about it. It is like some solipsist saying anything can happen in his imagination.

Aaronson explains the errors in more detail. So does Lubos Motl. Somehow this paper got published in a Nature journal. It has become respectable to trash quantum mechanics with silly arguments.

Friday, October 5, 2018

Fundamental physics is over

About the recent Nobel physics prize, someone commented:
that's not even applied science, that's technology

People do not get that fundamental physics is over (you would not call seriously "string theory" "scientific" would you?).

I know I am repeating what Lord Kelvin said to his embarrassment just before great discoveries in relativistic physics, quantum physics, etc.

Nevertheless, that's truth: everything ends, everything has limits, humanity has limits and science has limits.

The clear indication that we are close to the limit is absence of ANY fundamental discoveries since a long time ago.

We are gradually shifting towards applied science and mere technology. All of three fields, basic science, applied science and technology are essential for humanity, but the fact is that the first one is almost over or probably over already.

Call them for what they are: Nobel Prizes in Technology
I mostly agree with this.

Future historians will look back at the XX century and say that is when the fundamental problems of science got sorted out.

Sure, there are a few things that seem only partially understood, and that a better understanding seems likely or possible. But for many of those things, it is possible that they will never be better understood than they are today.

What do we have to show for this century? Faster lasers. Gravity wave detection. Higgs boson detection. Better telescopes. Etc. But we haven't had any significant advances in fundamental physics in about 40 years.

Tuesday, October 2, 2018

Physics was invented and built by men

A reader sends this BBC story:
A senior scientist who said physics "was invented and built by men" has been suspended with immediate effect from working with the European nuclear research centre Cern.

Prof Alessandro Strumia, of Pisa University, made the comments during a presentation organised by the group.

He said, in comments first reported by the BBC's Pallab Ghosh, that physics was "becoming sexist against men".

Cern said on Monday it was suspending Prof Strumia pending an investigation.

It stated that his presentation was "unacceptable".
LuMo compares this to persecuting Galileo here and here.

No woman would be fired for pushing the accomplishments of women or for whining about men. This man was fired for presenting some facts and opinions about men. So his firing proved his point -- physics is sexist against men.

I thought that his punishment was potentially justifiable because he was injecting political opinions into a scientific context. But his talk was to a gender politics workshop where all the other opinions complained about male oppression. They will never get to the truth as long as contrary views are censored.

The Galileo analogy is a little silly. Galileo was allowed to publish his arguments about the Earth going around the Sun. He got into trouble when said the official Bible interpretations had been proven wrong.

I am writing this as the Nobel Physics prizes are announced. Marie Curie got one about a century ago. I don't think that there have been any women since. As usual, three more men got the prize this year.

One of the three prizewinners was a woman, the first in 55 years.

Thursday, September 20, 2018

Mermin defends Copenhagen Interpretation

N. David Mermin has written many popular essays explaining quantum mechanics, and now he summarizes his views on how to interpret the theory in Making better sense of quantum mechanics.

He prefers something called QBism, but nearly everything he says could be considered a defense of the Copenhagen interpretation.
Much of the ambiguity and confusion at the foundations of quantum mechanics stems from an almost universal refusal to recognize that individual personal experience is at the foundation of the story each of us tells about the world. Orthodox ("Copenhagen") thinking about quantum foundations overlooks this central role of private personal experience, seeking to replace it by impersonal features of a common "classical" external world.
He is drawing a fairly trivial distinction between his QBism view and Copenhagen. He illustrates with this famous (but possibly paraphrased) Bohr quote::
When asked whether the algorithm of quantum mechanics could be considered as somehow mirroring an underlying quantum world, Bohr would answer "There is no quantum world. There is only an abstract quantum physical description. It is wrong to think that the task of physics is to find out how nature is. Physics concerns what we can say about nature."
Mermin's only quibble with this is that he prefers "each of us can say" to "we can say". That is, he doesn't like the way Bohr lumps together everyone's observations and calls it the classical world.

Okay, I guess that distinction makes a difference when discussing Wigner's Friend, a thought experiment where one observer watches another. But for the most part, Mermin likes the Copenhagen interpretation, and successfully rebuts those who say that the interpretation is deficient somehow.

Monday, September 17, 2018

Correcting errors about EPR paradox

Blake C. Stacey writes about the Einstein–Podolsky–Rosen paradox:
Misreading EPR: Variations on an Incorrect Theme

Notwithstanding its great influence in modern physics, the EPR thought-experiment has been explained incorrectly a surprising number of times.
He then gives examples of famous authors who get EPR wrong.

He gets to the heart of the Bohr-Einstein dispute:
EPR write, near the end of their paper, "[O]ne would not arrive at our conclusion if one insisted that two or more physical quantities can be regarded as simultaneous elements of reality only when they can be simultaneously measured or predicted."

The response that Bohr could have made: "Yes."

EPR briefly considered the implications of this idea and then dismissed it with the remark, "No reasonable definition of reality could be expected to permit this."

But that is exactly what Bohr did. A possible reply in the Bohrian vein: "Could a `reasonable definition of reality' permit so basic a fact as the simultaneity of two events to be dependent on the observer's frame of reference? Many notions familiar from everyday life only become well-defined in relativity theory once we fix a Lorentz frame. Likewise, many statements in quantum theory only become well-defined once we have given a complete description of the experimental apparatus and its arrangement."

This is not a quote from anywhere in Bohr's writings, but it is fairly in the tradition of his Warsaw lecture, where he put considerable emphasis on what he felt to be "deepgoing analogies" between quantum theory and relativity.
In spite of all differences in the physical problems concerned, relativity theory and quantum theory possess striking similarities in a purely logical aspect. In both cases we are confronted with novel aspects of the observational problem, involving a revision of customary ideas of physical reality, and originating in the recognition of general laws of nature which do not directly affect practical experience. The impossibility of an unambiguous separation between space and time without reference to the observer, and the impossibility of a sharp separation between the behavior of objects and their interaction with the means of observation are, in fact, straightforward consequences of the existence of a maximum velocity of propagation of all actions and of a minimum quantity of any action, respectively.
This is well put. The aim of EPR is to explain a simple example of entangled particles, and to argue that no reasonable definition of reality would permit two observables that cannot be simultaneously measured.

And yet that is a core teaching of quantum mechanics, from about 10 years earlier. Two non-commuting observables cannot be simultaneously measured precisely. That is the Heisenberg uncertainty principle.

Theories that assign definite simultaneous values to observables are called hidden variable theories. All the reasonable ones have been ruled out by the Bell Test Experiments.

Complaining that the uncertainty principle violates pre-conceptions about reality is like complaining that relativity violates pre-conceptions about simultaneity. Of course it does. Get with the program.

There are crackpots who reject relativity because of the Twin Paradox, or some other such surprising effect. The physics community treats them as crackpots. And yet the community tolerates those who get excited by EPR, even tho EPR makes essentially the same mistake.

Saying "maximum velocity of propagation" is a way of saying the core of relativity theory, and saying "minimum quantity of any action" is a way of saying the core of quantum mechanics. The minimum is Planck's constant h, or h-bar. The Heisenberg uncertainties are proportional to this constant. That minimum makes it impossible to precisely measure position and momentum simultaneously, just as the finite speed of light makes it impossible to keep clocks simultaneous.

Thursday, September 13, 2018

Joint Hubble Lemaitre credit is a bad idea

I mentioned renaming the Hubble Law, as a way to correct history, but it appears that they have made the matter worse.

The respected science historian Helge Kragh writes:
The Hubble law, widely considered the first observational basis for the expansion of the universe, may in the future be known as the Hubble-Lema\^itre law. This is what the General Assembly of the International Astronomical Union recommended at its recent meeting in Vienna. However, the resolution in favour of a renamed law is problematic in so far as concerns its arguments based on the history of cosmology in the relevant period from about 1927 to the early 1930s. A critical examination of the resolution reveals flaws of a non-trivial nature. The purpose of this note is to highlight these problems and to provide a better historically informed background for the voting among the union's members, which in a few months' time will result in either a confirmation or a rejection of the decision made by the General Assembly.
He notes:
Until the mid-1940s no astronomer or physicist seems to have clearly identified Hubble as the discoverer of the cosmic expansion. Indeed, when Hubble went into his grave in 1953 he was happily unaware that he had discovered the expansion of the universe.
He says the cited evidence that Hubble met with Lemaitre is wrong. Furthermore, there are really two discoveries being confused -- the cosmic expansion and the empirical redshift-distance law. Hubble had a role in the latter, but not the former.

Monday, September 10, 2018

Where exactly does probability enter the theory?

Peter Woit writes:
A central question of the interpretation of quantum mechanics is that of “where exactly does probability enter the theory?”. The simple question that has been bothering me is that of why one can’t just take as answer the same place as in the classical theory: in one’s lack of precise knowledge about the initial state.
Lee Smolin says he is writing a book, and there are 3 options: (1) orthodox quantum mechanics, (2) many-worlds, (3) hidden variable theories, like pilot waves. All attempts at (2) have failed, so he says "My personal view is that option 3) is the only way forward for physics."

This is a pretty crazy opinion. No one has been able to makes sense out of probabilities in a many-worlds theory, and Bell test experiments have ruled out all sensible hidden variable theories.

Lubos Motl posts a rant against them, as usual:
Quantum mechanics was born 93 years ago but it's still normal for people who essentially or literally claim to be theoretical physicists to admit that they misunderstand even the most basic questions about the field. As a kid, I was shocked that people could have doubted heliocentrism and other things pretty much a century after these things were convincingly justified. But in recent years, I saw it would be totally unfair to dismiss those folks as medieval morons. The "modern morons" (or perhaps "postmodern morons") keep on overlooking and denying the basic scientific discoveries for a century, too! And this centennial delay is arguably more embarrassing today because there exist faster tools to spread the knowledge than the tools in the Middle Ages.
Lumo is mostly right, but it is possible to blame uncertainties on lack of knowledge of the initial state. It is theoretically possible that if you had perfect knowledge about a radioactive nucleus, then you would know when it would decay.

However it is also true that measurements are not going to give you that knowledge, based on what we know about quantum mechanics. This is what makes determinism more of a philosophical question than a scientific one.

I agree with Lumo that deriving the Born rule is silly. The Born rule is part of quantum theory. Deriving it from something equivalent might please some theorists, but really is just a mathematical exercise with no scientific significance.

This question about the origin of probabilities only makes sense to those who view probably as the essential thing that makes quantum mechanics different from classical mechanics. I do not have that view. Probabilities enter into all of science. It is hard to imagine any scientific theory that can be tested without some resort to a probabilistic analysis. So I don't think that the appearance of probability requires any special explanation. How else would any theory work?

It is very strange that respectable physicists can have such bizarre views about things that were settled about a century ago. I agree with Lumo about that.

Saturday, September 8, 2018

Another claim for QC real soon

Latest quantum computer hype:
Today the [Berkeley-based startup Rigetti] launched a project in the mold of Amazon Web Services (AWS) called Quantum Cloud Services. "What this platform achieves for the very first time is an integrated computing system that is the first quantum cloud services architecture," says Chad Rigetti, founder and CEO of his namesake company. The dozen initial users Rigetti has announced include biotech and chemistry companies harnessing quantum technology to study complex molecules in order to develop new drugs. The particular operations that the quantum end of the system can do, while still limited and error-prone, are nearly good enough to boost the performance of traditional computers beyond what they could do on their own -- a coming milestone called quantum advantage. "My guess is this could happen anytime from six to 36 months out," says Rigetti.
My guess is that their investors said that they require results in 6 to 36 months.

There is no chance that this company will have any success before the funding runs out.

Tuesday, September 4, 2018

Vote to rename law to Hubble-Lemaitre Law

Astronomers have long credited Hubble for discovering the expansion of the universe, even tho he had little to do with it.

If they can decide that Pluto is not a planet, then they can correct this error. Now they will vote on it:
Astronomers are engaged in a lively debate over plans to rename one of the laws of physics.

It emerged overnight at the 30th Meeting of the International Astronomical Union (IAU), in Vienna, where members of the general assembly considered a resolution on amending the name of the Hubble Law to the Hubble-Lemaître Law.

The resolution aims to credit the work of the Belgian astronomer Georges Lemaître and his contribution—along with the American astronomer Edwin Hubble — to our understanding of the expansion of the universe.

While most (but not all) members at the meeting were in favor of the resolution, a decision allowed all members of the International Astronomical Union a chance to vote. Subsequently, voting was downgraded to a straw vote and the resolution will formally be voted on by an electronic vote at a later date.
As the article explains, the Belgian Catholic priest published both the theory and the experimental evidence for it, before Hubble had a clue. Hubble did later publish some data confirming Lemaitre's paper as he had a better telescope, but the data was very crude and not really much better.

It is an amusing historical fact that Einstein, Eddington, and other leading cosmologists clung to the idea of a steady-state universe, while a Catholic priest and Vatican astronomers led the way to convincing everyone that the universe had a beginning in what is now called the Big Bang.
But Hubble was not the first. In 1927, Georges Lemaître had already published an article on the expansion of the universe. His article was written in French and published in a Belgian journal.

Lemaître presented a theoretical foundation for the expansion of the universe and used the astronomical data (the very same data that Hubble used in his 1929 article) to infer the rate at which the universe is expanding.

In 1928, the American mathematician and physicist Howard Robertson also published an article in Philosophical Magazine and Journal of Science, where he derived the formula for the expansion of the universe and inferred the rate of expansion from the same data that were used by Lemaître (a year before) and Hubble (a year after). ...

In January 1930 at the meeting of the Royal Astronomical Society in London, the English astronomer, physicist, and mathematician Arthur Eddington raised the problem of the expansion of the universe and the lack of any theory that would satisfactory explain this phenomenon.

When Lemaître found about this, he wrote to Eddington to remind him about his 1927 paper, where he laid theoretical foundation for the expansion of the universe.
It should be called the Lemaitre Law, or maybe the Lemaitre-Robertson Law, if you want to give an American some credit.

Thursday, August 30, 2018

Modifying gravity is called "cheating"

Gizmodo reports:
A fight over the very nature of the universe has turned ugly on social media and in the popular science press, complete with accusations of “cheating” and ad hominem attacks on Twitter. Most of the universe is hiding, and some scientists disagree over where it has gone.

It’s quite literally a story as old as time. Wherever you look in the cosmos, things don’t seem to add up. Our human observations of the universe’s structure—as far back as we can observe—suggest that there’s around five times more mass than we see in the galaxies, stars, dust, planets, brown dwarfs, and black holes that telescopes have observed directly. We call this mystery mass, or the mystery as a whole, “dark matter.”

Several thousand physicists researching these dark matter-related mysteries will tell you that dark matter is a particle, the way that electrons and protons are particles, that only appears to interact with other known particles via the gravitational pull of its mass. But there are a few dozen physicists who instead think that a set of ideas called “modified gravity” might one day explain these mysteries. Modified gravity would do away with the need for dark matter via a tweak to the laws of gravity. ...

Then, in June, the most sensitive dark matter particle-hunting experiment, called XENON, announced it had once again failed to find a dark matter particle. A story titled “Is Dark Matter Real?” followed in the August issue of Scientific American, ...

“It’s only if you ignore all of modern cosmology that the modified gravity alternative looks viable. Selectively ignoring the robust evidence that contradicts you may win you a debate in the eyes of the general public. But in the scientific realm, the evidence has already decided the matter, and 5/6ths of it is dark.”
In other words, it is a cheat to tweak the laws of gravity to accommodate the slow galaxy rotation, but not cheat to hypothesize a new particle.

Hoping for a dark matter particle was one of the main reasons for believing in SUSY, as SUSY requires about 100 new particles. Maybe the lightest one is the dark matter particle.

Another little controversy is whether the evidence for dark matter already contradicts the Standard Model. Not necessarily. Wilczek pushes axions as an explanation that I think is consistent with the SM.

Also, the SM only tries to explain strong, weak, and electromagnetic interactions. Dark matter could be some substance that does not interact with those forces, and thus could exist independently from the SM.
In Gizmodo’s conversations with 13 physicists studying dark matter, a pretty clear picture emerged: Dark matter as an undiscovered population of particles that influence the universe through gravity is the prevailing paradigm for a reason, and will continue as such until a theory comes along with the same predictive power for the universe’s grandest features.
It is odd to call the substance a particle. We only call electrons particles because of how they interact with light, but dark matter does not interact with light.
“Everywhere the dark matter theories make predictions, they get the right answers,” Scott Dodelson, a Carnegie Mellon physics professor, told Gizmodo. But he offered a caveat: “They can’t make predictions as well on small scales,” such as the scales of galaxies.
I am surprised that anyone would brag about a theory that only works on scales much larger than galaxies.

Monday, August 27, 2018

Billion new dollars for quantum computation

Peter Woit announces:
Moving through the US Congress is a National Quantum Initiative Act, which would provide over a billion dollars in funding for things related to quantum computation.
A billion dollars?!

IBM and Google both promised quantum supremacy in 2017. We have no announcement of QS, or any explanation for the failure.

I am not the only one saying it is impossible. See this recent Quanta mag article for other prominent naysayers.

If Congress were to have hearings on this funding, I would expect physicists to be extremely reluctant to throw cold water on lucrative funding for their colleagues. Maybe that is what is keeping Scott Aaronson quiet.

Previously Woit commented:
It’s remarkable to see publicly acknowledged by string theorists just how damaging to their subject multiverse mania has been, and rather bizarre to see that they attribute the problem to my book and Lee Smolin’s. The source of the damage is actually different books, the ones promoting the multiverse, for example this one.
This was induced by some string theorists still complaining about those books that appeared in around 2005.

It is bizarre for anyone to be bothered by some criticism from 13 years ago. The two books did not even say the same thing. You would think that the string theorists would just publish their rebuttal and move on.

Apparently they had no rebuttal, and they depended on everyone going along with the fiction that string theory was working.

Likewise, the quantum computation folks depend on everyone going along with the idea that we are about to have quantum computers (with quantum supremacy), and it will be a big technological advance. We don't need two books on the subject, as it is pretty obvious that IBM and Google are not delivering what they promised.

Saturday, August 25, 2018

Professor arrested for pocketing $4 in tips

Quantum computer complexity theorist Scott Aaronson seems to have survived his latest personal struggle, with his worldview intact.

He bought a smoothie, paid with a credit card, and took the $4 in the tip jar. An employee approached him, and politely explained that the tip jar is for tips. He grudgingly gave $1 back.

The manager then called the cops, and cop interviewed him to confirm what he had done. He was still oblivious to what was going on, so the cop handcuffed him and arrested him. That got his attention, and the manager agreed to drop the charges when the $4 was returned.

There is a biography about physicist Paul Dirac that calls him "the world's strangest man" because of a few silly anecdotes about him being a stereotypical absent-minded professor. That biographer has not met Scott.

Scott says that it was all the fault of the smoothie maker for not clearly explaining to him that he does not get to take change from the tip jar if he pays with a credit card. Scott is correct that there was a failure of communication, and surely both sides are at least somewhat to blame.

I am not posting this to criticize Scott. Just read his blog where he posts enuf negative info about himself. If I wanted to badmouth him, I would just link to his various posts where he has admitted to be wrong about various things. I am inclined to side with him as a fellow nerd who is frustrated by those who fail to explain themselves in a more logical manner. I am just posting it because I think that it is funny. After all, Scott has been named as one of the 30 smartest people alive and also one of the top 10 smartest people. And yet there are people with about 50 less IQ points who have no trouble buying smoothies, or understanding a request to put the tip money back.

Monday, August 6, 2018

Copenhagen is rooted in logical positivism

From an AAAS Science mag book review:
Most physicists still frame quantum problems through the sole lens of the so-called “Copenhagen interpretation,” the loose set of assumptions Niels Bohr and his colleagues developed to make sense of the strange quantum phenomena they discovered in the 1920s and 1930s. However, he warns, the apparent success of the Copenhagen interpretation hides profound failures.

The approach of Bohr and his followers, Becker argues, was ultimately rooted in logical positivism, an early-20th-century philosophical movement that attempted to limit science to what is empirically verifiable. By the mid-20th century, philosophers such as Thomas Kuhn and W. V. O. Quine had completely discredited this untenable view of science, Becker continues. The end of logical positivism, he concludes, should have led to the demise of the Copenhagen interpretation. Yet, physicists maintain that it is the only viable approach to quantum mechanics.

As Becker demonstrates, the physics community’s faith in Bohr’s wisdom rapidly transformed into a pervasive censorship that stifled any opposition.
This is partially correct. Quantum mechanics, and the Copenhagen Interpretation were rooted in logical positivism. Much of XX century physics was influenced, for the better, by logical positivism and related views.

It is also true that XX century philosophers abandoned logical positivism, for largely stupid reasons. They decided that there was no such thing as truth.

This created a huge split between the scientific world, which searches for truth, and the philosophical world, which contends that there is no such thing as truth. These views are irreconcilable. Science and Philosophy have become like Astronomy and Astrology. Each thinks that the other is so silly that any conversation is pointless.

Unfortunately, many physicists are now infected with anti-positivist views of quantum mechanics, and say that there is something wrong with it. Those physicists complain, but have gotten nowhere with there silly ideas.

Wednesday, August 1, 2018

Einstein's 1905 relativity had no new dogmas

Lubos Motl writes:
Einstein's breakthrough was far deeper, more philosophical than assumed

Relativity is about general, qualitative principles, not about light or particular objects and gadgets

Some days ago, we had interesting discussions about the special theory of relativity, its main message, the way of thinking, the essence of Einstein's genius and his paradigm shift, and the good and bad ways how relativity is presented to the kids and others. ...

Did the physicists before Einstein spend their days by screaming that the simultaneity of events is absolute? They didn't. It was an assumption that they were making all the time. All of science totally depended on it. But it seemed to obvious that they didn't even articulate that they were making this assumption. When they were describing the switch to another inertial system, they needed to use the Galilean transformation and at that moment, it became clear that they were assuming something. But everyone instinctively thought that one shouldn't question such an assumption. No one has even had the idea to question it. And that's why they couldn't find relativity before Einstein.

Einstein has figured out that some of these assumptions were just wrong and he replaced them with "new scientific dogmas".
They did find relativity before Einstein. With all the formulas. In particular, what Einstein said about simultaneity and synchronization of clocks was straight from what Poincare said five years earlier.

Motl repeats the widespread belief that Einstein found relativity by repudiating conventional wisdom and introducing new dogmas. That is not true at all. The most widely accepted theoy on the matter was Lorentz's 1895 theory. Lorentz had already received a Nobel prize for it in 1902.

Einstein's big dogmas were that the speed of light is constant and motion is relative. Einstein later admitted that he got the constant speed of light straight from Lorentz. He also got the relativity postulate from Lorentz, although it was Poincare who really emphasized it, so maybe he got it from Poincare.

Einstein did later argue against rival theories, such as Abraham's, but he never claimed that Lorentz or Poincare were wrong about their relativity theories. Other authors referred to the "Lorentz-Einstein theory", as if there were no diffence.xxc Even when Einstein was credited with saying something different from Lorentz, he insisted that his theory was the same as Lorentz's.

Einstein did sometimes pretend to have made conceptual advances in his formulation of special relativity, such as with the aether and local time. But what Einstein said on these matters was essentially the same as what Lorentz said many years earlier.

The formulation of special relativity that is accepted today is the geometric spacetime version presented by Poincare and Minkowki, not Einstein's. Poincare and Minkowski did explain how their view was different from Lorentz's.

Monday, July 30, 2018

Was Copernicus really heliocentric?

Wikipedia current has a debate on whether the Copernican system was heliocentric. See the Talk pages for Nicolaus Copernicus and Copernican heliocentrism.

This is a diversion from the usual Copernicus argument, which is whether he was Polish or German. He is customarily called Polish, but nation-states were not well defined, and there is an argument that he was more German than Polish.

The main point of confusion is that Copernicus did not really put the Sun at the center of the Earth's orbit. It was displaced by 1/25 to 1/31 of the Earth's orbit radius.

It appears that the center of the Earth's orbit revolved around the Sun, but you could also think of the Sun as revolving around the center of the Earth's orbit.

So is it fair to say that the Sun is at the center of the universe? Maybe if you mean that the Sun is near the center of the planetary orbits. Or that the Sun was at the center of the fixed stars. Or that the word "center" is used loosely to contrast with an Earth-centered system.

In Kepler's system, and Newton's, the Sun is not at the center of any orbit, but at a focus of an ellipse. It was later learned that the Sun orbits around a black hole at the center of the Milky Way galaxy.

I am not sure why anyone attaches such great importance to these issue. Motion is relative, and depends on your frame of reference. The Ptolemy and Copernicus models had essentially the same scientific merit and accuracy. They mainly differed in their choice of a frame of reference, and in their dubious arguments for preferring those frames.

People act as if the Copernicus choice of frame was one of the great intellectual advances of all time.

Suppose ancient map makers put East at the top of the page. Then one day a map maker put North at the top of the page. Would we credit him for being a great intellectual hero? Of course not.

There is an argument that the forces are easier to understand if you choose the frame so that the center of mass is stationary. Okay, but that was not really Copernicus's argument. There were ancient Greeks who thought it made more sense to put the Sun at the center because it was so much larger than the Earth. Yes, they very cleverly figured out that the Sun was much larger. There is a good logic to that also, but it is still just a choice of frame.

Friday, July 27, 2018

How physicists discovered the math of gauge theories

We have seen that if you are looking for theories obeying a locality axiom, and if you adopt a geometrical view, you are inevitably led to metric theories like general relativity, and gauge theories like electromagnetism. Those are the simplest theories obeying the axioms.

Formally, metric theories and gauge theories are very similar. Both satisfy the locality and geometry axioms. Both define the fields as the curvature of a connection on a bundle. The metric theories use the tangent bundle, while the gauge theories use an external group. Both use tensor calculus to codify the symmetries.

The Standard Model of particle physics is a gauge theory, with the group being U(2)xSU(3) instead of U(1), and explains the strong, weak, and electromagnetic interactions.

Pure gauge theories predict massless particles like the photon. The Standard Model also has a scalar Higgs field that breaks some of the symmetries and allows particles to have mass.

The curious thing, to me, is why it took so long to figure out that gauge theories were the key to constructing local field theories.
Newton published his theory of gravity in 1682, but was unhappy about the action-at-a-distance. He would have preferred a local field theory, but he could not figure out how to do it. Maxwell figured it out for electromagnetism in 1865, based on experiments of Faraday and others.

Nobody figured out the symmetries to Maxwell’s equations until Lorentz and Poincare concocted theories to explain the Michelson-Morley experiment,in 1892-1905. Poincare showed in 1905 that a relativistic field theory for gravity resolves the action-at-a-distance paradoxes of Newtonian gravity. After Minkowski stressed the importance of the metric, the geometry, and tensor analysis to special relativity in 1908, Nordstrom, Grossmann, Einstein, and Hilbert figured out how to make gravity a local geometrical theory. Hermann Weyl combined gravity and electromagnetism into what he called a “gauge” theory in 1919.

Poincare turned electromagnetism into a geometric theory in 1905. He put a non-Euclidean geometry on spacetime, with its metric and symmetry group, and used the 4-vector potential to prove the covariance of Maxwell’s equations. But he did not notice that his potential was just the connection on a line bundle.

Electromagnetism was shown to be a renormalizable quantum field theory by Feynman, Schwinger and others in the 1940s. ‘tHooft showed that all gauge theories were renormalizable in 1970. Only after 1970 did physicists decide that gauge theories were the fundamental key to quantum field theory, and the Standard Model was constructed in the 1970s. They picked SU(3) for the strong interaction because particles had already been found that closely matched representations of SU(3).

It seems to me that all of relativity (special and general) and gauge theory (electromagnetism and the Standard Model) could have derived mathematically from general principles, with little or no reference to experiment. It could have happened centuries ago.

Perhaps the mathematical sophistication was not there. Characterizing geometries in terms of symmetry transformations and their invariants was described in Klein's Erlangen Program, 1872. Newton did not use a vector notation. Vector notation did not become popular until about 1890. Tensor analysis was developed after that. A modern understanding of manifolds and fiber bundles was not published until about 1950.

Hermann Weyl was a brilliant mathematician who surely had an intuitive understanding of all these things in about 1920. He could have worked out the details, if he had understood how essential they were to modern physics. Why didn’t he?

Even Einstein, who was supposedly the big advocate of deriving physics from first principles, never seems to have noticed that relativity could be derived from geometry and causality. Geometry probably would not have been one of his principles, as he never really accepted that relativity is a geometrical theory.

I am still trying to figure out who was the first to say, in print, that relativity and electromagnetism are the inevitable consequences of the geometrical and locality axioms. This seems to have been obvious in the 1970s. But who said it first?

Is there even a physics textbook today that explains this argument? You can find some of it mentioned in Wikipedia articles, such as Covariant formulation of classical electromagnetism, Maxwell's equations in curved spacetime - Geometric formulation, and Mathematical descriptions of the electromagnetic field - Classical electrodynamics as the curvature of a line bundle. These articles mention that electromagnetics fields can be formulated as the curvature of a line bundle, and that this is an elegant formulation. But they do not explain how general considerations of geometry and locality lead to the formulation.

David Morrison writes in a recent paper:
In the late 1960s and early 1970s, Yang got acquainted with James Simons, ... Simons identified the relevant mathematics as the mathematical theory of connections on fiber bundles ... Simons communicated these newly uncovered connections with physics to Isadore Singer at MIT ... It is likely that similar observations were made independently by others.
I attended Singer’s seminars in the 1970s, so I can confirm that it was a big revelation to him that physicists were constructing a standard model based on connections on fiber bundles, a subject where he was a leading authority. He certainly had the belief that mathematicians and physicists did not know they were studying the same thing under different names, and he knew a lot of those mathematicians and physicists.

String theorists like to believe that useful physical theories can be derived from first principles. Based on the above, I have to say that it is possible, and it could have happened with relativity. But it has never happened in the history of science.

If there were ever an example where a theory might have been developed from first principles, it would be relativity and electromagnetism. But even in that case, it appears that a modern geometric view of the theory was only obtained many decades later.

Wednesday, July 25, 2018

Clifford suggested matter could curve space

A new paper:
Almost half a century before Einstein expounded his general theory of relativity, the English mathematician William Kingdon Clifford argued that space might not be Euclidean and proposed that matter is nothing but a small distortion in that spatial curvature. He further proposed that matter in motion is not more than the simple variation in space of this distortion. In this work, we conjecture that Clifford went further than his aforementioned proposals, as he tried to show that matter effectively curves space. For this purpose he made an unsuccessful observation on the change of the plane of polarization of the skylight during the solar eclipse of December 22, 1870 in Sicily.
I have wondered why some people credit Clifford, and this article spells it out. He was way ahead of everyone with the idea that our physical space might be non-Euclidean, as an explanation for gravity.

Friday, July 20, 2018

Vectors, tensors, and spinors on spacetime

Of the symmetries of spacetime, several of them are arguably not so reasonable when considering locality.

The first is time reversal. That takes time t to −t, and preserves the metric and some of the laws of mechanics. It takes the forward light cone to the backward light cone, and vice versa.

But locality is based on causality, and the past causes the future, not the other way around. There is a logical arrow of time from the past to the future. Most scientific phenomena are irreversible.

The second dubious symmetry is parity reversal. That is a spatial reflection that reverses right-handedness and left-handedness.

DNA is a right-handed helix, and there is no known life using left-handed DNA. Certain weak interactions (involved in radioactive decay) have a handedness preference. The Standard Model explains this by saying all neutrinos are massless with a left-handed helicity. That is, they spin left compared to the velocity direction. (Neutrinos are now thought to have mass.)

Charge conjugation is another dubious symmetry. Most of the laws of electricity will be the same if all positive charges are changed to negative, and all negative to positive.

Curiously, if you combine all three of the above symmetries, you get the CPT symmetry, and it is believed to be a true symmetry of nature. Under some very general assumptions, a quantum field theory with a local Lorentz symmetry must also have a CPT symmetry.

The final dubious symmetry is the rotation by 360°. It is a surprising mathematical fact that a rotation thru 720° is homotopic to the identity, but a rotation thru 360° is not. You can see this yourself by twisting your belt.

Thus the safest symmetry group to use for spacetime is not the Lorentz group, but the double cover of the connected component of the Lorentz group. That is, the spatial reflections and time reversals are excluded, and a rotation by x° is distinguished from a rotation by x+360°. This is called the spin group.

Geometric theories on spacetime are formulated in terms of vectors and tensors. Vectors and tensors are covariant, so that they automatically transform under a change of coordinates. Alternatively, you can say that spacetime is a geometric coordinate-free manifold with a Lorentz group symmetry, and vectors and tensors are the well-defined on that manifold.

If we consider spacetime with a spin group symmetry instead of the Lorentz group, then we get a new class or vector-like functions called spinors. Physicists sometimes think of a spinor as a square root of a vector, as you can multiply two spinors and get a vector.

The basic ingredients for a theory satisfying the geometry and locality axioms are thus: a spacetime manifold, a spin group structure on the tangent bundle, a separate fiber bundle, and connections on those bundles. The fields and other physical variables will be spinors and sections.

This is the essence of the Standard Model. Quarks, electrons, and neutrinos are all spinor fields. They have spin 1/2, which means they have 720° rotational symmetry, and not a 360° rotation symmetry. Such particles are also called fermions. The neutrinos are left-handed, meaning they have no spatial reflection symmetry. The Lagrangian defining the theory is covariant under the spin group, and hence well-defined on spacetime.

Under quantum mechanics, all particles with the same quantum numbers are identical, and a permutation of those identical particles is a symmetry of the system. Swapping two identical fermions introduces a factor of −1, just like rotating by 360°. That is what separates identical fermions, and keeps them from occupying the same state. This is called the Pauli exclusion principle, and it is the fundamental reason why fermions can be the building blocks of matter, and form stable objects.

Pauli exclusion for fermions is a mathematical consequence of having a Lorentz invariant quantum field theory.

All particles are either fermions or bosons. The photon is a boson, and has spin 1. A laser can have millions of photons in the same state, and they cannot be used to build a material substance.

The point here is that under general axioms of geometry and locality, one can plausibly deduce that spinor and gauge theories are the obvious candidates for fundamental physical theories. The Standard Model then seems pretty reasonable, and is actually rather simple and elegant compared to the alternatives.

In my counterfactual anti-positivist history of physics, it seems possible that someone could have derived models similar to the Standard Model from purely abstract principles, and some modern mathematics, but with no experiments.

Wednesday, July 18, 2018

Electric charge and local fields

Studying electric charge requires mathematical structures distinct from spacetime. Charge is conserved, but that conservation law is not related to any transformation of space or time. To measure electric fields, we need functions that measure something other than space and time.

The simplest type of such function would be from spacetime to the complex numbers. But such a function has an immediate problem with the locality axiom. A function like that would allow comparing values at spatially separated points, but no such comparison should be possible. It doesn't make sense to relate physics at one spacetime point to anything outside its light cones.

So we need to have a way of doing functions on spacetime, such that function values can only be compared infinitesimally, or along causal curves. The mathematical construction for this is called a line bundle.

For electromagnetism, imagine that every point in spacetime has a phase, a complex number with norm one. The phase does not mean anything by itself, but the change in phase along a causal curve is meaningful, and is the key to describing the propagation of an electromagnetic field.

A bundle connection is, by definition, the mathematical info for relating the phases along a curve. It is usually given in infinitesimal form, so the total phase change along the curve is given by integrating along the curve.

It turns out that a bundle can be curved, just as spacetime can be curved. The curvature is a direct measure of how the phase can change around a closed loop.

You cannot make a flat map of the surface of the Earth, without distorting distances, and you can make a flat map of a curved bundle either. You can choose some function (called a bundle section) that defines a preferred phase at each point. Then any other section is this one multiplied by some complex function on spacetime. Thus any section can be written fs, where f is an ordinary complex function on spacetime, and s is a special section that gives a phase at each point. You can think of the phase as a norm-1 complex value, but that is misleading because the values at one point cannot be directly compared to the values at other points.

For derivatives, we expect a product formula like d(fs) = f ds + s df.

The difficulty is in interpreting ds, as a derivative requires comparing values to nearby points. A bundle connection is what allows infinitesimal comparisons, so in place of ds there is some ω for which we have a formula: d(fs) = f ω + s df.

The ω is what mathematicians call a connection 1-form, and what physicists call a 4-vector potential. It is not necessarily the derivative of a function. The derivative is what mathematicians call the curvature 2-form, and what physicists call the electromagnetic field. It does not depend on the choice of the section s.

Maxwell’s equations reduce to interpreting the derivative of the curvature as the charge/current density.

The whole theory of electromagnetism is thus the inevitable consequence of looking for a geometric theory obeying locality.

One can also construct field theories with higher dimensional bundles, replacing the complex numbers C with Cn for some n, and the phase with U(n), the group of nxn unitary matrices. Other groups are possible also, such as SU(n), the group of such matrices with determinant 1. The 1-form ω has values in the Lie algebra, which is the skew symmetric complex matrices, if the group is U(n).

When a field theory is quantized, notorious infinities arise. For field theories based on connections on bundles as above, there is a theory of renormalization to cancel the infinities. It is a generalization of the system for quantum electrodynamics. Other theories are difficult or impossible to renormalize. Physicists settled on gauge theories because they are the only ones to give finite predictions in a quantum theory.

Historical note. This formulation of Maxwell’s equations appears to have been known by Hermann Weyl and other by 1920 or so, but it may not have been explicitly published until 1970 or so.

Friday, July 13, 2018

Infinitesimal analysis of geometry and locality

The geometric axiom would appear to be in conflict with the locality axiom. The geometries are defined by symmetry transformations that relate points to distant points. Under locality, points only relate to nearby points, not distant points.

The resolution of this conflict is that the geometric operations actually act infinitesmally. The Lorentz transformations do not actually act on spacetime, but on the tangent space at a given point.

Here “infinitesimal” is a mathematical shorthand for the limits used in computing derivatives and tangents. Locality allows a point to be related to an infinitesimally close point in its light cone, and that is really a statement about the derivatives at the point.

In general relativity, matter curves spacetime, and spacetime does not necessarily have rotational or other symmetries. But locally, to first order in infinitesimals, a point looks like the Minkowski space described earlier. That is, the tangent space is linear with the metric −dx2−dy2−dz2+c2dt2, in suitable coordinates.

A causal path in a curved spacetime is one that is within the light cones at every point. As the light cones are in the tangent space, this is a statement about the tangents to the curve. In other words, the velocity along the path cannot exceed the speed of light.

There is a mathematical theory for curved spaces. A manifold has a tangent space at each point, and if also has a metric on that, then one can take gradients of functions to get vector fields, find the shortest distance between points, compare vectors along curves, and calculate curvature. All of these things can be defined independently of coordinates on the manifold.

Spacetime (Riemann) curvature decomposes as the Ricci plus Weyl tensors. Technically, the Ricci tensor splits into the trace and trace-free parts, but that subtlety can be ignored for now. The Ricci tensor is a geometric measure of the presence of matter, and is zero in empty space.

There are whole textbooks explaining the mathematics of Riemann curvature, so I am not going to detail it here. It suffices to say that if you want a space that looks locally like Minkowski space, then it is locally described by the metric and Riemann curvature tensor.

The equations of general relativity are that the Ricci tensor is interpreted as mass density. More precisely, it is a tensor involving density, pressure, and stress. In particular, solving the dynamics of the solar system means using the equation Ricci = 0, as the sun and planets can be considered point masses in empty space. Einstein's calculation of the precession of Mercury's orbit was based studying spacetime with Ricci = 0.

Next we look at electric charges.

Wednesday, July 11, 2018

Bee finds free will in reductionism failure

Sabine Hossenfelder writes on the Limits of Reductionism:
Almost forgot to mention I made it 3rd prize in the 2018 FQXi essay contest “What is fundamental?”

The new essay continues my thoughts about whether free will is or isn’t compatible with what we know about the laws of nature. For many years I was convinced that the only way to make free will compatible with physics is to adopt a meaningless definition of free will. The current status is that I cannot exclude it’s compatible.

The conflict between physics and free will is that to our best current knowdge everything in the universe is made of a few dozen particles (take or give some more for dark matter) and we know the laws that determine those particles’ behavior. They all work the same way: If you know the state of the universe at one time, you can use the laws to calculate the state of the universe at all other times. This implies that what you do tomorrow is already encoded in the state of the universe today. There is, hence, nothing free about your behavior.
I don't know how she can get a Physics PhD and write stuff like that.

The universe is not really made of particles. The Heisenberg Uncertainty Principles shows that there are no particles. There are no laws that determine trajectories of particles. We have theories for how quantum fields evolve, and even predict bubble chamber tracks like the ones on the side of this blog. But it is just not true that the state tomorrow is encoded in the state today.

Look at radioactive decay. We have theories that might predict half-life or properties of emissions and other such things, but we cannot say when the decay occurs. For all we know, the atom has a mind of its own and decays when it feels like decaying.

The known laws of physics are simply not deterministic in the way that she describes.

She goes on to argue that world might still be indeterministic because of some unknown failure to reductionism.

In a way, chaos theory is such a failure of reductionism, because deterministic laws give rise in indeterministic physics. But she denies that sort of failure.

What she fails to grasp is that quantum mechanics is already compatible with (libertarian) free will.

The advocates of many-worlds (MWI) are nuts, but at least they concede that quantum mechanics does not determine your future (in this world). It makes for a range of future possibilites, and your choices will affect which of those possibilities you will get.

Of course the MWI advocates believe that every time you make a choice, you create an evil twin who gets the world with the opposite choice. That belief has no scientific substance to it. But the first part, saying that quantum mechanics allows free choices, is correct.

The other FQXI contest winners also have dubious physics, and perhaps I will comment on other essays.

Friday, July 6, 2018

All physical processes are local

After the geometry axiom, here is my next axiom.

Locality axiom: All physical processes are local.

Locality means that physical processes are only affected by nearby processes. It means that there is no action-at-a-distance. It means that your personality is not determined by your astrological sign. It means that the Moon cannot cause the tides unless some force or energy is transmitted from the Moon to the Earth at some finite speed.

Maxwell's electromagnetism theory of 1861 was local because the effects are transmitted by local fields at the speed of light. Newton's gravity was not. The concept of locality is closely related to the concept of causality. Both require a careful understanding of time. Quantum mechanics is local, if properly interpreted.

Time is very different from space. You can go back and forth in any spatial direction, but you can only go forward in time. If you want to go from position (0,0,0) to (1,1,1), you can go first in the x-direction, and then in the y-direction, or go in any order of directions that you please. Symmetries of Euclidean space guarantee these alternatives. But if you want to go from (0,0,0) at time t=0 to (1,1,1) at t=1, then your options are limited. Locality prohibits you from visiting two spatially distinct points at the same time. Time must always be increasing along your path.

Locality demands that there is some maximum speed for getting from one point to another. Call that speed c, as it will turn out to be the (c for constant) speed of light. No signal or any other causation can go faster than c.

Thus a physical event at a particular point and time can only influence events that are within a radius ct at a time t later. The origin (0,0,0) at t=0 can only influence future events (x,y,z,t) if x2+y2+z2 ≤ c2t2, t > 0. This is called the forward light cone. Likewise, the origin can only be influenced by past events in the backward light cone, x2+y2+z2 ≤ c2t2, t < 0. Anything outside those cones is essentially nonexistent to someone at the origin. Such is the causal structure of spacetime.

Since the world is geometrical, there should be some geometry underlying this causal structure. The geometry is not Euclidean. The simplest such geometry to assume a metric where distance squared equals −dx2−dy2−dz2+c2dt2. This will be positive inside the light cones. It does not make sense to relate to something outside the light cones anyway. Four-dimensional space with this non-Euclidean geometry is called Minkowski space.

The symmetries of spacetime with this metric may be found by an analogy to Euclidean space. Translations and reflections preserve the metric, just as with Euclidean space. A spatial rotation is a symmetry: (x,y,z,t) → (x cos u − y sin u, x sin u + y cost u, z, t). We can formally find additional symmetries by assuming that time is imaginary, and the metric is a Euclidean one on (x,y,z,ict). Poincare used this trick for an alternate derivation of the Lorentz transformations in 1905. Rotating by an imaginary angle iu, and converting the trigonometric functions to hyperbolic ones:
(x,y,z,ict) → (x, y, z cos iu − ict sin iu, z sin iu + ict cos iu)
    = (x, y, z cosh u + ct sinh u, iz sinh u + ict cosh u)
This is a Lorentz (boost) transformation with rapidity u, or with velocity v = c tanh u. The more familiar Lorentz factor is cosh u = 1/√(1 − v2/c2).

Imaginary time seems like just a sneaky math trick with no physical consequences, but there are some. If a physical theory has functions that are analytic in spacetime variables, then they can be continued to imaginary time, and sometimes this has consequences for real time.

Thus locality has far-reaching consequences. From the principle, it appears that the world must be something like Minkowski space, with Lorentz transformations.

Next, we will combine our two axioms.

Tuesday, July 3, 2018

The world is geometrical

In my anti-positivist counterfactual history, here is the first axiom.

Geometry axiom: The world is geometrical.

The most familiar geometry is Euclidean geometry, where the world is R3 and distance squared is given by dx2 + dy2 + dz2. There are other geometries.

Newtonian mechanics is a geometrical theory. Space is represented by R3, with Euclidean metric. That is a geometrical object because of the linear structure and the metric. Lines can be defined as the short distance between points. Plane, circles, triangles, and other geometric objects can be defined.

It is also a geometrical object because there is a large class of transformations that preserve the metric, and hence also the lines and circles determined by the metric. Those transformations are the rotations, reflections, and translations. For example, the transformation (x,y,z) → (x+5,y,z) preserves distances, and takes lines to lines and circles to circles.

Mathematically, a geometry can be defined in terms of some structure like a metric, or the transformations preserving that structure. This view has been known as the Klein Erlangen program since 1872.

The laws of classical equations can be written in geometrical equations like F=ma, where F is the force vector, a is the acceleration vector, and m is the mass. All are functions on Euclidean space. What makes F=ma geometrical is not just that it is defined on a geometrical space, or that vectors are used. The formula is geometrical because all quantities are covariant under the relevant transformations.

Classical mechanics does not specify where you put your coordinate origin (0,0,0), or how the axes are oriented. You can make any choice, and then apply one of the symmetries of Euclidean space. Formulas like F=ma will look the same, and so will physical computations. You can even do a change of coordinates that does not preserve the Euclidean structure, and covariance will automatically dictate the expression of the formula in those new coordinates.

One can think of Euclidean space and F=ma abstractly, where the space has no preferred coordinates, and F=ma is defined on that abstract space. Saying that F=ma is covariant with respect to a change of coordinates is exactly the same as saying F=ma is well-defined on a coordinate-free Euclidean space.

The strongly geometrical character of classical mechanics was confirmed by a theorem of Neother that symmetries are essentially the same as conservation laws. Momentum is conserved because spacetime has a spatial translational symmetry, energy is conserved because of time translation symmetry, and angular momentum is conserved because of rotational symmetry.

Next, we look at geometries that make physical sense.