Friday, July 13, 2018

Infinitesimal analysis of geometry and locality

The geometric axiom would appear to be in conflict with the locality axiom. The geometries are defined by symmetry transformations that relate points to distant points. Under locality, points only relate to nearby points, not distant points.

The resolution of this conflict is that the geometric operations actually act infinitesmally. The Lorentz transformations do not actually act on spacetime, but on the tangent space at a given point.

Here “infinitesimal” is a mathematical shorthand for the limits used in computing derivatives and tangents. Locality allows a point to be related to an infinitesimally close point in its light cone, and that is really a statement about the derivatives at the point.

In general relativity, matter curves spacetime, and spacetime does not necessarily have rotational or other symmetries. But locally, to first order in infinitesimals, a point looks like the Minkowski space described earlier. That is, the tangent space is linear with the metric −dx2−dy2−dz2+c2dt2, in suitable coordinates.

A causal path in a curved spacetime is one that is within the light cones at every point. As the light cones are in the tangent space, this is a statement about the tangents to the curve. In other words, the velocity along the path cannot exceed the speed of light.

There is a mathematical theory for curved spaces. A manifold has a tangent space at each point, and if also has a metric on that, then one can take gradients of functions to get vector fields, find the shortest distance between points, compare vectors along curves, and calculate curvature. All of these things can be defined independently of coordinates on the manifold.

Spacetime (Riemann) curvature decomposes as the Ricci plus Weyl tensors. Technically, the Ricci tensor splits into the trace and trace-free parts, but that subtlety can be ignored for now. The Ricci tensor is a geometric measure of the presence of matter, and is zero in empty space.

There are whole textbooks explaining the mathematics of Riemann curvature, so I am not going to detail it here. It suffices to say that if you want a space that looks locally like Minkowski space, then it is locally described by the metric and Riemann curvature tensor.

The equations of general relativity are that the Ricci tensor is interpreted as mass density. More precisely, it is a tensor involving density, pressure, and stress. In particular, solving the dynamics of the solar system means using the equation Ricci = 0, as the sun and planets can be considered point masses in empty space. Einstein's calculation of the precession of Mercury's orbit was based studying spacetime with Ricci = 0.

Next we look at electric charges.

Wednesday, July 11, 2018

Bee finds free will in reductionism failure

Sabine Hossenfelder writes on the Limits of Reductionism:
Almost forgot to mention I made it 3rd prize in the 2018 FQXi essay contest “What is fundamental?”

The new essay continues my thoughts about whether free will is or isn’t compatible with what we know about the laws of nature. For many years I was convinced that the only way to make free will compatible with physics is to adopt a meaningless definition of free will. The current status is that I cannot exclude it’s compatible.

The conflict between physics and free will is that to our best current knowdge everything in the universe is made of a few dozen particles (take or give some more for dark matter) and we know the laws that determine those particles’ behavior. They all work the same way: If you know the state of the universe at one time, you can use the laws to calculate the state of the universe at all other times. This implies that what you do tomorrow is already encoded in the state of the universe today. There is, hence, nothing free about your behavior.
I don't know how she can get a Physics PhD and write stuff like that.

The universe is not really made of particles. The Heisenberg Uncertainty Principles shows that there are no particles. There are no laws that determine trajectories of particles. We have theories for how quantum fields evolve, and even predict bubble chamber tracks like the ones on the side of this blog. But it is just not true that the state tomorrow is encoded in the state today.

Look at radioactive decay. We have theories that might predict half-life or properties of emissions and other such things, but we cannot say when the decay occurs. For all we know, the atom has a mind of its own and decays when it feels like decaying.

The known laws of physics are simply not deterministic in the way that she describes.

She goes on to argue that world might still be indeterministic because of some unknown failure to reductionism.

In a way, chaos theory is such a failure of reductionism, because deterministic laws give rise in indeterministic physics. But she denies that sort of failure.

What she fails to grasp is that quantum mechanics is already compatible with (libertarian) free will.

The advocates of many-worlds (MWI) are nuts, but at least they concede that quantum mechanics does not determine your future (in this world). It makes for a range of future possibilites, and your choices will affect which of those possibilities you will get.

Of course the MWI advocates believe that every time you make a choice, you create an evil twin who gets the world with the opposite choice. That belief has no scientific substance to it. But the first part, saying that quantum mechanics allows free choices, is correct.

The other FQXI contest winners also have dubious physics, and perhaps I will comment on other essays.

Friday, July 6, 2018

All physical processes are local

After the geometry axiom, here is my next axiom.

Locality axiom: All physical processes are local.

Locality means that physical processes are only affected by nearby processes. It means that there is no action-at-a-distance. It means that your personality is not determined by your astrological sign. It means that the Moon cannot cause the tides unless some force or energy is transmitted from the Moon to the Earth at some finite speed.

Maxwell's electromagnetism theory of 1861 was local because the effects are transmitted by local fields at the speed of light. Newton's gravity was not. The concept of locality is closely related to the concept of causality. Both require a careful understanding of time. Quantum mechanics is local, if properly interpreted.

Time is very different from space. You can go back and forth in any spatial direction, but you can only go forward in time. If you want to go from position (0,0,0) to (1,1,1), you can go first in the x-direction, and then in the y-direction, or go in any order of directions that you please. Symmetries of Euclidean space guarantee these alternatives. But if you want to go from (0,0,0) at time t=0 to (1,1,1) at t=1, then your options are limited. Locality prohibits you from visiting two spatially distinct points at the same time. Time must always be increasing along your path.

Locality demands that there is some maximum speed for getting from one point to another. Call that speed c, as it will turn out to be the (c for constant) speed of light. No signal or any other causation can go faster than c.

Thus a physical event at a particular point and time can only influence events that are within a radius ct at a time t later. The origin (0,0,0) at t=0 can only influence future events (x,y,z,t) if x2+y2+z2 ≤ c2t2, t > 0. This is called the forward light cone. Likewise, the origin can only be influenced by past events in the backward light cone, x2+y2+z2 ≤ c2t2, t < 0. Anything outside those cones is essentially nonexistent to someone at the origin. Such is the causal structure of spacetime.

Since the world is geometrical, there should be some geometry underlying this causal structure. The geometry is not Euclidean. The simplest such geometry to assume a metric where distance squared equals −dx2−dy2−dz2+c2dt2. This will be positive inside the light cones. It does not make sense to relate to something outside the light cones anyway. Four-dimensional space with this non-Euclidean geometry is called Minkowski space.

The symmetries of spacetime with this metric may be found by an analogy to Euclidean space. Translations and reflections preserve the metric, just as with Euclidean space. A spatial rotation is a symmetry: (x,y,z,t) → (x cos u − y sin u, x sin u + y cost u, z, t). We can formally find additional symmetries by assuming that time is imaginary, and the metric is a Euclidean one on (x,y,z,ict). Poincare used this trick for an alternate derivation of the Lorentz transformations in 1905. Rotating by an imaginary angle iu, and converting the trigonometric functions to hyperbolic ones:
(x,y,z,ict) → (x, y, z cos iu − ict sin iu, z sin iu + ict cos iu)
    = (x, y, z cosh u + ct sinh u, iz sinh u + ict cosh u)
This is a Lorentz (boost) transformation with rapidity u, or with velocity v = c tanh u. The more familiar Lorentz factor is cosh u = 1/√(1 − v2/c2).

Imaginary time seems like just a sneaky math trick with no physical consequences, but there are some. If a physical theory has functions that are analytic in spacetime variables, then they can be continued to imaginary time, and sometimes this has consequences for real time.

Thus locality has far-reaching consequences. From the principle, it appears that the world must be something like Minkowski space, with Lorentz transformations.

Next, we will combine our two axioms.

Tuesday, July 3, 2018

The world is geometrical

In my anti-positivist counterfactual history, here is the first axiom.

Geometry axiom: The world is geometrical.

The most familiar geometry is Euclidean geometry, where the world is R3 and distance squared is given by dx2 + dy2 + dz2. There are other geometries.

Newtonian mechanics is a geometrical theory. Space is represented by R3, with Euclidean metric. That is a geometrical object because of the linear structure and the metric. Lines can be defined as the short distance between points. Plane, circles, triangles, and other geometric objects can be defined.

It is also a geometrical object because there is a large class of transformations that preserve the metric, and hence also the lines and circles determined by the metric. Those transformations are the rotations, reflections, and translations. For example, the transformation (x,y,z) → (x+5,y,z) preserves distances, and takes lines to lines and circles to circles.

Mathematically, a geometry can be defined in terms of some structure like a metric, or the transformations preserving that structure. This view has been known as the Klein Erlangen program since 1872.

The laws of classical equations can be written in geometrical equations like F=ma, where F is the force vector, a is the acceleration vector, and m is the mass. All are functions on Euclidean space. What makes F=ma geometrical is not just that it is defined on a geometrical space, or that vectors are used. The formula is geometrical because all quantities are covariant under the relevant transformations.

Classical mechanics does not specify where you put your coordinate origin (0,0,0), or how the axes are oriented. You can make any choice, and then apply one of the symmetries of Euclidean space. Formulas like F=ma will look the same, and so will physical computations. You can even do a change of coordinates that does not preserve the Euclidean structure, and covariance will automatically dictate the expression of the formula in those new coordinates.

One can think of Euclidean space and F=ma abstractly, where the space has no preferred coordinates, and F=ma is defined on that abstract space. Saying that F=ma is covariant with respect to a change of coordinates is exactly the same as saying F=ma is well-defined on a coordinate-free Euclidean space.

The strongly geometrical character of classical mechanics was confirmed by a theorem of Neother that symmetries are essentially the same as conservation laws. Momentum is conserved because spacetime has a spatial translational symmetry, energy is conserved because of time translation symmetry, and angular momentum is conserved because of rotational symmetry.

Next, we look at geometries that make physical sense.

Friday, June 29, 2018

MWI is the strangest and least believable thing

In Sam Harris and Sean Carroll DEBATE Free Will & Laplace's Demon - YouTube, we get a discussion of many-worlds:
Sam Harris [at 4:00]: This is supposed to be science, right?

It sounds like the strangest and least believable thing [inaudible] So how is it that science, after centuries of being apparently rigorous, and parsimonious, and hard-headed, finally disgorges a picture of reality which seems to be the least believable thing that anyone has ever thought of?

Sean M. Carroll: You've come to the right place!
Carroll goes on to explain his beliefs in many-worlds quantum mechanics, block universe, Laplace's Demonm, determinism, and time reversibility.

The fact that we think we have free will, and we remember the past instead of the future, is all just a psychological illusion caused by the increase in entropy.

I wonder how Carroll ever got to be a physics professor. He is entitled to believe in whatever gods he chooses, but he represents these opinions as consequences of modern science. They are not.

Harris's question nails it. Scientists were rigorous, parsimonious, and hard-headed for centuries, but now the public image of science is dominated by professors like Carroll who present the least believable ideas.

A recent New Scientist article on how to think about the multiverse quotes him:
“One of the most common misconceptions is that the multiverse is a hypothesis,” says Sean Carroll at the California Institute of Technology in Pasadena. In fact, it is forced upon us.” It is a prediction of theories we have good reason to think are correct.”
If the multiverse really were a hypothesis, it would be testable. No, the multiverse is just some anti-positivist philosophical belief that has no empirical support, and does not follow from any accepted theory.

For someone who doesn't believe in libertarian free will, Carroll is intolerant of those who disagree with his leftist agenda. He is sympathetic to a restaurant for refusing to serve a Trump White House employee!

I would think that a science professor, who likes to talk about how great science is, would happily promote the open exchange of political ideas. But no, he apparently thinks that leftist Democrats should not necessarily be civil to Trump supporters.

Wednesday, June 27, 2018

Bell test experiments explained

Eight physicists have written an excellent new paper explaining quantum mechanics from the viewpoint of the Bell test experiments:
Understanding quantum physics through simple experiments: from wave-particle duality to Bell's theorem

Quantum physics, which describes the strange behavior of light and matter at the smallest scales, is one of the most successful descriptions of reality, yet it is notoriously inaccessible. Here we provide an approachable explanation of quantum physics using simple thought experiments that deal with one- and two-particle interference. We derive all relevant quantum predictions using minimal mathematics, without introducing the advanced calculations that are typically used to describe quantum physics. We focus on the two key surprises of quantum physics, namely wave-particle duality, which deals with the counter-intuitive behavior of single quantum particles, and entanglement, which applies to two or more quantum particles and brings out the inherent contradiction between quantum physics and seemingly obvious assumptions regarding the nature of reality. We employ Hardy's version of Bell's theorem to show that so-called local hidden variables are inadequate at explaining the behavior of entangled quantum particles. This means that one either has to give up on hidden variables, i.e. the idea that the outcomes of measurements on quantum particles are determined before an experiment is actually carried out, or one has to relinquish the principle of locality, which requires that no causal influences should be faster than the speed of light. Finally, we describe how these remarkable predictions of quantum physics have been confirmed in experiments. We have successfully used the present approach in a course that is open to all undergraduate students at the University of Calgary, without any prerequisites in mathematics or physics.
It concludes:

Bell's theorem indicates a deep contradiction between quantum physics and EPR's seemingly obvious assumptions about reality. Surprisingly, experiments demonstrate quantum physics to be correct and rule out the intuitively appealing possibility of local hidden variables. This implies that measurement outcomes are either not predetermined at all, or they are determined by nonlocal hidden variables, such as Bohm's pilot wave model.
My only quibble with this is that it is a sloppy about the alternatives to hidden variables.

Yes, experiments have ruled out local hidden variables. Nnonlocal hidden variables are possible but foolish; who wants a theory where inaccessible variable have magical effects at a distance? Such theories are useless and misleading.

This leaves us to conclude, according to the paper, "that measurement outcomes are either not predetermined at all". In other words, it says, we must reject "the idea that the outcomes of measurements on quantum particles are determined before an experiment is actually carried out".

There are actually two possibilities here. One is that photons, and other quantum objects, have a free will of their own where they can decide what properties they will have, within the range of possibilities. That is, the physics is not determinist.

The other possibility is that the outcomes are all predetermined in some physical sense, but not as mathematical variables like real numbers.

Suppose you fire a photon at a screen, thru a double slit. Maybe its path is determined once it passes the double slit. Maybe it is determined even before the slit. Maybe the photon is still deciding what it wants to be right up to the point that it hits the screen. Our experiments cannot really distinguish between this possibilities.

What we can say is that you cannot model the photon by introducing some non-measured variables like which slit the photon is passing thru. It doesn't work. The experiments show that.

My view is that we do measurements with real numbers because that is the only way we know how to make observations. But there is no reason to believe that a photon can be fully described by real numbers. Photons are strange objects. So are real numbers. But they are not the same.

If you have a single electron, then it appears that you can describe it by a wave function okay. The above paper does a good job of explaining this. But with two particles, they can be entangled. Then wave functions can still predict experiments, according to quantum mechanics, but they don't truly match what appears to be the photon internal structure. After all, photons appear to behave locally, but the wave functions seem somewhat nonlocal in the case of an entangled measurement.

Quantum mechanics is s positivist theory. It predicts experiments, and if that is all you require, then it is a complete theory. But if you think that you are representing the true internals of a photon with wave functions, forget it. It cannot be done. The wave functions are just tools for predicting experiments. The photons could be sentient beings with their own free will and God-given souls, for all we know, as long as they are constrained to obey quantum mechanics.

There are a lot of papers on Bell's theorem that make strong conclusions because of faulty assumptions about reality. The above paper does an excellent job of explaining the experiments that must be accepted. The experiments are not really that strange, as they do not show nonlocality or pilot waves or anything that bizarre. Quantum mechanics is not really so strange, if you let go of your preconceptions about how photons can be modeled.

Note: Quantum field theory teaches that photons are not really fundamental, and discussions about photons like the above are convenient simplifications of fields.

Monday, June 25, 2018

Was relativity positivist or anti-positivist?

A famous string theorist wrote:
Albert Einstein famously believed that, given some general principles, there is essentially a unique way to construct a consistent, functioning universe.
That is much of the rationale behind string theory. Ignore experiment, follow abstract principles, and devise laws of nature.

Put another way, he believed in a top-down approach, instead of a bottom-up approach. See recent postings by Allanach, Lumo, and Bee. They would all like to believe in the top-down approach, but they differ in the extent that they are willing to reconsider their beliefs in the face of contrary LHC evidence.

Sabine Hossenfelder (aka Dr. Bee) has a new book, Lost in Math, that critcizes various purely mathematical approaches to theoretical physics that have failed to get any empirical results. I have not read it.

Some people think that Einstein did this top-down approach with relativity. That is, he just adopted some philosophically attractive postulates, and derived special relativity.

That is not exactly what happened. Maxwell developed his electromagnetic theory from experiments by Faraday and others, and noticed a paradox about the motion of the Earth. The Michelson-Morley experiment tested Maxwell's ideas, and found a symmetry of nature that was not obviously apparent in his equations. Lorentz found a way to reconcile the theory and experiment. Poincare showed that the Lorentz transformations generated a symmetry group for 4-dimensional spacetime, and that Maxwell’s equations could be understood geometrically on spacetime. Einstein adopted two of Lorentz's theorems as postulates, and showed how they could be used to derive the Lorentz transformations. Minkowski elaborated on the non-Euclidean geometry of Poincare’s spacetime, and formulated the version of special relativity that became popular at the time.

There are philosophers who say that getting relativity from experiment is just positivist propaganda. They say Einstein ignored the Michelson-Morley and other experiments, and applied non-empirical thinking. Polyanhi even said that special relativity was proposed "on the basis of pure speculation, rationally intuited by Einstein before he had ever heard about it."

So Einstein’s 1905 special relativity paper is widely praised, but some praise it for being positivist, and some for it being anti-positivist!

It was not really either, because it was just an exposition of the ideas of others. The theory was created by Lorentz, Poincare, and Minkowski, and they all explicitly relied on empirical findings in their papers, and did their work independently from Einstein.

I don’t think physics has ever had substantial advances from anti-positivist thinking.

But if there were to be an anti-positivist invention of relativity in some alternate universe, what would it look like? That is what I wish to propose in several subsequent posts.

Friday, June 22, 2018

Quantum mechanics has no action-at-a-distance

Stephen Boughn writes:
There Is No Action at a Distance in Quantum Mechanics, Spooky or Otherwise

I feel compelled to respond to the frequent references to spooky action at a distance that often accompany reports of experiments investigating entangled quantum mechanical states. Most, but not all, of these articles have appeared in the popular press. As an experimentalist I have great admiration for such experiments and the concomitant advances in quantum information and quantum computing, but accompanying claims of action at a distance are quite simply nonsense. Some physicists and philosophers of science have bought into the story by promoting the nonlocal nature of quantum mechanics. In 1964, John Bell proved that classical hidden variable theories cannot reproduce the predictions of quantum mechanics unless they employ some type of action at a distance. I have no problem with this conclusion. Unfortunately, Bell later expanded his analysis and mistakenly deduced that quantum mechanics and by implication nature herself are nonlocal.
He has some additional argument in Making Sense of Bell's Theorem and Quantum Nonlocality.

He is correct. His explanations are routine textbook stuff, with nothing particularly original. None of it would need to be said, except that most of the popular explanations of quantum mechanics insist on saying that the theory is nonlocal. Some physicists even say this. Some even say that Bell proved it.

Boughn's explanations are clear enough, but it would have been nice if he quoted the respectable authorities who say that quantum mechanics is nonlocal. it would prove that he is not attacking a straw man, and justify writing a paper to explain some textbook material.

Thursday, June 21, 2018

For Solstice, celebrate Earth's uniqueness

Astronomers are always claiming that they found a distant planet that might support life, or that there must be millions of Earth-like planets in our galaxy.

But there may be no others. Earth has many unusual properties that are essential for life, and that may never be found elsewhere.

The NY Times reports:
As you mark the longest day of the year, consider the debate among astronomers over whether Earth’s tilt toward the sun helps make life on our world and others possible. ...

The solstice occurs because Earth does not spin upright but leans 23.5 degrees on a tilted axis. Such a slouch, or obliquity, has long caused astronomers to wonder whether Earth’s tilt — which you could argue is in a sweet spot between more extreme obliquities — helped create the conditions necessary for life. ...

Is life only possible on an exoplanet with a tilt similar to ours? ...

Mars’s slouch, for example, is currently akin to Earth’s at 25.19 degrees, but it shifts back and forth between 10 degrees and 60 degrees over millions of years. That means that the seasons and climate of the red planet — which is currently experiencing an extreme dust storm — vary wildly. That could create conditions that make life impossible.

Take Earth as an example. Although our planet’s obliquity is relatively constant, it does change by a mere few degrees. Such slight variations have sent vast sheets of glaciers from the poles to the tropics and entombed Earth within a frozen skin of solid ice. Luckily, Earth has managed to escape these so-called snowball states. But scientists are not sure whether the same will be true for planets like Mars with larger variations in their tilts. ...

As such, a stable tilt just might be a necessary ingredient for life. It’s an interesting finding given that the Earth’s tilt never changes drastically thanks to the Moon. And yet astronomers don’t know how common such moons are within the galaxy, said John Armstrong, an astronomer at Weber State University in Utah. If they turn out to be uncommon across the galaxy, it could mean that such stability — and therefore life — is hard to come by.
That's right, the Earth has a tilt that is stabilized by the Moon, giving regular seasons over billions of years. Having a single large moon is probably very unusual.

Earth also has large amounts of surface water, geologic activity (volcanoes and plate tectonics), a large variety of minerals, etc. It also has a huge outer planet, Jupiter, to further stabilize the orbit. There are probably many other such factors that I don't even know about.

We don't have the ability to detect whether distant planets have these features. But common sense would indicate that they would be very unlikely. I think it is likely that Earth has the only intelligent life of this galaxy.

Wednesday, June 20, 2018

Standard Model of Physics at 50

Yvette Cendes writes in SciAm:
The Standard Model (of Physics) at 50

It has successfully predicted many particles, including the Higgs Boson, and has led to 55 Nobels so far, but there’s plenty it still can’t account for

Just over a half-century ago, the physicist Steven Weinberg published a seminal paper titled “A Model of Leptons in the journal Physical Review Letters.” It was just three pages long, but its contents were revolutionary: in the paper, Weinberg outlined the core of the theory now known as the Standard Model, which governs elementary particles.
If this was really such a great paper, then why didn't anyone notice at the time?

As you can see here, it was only cited once in 1968 (by Salam), once in 1969, and once in 1970.

The paper proposed a model for electroweak interactions. It was the same as quantum electrodynamics, except that the gauge group was U(2) instead of U(1), a Higgs field was added to make the weak force short range, and no one knew how to cancel the infinities.

The idea of using U(2) for weak/electroweak was due to Schwinger and Glashow in 1961, and the idea of using a Higgs was due to Higgs and others in 1964. A scheme for canceling infinities in a gauge theory was published by 'tHooft in 1970.

So Weinberg's paper was no big deal at the time. It did not solve any physical problem. It wasn't much of a theory, because he had no way of computing anything. By 2015, it had 9289 citations, no. 2 on the list.

I am not trying to put down Weinberg, but he and Salam lucked into that Nobel prize. The works of Higgs and 'tHooft were more essential to creating the Standard Model.

The SciAm author ends with a complaint about a physics party:
And one Nobel laureate, who shall go unnamed, proceeded to frame our introduction by stating I was clearly invited because I was pretty, and that I looked old enough to finish my PhD already. (The Nobel Prize in Physics is still such an old boys club that only two women have ever won the prize out of 207 recipients. The last was in 1962 — a greater gap than in any other field, and not for a lack of good scientists.)
I guess she is saying that female physicists were unfairly deprived of Nobel prizes, but maybe they were too pretty, or taking care of babies instead of finishing a PhD.

Monday, June 18, 2018

Video rant against Jewish Physics

I just found an amusing video from a couple of years ago titled Weev talks about relativity. Weev is a well-known internet troll, and doesn't actually say much about relativity, but has two others talking about relativity.

While they talk, the video shows flashes of Donald Trump, and news footage related to Trump. I guess this was posted during Trump's campaign for the Presidency, and Weev was a Trump supporter.

One rants about "Jewish Physics", while the other is a skeptic. The main guy talks about how physics has gotten away from experiment, and says a lot of it is "academic circle jerk ... like the epicircles of astronomy". He means "epicycles".

He argues that Einstein's special relativity was just untestable philosophical ideas, not physics. He just took the math and theory from Lorentz's papers of 10 years earlier, and added some untestable philosophical re-imagining about there being no place of rest. Weev adds that Einstein did not cite his sources.

The skeptic doubts that there could be a Jewish conspiracy about such matters, and the main guy says that it is not really a conspiracy, and not just Jews. It is more a matter that Jews like philosophical unscientific ideas.

The explanations are a little garbled, but most of it is essentially correct. It is true that Einstein's famous 1905 special relativity theory was mathematically and observationally equivalent to what Lorentz has already published, and what Einstein failed to cite. Historians of science agree to this.

It is also true that saying that there is no rest frame, or no aether, is just untestable philosophy. It does not add anything useful to Lorentz's theory.

You might say that it is testable, because you could prove Einstein wrong by finding a rest frame or aether. In fact, you can define a rest frame in terms of the cosmic background microwave radiation, and an aether in terms of the quantum electrodynamic vacuum, but nobody says this proves Einstein wrong.

You might say that Einstein's 1905 paper was a big advance if it were conceptually superior, or if led to other advances in the field. But that is not true. Einstein's approach was not seen as being much different from Lorentz's at the time. The conceptually superior approach was considered to be the spacetime geometry relativity of Poincare and Minkowski, and subsequent work built on Minkowski's, not Einstein's.

Einstein has sort of a cult following, and most of the followers appear to be non-Jews. So how is this Jewish Physics?

The term "Jewish Physics" is inaccurate and unnecessarily inflammatory, but there can't be any doubt that Einstein is a Jewish saint. He was aggressively promoted and idolized by Jews. I hesitate to call it a religious thing, as Einstein was not very religious and it is the secular Jews who idolize him, not the orthodox Jews.

It is also hard to separate Einstein's science from his ideological beliefs. His deepest beliefs include Zionism, determinism, Jewish pantheism, anti-positivism, and Communism. When he attacks quantum mechanics, he relied on his anti-positivism and determinism. When he promoted his version of relativity over that of Lorentz, he relied on his beliefs, and not any math or empirical science.

You would think that his membership in Communist front organizations would detract from his popularity, but it does not seem to have had any such effect. It is also well-known that Einstein did not really contribute anything to special relativity, as Whittaker explained it in his 1953 book.

Einstein did have substantial scientific accomplishments, but not nearly enough to be TIME Man of the Century. Idolizing him is largely ideological.

On another topic, some freshman physics courses give a problem on what would happen if everyone in China jumped up and down at the same time. Pure fiction, right? Apparently everyone in Mexico jumped at the same time, and it caused an earthquake!

Thursday, June 14, 2018

Rovelli defends ancient philosophy for physics

Carlo Rovelli writes Physics Needs Philosophy. Philosophy Needs Physics.
Against Philosophy is the title of a chapter of a book by one of the great physicists of the last generation: Steven Weinberg, Nobel Prize winner and one of the architects of the Standard Model of elementary particle physics. Weinberg argues eloquently that philosophy is more damaging than helpful for physics - although it might provide some good ideas at times, it is often a straightjacket that physicists have to free themselves from. More radically, Stephen Hawking famously wrote that "philosophy is dead" because the big questions that used to be discussed by philosophers are now in the hands of physicists. Similar views are widespread among scientists, and scientists do not keep them to themselves. Neil de Grasse Tyson, a well known figure in the popularisation of science in America, publicly stated in the same vein: ".we learn about the expanding universe, . we learn about quantum physics, each of which falls so far out of what you can deduce from your armchair that the whole community of philosophers . was rendered essentially obsolete."
That's right, many modern physicists have concluded that philosophy is dead to them.

Rovelli disagrees, but his argument is almost entirely based on Aristotle and other long-dead philosophers, and by arguing in favor of philosophical thinking.

His examples:
But the direct influence of philosophy on physics is certainly not limited to the birth of modern physics. It can be recognised in every major step. Take the twentieth century. Both major advances made by twentieth century physics were strongly influenced by philosophy. They would have been inconceivable without thephilosophy of the time. Quantum mechanics springs from an intuition due to Heisenberg, grounded in the strongly positivist philosophical atmosphere in which he found himself: one gets knowledge by restricting oneself to what is observable. The abstract of Heisenberg's 1925 milestone paper on quantum theory is explicit about this:

"The aim of this work is to set the basis for a theory of quantum mechanics based exclusively on relations between quantities that are in principle observable."

The same distinctly philosophical attitude nourished Einstein's discovery of special relativity: by restricting to what is observable, we recognise that the notion of simultaneity is misleading. Einstein explicitly recognised his debt to the philosophical writings of Mach and Poincaré. Without these inputs, his special relativity would have been inconceivable. Although not the same, the philosophical influences on Einstein's conception of general relativity were even stronger. Once again, he was explicit in recognising his debt to philosophy, this time to the critical thinking of Leibniz, Berkeley, and Mach.
That's right, the discoveries of relativity and quantum mechanics were strongly influenced by logical positivist thinking.

His history is not quite correct. Einstein did not discover special relativity or the problem of simultaneity. He got most of SR from Lorentz, and got synchronization from Poincare. Up to his dying day, he never acknowledged his debt to Poincare. Einstein did not really buy into the positivist philosophy, and disavowed positivist thinking about QM. He explicitly disavowed positivism in 1945.

I agree that logical positivism has been important to physics, but it has been a dead philosophy since WWII.

I defend logical positivism, but I am in very small minority. There are no reputable philosophers who defend it.

Post-WWII philosophers not only reject logical positivism, they reject the scientific method and much of what modern science is all about.

Siding with today's philosophers is essentially the same as being anti-science. Modern philosophy is at war with modern science.

The only philosophers of the last century discussed by Rovelli are Popper and Kuhn, and Rovellis concedes that they have had a negative influence on physics.
I suspect that part of the problem is precisely that the dominant ideas of Popper and Kuhn have misled current theoretical investigations. Physicists have been too casual in dismissing the insights of successful established theories. Misled by Kuhn’s insistence on incommensurability across scientific revolutions, they fail to build on what we already know, which is how science has always moved forward.
Rovelli says his "own technical area", loop quantum gravity, does not make any sense, and he hopes philosophers will help make sense of it. No chance of that. Loop quantum gravity is a dead end. No good physics has come out of that field, and nothing ever will.

His article is really a defense of ancient philosophy. There is no example of any good from modern philosophers.

Wednesday, June 13, 2018

We dummies should not question the super-smart

Dr. Bee has written a book about failed theories in modern physics, and complains:
“By writing [this book], I waived my hopes of ever getting tenure.” ...

I am not tenured and I do not have a tenure-track position, so not like someone threatened me. I presently have a temporary contract which will run out next year. What I should be doing right now is applying for faculty positions. Now imagine you work at some institution which has a group in my research area. Everyone is happily producing papers in record numbers, but I go around and say this is a waste of money. Would you give me a job? You probably wouldn’t. I probably wouldn’t give me a job either. ...

I have never been an easy fit to academia. I guess I was hoping I’d grow into it, but with time my fit has only become more uneasy. At some point I simply concluded I have had enough of this nonsense. I don’t want to be associated with a community which wastes tax-money because its practitioners think they are morally and intellectually so superior that they cannot possibly be affected by cognitive biases. You only have to read the comments on this blog to witness the origin of the problem, as with commenters who work in the field laughing off the idea that their objectivity can possibly be affected by working in echo-chambers. I can’t even.
I haven't read her book, but it is definitely true that a huge amount of money is pumped into worthless theories, but the leading scholars will not tell the truth about them.

This triggers LuMo into one of his usual rants:
Less than 1,000 people are actually being paid as string theorists or something "really close" in the world now, and even if you realistically assume that the average string theorist is paid more than the average person, the fraction of the mankind's money that goes to string theory is some "one millionth" or so. Or 1/100,000 of the money that goes to porn or any other big industry. Moreover, the funds are allocated by special institutions or donors – they're too technical decisions that the taxpayer simply shouldn't make directly. ...

You don't really need to be a string theorist to understand that string theorists are the cream of the cream of the cream. Most people have met someone who belongs to the cream of the cream, e.g. an astronaut. Well, there's some extra selection related to the theoretical physics-related abilities needed to become a string theorist. ...

If someone has dedicated a few years to these matters and he has failed to learn string theory and to understand that it's the only known promising way to go beyond quantum field theory as of 2018, then I can assure you that his IQ is below 150. ...

If you investigate what smart enough people – who have cared about these matters – honestly think about string theory, you may really measure their intelligence in this way. The more they appreciate string theory, the smarter they are.
Okay, I admit it, my IQ is only 149, and I do not see how string theory offers any promise to move quantum field theory forward. Research in the field peaked in the 1990s, and it has not even made any significant progress in the last 20 years. The theory still has no known relationship to any observable phenomenon. It is just a mathematical idea that did not pan out.

Tuesday, June 12, 2018

Elegance is the fuzziest aspect of beauty

Physicist Dr. Bee writes, in connections with her new book:
Elegance is the fuzziest aspect of beauty. It is often described as an element of surprise, the “aha-effect,” or the discovery of unexpected connections. One specific aspect of elegance is a theory’s resistance to change, often referred to as “rigidity” or (misleadingly, I think) as the ability of a theory to “explain itself.”

By no way do I mean to propose this as a definition of beauty; it is merely a summary of what physicists mean when they say a theory is beautiful. General relativity, string theory, grand unification, and supersymmetry score high on all three aspects of beauty. The standard model, modified gravity, or asymptotically safe gravity, not so much.

But while physicists largely agree on what they mean by beauty, in some cases they disagree on whether a theory fulfills the requirements. This is the case most prominently for quantum mechanics and the multiverse.
I do not agree that grand unification and supersymmetry are beautiful. They require 100s of new parameters and particles in the theory, over the standard model's 20 or so.

A comment says:
A beautiful equation is also one that exhibits the fewest free parameters while explaining the most physics. That's why general relativity is beautiful while the Lagrangian of the Standard Model is ugly as hell. They both work, one by itself and the other by brute force, although I would never compare one with the other.
No, I disagree. This is like saying that the periodic table of the chemical elements is ugly as hell, because it have 92+ elements and some irregularities. It was vastly simpler than any other categorization of the 1000s of known substances, and put them into simple patterns.

The standard model is just quarks, electrons, and neutrinos, with some flavors, generations, colors, and anti-particles, and some bosons for transmitting forces.

Monday, June 11, 2018

Denying laws of physics in a multiverse

Einstein spent the last 20 years of his life at the Institute for Advanced Study in Princeton, attacking quantum mechanics and pursuing unified field theory. None of that work amounted to anything.

The current director there is physicist and string theorist Robbert Dijkgraaf. He writes a Quanta mag article:
There Are No Laws of Physics. There’s Only the Landscape.

Scientists seek a single description of reality. But modern physics allows for many different descriptions, many equivalent to one another, connected through a vast landscape of mathematical possibility. ...

Did nature have any choice in picking its fundamental laws? Albert Einstein famously believed that, given some general principles, there is essentially a unique way to construct a consistent, functioning universe. In Einstein’s view, if we probed the essence of physics deeply enough, there would be one and only one way in which all the components — matter, radiation, forces, space and time — would fit together to make reality work, just as the gears, springs, dials and wheels of a mechanical clock uniquely combine to keep time.

The current Standard Model of particle physics is indeed a tightly constructed mechanism with only a handful of ingredients. ...

If our world is but one of many, how do we deal with the alternatives? The current point of view can be seen as the polar opposite of Einstein’s dream of a unique cosmos. Modern physicists embrace the vast space of possibilities and try to understand its overarching logic and interconnectedness. From gold diggers they have turned into geographers and geologists, mapping the landscape in detail and studying the forces that have shaped it.

The game changer that led to this switch of perspective has been string theory. At this moment it is the only viable candidate for a theory of nature able to describe all particles and forces, including gravity, while obeying the strict logical rules of quantum mechanics and relativity. The good news is that string theory has no free parameters. It has no dials that can be turned. ...

Why is this all so exciting for physics? First of all, the conclusion that many, if not all, models are part of one huge interconnected space is among the most astonishing results of modern quantum physics. It is a change of perspective worthy of the term “paradigm shift.”
If I did not see the source, I would say that this is the babbling of a crackpot. But this is a top physicist at a top institution publishing in a respected magazine. See also critical comments by Woit.

This is a good example anti-positivist thinking that is the opposite of good science.

Science is all about observing the universe, and developing theories for predicting experiments. Dijkgraaf's approach is to ignore observations, and develop a theory from non-empirical principles.

Some philosophers claim that Einstein discovered relativity with Dijkgraaf-like anti-positivist thinking, but I disprove that in my book and on this blog.

Then Dijkgraaf argues that the exciting part is that theory has no predictive power at all, and is really just a framework for discussing all possible models.

The proponents of many-worlds theory similarly argue that the exciting part of their theory is that all possibilities can happen, and the theory has no predictive power. Their reasoning is different, but the worthlessness of the result is the same.

So why study something that is so transparently worthless? Because it is a paradigm shift, of course, and philosophers assure us that paradigm shifts are not rational.

This article is an even better example of why I wrote a book on How Einstein Ruined Physics. The whole Physics profession has been infected by the most anti-science thinking imaginable.

LuMo's response to defend string theory:
There's no reason to think that the total number of spacetime dimensions is 4, there is nothing wrong mathematically about the numbers 10 and 11, they're in fact preferred by more detailed calculations, and there's nothing unnatural about the compactification to microscopic radii.

But the main point I wanted to convey is that Dijkgraaf seems to deny the reality in these media altogether. He wrote the text as if no "string wars" have ever taken place. But the string wars did take place more than a decade ago. Dijkgraaf was among those who preferred his convenience and didn't do anything at all to help me and others to defeat the enemy – so the enemy has won the battle for the space in the mainstream media.

If you ever want articles written for the popular magazines – including the Quanta Magazine – about modern theoretical physics to be meaningful again, you will first have to restart the string wars and win them. I am afraid that the society has sufficiently deteriorated over those 10+ years of your inaction that a physical elimination of the enemy may be needed.
Got that? It is no use promoting string theory to the general public unless the enemies of string theory are physically eliminated.

Sunday, June 10, 2018

Congress Democrats want to fund quantum computing

The quatum hype continues:

Quantum computing has made it to the United States Congress. "Quantum computing is the next technological frontier that will change the world, and we cannot afford to fall behind," said Senator Kamala Harris (D-California) in a statement passed to Gizmodo. "We must act now to address the challenges we face in the development of this technology -- our future depends on it." From the report:

The bill introduced by Harris in the Senate focuses on defense, calling for the creation of a consortium of researchers selected by the Chief of Naval Research and the Director of the Army Research Laboratory. The consortium would award grants, assist with research, and facilitate partnerships between the members. Another, yet-to-be-introduced bill, seen in draft form by Gizmodo, calls for a 10-year National Quantum Initiative Program to set goals and priorities for quantum computing in the US; invest in the technology; and partner with academia and industry. An office within the Department of Energy would coordinate the program. Another group would include members from the National Science Foundation, the National Institute of Standards and Technology, the Department of Energy, the office of the Director of National Intelligence to coordinate research and education activity between agencies. Furthermore, the draft bill calls for the establishment of up to five Quantum Information Science research centers, as well as two multidisciplinary National Centers for Quantum Research and Education.
According to some big-money Democrat donors, Harris is their best hope for winning the USA Presidency in 2020. Her father is Jamaican and her mother east Indian, so they think that she will play well to current Democrat identity politics, where white males are despised. She has been successful in California politics, but would be considered a leftist kook in much of the rest of the USA.

Here are some comments:
Sure, nobody could so far put up any evidence that Quantum Computing will ever be able to be more efficient than conventional computing, but hey, let's allocate billions to the belief in the hype.
AFAIK, your post is complete nonsense. It is perfectly well known for which tasks quantum computing will be more efficient than conventional computing and how many functioning Qbits you need (with given error rates). Note that the computational power does not increase linearly when doubling qbits. Apart from the tasks that we know can be solved, there is an ever expanding list of research results of more tasks that quantum computers are suitable for. You have to think of a quantum computer like a giant and fragile (unfortunately) co-processor that is insanely fast for certain tasks, not as a replacement for conventional computers.
But as the poster you rudely accused of posting nonsense wrote, it's never been demonstrated.

There are legitimate reasons to think it will never happen: Noise, cost scaling of maintaining low entropy space, incompatibility between quantum error correction on qbits and doing logic on those qbits.

I'm a sceptic. I don't expect to see the ECDLP for deployed key sizes solved by quantum computers, ever.
I agree with that last comment.

The physics community is much too corrupt to point out that the Harris bill is a big waste of money.

Friday, June 8, 2018

A foolish unified field theory using E8

Someone asked me about this SciAm article:
A Geometric Theory of Everything
Deep down, the particles and forces of the universe are a manifestation of exquisite geometry
A. Garrett Lisi, James Owen Weatherall
December 1, 2010

This quest for unification is driven by practical, philosophical and aesthetic considerations. When successful, merging theories clarifies our understanding of the universe and leads us to discover things we might otherwise never have suspected. Much of the activity in experimental particle physics today, at accelerators such as the Large Hadron Collider at CERN near Geneva, involves a search for novel phenomena predicted by the unified electroweak theory. In addition to predicting new physical effects, a unified theory provides a more aesthetically satisfying picture of how our universe operates. Many physicists share an intuition that, at the deepest level, all physical phenomena match the patterns of some beautiful mathematical structure.

The current best theory of nongravitational forces — the electromagnetic, weak and strong nuclear force — was largely completed by the 1970s and has become familiar as the Standard Model of particle physics. Mathematically, the theory describes these forces and particles as the dynamics of elegant geometric objects called Lie groups and fiber bundles. It is, however, somewhat of a patchwork; a separate geometric object governs each force. Over the years physicists have proposed various Grand Unified Theories, or GUTs, in which a single geometric object would explain all these forces, but no one yet knows which, if any, of these theories is true.

This article is a good example of why I wrote a book on How Einstein Ruined Physics.

The paper does not solve any physical or mathematical problems. It does not explain any experiments. The only thing going for it is a philosophical belief in unification, which is essentially the opposite of reductionism.

In science, reductionism is a good thing, not unification. The Standard Model reduces the 100s of known particles to a 12-parameter Lie group. The philosophy behind these grand unified models is that this was too much reductionism, and we should use a much larger group that does not separate the forces. The above paper uses a 248-parameter group. That means at least 236 more particles than have ever been seen in nature.

The term "simple" in the title does not mean it is a simple model. It means that it uses a mathematical group that is simple in the sense of not being reducible to smaller groups.

Got that? We have a 12-parameter model that matches experiments perfectly, and that allows the fundamental forces to be understood as strong, weak, electromagnetic, and gravity. Lisi and Weatherall say that it is philosophically desirable to trade that for a 248-parameter model that does not match any experiments, but which is supposed to be better because it does not allow treating the fundamental forces separately.

I believe this entire line of research into unified field theories to be wrong-headed. Reductionism is what makes the Standard Model and other good theories great, and not a bad thing. The unified field theorists hate the Standard Model for the same reasons that made it so successful.

These researchers are widely believed to be crackpots, but somehow they got an article in Scientific American. That magazine has really gone downhill.

Wednesday, June 6, 2018

Maudlin attacks positivism in book review

Philosopher of Physics Tim Maudlin writes in a book review, which is really just an excuse to write a nice essay on his favorite topics:
Logical positivism is a very attractive view for people who do not want to worry about what they cannot observe. It is ultimately a theory about meaning, about the content of a theory. According to the positivists, a theory says no more than its observable consequences.

Logical positivism has been killed many times over by philosophers. But no matter how many stakes are driven through its heart, it arises unbidden in the minds of scientists. For if the content of a theory goes beyond what you can observe, then you can never, in principle, be sure that any theory is right. And that means there can be interminable arguments about which theory is right that cannot be settled by observation. ...

Einstein was the great anti-positivist. His position is often called realism, but a better name is perhaps common sense. ...

So Einstein and Bohr were polar opposites in their approach to physics. Einstein demanded a clear and comprehensible account of what is going on in the physical world — at all scales — in space and time. Bohr thought that the key to quantum mechanics was the realization that no such thing could be had.

When the Copenhagen interpretation got imported to the pragmatic soil of the United States, Bohr’s incomprehensible nonsense was replaced by the more concise “shut up and calculate.” That is the philosophy that dominates physics to this day.
Maudlin is partial to interpretations of quantum mechanics that try to theorize about things that cannot be observed.

He is right that philosophers have done everything they can to kill logical positivism, but their arguments are unconvincing to many scientists. Einstein's anti-positivism has been a gigantic failure. Nothing good has come out of it.

On the other hand, all the great Physics triumphs of the XX century have been explicitly or implicitly positivist.

Einstein's early work was considered to be implicitly positivist. Many physicists praise him for what appeared to be positivist thinking. But he repudiated positivism in later life, and pursued unified field theories and attacked quantum mechanics.

"Realism" is a poor name for anti-positivism. The observations are what is real. Speculating about what cannot be seen is not.

The quantum (anti-positivist) realist seeks to find some intuitive simplistic mathematical model of the atom, like the Bohr atom of the old quantum theory. That might be nice, but there is good reason to believe that no such thing exists. The electron is not really a particle or a wave or anything else similar to any macroscopic object. The faulty models of the non-standard quantum interpretations have not been helpful.

As the great physicist Murray Gell-Mann said, after conversations with Putnam, “Bohr brainwashed a generation of physicists.” A vivid illustration of Kuhn’s kinship to Bohr in this respect can be drawn from Morris: “What I hated most about Kuhn’s lectures was the combination of obscurantism and dogmatism. On one hand, he was extremely dogmatic. On the other, it was never really clear about what.” It is no stretch to apply this precise description to Bohr, and not much of one to apply it to The Critique of Pure Reason as well.
There is something very strange about accusing a positivist of being obscure and dogmatic. The positivist is just the opposite.

The positivist talks about what is measurable and demonstrable. His conclusions can all be shown by experiments and logic. He remains silent on matters that cannot be empirically or logically decided.

I agree that Kuhn and later philosophers were obscure and dogmatic. Kuhn claimed to have some grand theory about how science works, but none of it makes any sense on closer inspection. He was mainly famous for his "paradigm shift" book, but no one can agree on what a paradigm shift is.
The difference between indicative propositions about the actual world and counterfactual propositions about mere possibilities is illustrated by these two conditionals: if Lee Harvey Oswald did not shoot John F. Kennedy, then someone else did (indicative and true); and if Oswald had not shot Kennedy, then someone else would have (counterfactual and probably false).
I do think that this sort of confusion about counterfactuals is at the root of some of the gripe about quantum mechanics. If you ask questions like "if the photon went thru this slit, then where would it hit the screen?", then it is hard to see why there is a diffraction pattern on the screen.
Kuhn implicitly accepts the descriptive view. The meanings of theoretical terms such as “mass” are determined by the theories in which they are deployed. Mass as used by Newton means something different from mass as employed by Einstein because the theories they are embedded in are different. Therefore Newtonians cannot really communicate with Einsteinians, Ptolemaic astronomers cannot really communicate with Copernican astronomers, and so on. This is why, for Kuhn, scientific revolutions cannot be settled by rational means: the disputants necessarily speak different languages.

The descriptive view was demolished by Kripke and Putnam in a series of lectures and papers in the 1970s.
Yes, that is at the core of what is wrong with Kuhn. Kuhn deduces that science is an irrational process (or "arational", which is the term he prefers). Denying that science is rational is a direct attack on the whole idea that scientists seek truth.

LuMo writes about a tweet:
The Defeat of Reason: Philosopher Tim Maudlin rebuts the influential relativism of Bohr's interpretation of quantum mechanics and Kuhn's interpretation of science.
— Steven Pinker (@sapinker) June 3, 2018
Pinker has basically endorsed an embarrassing, supportive review by Tim Maudlin of the painful anti-quantum book by Adam Becker. It's no coincidence that the title of Maudlin's reason describes quantum mechanics (plus, less importantly, Kuhn's views about the evolution of science as a human enterprise) as a "defeat of reason".
Again, it is odd to conflate Bohr's positivism with Kuhn's arational anti-positivism. Arguing that quantum mechanics is somehow a defeat of reason is really wacky. Pinker should stick to his field, which is psychology and language.

Monday, June 4, 2018

Relativity founded on experiment, not ad hoc

Wikipedia is a great resource on relativity, but of course it cites a lot of oddball reasons for crediting Einstein. In particular, it gives a silly argument cooked up the Einstein-loving historians about being "ad hoc", such as here:
Eventually, Albert Einstein (1905) was the first[4] to completely remove the ad hoc character from the contraction hypothesis, by demonstrating that this contraction did not require motion through a supposed aether, but could be explained using special relativity, which changed our notions of space, time, and simultaneity.[5] Einstein's view was further elaborated by Hermann Minkowski, who demonstrated the geometrical interpretation of all relativistic effects by introducing his concept of four-dimensional spacetime.[6]
and here:
In 1907 Einstein criticized the "ad hoc" character of Lorentz's contraction hypothesis in his theory of electrons, because according to him it was an artificial assumption to make the Michelson–Morley experiment conform to Lorentz's stationary aether and the relativity principle.[A 25] Einstein argued that Lorentz's "local time" can simply be called "time", and he stated that the immobile ether as the theoretical foundation of electrodynamics was unsatisfactory.[A 26]

FitzGerald and Lorentz looked at the Michelson-Morley experiment, and other experiments, and deduced that the speed of light appeared the same for all observers. This could be explained by a length contraction.

Einstein looked at the work of Lorentz, and postulated that the speed of light was the same for all observers. Then he deduced the same length contraction.

FitzGerald's and Lorentz's works are said to be ad hoc, because they based it on experiment. Einstein's was not, according to this argument, because he based it on postulates.

Einstein did not have the modern geometrical understanding of relativity. Minkowski demonstrated the geometrical view and elaborated on Poincare's concept of four-dimensional spacetime, but it does not appear that Einstein had any influence on Minkowski.

Special relativity did change our notions of space, time, and simultaneity, but not because of anything Einstein said. Everything Einstein said on these subjects was said earlier and better by Lorentz, Poincare, and Minkowski.

Poincare argued several years earlier that Lorentz's local time can be simply called time. Einstein was just agreeing with Poincare.

I cannot correct Wikipedia, because policy favors the so-called "reliable sources", such as Einstein biographies. The fact is that most of the historians favor Einstein. But if you trace Wikipedia articles to the primary sources, you can see that Einstein's contributions were merely expository.

There is something insidious about this whole concept of a theory being ad hoc. Basing a theory on experiment is a good thing, not a bad thing. If an experiment comes along that challenges your theories, like Michelson-Morley did, then finding some way to modify your theories to accommodate the experiment is just what a good scientist should do.

You might say that you don't just want to fudge your formulas to match the data. It is better to have an underlying theory to explain the fudge. But FitzGerald and Lorentz had exactly that. They had a belief that solid matter was held together by electromagnetic forces, and that changes in the fields would contract the matter. Wikipedia denigrates this, because detailed theories for molecular forces were only worked out later. But nevertheless, their beliefs turned out to be correct, and solid matter is held together by electromagnetic forces. The length contraction can be derived from Lorentz transformations and Maxwell's equations, as Lorentz proved, long before Einstein.

But Einstein had a better explanation, you might say. But that is not true. Einstein did not have the geometrical interpretation that is preferred today, and rejected it for years after others accepted it.

No, the rationale for crediting Einstein is based on anti-scientific ideologies, such as preferring postulates to experiments, and deriving theories grounded in experiemnt as being ad hoc.

For details, see my book, How Einstein Ruined Physics, and postings on this blog, such as this 2017 book update.

Friday, June 1, 2018

How mathematicians connected with physicists

ProfessorDavid R. Morrison just posted Geometry and Physics: An Overview:
We present some episodes from the history of interactions between geometry and physics over the past century.
He says "we", but he is the sole author.
In 1954, during the era of minimal communication between mathematics and theoretical physics, C. N. Yang and R. L. Mills [YM54] introduced gauge transformations consisting of locally varying symmetries taking values in a compact Lie group3 G, and studied physical theories which are invariant under such gauge transformations. These generalized the already-familiar abelian gauge transformations from electromagnetism - the same ones we encountered in Section 1 - for which G = U(1). These gauge theories (or "Yang-Mills theories") eventually became the basis of the Standard Model of particle physics, the formulation of which was finalized in the mid 1970s using the group4 G = (SU(3) x SU(2) x U(1))/Z6.

In the late 1960s and early 1970s, Yang got acquainted with James Simons, then the mathematics department chair at SUNY Stony Brook where Yang was a professor of physics. In the course of their conversations,5 Yang and Simons came to recognize that there were important similarities between formulas which were showing up in Yang's work, and formulas which appeared in parts of mathematics which Simons was familiar with. Simons identified the relevant mathematics as the mathematical theory of connections on fiber bundles, and recommended that Yang consult Steenrod's foundational book on the subject [Ste51] (which coincidentally was published just a few years prior to the work of Yang and Mills). Yang found the book difficult to read, but through further discussions with Simons and other mathematicians (including S.-S. Chern) he came to appreciate the power of the mathematical tools which fiber bundle theory offered. ...

Simons communicated these newly uncovered connections with physics to Isadore Singer at MIT who in turn discussed them with Michael Atiyah of Cambridge University. It is likely that similar observations were made independently by others.
I have heard this story directly from Singer, Chern, and others, but I find it hard to believe.

I believe Chern said that Yang took a differential geometry course from him in China, before that 1954 paper was written. So Yang did not really re-invent gauge theories. Yang was also an ego-maniac, so maybe he pretended to.

Indeed, the names "gauge theory" and "gauge transformation" date back to some mathematical physics by Hermann Weyl in 1918. Wikipedia says that Pauli popularized the first widely accepted gauge theory in 1941. Some of the main ideas seem to have been published as early as 1914.

At some point it must have been obvious that special relativity could be elegantly described as spacetime with the metric dx2 + dy2 + dz2 - dt2, with electromagnetism being a connection on a circle (S1) bundle. But I cannot find who explicitly said this first.

Weyl was very close to saying this in 1919, and so was Kaluza, also in 1919. So the idea that this formulation was only discovered in the 1970s is absurd.

One possible explanation is that mathematicians did not realize the importance of bundles, not derived from tangent spaces, until the 1950s. Also, mathematicians quit talking to physicists around 1950. So mathematicians had an intuitive understanding in the 1920s, but would not have expressed it in terms of bundles until the 1950s, and physicists never learned it. I am not sure it ever made it into physics textbooks until recently.

At any rate, the standard model of particle physics is based on replacing the circle with SU(3)xSU(2)xU(1), and using the same geometric formalism. So you would think that physicists would think that the geometric interpretation of electromagnetism would be fundamental and important enough for elementary textbooks.

While Morrison's paper has many examples of mathematical advances related to geometry and physics, the gauge theory of the standard model is the only one that involves genuine physics. The others could be more accurately described as mathematics that was partially inspired by physics, but which does not actually apply to any physical situation.

Wednesday, May 30, 2018

Research shows 2-qubit nuclear computation

One of the supposed applications of quantum computers is to simulate quantum mechanics. Enthusiasts claim that they will be able to calculate the structure of drug molecules and protein folding, and maybe cure cancer and find other miracles.

Here is the latest cutting edge research:
You see, theorists — the potential users of quantum computers — have a dilemma. Quantum computers hold a lot of promise. It is highly likely that a good quantum computer can calculate the properties of things like molecules and atomic nuclei much more efficiently than a classical computer. Unfortunately, the current generation of quantum computers, especially those that the average theorist can get access to, are rather limited. This gives the theorists a challenge: can they make computations less resource-intensive so that they can be performed on the currently available hardware? ...

They reduced the calculation of nucleon energy levels to mostly single-qubit operations, with just a few two-qubit ones thrown in. From this, they were able to calculate the ground state energy and estimate the binding energy (the energy required to break up the nucleus) for a deuterium nucleus.

As with all quantum computations, the results are statistical in nature, so the researchers have to perform the computation many times and take the average result. In this case, the researchers made use of two quantum computers — the IBM QX5 and the Rigetti 19Q — via their publicly available cloud computing APIs. This limited the number of computations that they could perform. Despite this, they obtained results within a few percent of the experimental values.

The calculation itself is nothing special. This particular nucleus has long been solvable with classical computers.
Got that? One or two qubits. You could simulate those qubits on a 1970s era pocket calculator.

This is a very very very long way from doing anything worthwhile.

Quantum computers have also claimed to have factored 15 into 3x5, but most of the cleverness went into reducing the work that the quantum computer had to to, so the quantum computer just had to do a couple of steps that any fool could do by hand.

Tuesday, May 29, 2018

New book trashes paradigm theory

I mentioned that Oscar-winning documentary filmmaker Error Morris is writing a book trashing his old philosophy of science professor, Mr. Paradigm shift. The book is now out: The Ashtray: (Or the Man Who Denied Reality).

The late Thomas Kuhn does seem to have convinced most of academia that there is no objective truth, or progress in science. So his fans will probably trash this book. I haven't seen the book, but Morris is right that Kuhn's central theses don't even make any sense. And yet he was probably the most widely praised philosopher of the last 50 years.

I think that it is fair to say he was the man who denied reality. More than anyone else, he has convinced intellectuals that there is no reality. I don't know whether he really threw an ashtray and kicked out a grad student.

Kuhn went astray by doing a detailed historical analysis of Ptolemaic and Copernican astronomy. He found that Copernican astronomy was not really any simpler or more accurate or having any compelling scientific advantage. And yet it was a revolution in the sense of the Earth revolving around the Sun. So he concluded that scientific revolutions don't have any advantages over the alternatives. Yeah, it was that stupid. It is amazing that so many people bought into his theory of science.

Sunday, May 27, 2018

Arizona revises school science standards

Biology professor Jerry Coyne complains about Arizona watering down education standards for evolution. I am more offended by some of the others, such as:
P2: Objects can affect other objects at a distance
This is clearly wrong. No objects ever affect other objects at a distance. Newton had a theory that gravity worked that way, but even he was not happy about it.

Others are also questionable. Matter is made of quantum fields, not particles.

One says "human processes ... shape the Earth's surface ..." I guess that can be true if humans build a dam, but the effect on the shape of the Earth's surface is extremely tiny.

Coyne doesn't like evolution being called a "theory".
“What we know is true and what we believe might be true but is not proven and that’s the reality,” Diane Douglas, state superintendent of public instruction, tells 3TV/CBS 5. “Evolution has been an ongoing debate for almost 100 years now. There is science to back up parts of it, but not all of it.”

“Not proven”!!!?? She fails to clarify, of course, that nothing is “proven” in science: we just get better and better explanations. But if you use “proven” in the vernacular sense, as something on whose truth you’d bet your house and life savings, then yes, evolution is as “proven” as is the fact that the Earth goes around the Sun and that benzene has six carbon atoms arranged in a ring.
Coyne wrote a pretty good book on this subject, but he needs better examples of scientific facts.

Earth only goes around the Sun if you take the Sun as a frame of reference. So I would not call that a proven fact.

I guess it is okay to say benzene is a ring, but the wave function is more complicated than that.

Update: A comment quotes from the full document:
All objects have an effect on other objects without being in contact with them. In some cases, the effect travels out from the source to the receiver in the form of radiation such as visible light. In other cases, action at a distance is explained in terms of the existence of a field of influence between objects, such as a magnetic, electric, or gravitational field. Gravity is a universal force of attraction between all objects, however large or small, keeping the planets in orbit around the Sun and causing terrestrial objects to fall towards the center of the Earth.
This reads as if it were written 150 years ago. Since about 1880 we have known that visible light is a pulsing electromagnetic field. Since about 1915, gravity has been understood as curved spacetime, and not a force acting at a distance.

Wednesday, May 23, 2018

Giant black hole is cosmic vacuum cleaner

Science popularists used to go around telling ppl that it is a big myth that a black hole can act as a cosmic vacuum cleaner. The Wikipedia List of common misconceptio used to say so.

The NY Times reports:
Astronomers in Australia now say they have found the hungriest heart in all the cosmos. It is a black hole 20 billion times the mass of the sun eating the equivalent of a star every two days.

The black hole is growing so rapidly, said Christian Wolf, of the Australian National University, who led the team that found it in the depths of time, “that it is probably 10,000 times brighter than the galaxy it lives in.” So bright, that it is dazzling our view and we can’t see the galaxy itself. ...

Black holes are a one-way gate to gravitational oblivion, according to Einstein’s theory of general relativity, but they can only swallow so much, depending on their size; the rest of the matter and energy gets splashed out across space, producing the fireworks popularly known as quasars.
Continue reading the main story

The blaze from material swirling around this newly observed drainpipe into eternity — known officially as SMSS J215728.21-360215.1 — is as luminous as 700 trillion suns, according to Dr. Wolf and his collaborators. If it were at the center of our own galaxy, the Milky Way, it would be 10 times brighter than the moon and bathe the Earth in so many X-rays that life would be impossible.
Based on this, I'd say that it is fair to call a black hole a cosmic vacuum cleaner. No space traveler would want to get anywhere near such a thing.

This is also one of the brightest objects in the universe. Not so black, I guess.

NY Times also reports:
It happens every 405,000 years. The Earth’s orbit gradually changes shape from almost circular to slightly elliptical over a period of 202,500 years, and then starts returning to form over the next 202,500 years — like a metronome swinging side to side.

Right now, we are in an almost perfectly circular orbit around the sun, and soon — within some thousands of years, that is — we will start moving toward the elliptical.

This happens because of the Earth’s gravitational interactions with other planets, especially Jupiter and Venus — Jupiter because it is very large, and Venus because it is very near.
Is this settled science? It seems like it could have far-reaching consequences. Maybe life on Earth was only possible because Earth is stabilized by the Moon, Venus, and Jupiter.

Monday, May 21, 2018

Quantum supremacy delayed to attend funerals

Scott Aaronson announces:
a weeklong visit to Google’s quantum computing group in LA. While we mourned tragedies—multiple members of the quantum computing community lost loved ones in recent weeks — it was great to be among so many friends, and great to talk and think for once about actual progress that’s happening in the world, as opposed to people saying mean things on Twitter. Skipping over its plans to build a 49-qubit chip, Google is now going straight for 72 qubits. And we now have some viable things that one can do, or try to do, with such a chip, beyond simply proving quantum supremacy — I’ll say more about that in subsequent posts.
When you are overdue on a high-profile project, the last thing you want to admit is that your goal is unrealizable.

No, a better strategy is to (1) say that you are attending funerals of family members, and (2) raise the stakes, and say that a higher goal can be achieved instead if only managed supplies more time, staff, and money.

Am I being too cynical here? Okay, maybe.

Google and IBM both bragged that they would achieve quantum supremacy in 2017. They said that 50 qubits was the magic threshold. They they dropped back to 49 qubits, a number that seems carefully chosen to allow them to claim the first real quantum computer, but such that they would not have to show the performance that we expect from quantum supremacy.

2017 ended with no new 49-qubit quantum computer, no quantum supremacy, and no explanation for the failed promises.

Okay, maybe they really did have some funerals to attend. Maybe quantum supremacy is really just around the corner.

I don't believe it. They are stringing along with empty promises, as this community has done for 20 years.

I will be watching for any proof that I am wrong. I will post it as soon as it is announced. Then you can all laugh at me.

But if there is still no quantum supremacy in 5 or 10 years, what will you say then?

Monday, May 14, 2018

Beables and brown cows

A new paper starts:
From its earliest days nearly a century ago, quantum mechanics has proven itself to be a tremendously accurate yet intellectually unsatisfying theory to many. Not the least of its problems is that it is a theory about the results of measurements. As John Bell once said in introducing the concept of `beables', it should be possible to say what is rather than merely what is observed.
This paragraph describes how Physics forked into hard science and philosophical beable-babble.

I am a logical positivist. So I have a simple attitude when you start talking about things that cannot be observed, then you as might as well be talking about ghosts. If there is no scientific observational way of saying that you are right or wrong, then it is just opinion, or philosophy, or religion, or some other immaterial belief. It is like you telling me that you like paintings of water lillies. I will not usually even have an opinion as to whether you are right or wrong, because it is not clear that any such opinion makes any sense.

Bohr, Heisenberg, and other creators of quantum mechanics were positivists.

At some point positivism fell out of fashion, and hardly anyone advocates anymore. But this beable stuff has gone nowhere. No good physics has resulted from beable theory.

The paper tells this story:
When I was in graduate school in Scotland, I was told the following parable by my advisors. An economist, a mathematician, and a logician were on a train traveling north. Just after they passed the Scottish border they noticed a single cow standing in a field. The economist remarked, "That cow is brown. All cows in Scotland must be brown." The mathematician replied, "No, one cow in Scotland is brown." The logician quietly but firmly muttered "No, one side of one cow in Scotland is brown." There are many versions of this parable involving a variety of professions and there are any number of lessons to be taken from it. It is usually meant as a dig at one of the particular professions that is included, especially when told by a member of one of the other professions. At the heart of the parable, though, is an open question: how much can we reasonably infer from a given observation?
The author thinks that the mathematician is the most reasonable of the three.

At least cow color can be measurement. Many of the arguments about foundational quantum mechanics involves things that cannot be measured.

Saturday, May 5, 2018

Aaronson discusses whether education is worthless

A new book by radical libertarian economist Bryan Caplan says that public education is a big waste of money, and complexity theorist Scott Aaronson reviews it:
When the US Congress was debating whether to cancel the Superconducting Supercollider, a few condensed-matter physicists famously testified against the project. They thought that $10-$20 billion for a single experiment was excessive, and that they could provide way more societal value with that kind of money were it reallocated to them. We all know what happened: the SSC was cancelled, and of the money that was freed up, 0% — absolutely none of it — went to any of the other research favored by the SSC’s opponents.

If Caplan were to get his way, I fear that the story would be similar. Caplan talks about all the other priorities — from feeding the world’s poor to curing diseases to fixing crumbling infrastructure — that could be funded using the trillions currently wasted on runaway credential signaling. But in any future I can plausibly imagine where the government actually axes education, the savings go to things like enriching the leaders’ cronies and launching vanity wars.

My preferences for American politics have two tiers. In the first tier, I simply want the Democrats to vanquish the Republicans, in every office from president down to dogcatcher, in order to prevent further spiraling into nihilistic quasi-fascism, and to restore the baseline non-horribleness that we know is possible for rich liberal democracies.
No, cost overruns killed the SSC. It was designed and budgeted for a 4cm tube, and they later decided that they needed 5cm, requiring billions of dollars more in superconducting magnets.

The SSC was oversold, but I doubt that Congress realized that. It was supposed to find lots of new physics. The Europeans then went and built the LHC, but all it did was to confirm the Standard Model and measure the Higgs mass.

I am mainly just trying to understand Aaronson's thinking here. He is obviously a typical Jewish leftist intellectual authoritarian here, as he pushes for one-party rule with ample funding for his favorite academic projects.

Does this explain his strange silence about quantum computing? He has been refusing to comment to the press. He has spent much of his life researching the potential for quantum computing, so you'd think that he would be excited by all the current research. Maybe he knows that it is an overhyped dud, but doesn't want to say so because he doesn't want the research money to be diverted into areas of less intellectual interest to him.

Monday, April 30, 2018

Coyne veers above his leftist pay grade

Leftist-atheist-evolutionist-determinist professor Jerry Coyne is ranting against free will again. He is a hard-core determinist (altho maybe not a super-determinist), and he frequently argues against the possibility that determinism or quantum physics leave any room for free will. The problems with his approach are:

1. He says fundamental phyisics proves that human behavior is deterministic.
2. He often tells people what they ought to do, which seems impossible if there is no free will.
3. He is frustrated that the goals of leftist egalitarianism appear unachievable, if all success is due to unjust privileges.

I will detail these points in turn.
All three philosophers save Beebee are determinists: she argues that science is a long way from determining whether determinism is true, though I think she’s dead wrong here. As Sean Carroll notes, the laws of physics of everyday life are completely understood, and to me that means that determinism is correct (save for the possibility of true quantum indeterminacy, which can’t play a role in any meaningful notion of human agency, since we have no control over our electrons).

Both Beebee and Blackburn evince various degrees of “compatibilism”: that there are some notions of free will that are compatible with determinism. But none of the discussants espouse any form of contracausal free will: that at any time, you could have done something other than what you did. ...

In fact, virtually all philosophers, including compatibilists, are determinists with respect to human behavior, as there is no evidence to the contrary and, as Sean Carroll observes, the physics of everyday life is know pretty completely.
It is bizarre to deduce determinism from fundamental physics. Quantum mechanics is explicitly non-deterministic. Even classical mechanics is non-deterministic, if you take into account measurement error and chaos.

Sean M. Carroll's excuse is that he believes in the many-worlds interpretation (MWI) of quantum mechanics. Thus he would say that a coin toss is not random, because the toss causes the world to split into two, one with heads and one with tails. If you think that you are making a free-will decision, all you are really doing is splitting your body and mind into multiple worlds, with each possible decision occurring in different worlds. He wrote a paper once claiming that it is still possible to believe in probabilities in MWI, but nobody accepts it.

Coyne's version of the anti-free-will argument is to say that all physical events are either determined by physical law, or left undetermined by quantum mechanics. If an event is undetermined by physics, then it cannot be determined by a free will decision either, as the mind is governed by physical law.

This argument is not really a scientific argument, as it does not depend on any scientific theories or facts. Ultimately, it is just a tricky way of trying to define away the possibility of free will. Believe it if you want, but there is no science behind it.

The fact is that we don't have a scientific theory of consciousness and free will. I do have everyday experience that convinces me that I have conscientiousness and free will, and there is no science to the contrary.
I’ve been accused, for instance, of being a compatibilist by using the word “ought”. But to me that the word is shorthand for the idea “if you want good consequence X, you should do action Y, and if you don’t, you can be called out.” I see no sense of agency in using that word, or that it plays a role in any form of free will that’s meaningful. Rather, using “ought” is just telling someone that actions have predictable consequences, and you can be shamed/punished/jailed for not doing something that promotes good consequences.
So it is his duty to go around shaming fully-programmed robots for doing what they are programmed to do?

Fine, let's shame fat people for eating too much. Let's shame homosexuals for risking GRIDS, as it used to be called. Is that what he is condoning here? He once called me out for being an idiot.

Rather than refute this view, I am just trying to grok it. How does one get thru life thinking that all his decisions are pre-determined somehow, and still have all these opinions about what other people ought to be doing?
The connection between determinism and the social-justice notion of “privilege” is one worth exploring, ...

This leads to an infinite-dimensional intersectionality in which all forms of undeserved “privilege” should be battled: not to make everybody’s life outcome in society equal, but to ensure that everybody gets the same chance to succeed. Why one form of privilege, say “whiteness” or “maleness” should get more attention than others depends on whether those traits are the most important in determining equal opportunity as opposed to, say, factors like parental wealth, intelligence social class.

That is all above my pay grade, but, as Burkemann notes, we’ll never have equality of outcome, for we’ll always have winners and losers. All we can do is ensure equal opportunity. But that in itself is a huge social task, and it must begin at birth.
Why does he want to give the un-privileged a chance to succeed, if they have no free will and their fates are determined anyway?

He's right that many traits influence success in life, and that we will never have equality of outcome. But then why is Coyne a leftist?

The main difference between right-wingers and left-wingers is that right-wingers accept that there are human differences that cannot be eliminated by public policy. Left-wingers just mindlessly deny knowledge on this subject, and pretend that public policy can perfect and equalize the human condition.

Coyne is amusing because all of the contradictions in his life philosophy are staring him in the face, and he doesn't know what to do about it. How could he, if he rejects free will?