Thursday, December 29, 2022

What is Relativity?

There are two main ways to describe the theory of relativity.

(1) A way of reconciling Maxwell's electromagnetism theory with the motion of the Earth.

(2) Combining space and time into a 4-dimensional spacetime with a non-Euclidean geometry, such that the laws of physics respect that geometry.

Version (1) was the view of Lorentz and Einstein (up to 1912 or so), and also FitzGerald, Maxwell, and other pioneers.

Version (2) was the view of Poincare (1915), Minkowski (1907), Grossmann (1913), and Hilbert (1915). Einstein sometimes seemed to eventually adopt the view, but he also denied that geometry had any essential role.

Wikipedia defines:

The theory of relativity usually encompasses two interrelated theories by Albert Einstein: special relativity and general relativity, proposed and published in 1905 and 1915, respectively.[1] Special relativity applies to all physical phenomena in the absence of gravity. General relativity explains the law of gravitation and its relation to the forces of nature.[2] It applies to the cosmological and astrophysical realm, including astronomy.[3]

The theory transformed theoretical physics and astronomy during the 20th century, superseding a 200-year-old theory of mechanics created primarily by Isaac Newton.[3][4][5] It introduced concepts including 4-dimensional spacetime as a unified entity of space and time, ...

Typical scholarly historical papers are Elie Zahar's Why did Einstein's Programme supercede Lorentz's Part I, and Part II. (Paywalled)

It is a historical fact that Einstein's programme had no significant influence, and it was the Poincare-Minkowski geometric view that superseded Lorentz's in 1908. Einstein's programme was called the Lorentz-Einstein theory, and nobody thought he was saying anything different from Lorentz. Almost no one. Vladimir Varicak credited Einstein with a more geometric view in 1911, but Einstein published a rebuttal, denying any difference from Lorentz.

In today's textbooks, relativity is the covariance of Maxwell's equations and other laws under the symmetries of spacetime, notably Lorentz transformations.

Wikipedia is correct that relativity "introduced concepts including 4-dimensional spacetime as a unified entity of space and time", but Einstein did not introduce that concept, and did not accept it in 1908 when everyone else did. It is not clear that he ever completely accepted it, as he denied the importance of geometry.

When I say non-Euclidean geometry, I do not just mean the curvature of general relativity, or the hyperbolic geometry rapidity of special relativity. I mean the geometry that Minkowski so clearly explained in 1907-8, where 4-D spacetime has an indefinite metric and a symmetry group preserving that metric. A geometry is a space with some structure, and a group preserving the structure. For example, see this recent Roger Penrose interview, where he calls it Minkowski geometry. Or see this recent Terry Tao post.

Thursday, December 22, 2022

Is Psi Ontic or Epistemic?

Here is a typical paper addressing the reality of the wave function in quantum mechanics:
The ontological models framework distinguishes ψ-ontic from ψ-epistemic wave- functions. It is, in general, quite straightforward to categorize the wave-function of a certain quantum theory. Nevertheless, there has been a debate about the ontological status of the wave-function in the statistical interpretation of quantum mechanics: is it ψ-epistemic and incomplete or ψ-ontic and complete?
I do not see how this question makes any sense. The wave function is not directly observable. It is useful for making predictions. What sense is there in wondering whether or not it is real?

Bell's theorem shows that psi cannot be epistemic in the sense that it shows our knowledge about an underlying local hidden variable theory. But it could still encode our knowledge about the physical state.

The paper also talks about whether psi is statistical. Agaikn, this makes no sense. All physical theories are statistical in the sense that you can run multiple experiments, and compute statistics on the predictions and outcomes. Quantum mechanics is not much different from classical mechanics in this way. All the theories give error bars on predictions, if you do them right.

Supposedly the PBR Theorem proves that psi is ontic. This is very misleading, at best. If I release an electron from my lab, its wave funciton will soon be miles wide. Does any think that the ontic reality of that election is miles wide? No, that is absurd. Only our uncertainty is miles wide, because we don't know which way the electron went.

The PBR paper was originally submitted under the title, "The quantum state cannot be interpreted statistically". It is baffling how anyone could think that they proved something so absurd. Everything can be interpreted statistically. How could it not be? When the paper was published, the editors forced them to change the title to "On the reality of the quantum state". Of course the paper cannot prove anything about reality either.

The Schroedinger Cat thought experiment was supposed to dispose of the idea that the wave function is ontic. The cat is not really half dead and half alive. The wave function predicts probabilities consistent with observation, and that is all. Schroedinger and Bohr thought that this was obvious.

A lot of people seem to believe that the Copenhagen interpretation requires believing in a cat that is a superposition of dead and alive. Wikipedia says so. But it also says that Bohr himself did not, and believed that something like decoherence would irreversibly put the cat into the dead state or the alive state, even before anyone opens the box.

Here is a recently-released short video on Max Tegmark - Many Worlds of Quantum Theory. I wonder if these physicists realize how ridiculous they sound when they gave layman explanations.

He says he believes that the many worlds are real, because we see electrons being in two places at once in the lab. But no one has ever seen an electron in two places at once. The double-slit experiment can be interpreted as an electron going thru both slits, but that is just a wave property of an electron. And even if an electron can be in two places, and humans are made of electrons, it does not follow that humans can be in two places.

Never mind the details. Many worlds theory is an absurdity. He says these worlds are real, and there is no evidence for them whatsoever. Listening to him is like going to a comic book convention and watching someone, dressed in a costume, ramble about some fantasy world as if it were real.

Tegmark ends by saying that many-worlds could be disproved if quantum mechanics were disproved. He does not even admit the obvious possibility that quantum mechanics is correct and many-worlds is not.

Sabine Hossenfelder has a new lecture on The Other Side Of Physics (TEDxNewcastle). It is hard to take anything she says seriously, once you realize that she believes in superdeterminism. She talks about Physics as a tool for answering questions, but that is not true under superdeterminism. That teaches that experimental results cannot be believed, because the outcomes could have been forced by a conspiracy dating back to the Big Bang. In particular, it says that all the quantum experiments could be fake, and that we really live under classical mechanics.

In particular, she argues that the correlations of the Bell test experiments are all wrong because they are artifacts of the experimenters failing to properly randomize the input data, and they always fail because of mysterious constraints.

She also says that Eintein's special theory of relativity showed that there is no such thing as "now". This is because when we see or hear something, it takes time for the signals to reach our brains.

Then she rambles about multiverse theory being ascientific, because the scientific method cannot be applied to it. But surely superdeterminism is even worse, and is unscientific, as it says that the scientific method does not work.

She says that information cannot be destroyed. Seems like more unscientific nonsense to me.

There are certains ideas of Physics that are so outlandish as to discredit anyone who promotes them. I include: many-worlds, superdeterminism, retrocausality, simulation hypothesis, and action-at-a-distance.

Monday, December 19, 2022

Do moving clocks slow down?

Some Russian scholars address this question:
The special theory of relativity has fundamentally changed our views of space and time. The relativity of simultaneity in particular, and the theory of relativity as a whole, still presents significant difficulty for beginners in the theory. v...

The real question is why we can even find real clocks (such as atomic clocks) that work very much like the ideal clocks of relativity. The short answer to this question is that a real clock behaves like an ideal clock because the physics that governs its inner workings is Lorentz invariant. ...

However, this question makes sense in the more ambitious program of Lorentz and Poincar ́e on how the theory of relativity should be developed. Lorentz noted in 1909 that the main difference between the two programs is that “Einstein simply postulates what we have deduced, with some difficulty and not altogether satisfactorily, from the fundamental equations of the electromagnetic field” [109].

Although the Lorentz-Poincar ́e program of deriving the Lorentz symmetry rather than postulating it was never completely abandoned [110, 2], it is clear that Einstein’s program replaced it [109, 111], and for good reason: even today, Lorentzian symmetry cannot be deduced from a more fundamental theory.

The phrase “moving clocks slow down” is universally used and has been in common use for over a hundred years, and we realize that our suggestion not to use it, while correct, is not very realistic due to the ”intellectual inertia” [112] of the physics community. ...

In our opinion, better pedagogical practice would be to base the special relativity from the very beginning on the Minkowski four-dimensional formalism, which allows the introduction of relativistic concepts without any mention of their Newtonian counterparts [13].

It is amusing to see Russian scholars engaging in the same sort of Einstein idol worship that we see in the West, especially when contrary evidence is staring them in the face.

Yes, Einstein simply postulated what Lorentz and Poincare proved. By this, Lorentz meant that they deduced it from experiments, like Michelson-Morley, and established theory, like Maxwell's equations. Einstein ignored all that, took their conclusions, and made them postulates.

Einstein did not dispute this. His original papers did not have any references, but in later interviews he always said  he got the light postulate from Lorentz, and did not use Michelson-Morley himself.

The Minkowski 4-D formalism was invented and published by Poincare in 1905, and popularized by Minkowski in 1908. Einstein eventually accepted it a few years after that.

I am not sure Einstein's program is really so popular in textbooks. All the books I've seen either start with Michelson-Morley, or go straight to 4-D. The books might credit Einstein with these ideas, but he missed them.

For example, here are a couple of recent lectures on spacetime, here and here. Both credit Einstein with discovering relativity, but acknowledge that it was Minkowski's 4-D formalism that caught on among physicists of the day and that still dominates and that Einstein only reluctantly accepted it a couple of years later after arguing against it.

In his 1922 Kyoto lecture “How I Created the Theory of Relativity,” Einstein describes the decisive moment when he became enlightened in conversations with his friend Michel Besso: “My interpretation was really about the concept of time. Namely, time could not be defined absolutely, but is in an inseparable relationship with the signal velocity” [107].

Although the alleged cause of Einstein’s redefinition of time is not known for certain, and it is conceivable that Einstein, consciously or unconsciously, may have borrowed from other authors, most plausibly from Poincar ́e, more than his writings and sayings suggest [108], without any doubt, this new definition of time was the most shocking and paradoxical aspect of special relativity.

Einstein elevated Lorentz’s local time to the status of “true” time, and for his contemporaries it became the successor to Newtonian time. This immediately gave the theory a paradoxical tinge, since Newtonian time is absolute, while Lorentz’s local time varies depending on the inertial frame of reference.

Einstein said his big breakthrough in 1905 was to understand the local time that Lorentz invented in 1895. Lorentz got the Nobel Prize in 1902, and Poincare's nomination credited him with the ingenious invention of local time.

To answer the title question, the clocks do not slow down, relative to their own world lines. They appear to slow down in another frame, because of the 4-D non-Euclidean geometry.

Discussion of whether the special relativity contractions and dilations are real or apparent has a long history. From Wikipedia:

In the period 1909 to 1913 Varićak had correspondence with Albert Einstein[6] concerning rotation and length contraction where Varićak's interpretations differed from those of Einstein. Concerning length contraction Varićak said that in Einstein's interpretation the contraction is only an "apparent" or a "psychological" phenomenon due to the convention of clock measurements whereas in the Lorentz theory it was an objective phenomenon.[7] Einstein published a brief rebuttal, saying that his interpretation of the contraction was closer to Lorentz's.[8]
See also Reality of length contraction. Einstein still did not accept in 1911 that relativity used a non-Euclidean geometry, and that the contractions and dilations were artifacts of that geometry.

Another new paper argues:

There have been three geometrizations in history. The first one is historically due to the Pythagorean school and Plato, the second one comes from Galileo, Kepler, Descartes and Newton, and the third is Einstein's geometrization of nature. The term geometrization of nature means the conception according to which nature (with its different meanings) is massively described by using geometry.
Einstein would not agree with this. He did not believe that General Relativity geometrizes gravity. He persisted in this view, long after he was credited with geometrizing gravity. Steve Weinberg also did not like that view. Strange, as they sometimes used geometric arguments to solve general relativity problems, and almost everyone else accepts the geometry view.

Wednesday, December 14, 2022

Dynamical Models do not Generate Free Will

Physics professor Scott Aaronson's blog attracts trolls, but a couple of comments there are so glaringly wrong that it is worth showing why.

SR Says:

As pointed out by Mateus Araújo above, QFT is local, so it seems to me that the only ways to reconcile this with the nonlocality of standard QM evidenced by the Bell test are: (1) QFT being totally incorrect in an easily-measurable way, (2) Many-Worlds, so that the appearance of non-locality is only due to our observable universe residing in a slice of the “true” wavefunction, (3) superdeterminism.
If those were really the only consequences of the Bell tests, then the Nobel citation would have said so. It did not.

Fred writes:

“It’s very weird that we feel that we have the free will to perform experiments and the consciousness to feel like we understand them, when, to our best understanding, everything is predetermined by the laws of physics.”

What’s even weirder is that determinism isn’t even the core culprit here! You only have two ingredients at each end of the spectrum: perfect determinism (e.g. a bunch of balls moving around and hitting one another… i.e. every event is caused by a prior event), and pure randomness (everything is uncorrelated noise, i.e. no causality, things happen without a prior cause… very weird too when you think about it).

And then you can mix those two in various amount, on a continuous scale. QM is a mix of determinism and randomness, somewhere in the middle of the scale. MWI + consciousness also seems to lie in the middle of the scale (the wave function of the universe is determined, but my place as a conscious being on that structure seems random, from my subjective point of view).

When it comes to free will: sure, determinism seems to obviously exclude it… but randomness seems to exclude it too! For the general idea of free will isn’t exactly understood by throwing a dice at every moment a supposed “decision point” happens. He is right that our dynamical models do not generate free will. That is why it is called free will.

He responds:

You might as well call it pixie magic dust then?

A dynamic system is a system which state evolves with time (a parameter).

My point is that dynamic systems either evolve following causality (current state is derived from prior state) and/or randomness (current state is independent of prior state), and then any degree of mix of those two things (where events depend partly on prior events and partly on some randomness).

Note that randomness is non-determinsm, meaning an event without any cause within the system. Whether that randomness is pure (appearing magically within the system) or is a dependence on causes external to the system is basically the same.

That’s it!

What other ingredient would there be?

No, randomness is not an event without any cause within the system, it is an event without cause in the model. Something is random if your model cannot predict it. There is no such thing as pure randomness.

Free will is a form of randomness that no one is even trying to model.

He moved on to the latest scientific hoax:

Looks like fusion energy supremacy has been demonstrated!

Scott Says: Only “scientific supremacy,” not supremacy where you account for the actual energy cost of the lasers which is still 100x too high. Still good though!

They refer to this story:
Scientists with the U.S. Department of Energy have reached a breakthrough in nuclear fusion.

For the first time ever in a laboratory, researchers were able to generate more energy from fusion reactions than they used to start the process. The total gain was around 150%.

"America has achieved a tremendous scientific breakthrough," Energy Secretary Jennifer Granholm said at a press conference.

The achievement came at the National Ignition Facility (NIF), a $3.5 billion laser complex at Lawrence Livermore National Laboratory in California. For more than a decade, NIF has struggled to meet its stated goal of producing a fusion reaction that generates more energy than it consumes.

To make this claim, they ignore the energy required to run the lasers, and to power the confinement structure. As Scott notes, they are really putting 100x energy in.

The analogy here is that this new experiment is supposed to show that it is possible to generate fusion energy, even if the overall efficiency is lousy. Quantum supremacy is supposed to show that it is possible to generate a super-Turing computation, even if it is completely useless.

Update: I added this comment:

@fred: You say that events are either caused by the previous state (determinism), or independent of it (random), and magic pixie dust is the only other possibility.

Consciousness is the other possible cause. My conscious state appears inscrutable to you. You cannot model it, and my decisions appear unpredictable. You as might as well call them random, if you cannot predict them. I cannot really explain it either, except to say that I am more sure of it than any of my perceptions. Cogito ergo sum. (I think, therefore I am.)

Readers here might say I could be a troll or an AI bot, and hence not to be believed. Fine, decide for yourself. Do you really think that modern physics has explained causality so thoroughly as to rule out human consciousness? Or do you make decisions everyday that science cannot predict or explain?

Update: Another comment:
They got a breakthrough on Nuclear Fusion. No one now can say quantum computing is the nuclear fusion of the 60s.
Not sure if this is intended to be sarcastic. Quantum computing is more like nuclear fusion than ever, with both fields making big splashes with supremacy claims of no practical significance.

Monday, December 12, 2022

Impossible or Fundamentally Impossible

Here is Scott Aaronson's main argument for quantum computing:
I notice that you never responded about a fault-tolerant QC running Shor’s algorithm. Do you believe that that’s fundamentally possible, or not? If not, what physical principle is going to come in and prevent it? Will you agree that the discovery of that principle would be a revolution in physics?
He draws a distinction between what is impossible, and what is fundamentally impossible. I am not sure the distinction makes any sense.

In other words, QC might be impossible, but it is not fundamentally impossible unless some law of physics forbids it. We have not found that law of physics. The Extended Church-Turing Thesis forbids it, but he says it is not fundamental and it "is still on thin ice."

A response:

From an engineering point of view, there are often unforeseen limitations emerging from complex interactions of different domains such as physics of materials, chemistry, thermodynamics, mechanics, economics, etc. Those different knowledge fields are themselves at a much higher conceptual level compared to the underlying basic physics they all share, so their own laws/heuristics only hold in specific domains with specific assumptions, and all those various models (often highly non linear) just don’t overlap.

There’s nothing in the basic laws of physics explicitly saying that you can’t build a stable stack of quarters from here all the way up to the edge of space.

But do you believe it can be done? Given an existing stack of quarters, it’s trivial to just add one more quarter to it, and then by recursion assume the stack can be arbitrarily high. But that’s not how system scalability works in practice: at some point, what works for 100 quarters won’t work for 1000 quarters, because new problems are introduced: e.g. the wind will screw things up, and if you build your stack inside a tube with a vacuum, you’re now facing another totally different engineering challenge (create a 100 mile-high tube that can contain a vacuum). And, even without air, you’d have to deal with the effects of tides, plate tectonics, strength limitations in alloys, etc.

There’s also no specific law of physics telling us whether building room temperature super-conductors is impossible.

Same about building a stealth bomber that can travel faster than mach 5 at sea level.

It also goes the other way: a hundred years ago, it would have seem impossible (given the technology of the day, but given pretty much the same laws of physics) to build a gravitational wave detector that could measure changes in distance around 1/10,000th of the diameter of a proton, between two mirrors separated by 4km.

So, for the vast majority of hard engineering problems (and building a QC *is* a hard engineering problem), the fact that there’s no clear black and white basic principle saying it’s impossible isn’t really helping much at all. It wouldn’t be the first time we set up to build something, and then it never happens because various requirements just can’t be met within the same system (often it’s quietly killed because money runs out and people move on to other things because some new engineering progress makes an entirely different problem more exciting to work on).

So maybe QC is like that gravitational wave detector. It seemed impossible for a long time, until some huge technological advances were made.

In the same thread, Aaronson ridicules the idea that Einstein might have thought that ER=EPR. He did not even believe in either wormholes or quantum mechanics. It is not clear today if anyone really believes this wormhole entanglement nonsense. Somehow Einstein did inspire a lot of bogus physics thinking.

Sabine Hossenfelder has a new video on Quantum Uncertainty Simply Explained. She correctly describe the uncertainty principle as an essential part of quantum mechanics, but also explains that all waves obey an uncertainty principle. The uncertainty principle is a way of saying that electrons have wave-like behavior.

Update: Quanta has an article by Robbert Dijkgraaf saying There Are No Laws of Physics. There’s Only the Landscape. That is because he is a string theorist who studies mathematical abstractions that have nothing to do with reality.

Thursday, December 8, 2022

Holographic Wormhole on a Microchip

From Nature, the leading European science journal:
A holographic wormhole in a quantum computer

Physicists have used a quantum computer to generate an entity known as an emergent wormhole. Quantum systems can be linked by entanglement even when separated by extremely long distances. The authors generated a highly entangled quantum state between the two halves of a quantum computer, creating an alternative description, known as a holographic dual, in the form of an emergent wormhole stretched between two exterior regions. They then simulated a message traversing this wormhole. Such exotic physics is part of efforts to reconcile quantum mechanics with the general theory of relativity.

Dr. Quantum Supremacy responds:
Tonight, David Nirenberg, Director of the IAS and a medieval historian, gave an after-dinner speech to our workshop, centered around how auspicious it was that the workshop was being held a mere week after the momentous announcement of a holographic wormhole on a microchip (!!) — a feat that experts were calling the first-ever laboratory investigation of quantum gravity, and a new frontier for experimental physics itself. Nirenberg asked whether, a century from now, people might look back on the wormhole achievement as today we look back on Eddington’s 1919 eclipse observations providing the evidence for general relativity.

I confess: this was the first time I felt visceral anger, rather than mere bemusement, over this wormhole affair. Before, I had implicitly assumed: no one was actually hoodwinked by this. No one really, literally believed that this little 9-qubit simulation opened up a wormhole, or helped prove the holographic nature of the real universe, or anything like that. I was wrong.

That 1919 eclipse was hyped with a NY Times headline: “Men of Science More or Less Agog Over Results of Eclipse Observations”.

Update: Here is the Quanta video.

Almost a century ago, Albert Einstein realized that the equations of general relativity could produce wormholes. But it would take a number of theoretical leaps and a “crazy” team of experimentalists to build one on Google's quantum computer. Read the full article at Quanta Magazine:
Quanta used to be a respectable magazine. As was Nature, which is now filled with woke nonsense.

Monday, December 5, 2022

What is Entanglement?

Entanglement is supposed to be the essence of quantum mechanics, but I wonder about it. Wikipedia defines:
Quantum entanglement is the physical phenomenon that occurs when a group of particles are generated, interact, or share spatial proximity in a way such that the quantum state of each particle of the group cannot be described independently of the state of the others, including when the particles are separated by a large distance. The topic of quantum entanglement is at the heart of the disparity between classical and quantum physics: entanglement is a primary feature of quantum mechanics not present in classical mechanics.[1]

Measurements of physical properties such as position, momentum, spin, and polarization performed on entangled particles can, in some cases, be found to be perfectly correlated. For example, if a pair of entangled particles is generated such that their total spin is known to be zero, and one particle is found to have clockwise spin on a first axis, then the spin of the other particle, measured on the same axis, is found to be anticlockwise.

It confusingly says it is a "physical phenomenon", and then says it is a property of how quantum states are described. So it is not a physical property. I removed the word "physical".

My bigger issue is "quantum entanglement is at the heart of the disparity between classical and quantum physics". This is conventional wisdom, so it is reasonable for Wikipedia to say this, but it is true?

Classical systems obey entanglement, exactly as it is described here. Suppose you have a system of two balls, and you know their total momentum. Assume that momentum is conserved. Then the balls get separated, and you measure the momentum of one of them. You then know the momentum of the other ball, as it is perfectly correlated and opposite. The momenta must sum to the original total.

You can do the same with angular momentum.

So how is this any different from the quantum entanglement? I do not see any difference.

Anticipating your objections, you might tell me that the quantum momentum and spin are not known until measured, or that Bell's theorem puts limits on certain correlations. Okay, I accept all that, but what does it have to do with the definition of entanglement? Nothing.

Or you might say that entanglement is a nonlocal interaction. Spooky action and all that. But that is a big hoax. Quantum mechanics is a local theory.

A more sophisticated objection says that entanglement is a precious resource that can be used for cryptography, teleportation, and parallel computation. Quantum entangled particles have the necessary magic, and classically entangled ones do not.

This objection is harder to answer, as currently billions of dollars are being spent to try to determine whether any such magic exists. Most say yes, but they have yet to demonstrate anything useful from that magic.

Even if such magic exists, there ought to be a way to define entanglement that makes it clearly a quantum phenomenon, and not a classical one.

Quantum systems are different from classical systems. Bell proved that. The uncertainty principle is different, and so are certain correlations. But I don't see how entanglement is any different.

Lenny Susskind and others have been saying that wormholes and entanglement are the same thing.

YouTube:

Almost a century ago, Albert Einstein realized that the equations of general relativity could produce wormholes. But it would take a number of theoretical leaps and a “crazy” team of experimentalists to build one on Google's quantum computer. Read the full article at Quanta Magazine:
The idea seems to be that if entanglement and wormholes are the same thing, and quantum computers use entanglement to do super-Turing computations, then there should be some wormholes hiding inside a quantum computer. Seems like a joke to me, but I did not read the details.

Update: Peter Woit writes:

The best way to understand the “physicists create wormholes in the lab” nonsense of the past few days is as a publicity stunt ...

I’m hoping that journalists and scientists will learn something from this fiasco and not get taken in again anytime soon. It would be very helpful if both Nature and Quanta did an internal investigation of how this happened and reported the results to the public. Who were the organizers of the stunt and how did they pull it off? ...

his claims in the Quanta video that the result of the Google quantum computer calculation was on a par with the Higgs discovery. Does he really believe this (he’s completely delusional) or not (he’s intentionally dishonest)? ...

Peter Shor says: It seems to me that the string theorists and “it from qubit” community seem to have this unwritten rule that they don’t criticize other members of this community in public.

I used to think that Physics had higher standards than other sciences for truth and professionalism. Apparently not.

Another comment:

I thought you’d be interested to know that the “wormhole created in a quantum computer” story is now being covered in some far-right-wing media. I won’t name them here (they’re very far-right sites, not sure if you’d allow a link here), but they’re essentially saying “isn’t this manifestly stupid? See? Why should we believe scientists when they publish bullshit like this?” and essentially use the story to argue that scientists and science journalists are all a bunch of idiots, hence why should we trust them on vaccines/climate change etc.

This is another consequence of bad publicity stunts like this: it erodes trust in scientists.

Here is one such site. It is so disreputable that Google confiscated its domain name. Possibly the most censored site in the world. It just quotes a NY Times tweet, a Google tweet, a Reuters story, and a couple of more tweets, and adds:
This is very obviously fake and it’s goofy that people think it’s real.
I agree with that. It is goofy that people think that this research is real.

Thursday, December 1, 2022

Entanglement Discovered in a Quantum Computer

Lenny Susskind and others have been saying that wormholes and entanglement are the same thing.

YouTube:

Almost a century ago, Albert Einstein realized that the equations of general relativity could produce wormholes. But it would take a number of theoretical leaps and a “crazy” team of experimentalists to build one on Google's quantum computer. Read the full article at Quanta Magazine:
The idea seems to be that if entanglement and wormholes are the same thing, and quantum computers use entanglement to do super-Turing computations, then there should be some wormholes hiding inside a quantum computer. Seems like a joke to me, but I did not read the details.

See Peter Woit for details. At least one physicist calls it a publicity stunt. The quantum computer researchers have burned a lot of money, and need something to show for it.

Update: A comment:

Even if the headline isn’t strictly accurate (a topic for another time, although I think you’re splitting hairs here), what’s the harm? It’s a cool-sounding result which gets people interested in theoretical physics, science more generally. As long as science journalists are driving interest and engagement, I think they’re doing a good job. If you want to discuss bad science journalism, surely a better use of your time would be all the anti-science fake news coming from the populist right in the U.S.
I suspect that this view is common. Over-hyped phony stories generate interest and funding. If you want to be a good Leftist, you should not call out Leftist lies. Instead you should devote that energy to attacking right-wingers!

Update: Scott Aaronson admits that the wormhole story is a big hoax, promoted by physicists who should know better. He also discusses a new paper saying that quantum supremacy is impossible. He says it is no surprise to experts in the field who have known since 2016 that scaling up quantum computers will not work. He is still a believer:

So, though it’s been under sustained attack from multiple directions these past few years, I’d say that the flag of quantum supremacy yet waves. The Extended Church-Turing Thesis is still on thin ice.
That is, he says that he has not been proved wrong yet. Okay, but he hasn't been proved right either.

Monday, November 28, 2022

Aharonov–Bohm effect does not Prove Nonlocality

I heard the suggestion that the Aharonov–Bohm effect proves a form of quantum nonlocality.
The Aharonov–Bohm effect, sometimes called the Ehrenberg–Siday–Aharonov–Bohm effect, is a quantum mechanical phenomenon in which an electrically charged particle is affected by an electromagnetic potential (φ, A), despite being confined to a region in which both the magnetic field B and electric field E are zero.[1] The underlying mechanism is the coupling of the electromagnetic potential with the complex phase of a charged particle's wave function, and the Aharonov–Bohm effect is accordingly illustrated by interference experiments.

The most commonly described case, sometimes called the Aharonov–Bohm solenoid effect, takes place when the wave function of a charged particle passing around a long solenoid experiences a phase shift as a result of the enclosed magnetic field, despite the magnetic field being negligible in the region through which the particle passes and the particle's wavefunction being negligible inside the solenoid. This phase shift has been observed experimentally.[2]

So the effect depends on the potential, and not just the fields.

The potential and fields are all locally defined, so what is the problem?

The problem is that only the fields are directly observable, and there is considerable discretion in defining the potential. Sometimes the potential is defined to satisfy a distant condition. This is allowed, because gauge symmetry means it has the same physical effect.

From the viewpoint of differential geometry, the potential is a connection on a complex line bundle, and is a purely local object. It is more fundamental than the fields.

The paradox is that an electron can interfere with itself after going around a non-null-homotopic loop with a flat complex line bundle. Arguably there is something nonlocal about that. I don't think so. It is not like action-at-a-distance at all.

Friday, November 25, 2022

Electrons Are Spinning

Scientific American reports:
Quantum Particles Aren’t Spinning. So Where Does Their Spin Come From?

A new proposal seeks to solve the paradox of quantum spin ...

But despite appearances, electrons don’t spin. They can’t spin; proving that it’s impossible for electrons to be spinning is a standard homework problem in any introductory quantum physics course. If electrons actually spun fast enough to account for all of the spinlike behavior they display, their surfaces would be moving much faster than the speed of light (if they even have surfaces at all). Even more surprising is that for nearly a century, this seeming contradiction has just been written off by most physicists as yet another strange feature of the quantum world, nothing to lose sleep over.

No, this is wrong. Electrons do spin. You only get that paradox if you assume that electrons are very tiny spheres or point particles, but quantum mechanics teaches that electron are non-classical entities with wave-like properties.

The article goes on to give the history of quantum spin, and how crucial it is for understanding chemistry and many other things.

But all of these fabulous discoveries, applications, and explanations still leave Goudsmit and Uhlenbeck’s question on the table: what is spin? If electrons must have spin, but can’t be spinning, then where does that angular momentum come from? The standard answer is that this momentum is simply inherent to subatomic particles, and doesn’t correspond to any macroscopic notion of spinning.

Yet this answer is not satisfying to everyone. “I never loved the account of spin that you got in a quantum mechanics class,” says Charles Sebens, a philosopher of physics at the California Institute of Technology.

No, this is silly. The QM textbooks teach that position, momentum, energy, angular momentum, and spin are observables that correspond to the classical variables, but cannot be taken literally about electrons as point particles, as the uncertainty principle prevents such a literal treatment. There is not really any difference between spin and the other variables in this respect.

I previously posted Electrons do spin.

Peter Woit explains:

Despite what Sebens and Carroll claim, it has nothing to do with quantum field theory. The spin phenomenon is already there in the single particle theory, with the free QFT just providing a consistent multi-particle theory. In addition, while relativity and four-dimensional space-time geometry introduce new aspects to the spin phenomenon, it’s already there in the non-relativistic theory with its three-dimensional spatial geometry.
Asking whether electrons really spin is a like asking whether they orbit the nucleus of an atom. A century ago, physicists tried to model an atom as classical electron orbits, and figured out that it doesn't work. You need a quantum model. But it is still correct to say that the electrons orbit the nucleus.

Wednesday, November 23, 2022

TV Show on Zero and Infinity

I just watched the latest PBS TV Nova on Zero to Infinity:
Discover how the concepts of zero and infinity revolutionized mathematics.
It was stupid and boring.

A Black woman professor narrated. They always seems to find Blacks and women for these shows. Not sure why. Does PBS have a lot of Black viewers? Is it trying to get more?

I doubt it. My guess is that the typical WHite liberal PBS viewer gets a good feeling of social justice when a Black woman is lecturing.

Much of the show was about the invention of the Zero. It attributed it to India, and said that Persians and other middle easterners brought it to Europe.

But what was the invention? The use of 0 as a placeholder, or as a counting number, or as a number on the same footing as other numbers?

I looked for some statement from India or Persia saying something like: The counting numbers are { 0, 1, 2, ... }, and for any such numbers, A + B = B + A.

That would show that the author considered 0 to be a number just like 1 and 2.

But on the contrary, the show itself did not even do that. The moderator kept referring to the counting numbers as 1, 2, 3, ..., and not including 0.

Any who says that has still not grasped the invention of 0. 0 is a counting number. If I ask you how many apples you have, and you have none, then you answer 0. Maybe you answer -2, if you owe 2 apples. If you say there is no answer, then you have not accepted the 0.

Even the business pages of a typical newspaper rarely treat zero as a number. It will often avoid it with various euphemisms.

The show eventually moved on to infinity, but that was not any better. It gave a faulty version of Cantor's diagonal proof of the uncountability of the reals.

Suppose you list .4, .49999..., .009, .0009, ... . It said to add 1, mod 10, to each diagonal digit. That gives .50000. That is the same real number as the 2nd item on the list, with a different decimal representation.

A good proof must somehow take into account that real numbers can have two decimal representations.

Cantor's orginal proof did not use diagonalization.

The show went on to Zeno's paradoxes and Hilbert's hotel. It was all fairly trivial.

Since it tried to trace the origin of the zero, I thought that it might tell us who invented infinity?

It did talk about approximating π as a limit of an infinite sequence. I guess that idea goes back to the ancient Greeks. The invention of infinitesmal calculus required limits. Those ideas were made rigorous centuries later. Cantor introduced the concept of different infinite cardinals. I am not sure who really first had the modern concept.

These PBS shows appears to be expensively produced, but you can find lots of free YouTube videos that explain the math much better, and are more entertaining.

Monday, November 21, 2022

Making Finitary Deductions About Infinities

That is my 5-word definition of Mathematics. It is what distinguishes Math from every other field.

Some say that Math is the study of numbers, or the use of symbolic notation. But Music uses symbolic notation, and numbers are used by all the hard and soft sciences.

None of the empirical sciences ever encounter infinities. Cosmologists may talk about the universe having infinite extent, but there is no reason to believe that, and we cannot observe that. We only observe finite quantities.

And the sciences never make a finitary deduction either. An experiment might convince us of some fact, but it is really just evidence that makes an outcome 99% likely, or something like that. The experiment has to be refined and redone to become more and more sure of it.

Math has infinities all over the place. This is obviously true about work on limits, but it is also true about elementary statements like the Pythagorean Theorem. There are infinitely many possible right triangles, and the theorem gives a formula about all of them.

Even with all the infinities, the proofs always use a finite set of steps from a finite number of axioms. The proofs about the infinities are always strictly finitary.

Here is the Wikipedia definition of Mathematics:

Mathematics (from Ancient Greek μάθημα; máthēma: 'knowledge, study, learning') is an area of knowledge that includes such topics as numbers (arithmetic and number theory),[2] formulas and related structures (algebra),[3] shapes and the spaces in which they are contained (geometry),[2] and quantities and their changes (calculus and analysis).[4][5][6] Most mathematical activity involves the use of pure reason to discover or prove the properties of abstract objects, which consist of either abstractions from nature or — in modern mathematics — entities that are stipulated with certain properties, called axioms. A mathematical proof consists of a succession of applications of some deductive rules to already known results, including previously proved theorems, axioms and (in case of abstraction from nature) some basic properties that are considered as true starting points of the theory under consideration.

Mathematics is used in science for modeling phenomena, which then allows predictions to be made from experimental laws. The independence of mathematical truth from any experimentation implies that the accuracy of such predictions depends only on the adequacy of the model. Inaccurate predictions, rather than being caused by incorrect mathematics, imply the need to change the mathematical model used.

Here is Britannica:
mathematics, the science of structure, order, and relation that has evolved from elemental practices of counting, measuring, and describing the shapes of objects. It deals with logical reasoning and quantitative calculation, and its development has involved an increasing degree of idealization and abstraction of its subject matter. Since the 17th century, mathematics has been an indispensable adjunct to the physical sciences and technology, and in more recent times it has assumed a similar role in the quantitative aspects of the life sciences.
Here are some dictionaries:
The abstract science of number, quantity, and space. -- Oxford

An abstract representational system studying numbers, shapes, structures, quantitative change and relationships between them. -- Wiktionary

The science of numbers and their operations, interrelations, combinations, generalizations, and abstractions and of space configurations and their structure, measurement, transformations, and generalizations. -- Merriam-Webster

The study of the measurement, relationships, and properties of quantities and sets, using numbers and symbols. -- dictonary.com

The science that deals with the logic of shape, quantity and arrangement. -- Live science

The study of numbers, shapes, and space using reason and usually a special system of symbols and rules for organizing them. -- Cambridge

These are pretty good, but do not distinguish Math from science well. Yes, Math is an area of knowledge that includes numbers, but the mathematicians do proofs, with infinite numbers and finite arguments.

Thursday, November 17, 2022

Sean M. Carroll Goes Woke on Sex

Sean M. Carroll has become one of the leading expositors of advanced Physics, but he has a lot of strange views that will make you skeptical of whatever he tells you.

The biggest is that he believes in the many-worlds alternative to quantum mechanics. This is a belief that anything is possible, and that nothing is more likely than anything. It is a complete rejection of all modern science.

He has his own rationalization that is mostly circular reasoning.

In his latest Ask Me Anything podcast, he says that he does not see a moral justification for parents spending money on their children's education. He says all children should get the same education.

He is married with no kids.

He is welcome to his opinions, but he does not describe the American situation accurately. In California, his home state until recently, the schools get about 50% of the state budget, and the poor districts get at least as much as the rich districts. The rich are not getting any better educational opportunities.

Some rich parents do send their kids to expensive schools, but the educational opportunities are not much different from public schools.

He has also joined the sex-deniers who say that biological sex is not binary. Biology professor Jerry Coyne is a big fan of Carroll, because of what he says in favor of determinism and against libertarian free will, but schools him on biological sex.

In reality, what they are trying to do is the reverse: adjust scientific reality so that it aligns with social justice. That is, if sex is a spectrum and not binary, then people of different genders can somehow feel that they are in harmony with biological reality. But that’s an example of the “appeal to nature.” The rights of people of different genders, including transsexual people, do not depend on the developmental biology of sex, or of any observations in nature about sex dichotomies.

I’m not going to discuss my claim that sex is binary; I’ve talked about it at length, as did Luana Maroja in her piece at Substack. I’ll just put it out there that the going biological definition of sex is that there are two sexes in vertebrates: males (who produce small mobile gametes) and females (who produce large, immobile gametes). There is no group that produces intermediate types of gametes that can unite with other gametes, so there is nothing beyond these two sexes.

Carroll is not a biologist. He is not to be confused with the somewhat more accomplished scientist, Sean B. Carroll, who really is an expert biologist.

I assume that Sean M. Carroll is smart enough to know the difference between male and female. But it appears that he is willing to recite nonsense in order to please his Leftist Woke fans.

I suggest keeping this in mind when listening to him. He sometimes gives pretty good explanations of textbook physics, but his opinions on big picture physics are dubious, and his moral and politcal opinions are garbage.

Monday, November 14, 2022

Einstein and the Equivalence Principle

New paper:
Einstein's Happiest Moment: The Equivalence Principle
Paul Worden, James Overduin

Einstein's happiest thought was his leap from the observation that a falling person feels no gravity to the realization that gravity might be equivalent to acceleration. It affects all bodies in the same way because it is a property of spacetime -- its curvature -- not a force propagating through spacetime (like electromagnetic or nuclear forces). When expressed in a way that is manifestly independent of the choice of coordinates, this idea became General Relativity. But the ground for what is now known as the "equivalence principle" was laid long before Einstein, affording a fascinating example of the growth of a scientific idea through the continuous interplay between theory and experiment.

As this article and Eikipedia explain, the equivalence principle goes back centuries. Einstein was very happy about using it in a 1907 paper, but it was not because gravity was a realization of curvature, as he did not even know what curvature was at the time.

It is my understanding that what Einstein was actually happy about was that he figured out a way to use the principle to use special realtivity to show gravitational time dilation.

Special relativity is often described as a theory about constant velocity, but back during the early days, say 1995-2010, it was widely understood to cover accelerating particles also. Poincare proposed a couple of relativistic gravity theories, but the geometry was not understood.

Einstgein figured out how to sidestep having a gravity theory, by saying that gravity was like non-gravitational acceleration. That was enough to figure out clocks in a gravitational field.

People often say that general relativity is needed for GPS navigation, but I don't think that is true. It only needs special relativity, and this trick of Einstein.

As far as I know, this idea of Einstein was his own, and not plagiarized from anyone else. Maybe that is why he was so happy about it.

Thursday, November 10, 2022

Probability is Subjective

Ulrich J. Mohrhoff writes:
With Mermin, I also hold this truth to be self-evident (though it took me some time to get there), that probabilities are intrinsically subjective. ...

Mermin invokes the celebrated probabilist Bruno de Finetti, who wrote: “The abandonment of superstitious beliefs about the existence of Phlogiston, the cosmic ether, absolute space and time. . . , or Fairies and Witches, was an essential step along the road to scientific thinking. Probability too, if regarded as something endowed with some kind of objective existence, is no less a misleading misconception, an illusory attempt to exteriorize or materialize our actual probabilistic beliefs.”

Taking the mind-independent existence of the external world for granted, de Finetti holds that there is no place for probability in such a world, any- more than there is for Phlogiston and the rest.

I agree with this, but do not deny the importance of probability.

All scientific theories are inherently probabilistic. Even classical celestial mechanics, the textbook example of the clockwork deterministic universe, was always probabilistic in practice. Observations in the sky always had errors, and predictions had uncertainty. Linear regression was invented to make probabilistic predictions about celestial orbits.

Monday, November 7, 2022

Quantum Computing Skeptic gives Lecture

Gil Kalai gave a lecture on the impossibility of quantum computers, summarized here:
My argument for the impossibility of quantum computers lies within the scope of quantum mechanics and does not deviate from its principles. In essence, the argument is based on computational complexity and its interpretation, and it is discussed in-depth in my papers which also include a discussion of general conclusions that derive from my argument and relate to quantum physics, alongside suggestions of general laws of nature that express the impossibility of quantum computation.

My argument mostly deals with understanding quantum computers on the intermediate scale (known as NISQ computers, an abbreviation of Noisy Intermediate Scale Quantum), that is, quantum computers of up to at most several hundreds of qubits. It is expected that on this scale we will be able to construct quantum codes of a quality sufficient for the construction of bigger quantum computers. It is further expected that on this scale the quantum computer will achieve computations far beyond the ability of powerful classical computers, that is, will achieve quantum computational supremacy. The Google’s Sycamore computer is an example of a noisy intermediate-scale quantum computer.

As specified later, it is my argument that NISQ computers cannot be controlled. Hence:

  1. Such systems cannot demonstrate significant quantum computational advantage.
  2. Such systems cannot be used for the creation of quantum error-correcting codes.
  3. Such systems lead to non-stationary and even chaotic distributions.

Note that he does not say that quantum mechanics is wrong. He denies that quantum computing is a necessary consequence.

A lot of smart people and a lot of research funding say that he is wrong.

Maybe I am just a contrarian, but it seems to me that they should have been able to prove him wrong by now. They have not.

Thursday, November 3, 2022

Dr. Bee Make Bad Argument for Superdeterminism

Jonte R. Hance and Sabine Hossenfelder posted another short argument for superdeterminism, without admitting that superdeterminism is their real goal.

It starts out complaining that a Physics Nature article about Bell Tests was not completely precise. The Bell Tests prove that quantum mechanics experiments are inconsistent with local hidden variable theories.

As Bell and others have pointed out, there are some subtle assumptions: that the experimenter can make free choices (no superdeterminism), that future does not cause the past (no retrocausality), and that experiments have single outcomes (no many-worlds). All of these possibilities are crazy, and no serious person would believe in them. So these are reasonable assumptions.

If their only point was that a precise statement would mention these possibilities, that would be fine. But they go further.

They say that some people believe that they have the free will to do the measurements they choose, and then "It is, in hindsight, difficult to understand how this as- sociation came about." That is, they do not understand how people could think that they have the free will choose equipment settings.

Understanding the implications is even more important now that the experimentally observed violations of Bell’s inequality have been awarded the 2022 Nobel Prize in Physics. Contrary to what is of- ten stated, these observations do not demonstrate that “spooky action at a distance” is real and nature therefore non-local.
The Nobel citation did not say that spooky action is real, or that there is anything wrong with quantum mechanics.
Rather, the observations show that if nature is local, then statistical independence must be violated. We should therefore look for independent experimental evidence that can distinguish the two different options: non-locality and statistical independence, or locality and violations of statistical independence.
No, they are wrong here. The Bell observations show that if nature is local, then the theory must be a non-classical theory like quantum mechanics, or else we have one of the crazy loopholes like superdeterminism, retrocausality, or many-worlds. Saying non-classical is essentially the same as saying no local hidden variables.

Their deceptive title is "Bell's theorem allows local theories of quantum mechanics". That is completely correct statement, as local quantum mechanics is what all the textbooks teach. But what the body of the paper says is that Bell's theorem allows local superdeterminism, and that is the opposite of quantum mechanics. There is no superdeterministic theory of quantum mechanics.

Believing is superdeterminism is essentially a rejection of all science in the last millennium. So is retrocausality and many-worlds. You can believe in it if you want, but it is quite wrong to say that it is required by locality.

Dr. Bee has started expanding her podcasts to covering science news. She does a competent job, and she is very knowledgeable about Physics. But how can you trust anyone who believes that no one has any free will to do experiments, and that every randomized trial is fake?

Tuesday, November 1, 2022

String Theory may Explain Consciousness

New paper:
Recent proposals in quantum gravity have suggested that unknown systems can mediate entanglement between two known quantum systems, if the mediator itself is non-classical. This approach may be applicable to the brain, where speculations about quantum operations in consciousness and cognition have a long history. ...

Our findings suggest that we may have witnessed entanglement mediated by consciousness-related brain functions. Those brain functions must then operate non-classically, which would mean that consciousness is non-classical.

Roger Penrose was widely mocked for advocating ideas like this. No one has made much progress on the problem of consciousness, and I am skeptical about this, and the next story.

Separately, I heard a rumor that a string theory prediction about holography has been confirmed in a quark-gluon plasma:

a big (not so well-kept) secret I heard the other day. Story goes that some accelerator lab (Fermi?) has been busy smashing heavy ion beams (Au nuclei?) together, creating a quark-gluon plasma. and measuring some QCD observable (say "A") of the chaos that ensues. According to a "holographic principle" (an AdS/CFT-type correspondence), A is equivalently described as some GR (or QG?) observable ("B") on the system comprised of a black hole that arises in the 5D spacetime forming the bulk (interior) of the shell on which the q-g plasma lives as a solution to the QCD equations. The Einstein equations for the evolution of the hole are solvable and B can be calculated. The lab has apparently successfully verified that the "predictions" given by the calculations of B agree with measurements of A. (Secret was leaked by Susskind in a recent talk which can be found on YouTube... my version includes a little reading between the lines and may not be completely accurate.... so I'll speculate further and guess that A is something like rate of change of temperature and B is something like rate of change in entropy, i.e., area of the event horizon. The plasma cools and the hole shrinks due to Hawking radiation?)

This is mind-blowing and, I think, of importance equal to, if not surpassing, that of the confirmation of GR by deflection of starlight during the 1919 eclipse... or of the finding of the Higgs.
I will be watching for more on this. Lenny Susskind gave some related lectures here and here.

Peter Woit has a new post trashing some related claims to testing string theory.

Scott Aaronson is claiming some new results about the complexity of the AdS/CFT correspondence. You have to skip over his previous blog post, where he describes the progressive thesis that he is aligned with:

just like at least a solid minority of Germans turned out to be totally fine with Nazism, however much they might’ve denied it beforehand, so too at least a solid minority of Americans would be fine with — if not ecstatic about — The Handmaid’s Tale made real. Indeed, they’d add, it’s only vociferous progressive activism that stands between us and that dystopia.

And if anyone were tempted to doubt this, progressives might point to the election of Donald Trump, the failed insurrection to maintain his power, and the repeal of Roe as proof enough to last for a quadrillion years.

I have never even heard of any Trump supporters who want anything like The Handmaid's Tale. Only liberals watch the show and read the bood, as far as I know. Also there was no insurrection, and no repeal. Abortion law was merely returned to the democratic process. Aaronson sounds like a parody of a left-wing lunatic. I sometimes wonder if he is serious.

Monday, October 31, 2022

What was that Bell Nobel Prize For?

The Bell fans lobbied for a Nobel Prize for 30 years, and they finally got it, so are they happy?

No. See this video, Tim Maudlin Corrects the 2022 Nobel Physics Committee About Bell's Inequality. He says the Nobel citation missed the point.

I don't want to pick a fight with Maudlin, as he is a very smart guy who explains this stuff very well. He has sharp disagreements with others about Bell's theorem, and I describe them here.

Another recent Maudlin video says:

[47:00] The theorem of Bell [and confirming experiments] is the most astonishing thing in the history of Physics.
Among other things, he gives a very good explanation of what is wrong with superdeterminism, as a Bell loophole. Here is a shorter interview.

Here is my view. When quantum mechanics (QM) was discovered in 1926, a lot of smart people wondered whether was a new type of theory, or if the uncertainties were just disguising an underlying classical theory. John von Neumann was the world's smartest man, and he convinced himself in 1932 that QM was different from any classical theory. Einstein co-wrote a 1935 paper speculating that QM might be completed by adding elements of physical reality. Bell showed in 1964 that the difference between QM and a classical theory could be quantified, and that was later confirmed experimentally by Clauser and the other Noble prize winners.

So the Bell work is no big deal, as it only confirmed what everyone thought.

Maudlin and the other Bell fans have another view. To be fair to Maudlin, I suggest his paper, What Bell Did, and his exchange with Werner, here and here.

He correctly says that Bell assumed locality, hidden variables, and statistical independence. Statistic independence is assumed by all of science, and is reasonable. Hidden variables are just the Einstein elements of physical reality, and he and Bell argue that any reasonable theory would have them. That leaves locality. The experiments showed that the Bell inequalities are violated, so that means that nature must be nonlocal.

He is right that if you accept hidden variable theory then you have to accept nonlocality. I just do not accept hidden variables.

He is also right that the Nobel citation failed to endorse the nonlocality conclusion.

There are also the superdeterminism and many-worlds loopholes, but Maudlin and the Nobel committee are right to ignore these. That leaves you with a choice -- you can have locality or hidden variables, but you cannot have both.

Maudlin would say that I and the Nobel committee suffer from a misconception that has gone on for decades.

It would take some very compelling evidence to convince me of nonlocality. As Maudlin says, if you snap your fingers, do you believe that what happens in your hand can depend on what happens in a distant galaxy? I say of course not, but Maudlin accepts that.

Wouldn't we see some examples of action-at-a-distance?

He gives an example pointing to nonlocality in the Aharonov–Bohm effect. I do not agree, but it requires technical explanation, and maybe I will post separately on it.

Maudlin says:

The reality of nonlocality has been settled. [3rd video, 18:45]
So what is nonlocal? There is no way to change one particle, and have that affect an observable of a distant particle. So the only things that are nonlocal are the mythical hidden variables.

Wikipedia describes Bell's theorem:

Bell's theorem is a term encompassing a number of closely related results in physics, all of which determine that quantum mechanics is incompatible with local hidden-variable theories given some basic assumptions about the nature of measurement.
Maudlin wants to remove the term "hidden-variable" from the picture, and deny that Bell made such an assumption. You can read Bell's 1964 original paper, and see for yourself that he assumes hidden variables. In later papers he called them "beables" and tried to argue that they could be assumed from first principles. But they have to be assumed somehow.

Discussions of Bell's Theorem sometimes get sidetracked by issues of probability and determinism. Some say Bell proved the world is indeterministic. Some say Einstein EPR objected to indeterminism. This is a red herring. There is some truth to it, but it has to be stated carefully, or it is misleading. Maybe I will make another post on this issue. I would say that Bell proved the impossibility of local hidden variable theories, whether they are deterministic or stochastic. Ultimately all theories are stochastic anyway, as all measurements and predictions have errors.

Friday, October 28, 2022

Why Many-Worlds cannot have Probabilities

More and more physicists say that the Many Worlds Interpretation (MWI) is their favorite interpretation of quantum mechanics (QM). They usually argue that it is simpler, more scientific, more philosophically sensible, and obviously preferable to the nonsensical and inconsistent Coperhagen Interpretation (CI). They stress that it is an interpretation, making all the same predictions as QM/CI. All of this is false. MWI does not any predictions that are verifiable by experiment. It says all outcomes are possible. To get a measurable prediction, you have to somehow say that some outcomes are more probable than others. The MWI theory fails to say any worlds are more probable than others. So to get probabilities, you need the Born Rule.

Some have argued that there is a way to get the Born Rule in MWI, but the mainstream opinion is that those arguments are circular. For example, see this recent paper:

How Do the Probabilities Arise in Quantum Measurement? Mani L. Bhaumik ...

So far, only some ad hoc propositions such as Born’s rule [5] have allowed the physicists to predict experimen- tal results with uncanny accuracy of better than a part in trillion [6]. But the basic cause of this essential rule has remained shrouded in a veil of mystery. One of the prominent investigators in this field, Wojciech Zurek has attempted to provide a derivation of the Born rule per- haps to make his program comprehensive [7]. But it has faced a stiff resistance from some foremost investigators including one of the giants of physics of our time, Nobel laureate Steven Weinberg.

In his classic textbook, Lectures on Quantum Mechan- ics, Weinberg states [8, p. 92], “There seems to be a wide spread impression that decoherence solves all obstacles to the class of interpretations of quantum mechanics, which take seriously the dynamical assumptions of quantum mechanics as applied to everything, including measure- ment.” Weinberg goes on to characterize his objection by asserting that the problem with derivation of the Born’s rule by Zurek “is clearly circular, because it relies on the formula for expectation values as matrix elements of operators, which is itself derived from the Born rule.” In [8, p. 26] he questions, “If physical states, including observers and their instruments, evolve deterministically, where do the probabilities come from?" Again in his recent book [9, p. 131], Weinberg questions, “So if we regard the whole process of measurement as being governed by the equations of quantum mechanics, and these equations are perfectly deterministic, how do probabilities get into quantum mechanics?

Maximilian Schlosshauer and Arthur Fine remark [10], “Certainly Zurek’s approach improves our understanding of the probabilistic character of quantum theory over that sort of proposal by at least one quantum leap.” However, they also criticize Zurek’s derivation of the Born’s rule of circularity, stating: “We cannot derive probabilities from a theory that does not already contain some probabilistic concept; at some stage, we need to “put probabilities in to get probabilities out.”

The author goes on to argue that he has solved these problems, and found a solution that has eluded physicists since 1926.

Maybe so, but I doubt it. The paper looks as if it reviews some standard QM theory, and shows that questions naturally have probabilities. Yes, sure QM has probabilities. It is when you make the leap to deterministic unitary theory and MWI that the probabilities disappear.

Weinberg is dead, so we cannot ask him if this paper solves the problem. I doubt that others are persuaded, but we shall see.

In the mean time, I cite this as proof that MWI currently has no way of saying that any outcome is more probable than any other. In other worlds, completely usuless as a scientific theory. Anyone who subscribes to it is a crackpot.

Unless this paper solves all the MWI problems. If the MWI advocates endorse this paper as a solution to their problems, then I will take another look at it. But that will not happen. They will just go on ignoring the fact that MWI cannot make any testable prediction.

Here is a podcast interview of Hugh Everett's biographer. He is described as having a hard life, and his MWI theory, which he preferred to call the "relative state", was not well appreciated in his lifetime. The interviewer, Steve Hsu is a believer.

They acknowledge that some journals refuse to publish anything in favor of MWI, and maybe half of physicists regard it as outlandish and ridiculous. But they also argue that it is essentially the same as decoherence theory, and that is very well accepted.

It is not the same. Decoherence is an attempt to understand how the wave function collapses, in the absence of an observer. Copenhagen followers regard it as a straightforward extension of known QM. MWI posits that decoherence is accompanied by a split in the universes, making many more.

Hsu says that the whole universe does not necessarily split; just the observer splits. Okay, but he really wants MWI for cosmology problems where there is no observer. The splits must be huge.

Wednesday, October 26, 2022

An Electron is in Probabilistic Limbo

Science journalist John Horgan writes in SciAm about the recent Nobel Prize for Bell experiments:
Electrons possess a quantum property called spin, which is unlike the spin of a planet or top. Quantum spin is binary; it is either up or down, to use a common notation. Imagine if planets could only spin clockwise, or counterclockwise, with their axes pointed only at the North Star, and in no other direction, and you’re getting the gist of spin. Although quantum spin, like entanglement, makes no sense, it has been verified countless times over the past century.
No, that is not quite right. Quantum spin is measured by a Stern-Gerlach device, and that measures spin in a particular direction. Spin could be in another direction, but if you only measure North-South, you will only see North-South spin.

The concept makes sense. The most confusing thing is that the uncertainty principle prevents knowing the spin in different directions at the same time.

Okay, now you let the electrons fly apart from each other. Then you measure the spin of electron A and find that its spin is up. At that moment, the wave function for both electrons collapses, instantaneously predicting the spin of electron B, even if it is a light-year away. How can that be? How can your measurement of A tell you something about B instantaneously? Entanglement seems to violate special relativity, which says that effects cannot propagate faster than the speed of light. Entanglement also implies that the two electrons, before you measure them, do not have a fixed spin; they exist in a probabilistic limbo.
Horgan has been taking a QM class, but he has been led astray. Spin is not so confusing.

Even one electron by itself is in a probabilistic limbo. Even if you prepare it to have definite spin in a particular direction, then measuring spin in other directions will be governed by a probability forumula.

He complains about entanglement, but a similar thing happens classically. Suppose you had a two-planet system, and knew the total angular momentum. Then the planets got separated, and you measured the spin of one of them. You would immediately know the spin of the distant planet.

Nobody thinks that is spooky, or violates special relativity. So why do people get so excited by Einstein EPR and quantum spin entanglement?

You might say the quantum spin is more mysterious because of the uncertainty principle, and because Bell proved that the correlations are somewhat higher that what you would expect from a local hidden variable theory.

Okay, I agree, those are mysterious. So talk about them! Those mysteries can be understood.

Instead, these article use smoke and mirrors to exaggerate what is mysterious. I cannot even blame Horgan for this, as he is just parroting popular explanations that are designed to confuse.

Monday, October 24, 2022

Biology Journals have gone Woke

Academic research papers are increasingly woke-infected. Here is a recent biology paper, published in a respectable journal:
Six Principles for Embracing Gender and Sexual Diversity in Postsecondary Biology Classrooms

Conclusions

Biology classrooms represent powerful opportunities to teach sex- and gender-related topics accurately and inclusively. The sexual and gender diversity displayed in human populations is consistent with the diversity that characterizes all biological systems, but current teaching paradigms often leave students with the impression that LGBTQIA2S + people are acting against nature or “basic biology.” This failure of biology education can have dangerous repercussions. ...

Author Biographical

Ash T. Zemenick is a nonbinary trans person who grew up with an economically and academically supportive household to which they attribute many of their opportunities. They are now the manager of the University of California Berkeley's Sagehen Creek Field Station, in Truckee, California, and are a cofounder and lead director of Project Biodiversify, in the United States. Shaun Turney is a white heterosexual transgender Canadian man who was supported in both his transition and his education by his university-educated parents. He is currently on paternity leave from his work as a non–tenure-track course lecturer in biology. Alex J. Webster is a cis white queer woman who grew up in an economically stable household and is now raising a child in a nontraditional queer family structure. She is a research professor in the University of New Mexico's Department of Biology, in Albuquerque, New Mexico, and is a director of Project Biodiversify, in the United States. Sarah C. Jones is a disabled (ADHD) cis white queer woman who grew up in a supportive and economically stable household with two university-educated parents. She is a director of Project Biodiversify, and serves as the education manager for Budburst, a project of the Chicago Botanic Garden, in Chicago, Illinois, in the United States. Marjorie G. Weber is a cis white woman who grew up in an economically stable household. She is an assistant professor in Michigan State University's Plant Biology Department and Program in Ecology, Evolution, and Behavior, in East Lansing, Michigan, and is a cofounder and director of Project Biodiversify, in the United States.

I guess that if an author is "cis white" and from a normal educated family, she has to claim a disadvantage privilege somehow, so she is queer and disabled with ADHD. In most people, queer is an excuse for perverted sexual practices, and ADHD is an excuse for mind-altering drugs.

Nature, perhaps the world's top science journal, has turned over a whole issue to racism. It starts with the last known example of scientific racism:

In 1768, the UK Royal Society commissioned a research ship, HMS Endeavour, to sail to Tahiti in time to witness a transit of Venus across the Sun. But, as researchers later discovered, the UK government and the society had an extra purpose for the voyage: the ship’s captain, James Cook, had been given secret instructions to continue onwards in what became Britain’s colonial takeover of Australia and New Zealand.
I am not sure what is racist about that. Britain probably did not care about the races of the local inhabitants.
The killing of George Floyd at the hands of the Minneapolis police department, and President Donald Trump’s crushing of protests across the United States, has angered the world, and led to marches in cities globally. The repeated killings of Black people in the United States serve as reminders — reminders that should not be needed — of the injustice, violence and systemic inequality that Black Americans continue to experience in every sphere of life.

Black people are more likely than white people to die at the hands of the police;

These are lies. Floyd died of a fentanyl overdose. Trump did not crush protests. Black people are less likely to die at the hands of police. There are only about ten a year who die, and they are nearly always dying as a result of trying to kill an arresting officer.

Since it is a science journal, I expect it to have evidence for its assertion. But the issue keeps claiming racism, and the only examples are trivial. One Black womon complains that when she came to a USA college dormitory from Ghana, her roommate did not want her sitting on her (the roommate's) bed. Another Black geoscientist complained that he was asked in a private email to defend some public accusations he made.

This is bizarrely lame. Most people do not want others sitting on their beds. White scientists have no problem defending what they say. Only a crappy scientist would refuse.

So why doesn't she go back to Ghana if she is being treated so badly? No, the fact is that there is a steady migration of Blacks from Black majority countries to supposedly racist countries like USA and UK. The fact is that USA and UK treat Blacks extremely well, and better than elsewhere.

This issue was supposed to convince me that Black scientists suffer systemic racism, but it convinces me of the opposite. The Black scientists in it are incompetent whiners who cannot give any example of any Black being mistreated, or say anything to justify the affirmative action policies of hiring less competent Blacks over more competent Whites.

Wednesday, October 19, 2022

The Multiverse Pandemic

More and more, Physics popularizers like Sean M. Carroll tell that we have to accept Many-Worlds, as the only intellectually respectable interpretation of quantum mechanics. Furthermore, it is logically implied by Schroedinger's equation, and Occam's Razor requires acceptance. As a bonus, it is completely deterministic, so we have no free will.

This is so crazy, it deserves to be mocked.

Nicolas Gisin writes in a new paper:

Newton never pretended that his physics were complete. And so, the dictatorship of Determinism was tolerable to free men.

Then came quantum physics. At first, free men celebrated the revolution of intrinsic randomness in the material world. This was the end of the awful dictator De- terminism, or so they thought. But this dictator had a son. . . or was it his grandson?

Determinism returned in the new guise of quantum physics without randomness: everything, absolutely everything, all alternatives, would equally happen, all on an equal footing. Real choices were no longer possible. But the most terrible was still to come: universal entanglement. According to the new multiversal dictator, not only did the material world obey deterministic laws, but it was all one big monstrous piece, everything entangled with everything else. There was no room left for any pineal gland, no possible interface between physics and free-will. The sources of all forces, all fields, everything was part of the big Ψ, the wavefunction of the multiverse, as the dictator bade people call their new God. ...

It’s time to take a step back. I am a free being, I enjoy free-will. I know that much more than anything else. How then, could an equation, even a truly beauti- ful equation, tell me I’m wrong? I know that I am free much more intimately than I will ever know any equation. Hence, and despite the grandiloquent speeches, I know in my gut that the Schrodinger equation can’t be the full story; there must be something else. “But what?”, reply the dictator’s priests. Admittedly, I don’t know, but I know the multiverse hypothesis is wrong, simply because I know determinism is a sham[4–9]

He is right. You know it is bogus when Carroll tells us that Occam's Razor and the beauty of the Schroeding equation require us to believe that zillions of new universes are being created every second. When you think you are making a decision, you are just creating new universes where everything possible happens.

You do not have to understand the mathematics of the Schroeding equation to know that this is foolishness.

Here is a reply to Gisin.

According to compatibilism, it is perfectly possible that our will is compatible with a causally closed world. But this may seem to be a too simplistic semantic trick to avoid the problem, and there is more to be said.

But how can indeterminism allow free-will? How would it help if our decisions are not fully determined by our own present state, but by occasional randomness break- ing into the causal chain?

Wouldn’t we be more free if we can determine our next decisions based on how we are now, rather than letting them at the mercy of randomness?

When the super-smart AI robots take over the world, I expect them to use sneaky philosophical arguments like this to convince people that the truest possible freedom is to become a slave to a deterministic algorithm.
But why being restricted to a unique choice would mean more freedom than making all possible choices in different worlds? A world in which we can choose only one thing and all the others are forbidden restricts our freedom. MWI allows us to follow Yogi Bera’s advice,
When you come to a fork in the road, take it.
So true freedom is a child's imagination, where anything can happen.
Could it be true that in MWI the histories in which Shakespeare produced randomly both great and bad lit- erature overwhelmingly dominate the multiverse? If MWI gives the same probabilities as standard QM, Shakespeare should create consistently great or consis- tently bad literature in most histories.

So how would entanglement limit creativity?

It is hard to believe physicists say this stuff seriously.

Here is another attempted rebuttal. I guess the Gisin paper touched a nerve.

Second, the concept of free will is vague and ill-defined - so it is a shaky basis to build a general argument against a given physical theory. Of course we all experience (and enjoy) the feeling of making our decisions spontaneously and autonomously, and it is comforting to know that this feeling is not in contradiction with our most fundamental understanding of the universe; but in order to understand exactly how physical laws allow for free will (or conscious- ness, or creativity – call it whatever you like), one needs a physical theory of it, which we currently do not have.
In other words, we all experience free will, but our physics cannot explain it, so we should just go with theories that make it impossible.
Another argument against unitary quantum theory is that its only available interpretation is the so-called “Many-world” interpretation ...

Unitary quantum theory is consistent; it provides a good explanation of all the experimental observations so far, and (unlike some of its stochastic variants) it is also compatible with properties of general relativity, such as locality and the equivalence principle.

Yes, unitary QM is just another name for Many-worlds, and it has never been able to explain any experiments or had any compatibility advantages with relativity.

In regular QM, you collect some data, make a wave function, compute a prediction, do an experiment, and get a definite outcome. Then you collapse the wave function to incorporate the new info.

In a unitary theory, all the possible outcomes that did not happen must live on somehow. We do not see them, so we suppose them to be in the parallel worlds. So the theory does not really predict anything, because it says all things happen invisibly.

Saying that the unitary QM theory is consistent and explains experiments is just nonsense. The theory says anything can happen. It only explains experiments in the sense that whatever we see is one of the possibilities in a theory saying everything is a possibility. That's all. The theory cannot even say that some outcomes are more probable than others.

Here is a recent podcast from another free will denier. He says most physicists say that QM has inherent indeterminacies, but he subscribes to superdeterminism. He goes on to explain that Schroedinger's cat is either alive or dead, and long-term weather predictions may be impossible due to chaos. Okay, but no free will? He returns to the subject at 1:33:20. His only reluctance to accept full determinism is that he does not want to excuse Hitler for moral responsibility.

Monday, October 17, 2022

Science Grants for Equity, not Science

Physicist Lawrence Krauss writes in the WSJ:
Now Even Science Grants Must Bow to ‘Equity and Inclusion’

Forget the Higgs boson and neutrinos. The Energy Department wants to know your diversity plan.

Starting in fiscal 2023, which began Oct. 1, every proposal responding to a solicitation from the Office of Science is required to include a PIER plan, which stands for Promoting Inclusive and Equitable Research, to “describe the activities and strategies of the applicant to promote equity and inclusion as an intrinsic element to advancing scientific excellence.” In the words of the announcement, “The complexity and detail of a PIER Plan is expected to increase with the size of the research team and the number of personnel to be supported.”

When I read this new requirement, I went back to the last grant proposal from our group—which involved exploring gravitational waves, the early universe, Higgs boson physics, neutrino cosmology, dark-matter detection, supersymmetry and black-hole physics. What does any of this have to do with diversity and inclusion? Nothing.

There a cost to this. I expect American Science to ge into a long term decline. Colleges are no long accepting the best students, universities are not hiring the best professors, and the best research is not being funded.

Thursday, October 13, 2022

Nobel Prize was Not for Non-locality

Commentary about the 2022 Nobel Physics Prize breaks down into two camps: (1) the experiments confirmed quantum mechanics as it has been understood since 1932; or (2) revolutionary experiments show nature is not real or local.

I pointed out that the Nobel committee stuck to (1), and pointedly did not endorse (2).

Another person who noticed this was Physics Philosopher Tim Maudlin, who wrote:

Unfortunately, much of this history has been garbled in the public discussion of Bell’s work and its experimental tests. The Nobel prize committee itself gets it wrong in its press release,
"John Clauser developed John Bell’s ideas, leading to a practical experiment. When he took the measurements, they supported quantum mechanics by clearly violating a Bell inequality. This means that quantum mechanics cannot be replaced by a theory that uses hidden variables."
But that statement is flatly false. Indeed, it was a theory that uses hidden variables—Bohmian mechanics—that inspired Bell to find his inequalities, and that theory makes the correct prediction that the inequalities will be violated.
The Nobel statement is correct if "theory" means "local theory". For the most part, theories have to be local to be scientific. A nonlocal theory can have magical action-at-a-distiance.

Bohr, Einstein, Bohm, Bell, and everyone else did not believe in nonlocal theories. Bell and Bohm only wrote about Bohm's theory as a mathematical curiosity. Bell was parially motivated by trying to find a local version of Bohm's theory, not to accept a nonlocal theory.

Maudlin is exceptional in that he believes in nonlocal interpretations. I think Botmian pilot wave theory is his favorite, and he claims it can be turned into a full interpreation of quantum mechanics.

Much as poeple like to argue that QM is strange, Bohmian mechanics is far stranger. In it, an electron can be observed in one place, and its ghost can be causing weird effects elsewhere.

Maudlin concludes:

What Bell’s theoretical work and the subsequent experimental work of Clauser, Aspect and Zeilinger proved was non-locality, not no-hidden-variables. Ultimately, they proved Einstein wrong in his suspicions against spooky action-at-a-distance. And that, surely, deserves the highest honors one can bestow.
Yes, they would deserve the highest honors if they proved non-locality, and that spooky action-at-a-distance really does happen. But the Nobel Prize citation pointedly does not say any of those things. The prize was only for experimental work confirming quantum mechanics as it was understood in 1932.

In his later years, Bell adopted a view that a true theory of nature should be based on hidden variables, which he called by a term he coined, "beables". So hidden variables became a philosophical necessity. So when the Bell test experiments ruled out local hidden variables, he adopted nonlocal hidden variables. I think that Maudlin was persuaded by that argument.

But the mainstream textbooks are not persuaded, and neither is the Nobel committee.