Thursday, December 29, 2022

What is Relativity?

There are two main ways to describe the theory of relativity.

(1) A way of reconciling Maxwell's electromagnetism theory with the motion of the Earth.

(2) Combining space and time into a 4-dimensional spacetime with a non-Euclidean geometry, such that the laws of physics respect that geometry.

Version (1) was the view of Lorentz and Einstein (up to 1912 or so), and also FitzGerald, Maxwell, and other pioneers.

Version (2) was the view of Poincare (1915), Minkowski (1907), Grossmann (1913), and Hilbert (1915). Einstein sometimes seemed to eventually adopt the view, but he also denied that geometry had any essential role.

Wikipedia defines:

The theory of relativity usually encompasses two interrelated theories by Albert Einstein: special relativity and general relativity, proposed and published in 1905 and 1915, respectively.[1] Special relativity applies to all physical phenomena in the absence of gravity. General relativity explains the law of gravitation and its relation to the forces of nature.[2] It applies to the cosmological and astrophysical realm, including astronomy.[3]

The theory transformed theoretical physics and astronomy during the 20th century, superseding a 200-year-old theory of mechanics created primarily by Isaac Newton.[3][4][5] It introduced concepts including 4-dimensional spacetime as a unified entity of space and time, ...

Typical scholarly historical papers are Elie Zahar's Why did Einstein's Programme supercede Lorentz's Part I, and Part II. (Paywalled)

It is a historical fact that Einstein's programme had no significant influence, and it was the Poincare-Minkowski geometric view that superseded Lorentz's in 1908. Einstein's programme was called the Lorentz-Einstein theory, and nobody thought he was saying anything different from Lorentz. Almost no one. Vladimir Varicak credited Einstein with a more geometric view in 1911, but Einstein published a rebuttal, denying any difference from Lorentz.

In today's textbooks, relativity is the covariance of Maxwell's equations and other laws under the symmetries of spacetime, notably Lorentz transformations.

Wikipedia is correct that relativity "introduced concepts including 4-dimensional spacetime as a unified entity of space and time", but Einstein did not introduce that concept, and did not accept it in 1908 when everyone else did. It is not clear that he ever completely accepted it, as he denied the importance of geometry.

When I say non-Euclidean geometry, I do not just mean the curvature of general relativity, or the hyperbolic geometry rapidity of special relativity. I mean the geometry that Minkowski so clearly explained in 1907-8, where 4-D spacetime has an indefinite metric and a symmetry group preserving that metric. A geometry is a space with some structure, and a group preserving the structure. For example, see this recent Roger Penrose interview, where he calls it Minkowski geometry. Or see this recent Terry Tao post.

Thursday, December 22, 2022

Is Psi Ontic or Epistemic?

Here is a typical paper addressing the reality of the wave function in quantum mechanics:
The ontological models framework distinguishes ψ-ontic from ψ-epistemic wave- functions. It is, in general, quite straightforward to categorize the wave-function of a certain quantum theory. Nevertheless, there has been a debate about the ontological status of the wave-function in the statistical interpretation of quantum mechanics: is it ψ-epistemic and incomplete or ψ-ontic and complete?
I do not see how this question makes any sense. The wave function is not directly observable. It is useful for making predictions. What sense is there in wondering whether or not it is real?

Bell's theorem shows that psi cannot be epistemic in the sense that it shows our knowledge about an underlying local hidden variable theory. But it could still encode our knowledge about the physical state.

The paper also talks about whether psi is statistical. Agaikn, this makes no sense. All physical theories are statistical in the sense that you can run multiple experiments, and compute statistics on the predictions and outcomes. Quantum mechanics is not much different from classical mechanics in this way. All the theories give error bars on predictions, if you do them right.

Supposedly the PBR Theorem proves that psi is ontic. This is very misleading, at best. If I release an electron from my lab, its wave funciton will soon be miles wide. Does any think that the ontic reality of that election is miles wide? No, that is absurd. Only our uncertainty is miles wide, because we don't know which way the electron went.

The PBR paper was originally submitted under the title, "The quantum state cannot be interpreted statistically". It is baffling how anyone could think that they proved something so absurd. Everything can be interpreted statistically. How could it not be? When the paper was published, the editors forced them to change the title to "On the reality of the quantum state". Of course the paper cannot prove anything about reality either.

The Schroedinger Cat thought experiment was supposed to dispose of the idea that the wave function is ontic. The cat is not really half dead and half alive. The wave function predicts probabilities consistent with observation, and that is all. Schroedinger and Bohr thought that this was obvious.

A lot of people seem to believe that the Copenhagen interpretation requires believing in a cat that is a superposition of dead and alive. Wikipedia says so. But it also says that Bohr himself did not, and believed that something like decoherence would irreversibly put the cat into the dead state or the alive state, even before anyone opens the box.

Here is a recently-released short video on Max Tegmark - Many Worlds of Quantum Theory. I wonder if these physicists realize how ridiculous they sound when they gave layman explanations.

He says he believes that the many worlds are real, because we see electrons being in two places at once in the lab. But no one has ever seen an electron in two places at once. The double-slit experiment can be interpreted as an electron going thru both slits, but that is just a wave property of an electron. And even if an electron can be in two places, and humans are made of electrons, it does not follow that humans can be in two places.

Never mind the details. Many worlds theory is an absurdity. He says these worlds are real, and there is no evidence for them whatsoever. Listening to him is like going to a comic book convention and watching someone, dressed in a costume, ramble about some fantasy world as if it were real.

Tegmark ends by saying that many-worlds could be disproved if quantum mechanics were disproved. He does not even admit the obvious possibility that quantum mechanics is correct and many-worlds is not.

Sabine Hossenfelder has a new lecture on The Other Side Of Physics (TEDxNewcastle). It is hard to take anything she says seriously, once you realize that she believes in superdeterminism. She talks about Physics as a tool for answering questions, but that is not true under superdeterminism. That teaches that experimental results cannot be believed, because the outcomes could have been forced by a conspiracy dating back to the Big Bang. In particular, it says that all the quantum experiments could be fake, and that we really live under classical mechanics.

In particular, she argues that the correlations of the Bell test experiments are all wrong because they are artifacts of the experimenters failing to properly randomize the input data, and they always fail because of mysterious constraints.

She also says that Eintein's special theory of relativity showed that there is no such thing as "now". This is because when we see or hear something, it takes time for the signals to reach our brains.

Then she rambles about multiverse theory being ascientific, because the scientific method cannot be applied to it. But surely superdeterminism is even worse, and is unscientific, as it says that the scientific method does not work.

She says that information cannot be destroyed. Seems like more unscientific nonsense to me.

There are certains ideas of Physics that are so outlandish as to discredit anyone who promotes them. I include: many-worlds, superdeterminism, retrocausality, simulation hypothesis, and action-at-a-distance.

Monday, December 19, 2022

Do moving clocks slow down?

Some Russian scholars address this question:
The special theory of relativity has fundamentally changed our views of space and time. The relativity of simultaneity in particular, and the theory of relativity as a whole, still presents significant difficulty for beginners in the theory. v...

The real question is why we can even find real clocks (such as atomic clocks) that work very much like the ideal clocks of relativity. The short answer to this question is that a real clock behaves like an ideal clock because the physics that governs its inner workings is Lorentz invariant. ...

However, this question makes sense in the more ambitious program of Lorentz and Poincar ́e on how the theory of relativity should be developed. Lorentz noted in 1909 that the main difference between the two programs is that “Einstein simply postulates what we have deduced, with some difficulty and not altogether satisfactorily, from the fundamental equations of the electromagnetic field” [109].

Although the Lorentz-Poincar ́e program of deriving the Lorentz symmetry rather than postulating it was never completely abandoned [110, 2], it is clear that Einstein’s program replaced it [109, 111], and for good reason: even today, Lorentzian symmetry cannot be deduced from a more fundamental theory.

The phrase “moving clocks slow down” is universally used and has been in common use for over a hundred years, and we realize that our suggestion not to use it, while correct, is not very realistic due to the ”intellectual inertia” [112] of the physics community. ...

In our opinion, better pedagogical practice would be to base the special relativity from the very beginning on the Minkowski four-dimensional formalism, which allows the introduction of relativistic concepts without any mention of their Newtonian counterparts [13].

It is amusing to see Russian scholars engaging in the same sort of Einstein idol worship that we see in the West, especially when contrary evidence is staring them in the face.

Yes, Einstein simply postulated what Lorentz and Poincare proved. By this, Lorentz meant that they deduced it from experiments, like Michelson-Morley, and established theory, like Maxwell's equations. Einstein ignored all that, took their conclusions, and made them postulates.

Einstein did not dispute this. His original papers did not have any references, but in later interviews he always said  he got the light postulate from Lorentz, and did not use Michelson-Morley himself.

The Minkowski 4-D formalism was invented and published by Poincare in 1905, and popularized by Minkowski in 1908. Einstein eventually accepted it a few years after that.

I am not sure Einstein's program is really so popular in textbooks. All the books I've seen either start with Michelson-Morley, or go straight to 4-D. The books might credit Einstein with these ideas, but he missed them.

For example, here are a couple of recent lectures on spacetime, here and here. Both credit Einstein with discovering relativity, but acknowledge that it was Minkowski's 4-D formalism that caught on among physicists of the day and that still dominates and that Einstein only reluctantly accepted it a couple of years later after arguing against it.

In his 1922 Kyoto lecture “How I Created the Theory of Relativity,” Einstein describes the decisive moment when he became enlightened in conversations with his friend Michel Besso: “My interpretation was really about the concept of time. Namely, time could not be defined absolutely, but is in an inseparable relationship with the signal velocity” [107].

Although the alleged cause of Einstein’s redefinition of time is not known for certain, and it is conceivable that Einstein, consciously or unconsciously, may have borrowed from other authors, most plausibly from Poincar ́e, more than his writings and sayings suggest [108], without any doubt, this new definition of time was the most shocking and paradoxical aspect of special relativity.

Einstein elevated Lorentz’s local time to the status of “true” time, and for his contemporaries it became the successor to Newtonian time. This immediately gave the theory a paradoxical tinge, since Newtonian time is absolute, while Lorentz’s local time varies depending on the inertial frame of reference.

Einstein said his big breakthrough in 1905 was to understand the local time that Lorentz invented in 1895. Lorentz got the Nobel Prize in 1902, and Poincare's nomination credited him with the ingenious invention of local time.

To answer the title question, the clocks do not slow down, relative to their own world lines. They appear to slow down in another frame, because of the 4-D non-Euclidean geometry.

Discussion of whether the special relativity contractions and dilations are real or apparent has a long history. From Wikipedia:

In the period 1909 to 1913 Varićak had correspondence with Albert Einstein[6] concerning rotation and length contraction where Varićak's interpretations differed from those of Einstein. Concerning length contraction Varićak said that in Einstein's interpretation the contraction is only an "apparent" or a "psychological" phenomenon due to the convention of clock measurements whereas in the Lorentz theory it was an objective phenomenon.[7] Einstein published a brief rebuttal, saying that his interpretation of the contraction was closer to Lorentz's.[8]
See also Reality of length contraction. Einstein still did not accept in 1911 that relativity used a non-Euclidean geometry, and that the contractions and dilations were artifacts of that geometry.

Another new paper argues:

There have been three geometrizations in history. The first one is historically due to the Pythagorean school and Plato, the second one comes from Galileo, Kepler, Descartes and Newton, and the third is Einstein's geometrization of nature. The term geometrization of nature means the conception according to which nature (with its different meanings) is massively described by using geometry.
Einstein would not agree with this. He did not believe that General Relativity geometrizes gravity. He persisted in this view, long after he was credited with geometrizing gravity. Steve Weinberg also did not like that view. Strange, as they sometimes used geometric arguments to solve general relativity problems, and almost everyone else accepts the geometry view.

Wednesday, December 14, 2022

Dynamical Models do not Generate Free Will

Physics professor Scott Aaronson's blog attracts trolls, but a couple of comments there are so glaringly wrong that it is worth showing why.

SR Says:

As pointed out by Mateus Araújo above, QFT is local, so it seems to me that the only ways to reconcile this with the nonlocality of standard QM evidenced by the Bell test are: (1) QFT being totally incorrect in an easily-measurable way, (2) Many-Worlds, so that the appearance of non-locality is only due to our observable universe residing in a slice of the “true” wavefunction, (3) superdeterminism.
If those were really the only consequences of the Bell tests, then the Nobel citation would have said so. It did not.

Fred writes:

“It’s very weird that we feel that we have the free will to perform experiments and the consciousness to feel like we understand them, when, to our best understanding, everything is predetermined by the laws of physics.”

What’s even weirder is that determinism isn’t even the core culprit here! You only have two ingredients at each end of the spectrum: perfect determinism (e.g. a bunch of balls moving around and hitting one another… i.e. every event is caused by a prior event), and pure randomness (everything is uncorrelated noise, i.e. no causality, things happen without a prior cause… very weird too when you think about it).

And then you can mix those two in various amount, on a continuous scale. QM is a mix of determinism and randomness, somewhere in the middle of the scale. MWI + consciousness also seems to lie in the middle of the scale (the wave function of the universe is determined, but my place as a conscious being on that structure seems random, from my subjective point of view).

When it comes to free will: sure, determinism seems to obviously exclude it… but randomness seems to exclude it too! For the general idea of free will isn’t exactly understood by throwing a dice at every moment a supposed “decision point” happens. He is right that our dynamical models do not generate free will. That is why it is called free will.

He responds:

You might as well call it pixie magic dust then?

A dynamic system is a system which state evolves with time (a parameter).

My point is that dynamic systems either evolve following causality (current state is derived from prior state) and/or randomness (current state is independent of prior state), and then any degree of mix of those two things (where events depend partly on prior events and partly on some randomness).

Note that randomness is non-determinsm, meaning an event without any cause within the system. Whether that randomness is pure (appearing magically within the system) or is a dependence on causes external to the system is basically the same.

That’s it!

What other ingredient would there be?

No, randomness is not an event without any cause within the system, it is an event without cause in the model. Something is random if your model cannot predict it. There is no such thing as pure randomness.

Free will is a form of randomness that no one is even trying to model.

He moved on to the latest scientific hoax:

Looks like fusion energy supremacy has been demonstrated!

Scott Says: Only “scientific supremacy,” not supremacy where you account for the actual energy cost of the lasers which is still 100x too high. Still good though!

They refer to this story:
Scientists with the U.S. Department of Energy have reached a breakthrough in nuclear fusion.

For the first time ever in a laboratory, researchers were able to generate more energy from fusion reactions than they used to start the process. The total gain was around 150%.

"America has achieved a tremendous scientific breakthrough," Energy Secretary Jennifer Granholm said at a press conference.

The achievement came at the National Ignition Facility (NIF), a $3.5 billion laser complex at Lawrence Livermore National Laboratory in California. For more than a decade, NIF has struggled to meet its stated goal of producing a fusion reaction that generates more energy than it consumes.

To make this claim, they ignore the energy required to run the lasers, and to power the confinement structure. As Scott notes, they are really putting 100x energy in.

The analogy here is that this new experiment is supposed to show that it is possible to generate fusion energy, even if the overall efficiency is lousy. Quantum supremacy is supposed to show that it is possible to generate a super-Turing computation, even if it is completely useless.

Update: I added this comment:

@fred: You say that events are either caused by the previous state (determinism), or independent of it (random), and magic pixie dust is the only other possibility.

Consciousness is the other possible cause. My conscious state appears inscrutable to you. You cannot model it, and my decisions appear unpredictable. You as might as well call them random, if you cannot predict them. I cannot really explain it either, except to say that I am more sure of it than any of my perceptions. Cogito ergo sum. (I think, therefore I am.)

Readers here might say I could be a troll or an AI bot, and hence not to be believed. Fine, decide for yourself. Do you really think that modern physics has explained causality so thoroughly as to rule out human consciousness? Or do you make decisions everyday that science cannot predict or explain?

Update: Another comment:
They got a breakthrough on Nuclear Fusion. No one now can say quantum computing is the nuclear fusion of the 60s.
Not sure if this is intended to be sarcastic. Quantum computing is more like nuclear fusion than ever, with both fields making big splashes with supremacy claims of no practical significance.

Monday, December 12, 2022

Impossible or Fundamentally Impossible

Here is Scott Aaronson's main argument for quantum computing:
I notice that you never responded about a fault-tolerant QC running Shor’s algorithm. Do you believe that that’s fundamentally possible, or not? If not, what physical principle is going to come in and prevent it? Will you agree that the discovery of that principle would be a revolution in physics?
He draws a distinction between what is impossible, and what is fundamentally impossible. I am not sure the distinction makes any sense.

In other words, QC might be impossible, but it is not fundamentally impossible unless some law of physics forbids it. We have not found that law of physics. The Extended Church-Turing Thesis forbids it, but he says it is not fundamental and it "is still on thin ice."

A response:

From an engineering point of view, there are often unforeseen limitations emerging from complex interactions of different domains such as physics of materials, chemistry, thermodynamics, mechanics, economics, etc. Those different knowledge fields are themselves at a much higher conceptual level compared to the underlying basic physics they all share, so their own laws/heuristics only hold in specific domains with specific assumptions, and all those various models (often highly non linear) just don’t overlap.

There’s nothing in the basic laws of physics explicitly saying that you can’t build a stable stack of quarters from here all the way up to the edge of space.

But do you believe it can be done? Given an existing stack of quarters, it’s trivial to just add one more quarter to it, and then by recursion assume the stack can be arbitrarily high. But that’s not how system scalability works in practice: at some point, what works for 100 quarters won’t work for 1000 quarters, because new problems are introduced: e.g. the wind will screw things up, and if you build your stack inside a tube with a vacuum, you’re now facing another totally different engineering challenge (create a 100 mile-high tube that can contain a vacuum). And, even without air, you’d have to deal with the effects of tides, plate tectonics, strength limitations in alloys, etc.

There’s also no specific law of physics telling us whether building room temperature super-conductors is impossible.

Same about building a stealth bomber that can travel faster than mach 5 at sea level.

It also goes the other way: a hundred years ago, it would have seem impossible (given the technology of the day, but given pretty much the same laws of physics) to build a gravitational wave detector that could measure changes in distance around 1/10,000th of the diameter of a proton, between two mirrors separated by 4km.

So, for the vast majority of hard engineering problems (and building a QC *is* a hard engineering problem), the fact that there’s no clear black and white basic principle saying it’s impossible isn’t really helping much at all. It wouldn’t be the first time we set up to build something, and then it never happens because various requirements just can’t be met within the same system (often it’s quietly killed because money runs out and people move on to other things because some new engineering progress makes an entirely different problem more exciting to work on).

So maybe QC is like that gravitational wave detector. It seemed impossible for a long time, until some huge technological advances were made.

In the same thread, Aaronson ridicules the idea that Einstein might have thought that ER=EPR. He did not even believe in either wormholes or quantum mechanics. It is not clear today if anyone really believes this wormhole entanglement nonsense. Somehow Einstein did inspire a lot of bogus physics thinking.

Sabine Hossenfelder has a new video on Quantum Uncertainty Simply Explained. She correctly describe the uncertainty principle as an essential part of quantum mechanics, but also explains that all waves obey an uncertainty principle. The uncertainty principle is a way of saying that electrons have wave-like behavior.

Update: Quanta has an article by Robbert Dijkgraaf saying There Are No Laws of Physics. There’s Only the Landscape. That is because he is a string theorist who studies mathematical abstractions that have nothing to do with reality.

Thursday, December 8, 2022

Holographic Wormhole on a Microchip

From Nature, the leading European science journal:
A holographic wormhole in a quantum computer

Physicists have used a quantum computer to generate an entity known as an emergent wormhole. Quantum systems can be linked by entanglement even when separated by extremely long distances. The authors generated a highly entangled quantum state between the two halves of a quantum computer, creating an alternative description, known as a holographic dual, in the form of an emergent wormhole stretched between two exterior regions. They then simulated a message traversing this wormhole. Such exotic physics is part of efforts to reconcile quantum mechanics with the general theory of relativity.

Dr. Quantum Supremacy responds:
Tonight, David Nirenberg, Director of the IAS and a medieval historian, gave an after-dinner speech to our workshop, centered around how auspicious it was that the workshop was being held a mere week after the momentous announcement of a holographic wormhole on a microchip (!!) — a feat that experts were calling the first-ever laboratory investigation of quantum gravity, and a new frontier for experimental physics itself. Nirenberg asked whether, a century from now, people might look back on the wormhole achievement as today we look back on Eddington’s 1919 eclipse observations providing the evidence for general relativity.

I confess: this was the first time I felt visceral anger, rather than mere bemusement, over this wormhole affair. Before, I had implicitly assumed: no one was actually hoodwinked by this. No one really, literally believed that this little 9-qubit simulation opened up a wormhole, or helped prove the holographic nature of the real universe, or anything like that. I was wrong.

That 1919 eclipse was hyped with a NY Times headline: “Men of Science More or Less Agog Over Results of Eclipse Observations”.

Update: Here is the Quanta video.

Almost a century ago, Albert Einstein realized that the equations of general relativity could produce wormholes. But it would take a number of theoretical leaps and a “crazy” team of experimentalists to build one on Google's quantum computer. Read the full article at Quanta Magazine:
Quanta used to be a respectable magazine. As was Nature, which is now filled with woke nonsense.

Monday, December 5, 2022

What is Entanglement?

Entanglement is supposed to be the essence of quantum mechanics, but I wonder about it. Wikipedia defines:
Quantum entanglement is the physical phenomenon that occurs when a group of particles are generated, interact, or share spatial proximity in a way such that the quantum state of each particle of the group cannot be described independently of the state of the others, including when the particles are separated by a large distance. The topic of quantum entanglement is at the heart of the disparity between classical and quantum physics: entanglement is a primary feature of quantum mechanics not present in classical mechanics.[1]

Measurements of physical properties such as position, momentum, spin, and polarization performed on entangled particles can, in some cases, be found to be perfectly correlated. For example, if a pair of entangled particles is generated such that their total spin is known to be zero, and one particle is found to have clockwise spin on a first axis, then the spin of the other particle, measured on the same axis, is found to be anticlockwise.

It confusingly says it is a "physical phenomenon", and then says it is a property of how quantum states are described. So it is not a physical property. I removed the word "physical".

My bigger issue is "quantum entanglement is at the heart of the disparity between classical and quantum physics". This is conventional wisdom, so it is reasonable for Wikipedia to say this, but it is true?

Classical systems obey entanglement, exactly as it is described here. Suppose you have a system of two balls, and you know their total momentum. Assume that momentum is conserved. Then the balls get separated, and you measure the momentum of one of them. You then know the momentum of the other ball, as it is perfectly correlated and opposite. The momenta must sum to the original total.

You can do the same with angular momentum.

So how is this any different from the quantum entanglement? I do not see any difference.

Anticipating your objections, you might tell me that the quantum momentum and spin are not known until measured, or that Bell's theorem puts limits on certain correlations. Okay, I accept all that, but what does it have to do with the definition of entanglement? Nothing.

Or you might say that entanglement is a nonlocal interaction. Spooky action and all that. But that is a big hoax. Quantum mechanics is a local theory.

A more sophisticated objection says that entanglement is a precious resource that can be used for cryptography, teleportation, and parallel computation. Quantum entangled particles have the necessary magic, and classically entangled ones do not.

This objection is harder to answer, as currently billions of dollars are being spent to try to determine whether any such magic exists. Most say yes, but they have yet to demonstrate anything useful from that magic.

Even if such magic exists, there ought to be a way to define entanglement that makes it clearly a quantum phenomenon, and not a classical one.

Quantum systems are different from classical systems. Bell proved that. The uncertainty principle is different, and so are certain correlations. But I don't see how entanglement is any different.

Lenny Susskind and others have been saying that wormholes and entanglement are the same thing.

YouTube:

Almost a century ago, Albert Einstein realized that the equations of general relativity could produce wormholes. But it would take a number of theoretical leaps and a “crazy” team of experimentalists to build one on Google's quantum computer. Read the full article at Quanta Magazine:
The idea seems to be that if entanglement and wormholes are the same thing, and quantum computers use entanglement to do super-Turing computations, then there should be some wormholes hiding inside a quantum computer. Seems like a joke to me, but I did not read the details.

Update: Peter Woit writes:

The best way to understand the “physicists create wormholes in the lab” nonsense of the past few days is as a publicity stunt ...

I’m hoping that journalists and scientists will learn something from this fiasco and not get taken in again anytime soon. It would be very helpful if both Nature and Quanta did an internal investigation of how this happened and reported the results to the public. Who were the organizers of the stunt and how did they pull it off? ...

his claims in the Quanta video that the result of the Google quantum computer calculation was on a par with the Higgs discovery. Does he really believe this (he’s completely delusional) or not (he’s intentionally dishonest)? ...

Peter Shor says: It seems to me that the string theorists and “it from qubit” community seem to have this unwritten rule that they don’t criticize other members of this community in public.

I used to think that Physics had higher standards than other sciences for truth and professionalism. Apparently not.

Another comment:

I thought you’d be interested to know that the “wormhole created in a quantum computer” story is now being covered in some far-right-wing media. I won’t name them here (they’re very far-right sites, not sure if you’d allow a link here), but they’re essentially saying “isn’t this manifestly stupid? See? Why should we believe scientists when they publish bullshit like this?” and essentially use the story to argue that scientists and science journalists are all a bunch of idiots, hence why should we trust them on vaccines/climate change etc.

This is another consequence of bad publicity stunts like this: it erodes trust in scientists.

Here is one such site. It is so disreputable that Google confiscated its domain name. Possibly the most censored site in the world. It just quotes a NY Times tweet, a Google tweet, a Reuters story, and a couple of more tweets, and adds:
This is very obviously fake and it’s goofy that people think it’s real.
I agree with that. It is goofy that people think that this research is real.

Thursday, December 1, 2022

Entanglement Discovered in a Quantum Computer

Lenny Susskind and others have been saying that wormholes and entanglement are the same thing.

YouTube:

Almost a century ago, Albert Einstein realized that the equations of general relativity could produce wormholes. But it would take a number of theoretical leaps and a “crazy” team of experimentalists to build one on Google's quantum computer. Read the full article at Quanta Magazine:
The idea seems to be that if entanglement and wormholes are the same thing, and quantum computers use entanglement to do super-Turing computations, then there should be some wormholes hiding inside a quantum computer. Seems like a joke to me, but I did not read the details.

See Peter Woit for details. At least one physicist calls it a publicity stunt. The quantum computer researchers have burned a lot of money, and need something to show for it.

Update: A comment:

Even if the headline isn’t strictly accurate (a topic for another time, although I think you’re splitting hairs here), what’s the harm? It’s a cool-sounding result which gets people interested in theoretical physics, science more generally. As long as science journalists are driving interest and engagement, I think they’re doing a good job. If you want to discuss bad science journalism, surely a better use of your time would be all the anti-science fake news coming from the populist right in the U.S.
I suspect that this view is common. Over-hyped phony stories generate interest and funding. If you want to be a good Leftist, you should not call out Leftist lies. Instead you should devote that energy to attacking right-wingers!

Update: Scott Aaronson admits that the wormhole story is a big hoax, promoted by physicists who should know better. He also discusses a new paper saying that quantum supremacy is impossible. He says it is no surprise to experts in the field who have known since 2016 that scaling up quantum computers will not work. He is still a believer:

So, though it’s been under sustained attack from multiple directions these past few years, I’d say that the flag of quantum supremacy yet waves. The Extended Church-Turing Thesis is still on thin ice.
That is, he says that he has not been proved wrong yet. Okay, but he hasn't been proved right either.