Monday, March 18, 2024

Deutsch defends Many-Worlds Theory

New podcast: The Multiverse is REAL - David Deutsch.

He says the double-slit experiment conclusively proves the many-worlds theory. He accepts the parallel worlds for the same reason he accepts the existence of dinosaurs (millions of years ago). It is the only way to explain the evidence.

He admits that the photons are not really particles, and that any waves show a similar diffraction pattern. But he says that for the photon to behave as a wave, it must exist in multiple copies.

He also admits that all this has been known for decades, and yet most physicists do not accept this argument for many-worlds.

I do not see any merit to this argument. The double-slit experiment only proves that light, and other beams like electron beams, have wave properties. That's all. It is not even evidence for quantum mechanics, as this wave explanation was accepted before quantum mechanics was invented.

When asked about alternative theories, he says the von Neumann had this crazy idea that if you observe an electron in one place, then the possibility of it being somewhere else ceases to exist.

I do not see the problem with that. I do not even think the issue has anything to do with quantum mechanics. Anytime you estimate the probability of an event, and then observe it, that means that the other possibilities did not happen. That is how probabilities work.

He pushes quantum computing, but admits that he has not followed the latest technology.

I see the argument for many-worlds as nothing more than a rejection of probability theory. You could take any scientific theory that predicts probabilities, deny that the probabilities make any sense, and conclude that there are parallel worlds of unobserved possibilities.

That is all many-worlds theory is. I don't think that it even has anything to with quantum mechanics. It is only expressed in terms of quantum mechanics, because textbook QM emphasizes the probabilities. But other theories use probabilities the same way, and could have many-worlds interpretations.

Many-worlds thsoey is just the same as taking a science textbook, announcing some philosophical disagreement with probability theory, and redacting all the sections mentioning probability. It adds nothing. It just removes the theory's predictive power.

It is hard to see how any intelligent man takes many-worlds seriously. It offers nothing. Maybe they just don't understand probability theory, as I do not see anywhere that they recognize that they are just rejecting probability.

PBS TV has a news item:

How quantum computing could help us understand more about the universe

Scientists, researchers and some big companies are eager to jumpstart the next generation of computing, one that will be far more sophisticated and dependent on understanding the subatomic nature of the universe. But as science correspondent Miles O’Brien reports, it’s a huge challenge to take this new quantum leap forward.

A lot of hype. They admit that a fault tolerant quantum computer might be decade away. No one admits that it might be impossible.

Monday, March 11, 2024

Physics v. Magic

From xkcd comics.

His point here is that Physics is inherently causal. Things happen because there is some causal sequence of interactions from event A to event B, when A causes B.

For example, the Sun's gravitational pull on the Earth was once thought to be action-at-a-distance, but is not thought of as the Sun perturbing spacetime and a wave traveling to Earth. Or gravitons traveling to Earth.

However Physics often reasons from results, without following a causal chain. Examples are thermodynamics, conservation laws, and Lagrangians.

There are also examples in quantum mechanics. A particle might tunnel through a wall, without any understanding of how it gets through the wall.

In spite os these example, I still believe the universe is inherently local, in cause and effect. Sometimes our reasoning can skip some steps, but only as a mathematical convenience. There is no action-at-a-distance, and no magic.

Monday, March 4, 2024

Poincare was Five Years ahead of Einstein

It is funny to see historians try to credit Einstein. Here is a 2013 essay I had not seen before:
On some points, such as the principle of relativity or the physical interpretation of the Lorentz transformations, Poincaré’s contributions preceded by at least 5 years those of Einstein’s published in 1905. On the other hand, many of their contributions were practically simultaneous. In 1905 Poincaré published an abridged version of his “Sur la dynamique de l’électron” [3] (which preceded the work of Einstein); the expanded version of the article appeared in 1906 [4].

What are the conceptual differences? According to Darrigol,

Einstein completely eliminated the ether, required that the expression of the laws of physics should be the same in any inertial frame, and introduced a “new kinematics” in which the space and time measured in different inertial systems were all on exactly the same footing. In contrast, Poincaré maintained the ether as a privileged frame of reference in which “true” space and time were defined, while he regarded the space and time measured in other frames as only “apparent.” …Einstein derived the expression of the Lorentz transformation from his two postulates (the relativity principle and the constancy of the velocity of light in a given inertial system), whereas Poincaré obtained these transformations as those that leave the Maxwell–Lorentz equations invariant [1].
These are conceptual differences that have no actual experimental consequences as far as electromagnetism and optics are concerned. As Lorentz commented, the difference is purely epistemological: it concerns the number of conventional and arbitrary elements that one wishes to introduce in the definitions of the basic physical concepts.

Are we then dealing with a case of simultaneous discovery?

No, it was not simultaneous. Poincare was five years ahead of Einstein. Poincare was years ahead of Einstein with the relativity principle, rejection of the aether, local time, synchronizing clocks, interpreting Michelson-Morley, Lorentz group, mass-energy equivalence, four-dimensional spacetime, and relativistic theories of gravity. Einstein's only claim to originality is to certain epistemological differences of no physical significance.

Credit Einstein with those obscure conceptual differences if you want, but there is no mathematical or physical value to any of them. Mostly they consist of mathematical misunderstandings by Einstein and other physicists. For example, there is nothing erroneous about choosing a privileged frame on a symmetric space. It does not break the symmetry. Those who criticize Poincare for occasionally choosing a privileged frame are just mathematically ignorant.

The essay concludes:

As Darrigol suggests, it seems wiser to concede that Lorentz, Poincaré and Einstein all contributed to the emergence of the theory of relativity, that Poincaré and Einstein offered two different versions of the theory, and that Einstein gave form to what today is considered the best one.
No, Einstein's is not considered best today. Nearly everyone prefers the spacetime formulation that Poincare advanced in 1905 and Minkowski perfected in 1907. Einstein was still rejecting it in 1911, and did not even speak positively about it until after that, and never really accepted the geometrical significance.

A lot of historians concede that Lorentz and Poincare had all the mathematics of special relativity, and all the physical consequences, but insist that Einstein had a more modern viewpoint or superior understanding, leading to how we understand the theory today. But that is false. The Poincare-Minkowski geometrical interpretation has been preferred by nearly everyone but Einstein, since 1908.

Saturday, February 24, 2024

Thursday, February 22, 2024

Chinese Deflate Quantum Hype Again

Sabine Hossenfelder is now doing short daily physics news videos, and her latest is on Bad News for Quantum Computing: Another Advantage Gone.

In short, quantum computing researchers have been claiming quantum supremacy for years. Some call it quantum advantage. However, there has never been any convincing demonstration that quantum computers have any speedup at all over conventional computers.

The latest is that IBM claimed last year to do a quantum calculation on a "noisy" quantum computer. Some thought that they had outdone Google. But a Chinese group outdid them by doing the calculation faster and better on a classical computer.

The quantum enthusiasts will argue, as usual, that this does not disprove quantum computing, and maybe a more clever experiment would show an advantage. I am waiting.

Monday, February 12, 2024

The physicists philosophy of physics

Princeton astrophysicist PJE Peebles writes:
The starting idea of the natural sciences is that the world operates by rules that can be discovered by observations on scales large or small, depending on what interests you. In fundamental physics, the subject of this essay, the idea is narrowed to four starting assumptions.

A: The world operates by rules and the logic of their application that can be discovered, in successive approximations.

B: A useful approximation to the rules and logic, a theory, yields reliably computed quantitative predictions that agree with reliable and repeatable mea- surements, within the uncertainties of the predictions and measurements.

C: Fundamental physical science is growing more complete by advances in the quantity, variety, and precision of empirical fits to predictions, and by occa- sional unifications that demote well-tested fundamental physical theories to useful approximations to still better theories.

D: Research in fundamental physical science is advancing toward a unique mind-independent reality.

These sound reasonable, but they leave no room for many-worlds theory, string theory, simulation hypothesis, superdeterminism, or many of the ideas that are now fashionables.

The essay gives way too much attention to philosopher Thomas Kuhn.

It quotes Einstein:

The supreme task of the physicist is to arrive at those universal elemen- tary laws from which the cosmos can be built up by pure deduction.
This sounds a little like Weinberg's mythical Final Theory, also discussed.

No, trying to build the cosmos from pure deduction is foolishness.

Thursday, February 8, 2024

Dissecting Einstein's Brain

The RadioLab podcast just rebroadcast this:
Albert Einstein asked that when he died, his body be cremated and his ashes be scattered in a secret location. He didn’t want his grave, or his body, becoming a shrine to his genius. When he passed away in the early morning hours of April, 18, 1955, his family knew his wishes. There was only one problem: the pathologist who did the autopsy had different plans.

In the third episode of “G”, Radiolab’s miniseries on intelligence, we go on one of the strangest scavenger hunts for genius the world has ever seen. We follow Einstein’s stolen brain from that Princeton autopsy table, to a cider box in Wichita, Kansas, to labs all across the country. And eventually, beyond the brain itself entirely. All the while wondering, where exactly is the genius of a man who changed the way we view the world?

Later in the show, it discussed theories for the origin of Einstein's most brillian idea -- special relativity. Besides his extra-smart brain, it mentioned his physicist wife and a philosopher. It even had professor Galison explaining how train schedules causes people to rethink time.

Okay, but there was no mention of Lorentz and Poincare, or the fact that they had published the entire theory ahead of Einstein.

Galison is unusual because he does not recite crazy stories about Einstein's originality, like other Einstein scholars. He read Lorentz and Poincare and obviously understands that they did it all first, but he refuses to comment on the priority dispute.

Monday, February 5, 2024

What's the difference, said Heisenberg

From a math site:
In the 1960s Friedrichs met Heisenberg and used the occasion to express to him the deep gratitude of mathematicians for having created quantum mechanics, which gave birth to the beautiful theory of operators on Hilbert space. Heisenberg allowed that this was so; Friedrichs then added that the mathematicians have, in some measure, returned the favor. Heisenberg looked noncommittal, so Friedrichs pointed out that it was a mathematician, von Neumann, who clarified the difference between a self-adjoint operator and one that is merely symmetric. "What's the difference," said Heisenberg.

- story from Peter Lax, Functional Analysis (slightly edited for length)

There is the difference between a physicist, and a mathematical physicist.

John von Neumann wrote a 1932 book on quantum mechanics, and turned it into a real theory.

To a physicist, an observable is a symmetric operator, because those are the ones that give real values, and only real values are observed. To von Neumann, an observable is a self-adjoint operator on a Hilbert space, where some additional technical requirements are needed in order to prove the spectral theorem.

I am not trying to say that Heisenberg was stupid. But it is striking that a world-famous physicist could get a Nobel Prize for using operators as observables, and still be oblivious to the formal mathematical definition found in textbooks. We cannot expect physicists to understand mathematical subtleties.

Thursday, February 1, 2024

The World is not Discrete

Some people like to say that Quantum Mechanics makes the world discrete. That is not true. But I always assumed that QM models could be approximated by lattice models.

Apparently this is not true. We know that the weak force is chiral, ie, it violates mirror reflection symmetry. Neutrinos are left-handed in the Standard Model.

From the Scott Aaronson blog:

“There is currently no fully satisfactory way of evading the Nielsen-Ninomiya theorem. This means that there is no way to put the Standard Model on a lattice. On a practical level, this is not a particularly pressing problem. It is the weak sector of the Standard Model which is chiral, and here perturbative methods work perfectly well. In contrast, the strong coupling sector of QCD is a vector-like theory and this is where most effort on the lattice has gone. However, on a philosophical level, the lack of lattice regularisation is rather disturbing. People will bang on endlessly about whether or not we live “the matrix’”, seemingly unaware that there are serious obstacles to writing down a discrete version of the known laws of physics, obstacles which, to date, no one has overcome.”
There is a whole industry of physicists doing lattice approximations to the SM, but the SM is chiral and the approximations are not, so there is no hope that the approximations converge to the SM.

Aaronson is commenting on the silly idea that we live in a computer simulation. If we did, it would raise another silly idea that we could overwork the simulator by doing certain experiments.

Monday, January 29, 2024

Quantum Computer Revolution may be Further off

IEEE Spectrum reports:
The quantum computer revolution may be further off and more limited than many have been led to believe. That’s the message coming from a small but vocal set of prominent skeptics in and around the emerging quantum computing industry.

The problem isn’t just one of timescales. In May, Matthias Troyer, a technical fellow at Microsoft who leads the company’s quantum computing efforts, co-authored a paper in Communications of the ACM suggesting that the number of applications where quantum computers could provide a meaningful advantage was more limited than some might have you believe.

“We found out over the last 10 years that many things that people have proposed don’t work,” he says. “And then we found some very simple reasons for that.” ...

Even in the areas where quantum computers look most promising, the applications could be narrower than initially hoped. In recent years, papers from researchers at scientific software company Schrödinger and a multi-institutional team have suggested that only a limited number of problems in quantum chemistry are likely to benefit from quantum speedups. ...

“In the public, the quantum computer was portrayed as if it would enable something not currently achievable, which is inaccurate,” he says. “Primarily, it will accelerate existing processes rather than introducing a completely disruptive new application area. So we are evaluating a difference here.” ... “Most problems in quantum chemistry do not scale exponentially, and approximations are sufficient,” he says. “They are well behaved problems, you just need to make them faster with increased system size.”

Compare to the hype surrounding Artificial Intelligence (AI). It is also over-hyped by its enthusiasts, but it has also delievered a lot of very impressive demonstrable results. Quantum computing has delivered nothing, and may never deliver anything.

Someday we really will have personal robots and self-driving cars, but we may never have a useful quantum computer.

Google Research just released a video:

Quantum Computing - Hype vs. reality | Field Notes

Google Research
35.7K subscribers

25,188 views Jan 22, 2024 #GoogleAI #GoogleResearch
As the race to build the world's first truly useful quantum computer intensifies, so too does the need for clear-eyed assessment. This Field Notes episode brings in the Google Quantum AI team to help answer a few fundamental questions to drive understanding of its impact now and in the future.

It says quantum computers could become useful by 2030, or maybe a few years later.

the group is called "Quantum AI", but the video said nothing about AI. Just combining buzzwords, I guess.

The most touted application was fusion simulations, in order to help bring fusion power plants to market. Othere were discuvering drugs, and making the planet greener with chemistry for better batteries and fertilizer.

No mention of breaking everyone's cryptosystems. That is the only think quantum computer enthusiasts are sure about.

I am amazed that Google keeps funding this pipe dream. It has canceled hundreds of really useful products. It has developed some really good AI, but is checken to market it like OpenAI and Microsoft. Elsewhere Google touts quantum machine learning, but I doubt this will ever be practical. The non-quantum methods are progressing rapidly, and there is no sign that quantum computers would be useful.

Self-driving cares are over-hyyped, but I believe we are making progress and will get there. I do not think that that we are getting any closer to quantum computing.

Friday, January 26, 2024

The Evidence for CO2 Global Warming

Sabine Hossenfelder posts:
How do we know climate change is caused by humans?

In this video I summarize the main pieces of evidence that we have which show that climate change is caused by humans. This is most important that we know in which frequency range carbon dioxide absorbs light, we know that the carbon dioxide ratio in the atmosphere has been increasing, we know that the Ph-value of the oceans has been decreasing, the ratio of carbon isotopes in the atmosphere has been changing, and the stratosphere has been cooling, which was one of the key predictions of climate models from the 1960s.

She says this info is hard to find, but I found the same info as the first link from a search, a 2009 artucke:
How Do We Know that Humans Are the Major Cause of Global Warming?
YouTube also slaps an obnoxious "context" link on the video, with some info. You also get the same info from ChatGPT.

The evidence is that humans burning fossil fuels emit CO2, and the increases in atmospheric CO2 have caused warming. Probably most of the warming observed in recent decades.

My quibble is when they leap from this to saying that humans cause most of the climate change. The climate is changing a lot of different ways, in different places. I do not see anyone even trying to quantify climate change. Just CO2 and temperature.

Monday, January 22, 2024

Albert Explains Flaws in Many-Worlds

Newly-released video:
David Albert - What Does Quantum Theory Mean?

Quantum theory may be weird—superposition and entanglement of particles that in our normal world would make no sense—but quantum theory is truly how the microworld works. What does all this weirdness mean? How to go from microworld weirdness to macroworld normalcy? Will we ever make sense out of quantum mechanics?

Albert is a physicist-turned-philosopher, and he explains this pretty well.

He goes on to say that more and more physicists are adopting the many-worlds interpretation. He says it is counter-intuitive, but does not reject it for that reason. He rejects it because it does not explain the world.

In his opinion, it does not really solve the measurement problem, for two reasons.

(1) it tries to explain the definite outcomes as an illusion. Maybe this position could be justified some day.

(2) it cannot explain the probabilities we see, as many-worlds says all outcomes are determined.

He admits that physicists have done a lot of contortions to try to get around these issues, but they have failed.

"At the end of the day, it does not account for our experience."

I agree with him on these points. Perhaps mathematical physicists will develop a decoherence theory showing that the wave function branching resembles what we see. It hasn't happened yet, but it is possible.

But many-worlds will never explain the probabilities, because the whole point of many-worlds is to reject probabilities. The parallel worlds arise because probabilities are interpreted as world splittings, and all possibilities are realized in inaccessible alternate worlds.

So why are more and more physicists adopting such a wrong theory? No answer given. Physicists are losing their grip on reality.

Thursday, January 18, 2024

Higgs Boson did not Revolutionize Physics

David Berlinski wrote in a 2012 essay:
The discovery was announced; the story reported; and then there was silence. Physicists endeavoured, of course, to maintain the impression that they had discovered something of inestimable value. They were game. Writing in The Daily Beast, Sean Carroll predicted that the Higgs Boson would “revolutionize physics,” and if this is what physicists always say, then at least they seem never weary of saying it.

Lawrence Krauss, writing in The Daily Beast as well, gave it his best. Many years ago, Leon Lederman had designated the Higgs Boson as the God particle. No one can today remember why. The God particle? “Nothing could be further from the truth,” Krauss remarked. In this, of course, he was entirely correct: Nothing could be further from the truth.

In the end, Krauss, like Carroll before him, could do no better than an appeal to the revolution. The discovery of the Higgs Boson “validates an unprecedented revolution in our understanding of fundamental physics …” Readers of The Daily Beast are always pleased to uphold the revolution, no matter how revolting. Yet, the Standard Model was completed in the early 1970s,

From what I have found, calling everything a revolution stems from calling the Copernicus heliocentric model the Copernican Revolution, because the Earth revolved around the Sun. It was a weak pun. And then that was so important, it became the Scientific Revolution.

Finding the Higgs Boson just confirmed what people thought 50 years earlier. Not a revolution.

Monday, January 15, 2024

What would it have looked like?

Lawrence Krauss likes to tell this ancedote about the nature of science:
“Tell me,” the great twentieth-century philosopher Ludwig Wittgenstein once asked a friend, “why do people always say it was natural for man to assume that the sun went around the Earth rather than that the Earth was rotating?” His friend replied, “Well, obviously because it just looks as though the Sun is going around the Earth.” Wittgenstein responded, “Well, what would it have looked like if it had looked as though the Earth was rotating?”
For example, he tells it in this interview, where he attributes it to a play, so it might be fiction. He tells it again here, plugging his latest book.

It is a good story. Just because your data fits your model, you cannot conclude that your model is right. There could be a completely different model that fits just as well.

I am not sure what point Krauss was making. He seems to be saying that the many-worlds theory would look just like the Copenhagen interpretation of quantum mechanics. This is not a great example, because in our world we see more probable events as more likely. In many-worlds theory, there is no known reason for that happening. It is like saying the world is a simulation. It does not look like a simulation unless you also assume that the simulator has replicated natural laws very accurately.

Wednesday, January 10, 2024

If a Proton is just Bits, it must be a lot

Seth Lloyd argues that matter is made of information:
Does information work at the deep levels of physics, including quantum theory, undergirding the fundamental forces and particles? But what is the essence of information—describing how the world works or being how the world works. There is a huge difference. Could information be the most basic building block of reality?

Seth Lloyd is a professor of mechanical engineering at the Massachusetts Institute of Technology. He refers to himself as a “quantum mechanic”.

Okay, but he is challenged for a proton, and says that a proton is fully described by 50-60 bits for its location in the universe, and 1 bit for spin up or down.

What? The diameter of the observable universe is about 4x1028 cm. So that is about 6x1084 cm3 in volume, so it would take that many bits to specify location to the nearest cubic cm.

A cubic cm is a lot of space for a proton. We need at least 100 bits to specify a proton location to some small region. And the universe could be bigger than what is observable.

But that is not my issue here. The proton could have velocity. Need many more bits for that.

And spin is not just one bit. Spin could point in any direction, not just up or down.

None of these proton parameters can be specified precisely, because of Heisenberg Uncertainty. A proton can have a wave function, and not position and momentum at the same time. So how many bits are needed for a wave function?

But then the wave function is not even real, so I don't know if it makes sense to ask how many bits are needed for a wave function.

So if a proton is equivalent to some number of bits of information, I don't know how to calculate that number. Lloyd is underestimating them.