Wednesday, July 28, 2021

Brian Greene Still Clings to String Theory

Physicist Brian Greene used to be the public face of string theory, and on a recent video Q&A, he was asked:
Is it true that physicists are losing their hope in string theory?
The answer is obviously yes, but he refuses to admit it.

Greene's answer is amusingly weak. He is unable to point to any theoretical or experiment progress in string theory. He just babbles.

He starts by getting a drink of water, emphatically saying the answer is no, and claiming that the string theory critics need psychotherapy!

He goes on to defend the multiverse, even tho that was not really the question. He says string theory does not require a multiverse, but he argues for the multiverse, as that were essential to his belief in string theory.

He implies that only a narrow-minded person would only believe in one universe. He admits that it is desirable to have physical evidence for physical theories, but says that evidence for part of a theory (ie, one universe) can be taken to enhance a belief in other parts (ie, other universes.

He ends by saying that the state of the field is one of excitment.

Watch the above video from 42:00 to 46:00 if you think that I have distorted him.

If I knew nothing about string theory, I would say that this is a con man promoting goofy ideas that no one could believe. If I watched the rest of the video, I would see that he undertands a lot of physics, and is good at explaining textbook physics, and I would be very confused.

Dr. Bee's latest video says that string theorists have given up on their original goals, leaving an overhyped research program. The multiverse is so silly it is not even science. She challenges a lot of modern theoretical physics as too speculative to be worthwhile.

Sunday, July 25, 2021

R.I.P. Steven Weinberg

Many fine obituaries of Steven Weinberg are appearing.

His Nobel Prive was for the weak interaction, but he shared it with two others who came up with the same equations at the same time. His paper had essentially no citations, until others developed gauge theory more fully. So while the theory is important to the standard model, I am not so sure his 1967 paper was important.

Weinberg wrote this in the preface to his 1972 general relavity book:

There was another, more personal reason for my writing this book. In learning general relativity, and then in teaching it to classes at Berkeley and MIT, I became dissatisfied with what seemed to be the usual approach to the subject. I found that in most textbooks geometric ideas were given a starring role, so that a student who asked why the gravitational field is represented by a metric tensor, or why freely falling particles move on geodesics, or why the field equations are generally covariant would come away with an impression that this had something to do with the fact that spacetime is a Riemannian manifold.

Of course, this was Einstein's point of view, and his preeminent genius necessarily shapes our understanding of the theory he created. However, I believe that the geometrical approach has driven a wedge between general relativity and the theory of elementary particles. As long as it could be hoped, as Einstein did hope, that matter would eventually be understood in geometrical terms, it made sense to give Riemannian geometry a primary role in describing the theory of gravitation. But now the passage of time has taught us not to expect that the strong, weak, and electromagnetic interactions can be understood in geometrical terms, and that too great an emphasis on geometry can only obscure the deep connections between gravitation and the rest of physics.

This opinion is so bizarre that it is hard to believe it came from such a smart man.

All general relativity scholars give geometric ideas a starring role. But that was not Einstein's view. Einstein explicitly rejected the geometry.

This rejection was puzzling for both Einstein and Weinberg, as parts of general relativity have only geometric explanations, and their books do give those explanations. They did not succeed in purging the geometry.

It was also bizarre for his to deny that geometry was important for the other forces. This was five years after his Nobel prizewinning paper, and the work was credited with giving a geometrical unification of electromagnetism and weak forces. If he did not do that, what did he do? Was he repudiating his work? Did he not understand that gauge theories were geometric?

I am not trying to criticize him here. I am just pointing out some puzzling aspects of his beliefs.

He was also a hardcore atheist and scientific reductionist. He opposed religion, and considered Islam much worse that Christianity. He opposed paradigm shifters and other modern philosophy. He was a typical academic leftist, but had not bought into the current racial wokeness fanaticism.

Okay, that's all fine with me, but he also rejected positivism, and in his later years, endorsed many-worlds quantum theory. Again, these opinions are bizarre, coming from him. Lubos Motl also found them strange:

Years ago, I was only gradually noticing weird comments about quantum mechanics that Weinberg sometimes made.... At any rate, I consider Weinberg to be a 100% anti-quantum zealot ... It's sad.
He was of Jewish descent, but against Judaism the religion. He was extremely pro-Israel, and an extreme idolizer of Einstein. He studies the history of science enough to know what Einstein did and did not do, but excessively praised him anyway.

Friday, July 23, 2021

O'Dowd can explain Magnetism, but not Many Worlds

Matt O'Dowd has another nice PBS TV Physics video, on How Magnetism Shapes The Universe. I learned a few things. Then, at 15:00, he responds to some queries about a previous episode on Many-Worlds theory.

The questions have no good answers. He says that he will have to make some more videos to explain. Good luck with that. No one can tell how how many worlds there are, or how often they split, or how one can be more probable than any other, or any practical questions about how it all works.

Monday, July 19, 2021

Google Shows 1% of a Logical Qubit

The Google Quantum AI team has published a hot new result in Nature:
Realizing the potential of quantum computing requires sufficiently low logical error rates1. Many applications call for error rates as low as 10−15 (refs. 2,3,4,5,6,7,8,9), but state-of-the-art quantum platforms typically have physical error rates near 10−3 (refs. 10,11,12,13,14). Quantum error correction15,16,17 promises to bridge this divide by distributing quantum logical information across many physical qubits in such a way that errors can be detected and corrected.
As summarized in AAAS Science, they tried to spread a logical qubit over 11 physical qubits. They were not able to correct errors, but they estimate that they could achieve one logical qubit by using 1000 physical qubits.

The Google author promises something much better "in the relatively near future — date TBD."

Getting one qubit is still a long way from doing anything useful.

Tuesday, July 13, 2021

Parallel Universes do not explain anything

Karl-Erik Eriksson compares The Many-Worlds Interpretation and postmodernism:
Sokal's definition of postmodernism is the following: an intellectual current characterized by the more-or-less explicit rejection of the rationalist tradition of the Enlightenment, by theoretical discourses disconnected from any empirical test, and by cognitive and cultThe Many-Worlds Interpretation and postmodernismural relativism that regards science as nothing more than a "narration", a "myth" or a social construction among many others.27

Let us make a description of MWI:

MWI is a world view that regards our experiences of reality as nothing more than one narrative about our world among innumerably many other narratives in a world of worlds. A narrative exists only in one world and cannot be communicated to another world.

This is probably close to what also an MWI proponent could accept.

Then what do postmodernists see in MWI? They see MWI as a world view structured in a way that is similar to postmodernism and therefore useful to support it. Even if postmodernists do not value science, they can value the prestige of science, as shown by Sokal [18]. Therefore, MWI can be felt as a scientific support for postmodernism.

He also cites Brian Greene's arguments for many-worlds:
Stage one — the evolution of wavefunctions according to Schrödinger's equation — is mathematically rigorous, totally unambiguous, and fully accepted by the physics community. Stage two — the collapse of a wavefunction upon measurement — is, to the contrary, something that during the last eight decades has, at best, kept physics mildly bemused, and at worst, posed problems, puzzles and potential paradoxes that have devoured careers. The difficulty [...] is that according to Schrödinger's equation wave functions do not collapse. Wavefunction collapse is an add-on. It was introduced after Schrödinger discovered his equation, in an attempt to account for what experimenters actually see. [[9], p. 201.]
No, the evolution of the wave function is not stage one. The wave function is not directly observable, so we can only infer estimated wave function by other methods.

This might seem like a minor point, except that Greene is going to argue that the Schroedinger equation is all we need. That cannot be.

John Preskill, in his recent podcast, also said that maybe unitary evolution of the wavefunction is all there is. But that cannot be all there is, because we still need an explanation for why we see discrete measurements.

Note the attempt to denigrate wavefunction collapse as an invention "to account for what experimenters actually see."! Yes, physical theories try to account for what experimenters see. They see collapse. Any theory not explaining collapse is not doing the job.

The many-worlds fans say that the collapse is seen as the splitting of the universes. So they have to have collapse as part of the theory, but they say the splitting is a mystery and cannot tell us much more than that.

[Each] of the potential outcomes embodied in the wavefunction still vies for realization. And so we are still wondering how one outcome "wins" and where the many other possibilities "go" when that actually happens. When a coin is tossed, [...] you can, in principle, predict whether it will land heads or tails. On closer inspection, then, precisely one outcome is determined by the details you initially overlooked. The same cannot be said in quantum physics. [...]
There is not really any conceptual difference here.

The coin toss cannot be predicted with certainty. Maybe some overlooked details would enable a better prediction, but that could be true of quantum mechanics too, for all we know.

Much in the spirit of Bohr, some physicists believe that searching for such an explanation of how a single, definite outcome arises is misguided. These physicists argue that quantum mechanics, with its updating to include decoherence, is a sharply formulated theory whose predictions account for the behavior of laboratory measuring devices. And according to this view, that is the goal of science. To seek an explanation of what's really going on, to strive for an understanding of how a particular outcome came to be, to hunt for a level of reality beyond detector readings and computer printouts betrays an unreasonable intellectual greediness.

Many others, including me, have a different perspective. Explaining data is what science is about. But many physicists believe that science is also about embracing the theories data confirms and going further by using them to get maximal insight into the nature of reality. [[9], pp 212-213.

I am all for explaining data also, but just saying that anything can happen in parallel universes explains nothing.

Thursday, July 8, 2021

Electrons do Spin

There is a new PBS TV video on Electrons DO NOT Spin.

This guy is usually pretty reliable, but he is way off base here. Of course electorns spin.

His main argument is that if you conceptualize an electron as a particle, then it is hard to see how the charge distribution and angular velocity could result in the observed magnetic moment.

Okay, electrons are not classical particles. If you conceptualize an electron as a classical particle, you will have trouble with position, momentum, and everything else.

Spin is the intrinsic angular momentum, quantized.

Update: I found this explanation:

WHAT IS SPIN?

Spin is an inherent property possessed by the electron. However, it does not rotate. In quantum mechanics, we speak of an en electron as having an intrinsic angular momentum called spin. The reason we use this term is that electrons possess an angular momentum & a magnetic moment just like a rotating charged body.

DO ELECTRONS SPIN SIMILAR TO PLANETS?

IT IS MISLEADING TO IMAGINE AN ELECTRON AS A SMALL SPINNING OBJECT DUE TO THE FOLLOWING:

An electron’s spin is quantified. It has only two possible orientations, spin up and down, unlike a tossed ball.
To regard an electron as spinning, it must rotate with a speed greater than light to have the correct angular momentum[Griffiths, 2005, problem 4.25].
Similarly, the electron’s charge would have to rotate faster than the speed of light to generate the correct magnetic moment[Rohrlich, 2007, Pg 127].
Unlike a tossed ball, the spin of an electron never changes. It has only two possible orientations: spin up and down.

These arguments are just wrong.

The 1st and 4th are not true. An electron spin can be in any direction, not just up and down. If you put it in a suitable magnetic field, then it will be just up or down, but the same is true about a classical spinning charged ball.

I don't have those textbooks, but they presumably do a computation assuming an electron is a charged particle with extremely small size. But an electron is not a classical particle. While it looks like a point particle in some experiments, it also looks like a wave of concentrated fields. That wave/field is spinning.

Monday, July 5, 2021

Why Quantum Systems are Hard to Simulate

I just listened to Mindscape 153 | John Preskill on Quantum Computers and What They’re Good For:
Depending on who you listen to, quantum computers are either the biggest technological change coming down the road or just another overhyped bubble. Today we’re talking with a good person to listen to: John Preskill, one of the leaders in modern quantum information science. We talk about what a quantum computer is and promising technologies for actually building them. John emphasizes that quantum computers are tailor-made for simulating the behavior of quantum systems like molecules and materials; whether they will lead to breakthroughs in cryptography or optimization problems is less clear. Then we relate the idea of quantum information back to gravity and the emergence of spacetime. (If you want to build and run your own quantum algorithm, try the IBM Quantum Experience.)
Preskill coined the term "quantum supremacy", and supposedly that is the biggest accomplishment in the field, but oddly the term is never mentioned.

Sean M. Carroll is an advocate of many-worlds theory, and Preskill said he is comfortable with that belief.

Preskill referred to Feynman kickstarting the field by a lecture that said that quantum systems are hard to simulate. This supposedly implied that quantum computers are inevitable.

As I see it, there are three main views underlying this thinking.

Many-worlds. If computers are constantly splitting into multiple computers doing different computers, then possibly they can be put to work doing parallel computation, and reporting back a result in one world.

This used to be the eccentric view of David Deutsch, but now Carroll, Preskill, Aaronson, and many others are on board.

Negative probability. In this view, quantum entanglement is a mysterious resource that can have negative or imaginary probabilities, so maybe it can do things better than classical computers that are limited by [0,1] probabilities. This is the preferred explanation of Scott Aaronson.

Pinball. The game of pinball is also hard to simulate because a ball can hit a bumper, be forced to one side or the other in a way that is hard to predict, and then the subsequent action of that ball is wildly different, depending on the side of the bumper. It is a simple example of chaos.

Quantum systems can be like pinball. Imagine a series of 100 double-slits. You fire an electron thru all the slits. At the first slit, the electron goes thru one slit, or the other, or some superposition. The electron has the same choice at the next slit, and it is influenced by what happened at the first slit. Simulating the 100 slits requires keep track of 2100 possibilities.

So pinball is hard to simulate, but no one would try to build a super-computer out of pinball machines.

My theory here is that your faith in quantum computers depends on which of the above three views you hold.

If quantum experiments are glimpses into parallel worlds, then it is reasonable to expect parallel computation to do useful work. If quantum experiments unlock the power of negative probabilities, then it is also plausible there is a quantum magic to be used in a computer. But it quantum experiments are just pinball machines, then nothing special should be expected.

You might say that we know quantum experiments are not classical, as Bell proved that. My answer is that they are not literally parallel universes or negative probabilities either. Maybe the human mind cannot even grasp what is really going on, so we have to use simple metaphors.

The many-worlds and negative probability views do not even make any mathematical sense. Many-worlds is so nutty that Carroll, Aaronson, and Preskill discredit much of what they have to say by expressing these opinions.

Now Aaronson says:

To confine myself to some general comments: since Google’s announcement in Fall 2019, I’ve consistently said that sampling-based quantum supremacy is not yet a done deal. I’ve said that quantum supremacy seems important enough to want independent replications, and demonstrations in other hardware platforms like ion traps and photonics, and better gate fidelity, and better classical hardness, and better verification protocols. Most of all, I’ve said that we needed a genuine dialogue between the “quantum supremacists” and the classical skeptics: the former doing experiments and releasing all their data, the latter trying to design efficient classical simulations for those experiments, and so on in an iterative process. Just like in applied cryptography, we’d only have real confidence in a quantum supremacy claim once it had survived at least a few years of attacks by skeptics. So I’m delighted that this is precisely what’s now happening.
Wow. He consistently said that?

He was the referee who approved Google's claim of quantum supremacy. He has collected awards for papers on that "sampling-based quantum supremacy". Maybe his referee report should have recommended that Google scale back its claims.

Aaronson has also argued that the quantum computing skeptics have been proven wrong. I guess not. It will still take a few years of consensus building.

I am one of those quantum computing skeptics. I thought that maybe someday and experiment would be done that proves me wrong. But apparently that is not how it works.

The experiments are inconclusive, but the experts will eventually settle into the pro or con sides. Only when that happens will I be seen to be right or wrong.

No, I don't accept that. Expert opinion is shifting towards many-worlds, but it is still a crackpot unscientific theory of no value. When Shor's algorithm on a quantum computer factors a number that could not previously be factored, then I will accept that I was wrong. Currently, Dr. Boson Sampling Supremacy himself admits that I have not been proven wrong.

Update: Here is a SciAm explanation:

If you think of a computer solving a problem as a mouse running through a maze, a classical computer finds its way through by trying every path until it reaches the end.

What if, instead of solving the maze through trial and error, you could consider all possible routes simultaneously?

Even now, experts are still trying to get quantum computers to work well enough to best classical supercomputers.