Wednesday, July 28, 2021

Brian Greene Still Clings to String Theory

Physicist Brian Greene used to be the public face of string theory, and on a recent video Q&A, he was asked:
Is it true that physicists are losing their hope in string theory?
The answer is obviously yes, but he refuses to admit it.

Greene's answer is amusingly weak. He is unable to point to any theoretical or experiment progress in string theory. He just babbles.

He starts by getting a drink of water, emphatically saying the answer is no, and claiming that the string theory critics need psychotherapy!

He goes on to defend the multiverse, even tho that was not really the question. He says string theory does not require a multiverse, but he argues for the multiverse, as that were essential to his belief in string theory.

He implies that only a narrow-minded person would only believe in one universe. He admits that it is desirable to have physical evidence for physical theories, but says that evidence for part of a theory (ie, one universe) can be taken to enhance a belief in other parts (ie, other universes.

He ends by saying that the state of the field is one of excitment.

Watch the above video from 42:00 to 46:00 if you think that I have distorted him.

If I knew nothing about string theory, I would say that this is a con man promoting goofy ideas that no one could believe. If I watched the rest of the video, I would see that he undertands a lot of physics, and is good at explaining textbook physics, and I would be very confused.

Dr. Bee's latest video says that string theorists have given up on their original goals, leaving an overhyped research program. The multiverse is so silly it is not even science. She challenges a lot of modern theoretical physics as too speculative to be worthwhile.

Sunday, July 25, 2021

R.I.P. Steven Weinberg

Many fine obituaries of Steven Weinberg are appearing.

His Nobel Prive was for the weak interaction, but he shared it with two others who came up with the same equations at the same time. His paper had essentially no citations, until others developed gauge theory more fully. So while the theory is important to the standard model, I am not so sure his 1967 paper was important.

Weinberg wrote this in the preface to his 1972 general relavity book:

There was another, more personal reason for my writing this book. In learning general relativity, and then in teaching it to classes at Berkeley and MIT, I became dissatisfied with what seemed to be the usual approach to the subject. I found that in most textbooks geometric ideas were given a starring role, so that a student who asked why the gravitational field is represented by a metric tensor, or why freely falling particles move on geodesics, or why the field equations are generally covariant would come away with an impression that this had something to do with the fact that spacetime is a Riemannian manifold.

Of course, this was Einstein's point of view, and his preeminent genius necessarily shapes our understanding of the theory he created. However, I believe that the geometrical approach has driven a wedge between general relativity and the theory of elementary particles. As long as it could be hoped, as Einstein did hope, that matter would eventually be understood in geometrical terms, it made sense to give Riemannian geometry a primary role in describing the theory of gravitation. But now the passage of time has taught us not to expect that the strong, weak, and electromagnetic interactions can be understood in geometrical terms, and that too great an emphasis on geometry can only obscure the deep connections between gravitation and the rest of physics.

This opinion is so bizarre that it is hard to believe it came from such a smart man.

All general relativity scholars give geometric ideas a starring role. But that was not Einstein's view. Einstein explicitly rejected the geometry.

This rejection was puzzling for both Einstein and Weinberg, as parts of general relativity have only geometric explanations, and their books do give those explanations. They did not succeed in purging the geometry.

It was also bizarre for his to deny that geometry was important for the other forces. This was five years after his Nobel prizewinning paper, and the work was credited with giving a geometrical unification of electromagnetism and weak forces. If he did not do that, what did he do? Was he repudiating his work? Did he not understand that gauge theories were geometric?

I am not trying to criticize him here. I am just pointing out some puzzling aspects of his beliefs.

He was also a hardcore atheist and scientific reductionist. He opposed religion, and considered Islam much worse that Christianity. He opposed paradigm shifters and other modern philosophy. He was a typical academic leftist, but had not bought into the current racial wokeness fanaticism.

Okay, that's all fine with me, but he also rejected positivism, and in his later years, endorsed many-worlds quantum theory. Again, these opinions are bizarre, coming from him. Lubos Motl also found them strange:

Years ago, I was only gradually noticing weird comments about quantum mechanics that Weinberg sometimes made.... At any rate, I consider Weinberg to be a 100% anti-quantum zealot ... It's sad.
He was of Jewish descent, but against Judaism the religion. He was extremely pro-Israel, and an extreme idolizer of Einstein. He studies the history of science enough to know what Einstein did and did not do, but excessively praised him anyway.

Friday, July 23, 2021

O'Dowd can explain Magnetism, but not Many Worlds

Matt O'Dowd has another nice PBS TV Physics video, on How Magnetism Shapes The Universe. I learned a few things. Then, at 15:00, he responds to some queries about a previous episode on Many-Worlds theory.

The questions have no good answers. He says that he will have to make some more videos to explain. Good luck with that. No one can tell how how many worlds there are, or how often they split, or how one can be more probable than any other, or any practical questions about how it all works.

Monday, July 19, 2021

Google Shows 1% of a Logical Qubit

The Google Quantum AI team has published a hot new result in Nature:
Realizing the potential of quantum computing requires sufficiently low logical error rates1. Many applications call for error rates as low as 10−15 (refs. 2,3,4,5,6,7,8,9), but state-of-the-art quantum platforms typically have physical error rates near 10−3 (refs. 10,11,12,13,14). Quantum error correction15,16,17 promises to bridge this divide by distributing quantum logical information across many physical qubits in such a way that errors can be detected and corrected.
As summarized in AAAS Science, they tried to spread a logical qubit over 11 physical qubits. They were not able to correct errors, but they estimate that they could achieve one logical qubit by using 1000 physical qubits.

The Google author promises something much better "in the relatively near future — date TBD."

Getting one qubit is still a long way from doing anything useful.

Tuesday, July 13, 2021

Parallel Universes do not explain anything

Karl-Erik Eriksson compares The Many-Worlds Interpretation and postmodernism:
Sokal's definition of postmodernism is the following: an intellectual current characterized by the more-or-less explicit rejection of the rationalist tradition of the Enlightenment, by theoretical discourses disconnected from any empirical test, and by cognitive and cultThe Many-Worlds Interpretation and postmodernismural relativism that regards science as nothing more than a "narration", a "myth" or a social construction among many others.27

Let us make a description of MWI:

MWI is a world view that regards our experiences of reality as nothing more than one narrative about our world among innumerably many other narratives in a world of worlds. A narrative exists only in one world and cannot be communicated to another world.

This is probably close to what also an MWI proponent could accept.

Then what do postmodernists see in MWI? They see MWI as a world view structured in a way that is similar to postmodernism and therefore useful to support it. Even if postmodernists do not value science, they can value the prestige of science, as shown by Sokal [18]. Therefore, MWI can be felt as a scientific support for postmodernism.

He also cites Brian Greene's arguments for many-worlds:
Stage one — the evolution of wavefunctions according to Schrödinger's equation — is mathematically rigorous, totally unambiguous, and fully accepted by the physics community. Stage two — the collapse of a wavefunction upon measurement — is, to the contrary, something that during the last eight decades has, at best, kept physics mildly bemused, and at worst, posed problems, puzzles and potential paradoxes that have devoured careers. The difficulty [...] is that according to Schrödinger's equation wave functions do not collapse. Wavefunction collapse is an add-on. It was introduced after Schrödinger discovered his equation, in an attempt to account for what experimenters actually see. [[9], p. 201.]
No, the evolution of the wave function is not stage one. The wave function is not directly observable, so we can only infer estimated wave function by other methods.

This might seem like a minor point, except that Greene is going to argue that the Schroedinger equation is all we need. That cannot be.

John Preskill, in his recent podcast, also said that maybe unitary evolution of the wavefunction is all there is. But that cannot be all there is, because we still need an explanation for why we see discrete measurements.

Note the attempt to denigrate wavefunction collapse as an invention "to account for what experimenters actually see."! Yes, physical theories try to account for what experimenters see. They see collapse. Any theory not explaining collapse is not doing the job.

The many-worlds fans say that the collapse is seen as the splitting of the universes. So they have to have collapse as part of the theory, but they say the splitting is a mystery and cannot tell us much more than that.

[Each] of the potential outcomes embodied in the wavefunction still vies for realization. And so we are still wondering how one outcome "wins" and where the many other possibilities "go" when that actually happens. When a coin is tossed, [...] you can, in principle, predict whether it will land heads or tails. On closer inspection, then, precisely one outcome is determined by the details you initially overlooked. The same cannot be said in quantum physics. [...]
There is not really any conceptual difference here.

The coin toss cannot be predicted with certainty. Maybe some overlooked details would enable a better prediction, but that could be true of quantum mechanics too, for all we know.

Much in the spirit of Bohr, some physicists believe that searching for such an explanation of how a single, definite outcome arises is misguided. These physicists argue that quantum mechanics, with its updating to include decoherence, is a sharply formulated theory whose predictions account for the behavior of laboratory measuring devices. And according to this view, that is the goal of science. To seek an explanation of what's really going on, to strive for an understanding of how a particular outcome came to be, to hunt for a level of reality beyond detector readings and computer printouts betrays an unreasonable intellectual greediness.

Many others, including me, have a different perspective. Explaining data is what science is about. But many physicists believe that science is also about embracing the theories data confirms and going further by using them to get maximal insight into the nature of reality. [[9], pp 212-213.

I am all for explaining data also, but just saying that anything can happen in parallel universes explains nothing.

Thursday, July 8, 2021

Electrons do Spin

There is a new PBS TV video on Electrons DO NOT Spin.

This guy is usually pretty reliable, but he is way off base here. Of course electorns spin.

His main argument is that if you conceptualize an electron as a particle, then it is hard to see how the charge distribution and angular velocity could result in the observed magnetic moment.

Okay, electrons are not classical particles. If you conceptualize an electron as a classical particle, you will have trouble with position, momentum, and everything else.

Spin is the intrinsic angular momentum, quantized.

Update: I found this explanation:

WHAT IS SPIN?

Spin is an inherent property possessed by the electron. However, it does not rotate. In quantum mechanics, we speak of an en electron as having an intrinsic angular momentum called spin. The reason we use this term is that electrons possess an angular momentum & a magnetic moment just like a rotating charged body.

DO ELECTRONS SPIN SIMILAR TO PLANETS?

IT IS MISLEADING TO IMAGINE AN ELECTRON AS A SMALL SPINNING OBJECT DUE TO THE FOLLOWING:

An electron’s spin is quantified. It has only two possible orientations, spin up and down, unlike a tossed ball.
To regard an electron as spinning, it must rotate with a speed greater than light to have the correct angular momentum[Griffiths, 2005, problem 4.25].
Similarly, the electron’s charge would have to rotate faster than the speed of light to generate the correct magnetic moment[Rohrlich, 2007, Pg 127].
Unlike a tossed ball, the spin of an electron never changes. It has only two possible orientations: spin up and down.

These arguments are just wrong.

The 1st and 4th are not true. An electron spin can be in any direction, not just up and down. If you put it in a suitable magnetic field, then it will be just up or down, but the same is true about a classical spinning charged ball.

I don't have those textbooks, but they presumably do a computation assuming an electron is a charged particle with extremely small size. But an electron is not a classical particle. While it looks like a point particle in some experiments, it also looks like a wave of concentrated fields. That wave/field is spinning.

Monday, July 5, 2021

Why Quantum Systems are Hard to Simulate

I just listened to Mindscape 153 | John Preskill on Quantum Computers and What They’re Good For:
Depending on who you listen to, quantum computers are either the biggest technological change coming down the road or just another overhyped bubble. Today we’re talking with a good person to listen to: John Preskill, one of the leaders in modern quantum information science. We talk about what a quantum computer is and promising technologies for actually building them. John emphasizes that quantum computers are tailor-made for simulating the behavior of quantum systems like molecules and materials; whether they will lead to breakthroughs in cryptography or optimization problems is less clear. Then we relate the idea of quantum information back to gravity and the emergence of spacetime. (If you want to build and run your own quantum algorithm, try the IBM Quantum Experience.)
Preskill coined the term "quantum supremacy", and supposedly that is the biggest accomplishment in the field, but oddly the term is never mentioned.

Sean M. Carroll is an advocate of many-worlds theory, and Preskill said he is comfortable with that belief.

Preskill referred to Feynman kickstarting the field by a lecture that said that quantum systems are hard to simulate. This supposedly implied that quantum computers are inevitable.

As I see it, there are three main views underlying this thinking.

Many-worlds. If computers are constantly splitting into multiple computers doing different computers, then possibly they can be put to work doing parallel computation, and reporting back a result in one world.

This used to be the eccentric view of David Deutsch, but now Carroll, Preskill, Aaronson, and many others are on board.

Negative probability. In this view, quantum entanglement is a mysterious resource that can have negative or imaginary probabilities, so maybe it can do things better than classical computers that are limited by [0,1] probabilities. This is the preferred explanation of Scott Aaronson.

Pinball. The game of pinball is also hard to simulate because a ball can hit a bumper, be forced to one side or the other in a way that is hard to predict, and then the subsequent action of that ball is wildly different, depending on the side of the bumper. It is a simple example of chaos.

Quantum systems can be like pinball. Imagine a series of 100 double-slits. You fire an electron thru all the slits. At the first slit, the electron goes thru one slit, or the other, or some superposition. The electron has the same choice at the next slit, and it is influenced by what happened at the first slit. Simulating the 100 slits requires keep track of 2100 possibilities.

So pinball is hard to simulate, but no one would try to build a super-computer out of pinball machines.

My theory here is that your faith in quantum computers depends on which of the above three views you hold.

If quantum experiments are glimpses into parallel worlds, then it is reasonable to expect parallel computation to do useful work. If quantum experiments unlock the power of negative probabilities, then it is also plausible there is a quantum magic to be used in a computer. But it quantum experiments are just pinball machines, then nothing special should be expected.

You might say that we know quantum experiments are not classical, as Bell proved that. My answer is that they are not literally parallel universes or negative probabilities either. Maybe the human mind cannot even grasp what is really going on, so we have to use simple metaphors.

The many-worlds and negative probability views do not even make any mathematical sense. Many-worlds is so nutty that Carroll, Aaronson, and Preskill discredit much of what they have to say by expressing these opinions.

Now Aaronson says:

To confine myself to some general comments: since Google’s announcement in Fall 2019, I’ve consistently said that sampling-based quantum supremacy is not yet a done deal. I’ve said that quantum supremacy seems important enough to want independent replications, and demonstrations in other hardware platforms like ion traps and photonics, and better gate fidelity, and better classical hardness, and better verification protocols. Most of all, I’ve said that we needed a genuine dialogue between the “quantum supremacists” and the classical skeptics: the former doing experiments and releasing all their data, the latter trying to design efficient classical simulations for those experiments, and so on in an iterative process. Just like in applied cryptography, we’d only have real confidence in a quantum supremacy claim once it had survived at least a few years of attacks by skeptics. So I’m delighted that this is precisely what’s now happening.
Wow. He consistently said that?

He was the referee who approved Google's claim of quantum supremacy. He has collected awards for papers on that "sampling-based quantum supremacy". Maybe his referee report should have recommended that Google scale back its claims.

Aaronson has also argued that the quantum computing skeptics have been proven wrong. I guess not. It will still take a few years of consensus building.

I am one of those quantum computing skeptics. I thought that maybe someday and experiment would be done that proves me wrong. But apparently that is not how it works.

The experiments are inconclusive, but the experts will eventually settle into the pro or con sides. Only when that happens will I be seen to be right or wrong.

No, I don't accept that. Expert opinion is shifting towards many-worlds, but it is still a crackpot unscientific theory of no value. When Shor's algorithm on a quantum computer factors a number that could not previously be factored, then I will accept that I was wrong. Currently, Dr. Boson Sampling Supremacy himself admits that I have not been proven wrong.

Update: Here is a SciAm explanation:

If you think of a computer solving a problem as a mouse running through a maze, a classical computer finds its way through by trying every path until it reaches the end.

What if, instead of solving the maze through trial and error, you could consider all possible routes simultaneously?

Even now, experts are still trying to get quantum computers to work well enough to best classical supercomputers.

Wednesday, June 30, 2021

The Aether was not a Fringe Theory

An anonymous Wikipedia comment:
I'm perplexed that the other commenters identify ether theory as a fringe theory. I suppose, by their definition, Newtonian mechanics is also a "fringe theory" simply because it is outdated, and yet nobody is out to debunk Newtonian mechanics (in fact we teach it in schools). "Debunking" ether theory and obscuring its pedagogical utility is clearly an emotional pursuit for these people. It's hard to understand where this religious mentality about ether theory comes from, except in the context of a century of propaganda, as you pointed out. The attitude is clearly political and inappropriate (yet all too common) on Wikipedia.
I agree. If you think that aether theory was somehow wrong or unscientific, then read J.C. Maxwell's 1878 article in the Encyclopedia Britannica. He did not know that the aether had to be invariant under Lorentz transformations, but his essay holds up pretty well.

Here is what Lorentz's famous 1895 relativity paper said. After dismissing some aether theories:

It is not my intention to enter into such speculations more closely, or to express assumptions about the nature of the aether. I only wish to keep myself as free as possible from preconceived opinions about that substance, and I won't, for example, attribute to it the properties of ordinary liquids and gases. If it is the case, that a representation of the phenomena would succeed best under the condition of absolute permeability, then one should admit of such an assumption for the time being, and leave it to the subsequent research, to give us a deeper understanding. ​That we cannot speak about an absolute rest of the aether, is self-evident; this expression would not even make sense. When I say for the sake of brevity, that the aether would be at rest, then this only means that one part of this medium does not move against the other one and that all perceptible motions are relative motions of the celestial bodies in relation to the aether.
Einstein said essentially the same thing in 1905:
The introduction of a “luminiferous ether” will prove to be superfluous inasmuch as the view here to be developed will not require an “absolutely stationary space” provided with special properties, nor assign a velocity-vector to a point of the empty space in which electromagnetic processes take place.
Poincare wrote in a popular 1902 book that the aether is a convenient hypothesis what will someday be thrown aside as useless.

The lesson from relativity was not that the aether was wrong, or unscientific, or nonexistent, or exemplary of 19th century thinking.

The lesson was that theories based on motion against the aether were disproved.

The aether is sometimes defined as whatever transmits electromagnetism. If it is Lorentz invariant, then it poses no problem for relativity.

The existence of a preferred frame poses no problem either. The cosmic microwave radiation can be used to define what a rest frame is, so there is nothing wrong with that concept.

There was a time, maybe a century ago, when one could make fun of ancient scientists for lacking the imagination to understand that a vacuum might be empty space. But a vacuum is not really empty in today's theories either.

Thursday, June 24, 2021

Poincare understood what he wrote

I was referred to this 2014 Howard Stein paper for the bizarre claim that Poincare did not understand what he was doing:
Poincaré is a pre-eminent figure: as one of the greatest of mathematicians; as a contributor of prime importance to the development of physical theory at a time when physics was undergoing a profound transformation; and as a philosopher. However, I think that Poincaré, with all this virtue, made a serious philosophical mistake. In Poincaré’s own work, this error seems to me to have kept him from several fundamental discoveries in physics.
As Stein admits, claiming that Poincare did not understand what he was saying is to hypothesize something that has never happened in the entire history of physics.

It quotes Poincare making the following points.

1. Poincare presents a relativity theory with electromagnetic and gravitational force both propagate at the speed of light. This remarkable coincidence has two possible explanations. Lorentz would say that it is because gravity and everything else has an electromagnetic origin.

2. Poincare proposes a more radical explanation, which he analogizes to Copernicus differing from Ptolemy. His idea is that relativity is about how we do spacetime measurements, and therefore applies to all forces.

Stein quotes Poincare saying all this, and is baffled by it, because it is about eight years ahead of Einstein.

I commented on Stein's paper back in 2014.

Here is another silly argument:

No one referred to Poincare's 1905/6 papers in subsequent years and Einstein was unaware of them until the 1950s. Minkowski did not reference Poincare's work. Poincare wrote a semi-popular article in 1908, and that was all. Without the controversy started by E. T. Whitaker and discussed in this WP article, Poincare's 1905/6/8 papers might have languished in obscurity.
No, Poincare's papers were more influential than Einstein's, at the time. Here is Minkowski's big paper on relativity, and as you can see, it references Poincare's big 1905/6 paper twice. These two papers were the most important relativity papers after Lorentz's, and formed the basis of all subsequent work. They were known to Einstein at the time, and to everyone else with an interest in relativity.

Poincare's paper was in French, but Einstein was fluent in French, as he had attended a French-speaking university. Relativity quickly became widely accepted because of these papers, not Einstein's. Both papers would have been tough reading for physicists of the day, but both had shorter summary versions that were widely distributed and read. See translations here: Poincare and Minkowski.

There is still a question that begs for explanation. It may be unique in the history of science. In 1905, Poincare was maybe the most respected scientist or mathematician in all of Europe. Certainly one of the most famous and respected. In 1904 he wrote a paper declaring an entirely new mechanics based on the speed of light. In 1905 he published a long paper on his relativity, comparing it to Copernican heliocentrism, and the work immediately became the basis of all XX century physics.

And yet there are scholars today who claim that no one ever noticed Poincare's papers, and that all the credit should belong to Einstein instead. None of them can tell us anything that Einstein did that was original, so they resort to arguments that Lorentz and Poincare did not know what they were doing!

This is like saying that Copernicus should not get any credit for heliocentrism because he did not understand that the Earth revolved around the Sun in his model.

Einstein went on to spend most of his post-1920 career arguing against quantum mechanics and pursuing silly unified field theories. Other physicists accomplishment many great things, but there is a signiticant faction even today that says goofy things about quantum mechanics and proposes grand untestable theories.

Update: The Wikipedia discussion ended without adding the remarks about Poincare not knowing what he was doing, or that Minkowski failed cite Poincare. The article remains heavily biased towards the views of Einstein scholars who favor crediting Einstein, but the facts presented are quite historically accurate, and you can decide for yourself.

Monday, June 21, 2021

New book overhypes Quantum Computers

News:
Wired published a long extract from Amit Katwala's book Quantum Computing: How It Works and How It Could Change the World — explaining how it's already being put to use to explore some of science's biggest secrets by simulating nature itelf:

Some of the world's top scientists are engaged in a frantic race to find new battery technologies that can replace lithium-ion with something cleaner, cheaper and more plentiful. Quantum computers could be their secret weapon... Although we've known all the equations we need to simulate chemistry since the 1930s, we've never had the computing power available to do it...

Comments add:
QCs are _still_ much slower than much cheaper conventional computers if you use the best algorithms for each technology. And that will remain the case for a long time yet, and possibly forever. Hence there is absolutely _nothing_ that QCs are good for at this time, except separating fools from their money. Also note the excessive use of "could" in the article. In this context "could" = "maybe, maybe not, but certainly not anytime soon". ...

Good, grief the very title Quantum Computing: How It Works and How It Could Change the World tells you it's a work of speculative fiction. In the past it would have been Fusion Power: How It Works and How It Could Change the World or Faster-then-light Travel: How It Works and How It Could Change the World or Telekinesis: How It Works and How It Could Change the World or, if you're that way inclined, Yogic Flying: How It Works and How It Could Change the World. Show me one thing that quantum computing has actually achieved that isn't a specially-crafted artificial problem and that couldn't have been achieved with much less effort with a standard computer.

That's right. Quantum computers have not demonstrated the ability to do practical computations, and will not anytime soon, if ever.

Google has been similarly overpromising self-driving cars, but that technology has been demonstrated in controlled situations, even if it is not yet ready for the mass market.

Monday, June 14, 2021

New Book on Poincare and Relativity

There is a new book on Henri Poincare, and the author has posted a summary on Wikipedia::
Bruce Popp (2020) [1] argues that Poincaré ([Poi05] and [Poi06]) developed a correct relativistic theory of electrodynamics that achieved both substantial and incomplete progress to a theory of special relativity by a different route from Einstein. This route had its origins in work on radioactivity and electrons. His 1905 and 1906 papers are immediately based on his close reading of [Lor04] and the three divergences from Lorentz that Poincaré identified. For example, he understood Lorentz’s presentation of the transformations based on corresponding states was flawed. Poincaré provided the correct form for the transformations and the understanding that they were coordinate transformations. It is this corrected form that Poincaré named “Lorentz transformations” and that match the form and understanding given to them by Einstein [Ein05c]. Poincaré shows that thus corrected the transformations are a group corresponding to a rotation in four-dimensional space with three spatial and one time dimension and that the space-time interval is an invariant of this group. 
Popp emphasizes that while, as this summary suggests, Poincaré would have been justified in making a series of strong statements about his findings, very surprisingly he did not. In fact, Poincaré does not seem to have understood and synthesized what he showed in 1905. Worse, he contradicts himself in later writing adding to confusion about his work and positions, notably concerning the ether. Popp indicates that this is one reason why Poincaré’s alternate path to special relativity is not fully realized. Another is that Poincaré shows no appreciation of the implications for simultaneity and time; in brief there is nothing comparable to Einstein’s discussion of moving watch hands and trains arriving.

References Popp, Bruce D. (2020). Henri Poincaré: Electrons to special relativity: Translation of selected papers and discussion. Cham: Springer International Publishing. ISBN 978-3-030-48038-7.

This is all conventional wisdom, and here are his main arguments.

Poincare did not brag about his work, as Einstein did. Poincare obviously thought that his papers speak for themselves. He didn't brag about his many other original works either. That was common for scientists. Einstein was the exception, as he made great effort to claim credit for the work of others.

If Poincare understood what he wrote, then he was years ahead of Einstein. The Einstein fans say that this proves that Poincare did not understand what he wrote, but of course that never happens. Obviously Poincare understood what he wrote.

Poincare did not emphasize simultaneity in 1905. As you can read in the Wikipedia article on the subject, Poincare discovered relativistic time synchronization in 1898, and regarded it as a solved problem.

Poincare's contribution has been forgotten. There is some truth to this, as Poincare's work is mostly remembered in two papers by Minkowski, who died shortly afterwards, and in all subsequent work that considers relativity a 4-dimensional theory.

It is amazing how scholars concoct these stories to credit Einstein over Poincare. Our current understanding of relativity is based much more on the work of Poincare than Einstein.

Poincare's works get mentioned on Jordan Ellenberg: Mathematics of High-Dimensional Shapes and Geometries | Lex Fridman Podcast #190. He is praised for his work on celestial mechanics, stability of dynamics, and topology. Discovering relativity is not even mentioned. Einstein's greatest accomplishment was just a poor plagiarization of one of Poincare's minor papers.

After some discussion, the Wikipedia article on Einstein recently removed:

[Einstein is] universally acknowledged to be one of the two greatest physicists of all time, the other being Isaac Newton.
Someone pointed out that polls by PhysicsWorld and UK BBC showed physicists saying that Einstein was the greatest, with Newton in second place.

There continues to be crazy over-the-top idolization of Einstein. Normally a book about a great scholar will simply describe what he did, without gratuitous insults about him being inferior to some other great man.

That's what the above book does. It recognizes what Poincare did, and then makes nonsensical disparaging remarks in order to say that Einstein was better. Maybe someday I will see an Einstein scholar write something like this:

Einstein's 1905 relativity paper was a nice exposition of Lorentz's theory. But it lacked references to earlier theoretical work by Lorentz, FitzGerald, Poincare, and others, and to crucial experimental work by Michelson-Morley and others. It failed to explain how his theory was any different from Lorentz's. Nobody saw any difference, and called it the Lorentz-Einstein theory. Einstein failed to grasp the spacetime geometry, the Lorentz group, the covariance of Maxwell's equations, or the implications for gravity. Einstein shows no appreciation of relativity as a 4-dimensional theory; in brief there is nothing comparable to Poincare's 1905 work, and nothing that led to further work.
Whittaker did say something similar in his 1954 book. Einstein was sill alive, and could not refute it, even though his friend Max Born tried. So all serious scholars know that this Einstein credit for relativity is a hoax. The Einstein worship has only accelerated since then.

Monday, June 7, 2021

The Current War on Science

Science and medicine are being politicized, and there are so many examples that it is tiresome to list them.

During the Trump administration, it was common to hear academic and news media complaints that he was anti-Trump. But they never had any examples of acting against accepted research or not funding mainstream science programs.

Anthony Fauci was interviewed on Science Friday. He has been embarrassed by emails, but those emails are not that much different from his public statements. He has said a long list of foolish and unscientific things.

Friday he talked about AIDS a lot. He tried to blame it on Pres. Ronald Reagan. He tried to say it was not a gay disease, as proved by Magic Johnson getting it. (Johnson was rumored to be participating in dangerous homosexual practices, even before the AIDS story.)

Fauci and other experts have told us for a year that the coronavirus could not have been a Wuhan lab leak, when that is still the most plausible explanation.

Here is an essay onWhat Happens When Doctors Can't Tell the Truth?

Here is a paper by a Black woman a PhD from the Perimeter Institute, home to a lot of crackpot physics:

To provide an example of the role that white empiricism plays in physics, I discuss the current debate in string theory about postempiricism, motivated in part by a question: why are string theorists calling for an end to empiricism rather than an end to racial hegemony? I believe the answer is that knowledge production in physics is contingent on the ascribed identities of the physicists. ...

For these reasons, the area of quantum gravity, a physics subdiscipline considered by many to be the pinnacle of physics prestige, objectivity, universality, and culturelessness, is a natural starting point for a discussion about how social prestige asymmetries affect epistemic outcomes in physics. Ultimately, the discourse about the quantum gravity model of string theory provides an example of how white supremacist racial prestige asymmetry produces an antiempiricist epistemic practice among physicists, white empiricism. In string theory, we find an example wherein extremely speculative ideas that require abandoning the empiricist core of the scientific method and which are endorsed by white scientists are taken more seriously than the idea that Black women are competent observers of their own experiences.

Maybe the Perimeter Institute considers quantum gravity to be "the pinnacle of physics prestige, objectivity", but nothing good has ever come out of that subject.

Environmentalism has been hopelessly politicized for years. If they really cared about global warming, their top priorities would be building nuclear power plants and blocking Third World immigration into the First World.

Larry Krauss has a decent defense of objective science in Quillette. He is probably also a Trump-hating leftist, but I cite him to show that not all academics have bought into the current nonsense.

Authors are constantly chastened for terminology. I see Scott Aaronson still uses "quantum supremacy", but probably only because he has tenure and his enemies have other grounds for attacking him.

Reason reports:

Last month, the Journal of Hospital Medicine published an article titled, "Tribalism: The Good, the Bad, and the Future." It proposed strategies for medical professionals to overcome some of the natural group clustering that occurs in any large workspace: launch interdepartmental projects, socialize outside of the office, etc.
The paper was recalled, and the authors had to put out this apology:
From this experience, we learned that the words "tribe" and "tribalism" have no consistent meaning, are associated with negative historical and cultural assumptions, and can promote misleading stereotypes.4 The term "tribe" became popular as a colonial construct to describe forms of social organization considered "uncivilized" or "primitive." In using the term "tribe" to describe members of medical communities, we ignored the complex and dynamic identities of Native American, African, and other Indigenous Peoples and the history of their oppression.
This is ridiculous, as tribe is a perfectly good word. The authors ended up substituting "silo" for "tribe", but that has a less suitable meaning.

Update: Just today, here is a SciAm article complaining that physicians often note racial info, as it is correlated with an assortment of medical problems:

Yet, a tool used daily by almost every physician, the history of present illness (HPI), may still perpetuate medical racism. ...

Physicians often determine racial and ethnic labels themselves rather than asking patients to self-identify. ...

Beyond the issue of physicians using inaccurate racial labels, research has proven what scholars like W.E.B. Du Bois and Derrick Bell stated for decades: race is a social construct. ...

By using this outdated practice, physicians may be reinforcing the incorrect idea that race differentiation holds scientific value instead of being a clumsy artifact of the profession. ...

But, if physicians are truly trying to discern if patients are carriers of genetic allelic variants ..., then genetic mapping should be used in high-risk patients. ...

To be clear, a “color-blind” approach is not ideal either.

It seems clear that these people will cry racism no matter what the physicians do.

Update: Another example of one-sided politicization:

Well, the latest scientific journal or magazine to go to hell in a handbasket is Scientific American, which under the editorial guidance of Laura Helmuth has published a putrid piece of pure pro-Palestinian propaganda. It’s an op-ed piece apparently written by a group of Palestinian BDS activists (one author wishes to be anonymous). purveying the usual distortions, omissions, and outright lies.  If there were a counter piece refuting those lies (there is below, but not at Sci Am), it would be somewhat better, but not much. Instead, the op-ed is linked to a Google Document petition (surely not posted by Sci Am) that you can sign in solidarity with Palestine.

First of all, a science magazine has no business taking an ideological stand like this, particularly one replete with lies and distortions. What was Scientific American thinking? Do they fancy themselves to be Mother Jones?

And here is a recent Nature magazine editorial promoting leftist racial nonsense.

Update: From "Meet the Press Daily":

"So if you are trying to get at me as a public health official and a scientist, you're really attacking not only Dr. Anthony Fauci, you're attacking science. And anybody that looks at what is going on, clearly sees that, you have to be asleep not to see that. That is what going on," he added.

"Science and the truth are being attacked," Fauci concluded.

He is the highest paid US govt officil, and he certainly needs to be accountable to criticism.

Thursday, June 3, 2021

Einstein book addendum

I wrote an Einstein book several years ago. One of the main arguments was that Einstein does not deserve credit for discovering relativity. The reasons are:

1. All of the important special relativity equations were published by others before Einstein wrote anything on the subject.

2. Einstein's 1905 theory was not seen at the time as being particularly novel or influential.

3. The main concept behind relativity is that spacetime has a non-euclidean geometry. This was published by others, and missed by Einstein.

Historians acknowledge (1), but credit Einstein for some non-mathematical subtlety such as accepting local time, saying the aether was superfluous, or giving a derivation that was not ad hoc. The trouble with these is that what Einstein actually said about local time and the aether was nearly identical to what Lorentz and Poincare said years earlier.

Item (2) is also acknowledged, but not so well known. There were papers written on competing theories, and they referred to the "Lorentz-Einstein theory", as if there were no distinction between the Lorentz and Einstein theories. Einstein tried, but was never able to give a good explanation as to how his theory differed from Lorentz's. Lorentz said that Einstein merely postulated what he and others had deduced from previous theory and experiment. Poincare and Minkowski did explain how their versions of relativity differed from Lorentz.

As for (3), it is well-known that Minkowski published a non-euclidean geometry treatment of relativity, and that is what caught on with physicists and led to widespread acceptance. Einstein complained that he turned the theory into something that he could not recognize. Some assume that Minkowski built on Einstein's ideas, but Lorentz and Poincare were much greater influences, and it is not clear that Minkowski got anything from Einstein.

Even as late as 1910, when someone suggested that Einstein's non-Euclidean geometrical view could avoid a paradox of Lorentzian relativity, Einstein wrote a letter to the journal denying that he has any such view different from Lorentz's. That would have been a great opportunity for Einstein to take credit for a conceptual advance, but he denied it.

In short, here is the paradox. If the Lorentz contraction is applied to a spinning bicycle wheel, the tire contracts while the spoke lengths remain the same. This seem to contradict the Euclidean geometry fact that a circle circumference is 2π times the radius. Adopting a non-euclidean geometry resolves the paradox.

Someone similar happened in the 1920s, when a general relativity explained that non-euclidean geometry was the heart of the theory. Einstein published a favorable book review, but denied the geometry view.

See also: Einstein did not discover relativity, Einstein book update, and Second Einstein book update

The history of relativity gives the background for the distortions in Physics that came later in the book. Einstein found that he was widely idolized for his supposed genius ability to do non-empirical theorizing. By the late 1920s, he was repudiating his earlier more empirical approach. Dutch physicist Jeroen van Dongen has written a very good new paper on this XX century trend towards non-empirical Physics. He writes:

In the absence of the empirical, Einstein emphasized the merit of his personal epistemological conviction, along with its success as documented in his version of his biography: the epistemic benefit of doing unified field theory was bound up with the virtuous dispositions of his kind of theorist.
This is a polite way of saying that Einstein lied about his life story in order to promote himself and the virtues of his worthless unified field theory research.
For admiration of Einstein as empiricist icon, see e.g. Heisenberg (1989) ; Heisenberg here further recalls his surprise when Einstein explained to him in 1926 that he no longer held empiricist views. In 1927, Heisenberg signaled a difference of opinion regarding the role of `simplicity' and the empirical with Einstein (Heisenberg to Einstein, 10 June 1927, cited on p. 467 in Pais 1982); Einstein himself was well aware of his isolation and the negative judgment of his peers; see Pais (1982), p. 462. See Howard (1994) on the logical empiricists. ...

Dismissal could take a moral tone, for instance when Robert Oppenheimer deemed that Einstein had been "wasting his time." In fact, he had gone "completely cuckoo", Oppenheimer added in private, or, as he put it in public, Einstein had "lost contact with the profession of physics." Clearly, the Einstein of unified field theory was not a proper theorist.

That's right. Respect for Einstein early was based on empirical work. The Nobel Prize was for one his more empirical papers. Then Einstein went non-empirical, and his work was cuckoo.

But a philosophical shift made the non-empirical work more respectable than the empirical. Those logical empiricists were driven out pf academia. The Kuhn paradigm shifters put non-empirical work as the true scientific revolutions that everyone admired.

Example of Einstein against empiricism:

In the same letter, Einstein expressed that he was no longer thinking about experiments on the wave and particle properties of light, and that one "will never arrive at a sensible theory in an inductive manner", even if "fundamental experiments" could still be of value - once again deprecating the quantum program's empirical slant.
The history is imprtant because these Kuhnian revolutions never happen. The discoveries of relativity and quantum mechanics in the early XX century were driven by empirical findings.

The patron saints of non-empirical philosphy are Copernicus, Galileo, Einstein, and Kuhn.

The example of Copernicus is particularly apt for today’s discussion. Copernicus proposed his alternative to the fairly successful Ptolemean universe in 1543. Yet, this theoretical proposal was basically beyond any meaningful notion of empirical falsifiability. This situation persisted pretty much until Galileo pointed the newly invented telescope to the heavens and in 1610 observed the phases of Venus.
The phases of Venus were not decisive, and arguments for and against continued until Isaac Newton. Some of the arguments were not fully resolved for centuries.

Kuhn makes a big deal out of this because Copernicus described a "revolution" of the Earth around the Sun, and the theory eventually caught on even tho there was little empirical evidence for it at the time. So he portrayed scientists as a bunch of irrational fad-followers.

In the case of relativity, all of the important early papers referred directly to the Michelson-Morley experiment as the crucial experiment, as well as to other experiments. This was acknowledged by everyone at the time, including Einstein. The view only got revised later, in efforts to credit Einstein and devalue empiricism.

I have posted here many times that I think that the theories of relativity and quantum mechanics could have been anticipated by clever theorists. If you are looking for a locally causal field theory, the math leads directly to relativity and gauge theory. In a way, that is what Maxwell did with electromagnetism.

And once you accept that we needed a wave theory of matter, quantum mechanics is the obvious thing. Nobody knows any better way to even propose such a theory. So these theories could have been developed from pure theory.

Or so it seems in retrospect. It never happened that way.

String theorists would like to tell you that Einstein created relativity out of pure theory, and that inspired string theorists to do the same today. Forget it. When Einstein shifted to purely theory analysis, his work was garbage.

Peter Woit mentions the above paper, and a comment notes that it ends by saying that non-empirical physics like string theory is a Kuhnian paradigm shift, and urging that we “keep funding it as generously as before.”

Tuesday, June 1, 2021

No, this is not Math's Fatal Flaw

A recent YouTube video explains:
This is Math's Fatal Flaw

Not everything that is true can be proven. This discovery transformed infinity, changed the course of a world war and led to the modern computer.

So this discovery was one of the greatest accomplishments of the XX century, and yet it is a "fatal flaw"?

The video is actually pretty good, but I object to all the explanations that say that Mathematics is somehow deficient because the consistency of the axioms cannot be proved from the axioms.

Nobody would ever want the consistency to be provable from the axioms anyway. Such a proof would mean nothing. Inconsistent systems allow such proofs, and nobody wants that.

It would be nice to have an algorithm to determine whether a given math statement is true or false. The above discovery shows that it is not possible. But again, this is not a fatal flaw. It is what makes Math interesting.

Saturday, May 29, 2021

Burying the Wuhan lab leak hypothesis

Here is another example of scientists politicizing an issue of large public interest.

A Nature/SciAm article reports:

Calls to investigate Chinese laboratories have reached a fever pitch in the United States, as Republican leaders allege that the coronavirus causing the pandemic was leaked from one, and as some scientists argue that this ‘lab leak’ hypothesis requires a thorough, independent inquiry. But for many researchers, the tone of the growing demands is unsettling. ...

Others worry that the rhetoric around an alleged lab leak has grown so toxic that it’s fuelling online bullying of scientists ...

The debate over the lab-leak hypothesis has been rumbling since last year. But it has grown louder in the past month ...

Even if the letter in Science was well intentioned, its authors should have thought more about how it would feed into the divisive political environment surrounding this issue, says Angela Rasmussen, a virologist at the University of Saskatchewan in Saskatoon, Canada. ...

Rasmussen says, “This debate has moved so far from the evidence that I don’t know if we can dial it back.” ...

The United States has since requested that the WHO conduct a "transparent, science-based" phase 2 origins study, and US President Joe Biden announced that he has asked the US intelligence community, in addition to its national labs, to "press China to participate" in an investigation.

Apparently there is a lot of evidence for the Wuhan lab leak hypothesis, but we may never know for sure.

Because Republicans have demanded an investigation, a lot of Democrat scientists have resisted, arguing that no credence should be given to Pres. Trump's lies.

Here is another example. For decades, computer scientists and others have been warning that our elections are insecure and must be fixed. A prominent such article was just published:

Elections must be constructed and conducted such that everyone (all of the winning and losing candidates, as well as those who have supported them) can, with extremely high confidence, rationally believe the results reflect the will of the voters. ... As computer scientists, we must bear responsibility for warning about election vulnerabilities and proposing solutions,
Of course election integrity has been politicized, and they have to disavow being Trump supporters. As a blogger notes:
The letter is well written, but ... They say in the letter:
However, notwithstanding these serious concerns, we have never claimed that technical vulnerabilities have actually been exploited to alter the outcome of any US election.
This seems to be a problem. How do you get people to look both ways when crossing the street, when you have also basically asserted: “No one has ever been hit by a car when crossing the street.”
Just as we don't know the source of the Wuhan virus, we don't know who would have won a properly conducted 2020 election. And we may never know.

Tucker Carlson Tonight on Fox News, the highest rates cable TV news show, now regularly devotes segments to the politicization of science. Last night he discussed papers being retracted for political reasons, Biden administration appointments of incompetent political hacks to top-level scientific posts, physicians being forced to give harmful medical treatments, physicians fired for accusations of being transphobic, CDC employees not following official advice, and false and alarmist climate science.