Monday, June 14, 2021

New Book on Poincare and Relativity

There is a new book on Henri Poincare, and the author has posted a summary on Wikipedia::
Bruce Popp (2020) [1] argues that Poincaré ([Poi05] and [Poi06]) developed a correct relativistic theory of electrodynamics that achieved both substantial and incomplete progress to a theory of special relativity by a different route from Einstein. This route had its origins in work on radioactivity and electrons. His 1905 and 1906 papers are immediately based on his close reading of [Lor04] and the three divergences from Lorentz that Poincaré identified. For example, he understood Lorentz’s presentation of the transformations based on corresponding states was flawed. Poincaré provided the correct form for the transformations and the understanding that they were coordinate transformations. It is this corrected form that Poincaré named “Lorentz transformations” and that match the form and understanding given to them by Einstein [Ein05c]. Poincaré shows that thus corrected the transformations are a group corresponding to a rotation in four-dimensional space with three spatial and one time dimension and that the space-time interval is an invariant of this group. 
Popp emphasizes that while, as this summary suggests, Poincaré would have been justified in making a series of strong statements about his findings, very surprisingly he did not. In fact, Poincaré does not seem to have understood and synthesized what he showed in 1905. Worse, he contradicts himself in later writing adding to confusion about his work and positions, notably concerning the ether. Popp indicates that this is one reason why Poincaré’s alternate path to special relativity is not fully realized. Another is that Poincaré shows no appreciation of the implications for simultaneity and time; in brief there is nothing comparable to Einstein’s discussion of moving watch hands and trains arriving.

References Popp, Bruce D. (2020). Henri Poincaré: Electrons to special relativity: Translation of selected papers and discussion. Cham: Springer International Publishing. ISBN 978-3-030-48038-7.

This is all conventional wisdom, and here are his main arguments.

Poincare did not brag about his work, as Einstein did. Poincare obviously thought that his papers speak for themselves. He didn't brag about his many other original works either. That was common for scientists. Einstein was the exception, as he made great effort to claim credit for the work of others.

If Poincare understood what he wrote, then he was years ahead of Einstein. The Einstein fans say that this proves that Poincare did not understand what he wrote, but of course that never happens. Obviously Poincare understood what he wrote.

Poincare did not emphasize simultaneity in 1905. As you can read in the Wikipedia article on the subject, Poincare discovered relativistic time synchronization in 1898, and regarded it as a solved problem.

Poincare's contribution has been forgotten. There is some truth to this, as Poincare's work is mostly remembered in two papers by Minkowski, who died shortly afterwards, and in all subsequent work that considers relativity a 4-dimensional theory.

It is amazing how scholars concoct these stories to credit Einstein over Poincare. Our current understanding of relativity is based much more on the work of Poincare than Einstein.

Poincare's works get mentioned on Jordan Ellenberg: Mathematics of High-Dimensional Shapes and Geometries | Lex Fridman Podcast #190. He is praised for his work on celestial mechanics, stability of dynamics, and topology. Discovering relativity is not even mentioned. Einstein's greatest accomplishment was just a poor plagiarization of one of Poincare's minor papers.

After some discussion, the Wikipedia article on Einstein recently removed:

[Einstein is] universally acknowledged to be one of the two greatest physicists of all time, the other being Isaac Newton.
Someone pointed out that polls by PhysicsWorld and UK BBC showed physicists saying that Einstein was the greatest, with Newton in second place.

There continues to be crazy over-the-top idolization of Einstein. Normally a book about a great scholar will simply describe what he did, without gratuitous insults about him being inferior to some other great man.

That's what the above book does. It recognizes what Poincare did, and then makes nonsensical disparaging remarks in order to say that Einstein was better. Maybe someday I will see an Einstein scholar write something like this:

Einstein's 1905 relativity paper was a nice exposition of Lorentz's theory. But it lacked references to earlier theoretical work by Lorentz, FitzGerald, Poincare, and others, and to crucial experimental work by Michelson-Morley and others. It failed to explain how his theory was any different from Lorentz's. Nobody saw any difference, and called it the Lorentz-Einstein theory. Einstein failed to grasp the spacetime geometry, the Lorentz group, the covariance of Maxwell's equations, or the implications for gravity. Einstein shows no appreciation of relativity as a 4-dimensional theory; in brief there is nothing comparable to Poincare's 1905 work, and nothing that led to further work.
Whittaker did say something similar in his 1954 book. Einstein was sill alive, and could not refute it, even though his friend Max Born tried. So all serious scholars know that this Einstein credit for relativity is a hoax. The Einstein worship has only accelerated since then.

Monday, June 7, 2021

The Current War on Science

Science and medicine are being politicized, and there are so many examples that it is tiresome to list them.

During the Trump administration, it was common to hear academic and news media complaints that he was anti-Trump. But they never had any examples of acting against accepted research or not funding mainstream science programs.

Anthony Fauci was interviewed on Science Friday. He has been embarrassed by emails, but those emails are not that much different from his public statements. He has said a long list of foolish and unscientific things.

Friday he talked about AIDS a lot. He tried to blame it on Pres. Ronald Reagan. He tried to say it was not a gay disease, as proved by Magic Johnson getting it. (Johnson was rumored to be participating in dangerous homosexual practices, even before the AIDS story.)

Fauci and other experts have told us for a year that the coronavirus could not have been a Wuhan lab leak, when that is still the most plausible explanation.

Here is an essay onWhat Happens When Doctors Can't Tell the Truth?

Here is a paper by a Black woman a PhD from the Perimeter Institute, home to a lot of crackpot physics:

To provide an example of the role that white empiricism plays in physics, I discuss the current debate in string theory about postempiricism, motivated in part by a question: why are string theorists calling for an end to empiricism rather than an end to racial hegemony? I believe the answer is that knowledge production in physics is contingent on the ascribed identities of the physicists. ...

For these reasons, the area of quantum gravity, a physics subdiscipline considered by many to be the pinnacle of physics prestige, objectivity, universality, and culturelessness, is a natural starting point for a discussion about how social prestige asymmetries affect epistemic outcomes in physics. Ultimately, the discourse about the quantum gravity model of string theory provides an example of how white supremacist racial prestige asymmetry produces an antiempiricist epistemic practice among physicists, white empiricism. In string theory, we find an example wherein extremely speculative ideas that require abandoning the empiricist core of the scientific method and which are endorsed by white scientists are taken more seriously than the idea that Black women are competent observers of their own experiences.

Maybe the Perimeter Institute considers quantum gravity to be "the pinnacle of physics prestige, objectivity", but nothing good has ever come out of that subject.

Environmentalism has been hopelessly politicized for years. If they really cared about global warming, their top priorities would be building nuclear power plants and blocking Third World immigration into the First World.

Larry Krauss has a decent defense of objective science in Quillette. He is probably also a Trump-hating leftist, but I cite him to show that not all academics have bought into the current nonsense.

Authors are constantly chastened for terminology. I see Scott Aaronson still uses "quantum supremacy", but probably only because he has tenure and his enemies have other grounds for attacking him.

Reason reports:

Last month, the Journal of Hospital Medicine published an article titled, "Tribalism: The Good, the Bad, and the Future." It proposed strategies for medical professionals to overcome some of the natural group clustering that occurs in any large workspace: launch interdepartmental projects, socialize outside of the office, etc.
The paper was recalled, and the authors had to put out this apology:
From this experience, we learned that the words "tribe" and "tribalism" have no consistent meaning, are associated with negative historical and cultural assumptions, and can promote misleading stereotypes.4 The term "tribe" became popular as a colonial construct to describe forms of social organization considered "uncivilized" or "primitive." In using the term "tribe" to describe members of medical communities, we ignored the complex and dynamic identities of Native American, African, and other Indigenous Peoples and the history of their oppression.
This is ridiculous, as tribe is a perfectly good word. The authors ended up substituting "silo" for "tribe", but that has a less suitable meaning.

Update: Just today, here is a SciAm article complaining that physicians often note racial info, as it is correlated with an assortment of medical problems:

Yet, a tool used daily by almost every physician, the history of present illness (HPI), may still perpetuate medical racism. ...

Physicians often determine racial and ethnic labels themselves rather than asking patients to self-identify. ...

Beyond the issue of physicians using inaccurate racial labels, research has proven what scholars like W.E.B. Du Bois and Derrick Bell stated for decades: race is a social construct. ...

By using this outdated practice, physicians may be reinforcing the incorrect idea that race differentiation holds scientific value instead of being a clumsy artifact of the profession. ...

But, if physicians are truly trying to discern if patients are carriers of genetic allelic variants ..., then genetic mapping should be used in high-risk patients. ...

To be clear, a “color-blind” approach is not ideal either.

It seems clear that these people will cry racism no matter what the physicians do.

Update: Another example of one-sided politicization:

Well, the latest scientific journal or magazine to go to hell in a handbasket is Scientific American, which under the editorial guidance of Laura Helmuth has published a putrid piece of pure pro-Palestinian propaganda. It’s an op-ed piece apparently written by a group of Palestinian BDS activists (one author wishes to be anonymous). purveying the usual distortions, omissions, and outright lies.  If there were a counter piece refuting those lies (there is below, but not at Sci Am), it would be somewhat better, but not much. Instead, the op-ed is linked to a Google Document petition (surely not posted by Sci Am) that you can sign in solidarity with Palestine.

First of all, a science magazine has no business taking an ideological stand like this, particularly one replete with lies and distortions. What was Scientific American thinking? Do they fancy themselves to be Mother Jones?

And here is a recent Nature magazine editorial promoting leftist racial nonsense.

Update: From "Meet the Press Daily":

"So if you are trying to get at me as a public health official and a scientist, you're really attacking not only Dr. Anthony Fauci, you're attacking science. And anybody that looks at what is going on, clearly sees that, you have to be asleep not to see that. That is what going on," he added.

"Science and the truth are being attacked," Fauci concluded.

He is the highest paid US govt officil, and he certainly needs to be accountable to criticism.

Thursday, June 3, 2021

Einstein book addendum

I wrote an Einstein book several years ago. One of the main arguments was that Einstein does not deserve credit for discovering relativity. The reasons are:

1. All of the important special relativity equations were published by others before Einstein wrote anything on the subject.

2. Einstein's 1905 theory was not seen at the time as being particularly novel or influential.

3. The main concept behind relativity is that spacetime has a non-euclidean geometry. This was published by others, and missed by Einstein.

Historians acknowledge (1), but credit Einstein for some non-mathematical subtlety such as accepting local time, saying the aether was superfluous, or giving a derivation that was not ad hoc. The trouble with these is that what Einstein actually said about local time and the aether was nearly identical to what Lorentz and Poincare said years earlier.

Item (2) is also acknowledged, but not so well known. There were papers written on competing theories, and they referred to the "Lorentz-Einstein theory", as if there were no distinction between the Lorentz and Einstein theories. Einstein tried, but was never able to give a good explanation as to how his theory differed from Lorentz's. Lorentz said that Einstein merely postulated what he and others had deduced from previous theory and experiment. Poincare and Minkowski did explain how their versions of relativity differed from Lorentz.

As for (3), it is well-known that Minkowski published a non-euclidean geometry treatment of relativity, and that is what caught on with physicists and led to widespread acceptance. Einstein complained that he turned the theory into something that he could not recognize. Some assume that Minkowski built on Einstein's ideas, but Lorentz and Poincare were much greater influences, and it is not clear that Minkowski got anything from Einstein.

Even as late as 1910, when someone suggested that Einstein's non-Euclidean geometrical view could avoid a paradox of Lorentzian relativity, Einstein wrote a letter to the journal denying that he has any such view different from Lorentz's. That would have been a great opportunity for Einstein to take credit for a conceptual advance, but he denied it.

In short, here is the paradox. If the Lorentz contraction is applied to a spinning bicycle wheel, the tire contracts while the spoke lengths remain the same. This seem to contradict the Euclidean geometry fact that a circle circumference is 2π times the radius. Adopting a non-euclidean geometry resolves the paradox.

Someone similar happened in the 1920s, when a general relativity explained that non-euclidean geometry was the heart of the theory. Einstein published a favorable book review, but denied the geometry view.

See also: Einstein did not discover relativity, Einstein book update, and Second Einstein book update

The history of relativity gives the background for the distortions in Physics that came later in the book. Einstein found that he was widely idolized for his supposed genius ability to do non-empirical theorizing. By the late 1920s, he was repudiating his earlier more empirical approach. Dutch physicist Jeroen van Dongen has written a very good new paper on this XX century trend towards non-empirical Physics. He writes:

In the absence of the empirical, Einstein emphasized the merit of his personal epistemological conviction, along with its success as documented in his version of his biography: the epistemic benefit of doing unified field theory was bound up with the virtuous dispositions of his kind of theorist.
This is a polite way of saying that Einstein lied about his life story in order to promote himself and the virtues of his worthless unified field theory research.
For admiration of Einstein as empiricist icon, see e.g. Heisenberg (1989) ; Heisenberg here further recalls his surprise when Einstein explained to him in 1926 that he no longer held empiricist views. In 1927, Heisenberg signaled a difference of opinion regarding the role of `simplicity' and the empirical with Einstein (Heisenberg to Einstein, 10 June 1927, cited on p. 467 in Pais 1982); Einstein himself was well aware of his isolation and the negative judgment of his peers; see Pais (1982), p. 462. See Howard (1994) on the logical empiricists. ...

Dismissal could take a moral tone, for instance when Robert Oppenheimer deemed that Einstein had been "wasting his time." In fact, he had gone "completely cuckoo", Oppenheimer added in private, or, as he put it in public, Einstein had "lost contact with the profession of physics." Clearly, the Einstein of unified field theory was not a proper theorist.

That's right. Respect for Einstein early was based on empirical work. The Nobel Prize was for one his more empirical papers. Then Einstein went non-empirical, and his work was cuckoo.

But a philosophical shift made the non-empirical work more respectable than the empirical. Those logical empiricists were driven out pf academia. The Kuhn paradigm shifters put non-empirical work as the true scientific revolutions that everyone admired.

Example of Einstein against empiricism:

In the same letter, Einstein expressed that he was no longer thinking about experiments on the wave and particle properties of light, and that one "will never arrive at a sensible theory in an inductive manner", even if "fundamental experiments" could still be of value - once again deprecating the quantum program's empirical slant.
The history is imprtant because these Kuhnian revolutions never happen. The discoveries of relativity and quantum mechanics in the early XX century were driven by empirical findings.

The patron saints of non-empirical philosphy are Copernicus, Galileo, Einstein, and Kuhn.

The example of Copernicus is particularly apt for today’s discussion. Copernicus proposed his alternative to the fairly successful Ptolemean universe in 1543. Yet, this theoretical proposal was basically beyond any meaningful notion of empirical falsifiability. This situation persisted pretty much until Galileo pointed the newly invented telescope to the heavens and in 1610 observed the phases of Venus.
The phases of Venus were not decisive, and arguments for and against continued until Isaac Newton. Some of the arguments were not fully resolved for centuries.

Kuhn makes a big deal out of this because Copernicus described a "revolution" of the Earth around the Sun, and the theory eventually caught on even tho there was little empirical evidence for it at the time. So he portrayed scientists as a bunch of irrational fad-followers.

In the case of relativity, all of the important early papers referred directly to the Michelson-Morley experiment as the crucial experiment, as well as to other experiments. This was acknowledged by everyone at the time, including Einstein. The view only got revised later, in efforts to credit Einstein and devalue empiricism.

I have posted here many times that I think that the theories of relativity and quantum mechanics could have been anticipated by clever theorists. If you are looking for a locally causal field theory, the math leads directly to relativity and gauge theory. In a way, that is what Maxwell did with electromagnetism.

And once you accept that we needed a wave theory of matter, quantum mechanics is the obvious thing. Nobody knows any better way to even propose such a theory. So these theories could have been developed from pure theory.

Or so it seems in retrospect. It never happened that way.

String theorists would like to tell you that Einstein created relativity out of pure theory, and that inspired string theorists to do the same today. Forget it. When Einstein shifted to purely theory analysis, his work was garbage.

Peter Woit mentions the above paper, and a comment notes that it ends by saying that non-empirical physics like string theory is a Kuhnian paradigm shift, and urging that we “keep funding it as generously as before.”

Tuesday, June 1, 2021

No, this is not Math's Fatal Flaw

A recent YouTube video explains:
This is Math's Fatal Flaw

Not everything that is true can be proven. This discovery transformed infinity, changed the course of a world war and led to the modern computer.

So this discovery was one of the greatest accomplishments of the XX century, and yet it is a "fatal flaw"?

The video is actually pretty good, but I object to all the explanations that say that Mathematics is somehow deficient because the consistency of the axioms cannot be proved from the axioms.

Nobody would ever want the consistency to be provable from the axioms anyway. Such a proof would mean nothing. Inconsistent systems allow such proofs, and nobody wants that.

It would be nice to have an algorithm to determine whether a given math statement is true or false. The above discovery shows that it is not possible. But again, this is not a fatal flaw. It is what makes Math interesting.

Saturday, May 29, 2021

Burying the Wuhan lab leak hypothesis

Here is another example of scientists politicizing an issue of large public interest.

A Nature/SciAm article reports:

Calls to investigate Chinese laboratories have reached a fever pitch in the United States, as Republican leaders allege that the coronavirus causing the pandemic was leaked from one, and as some scientists argue that this ‘lab leak’ hypothesis requires a thorough, independent inquiry. But for many researchers, the tone of the growing demands is unsettling. ...

Others worry that the rhetoric around an alleged lab leak has grown so toxic that it’s fuelling online bullying of scientists ...

The debate over the lab-leak hypothesis has been rumbling since last year. But it has grown louder in the past month ...

Even if the letter in Science was well intentioned, its authors should have thought more about how it would feed into the divisive political environment surrounding this issue, says Angela Rasmussen, a virologist at the University of Saskatchewan in Saskatoon, Canada. ...

Rasmussen says, “This debate has moved so far from the evidence that I don’t know if we can dial it back.” ...

The United States has since requested that the WHO conduct a "transparent, science-based" phase 2 origins study, and US President Joe Biden announced that he has asked the US intelligence community, in addition to its national labs, to "press China to participate" in an investigation.

Apparently there is a lot of evidence for the Wuhan lab leak hypothesis, but we may never know for sure.

Because Republicans have demanded an investigation, a lot of Democrat scientists have resisted, arguing that no credence should be given to Pres. Trump's lies.

Here is another example. For decades, computer scientists and others have been warning that our elections are insecure and must be fixed. A prominent such article was just published:

Elections must be constructed and conducted such that everyone (all of the winning and losing candidates, as well as those who have supported them) can, with extremely high confidence, rationally believe the results reflect the will of the voters. ... As computer scientists, we must bear responsibility for warning about election vulnerabilities and proposing solutions,
Of course election integrity has been politicized, and they have to disavow being Trump supporters. As a blogger notes:
The letter is well written, but ... They say in the letter:
However, notwithstanding these serious concerns, we have never claimed that technical vulnerabilities have actually been exploited to alter the outcome of any US election.
This seems to be a problem. How do you get people to look both ways when crossing the street, when you have also basically asserted: “No one has ever been hit by a car when crossing the street.”
Just as we don't know the source of the Wuhan virus, we don't know who would have won a properly conducted 2020 election. And we may never know.

Tucker Carlson Tonight on Fox News, the highest rates cable TV news show, now regularly devotes segments to the politicization of science. Last night he discussed papers being retracted for political reasons, Biden administration appointments of incompetent political hacks to top-level scientific posts, physicians being forced to give harmful medical treatments, physicians fired for accusations of being transphobic, CDC employees not following official advice, and false and alarmist climate science.

Thursday, May 27, 2021

Watering down the Definition of Quantum Supremacy

Scott Aaronson admits:
If your point is just that quantum supremacy claims depend for their credibility on people trying hard to refute them and failing, then I vehemently agree! So you and I should both be glad that this is exactly what’s happening right now.

Regarding your question: no, I would not bet that Google’s Sycamore chip can’t be spoofed by a classical computer in 10 years — or right now, for that matter!

In other words, this Google quantum supremacy means nothing except that Google has made a machine that no one has bothered to simulate yet. And he has no confidence that it cannot be done.

Even if it cannot be done, it is still subject to the teapot supremacy argument.

I thought that quantum supremacy meant demonstrating decisively a computational ability superior to ordinary Turing machine computers. Aaronson seems to think that it just means doing something complicated that critics have not yet matched.

Here is how John Preskill originally used the term:

we hope to hasten the day when well controlled quantum systems can perform tasks surpassing what can be done in the classical world. One way to achieve such "quantum supremacy" would be to run an algorithm on a quantum computer which solves a problem with a super-polynomial speedup relative to classical computers ...

We therefore hope to hasten the onset of the era ofquantum supremacy, when wewill be able to perform tasks with controlled quantum systems going beyond whatcan be achieved with ordinary digital computers. ...

I have emphasized the goal of quantum supremacy (super-classical behavior of con-trollable quantum systems) as the driving force behind the quest for a quantumcomputer ...

I have emphasized the goal of quantum supremacy (super-classical behavior of con-trollable quantum systems) as the driving force behind the quest for a quantumcomputer, and the idea of quantum error correction as the basis for our hope thatscalable quantum computing will be achievable.

Aaronson is like Dr. Fauci admitting that the coronavirus may have come from the Wuhan lab. He is preparing for research that could be profoundly embarrassing to the entire field.

Update: Gil Kalai responds:

Hi Scott, regarding the analogy between Google’s Sycamore and IBM’s Deep Blue, here are three differences
1. In the Sycamore case, the researchers largely invented a new game to play
2. In the Sycamore case, the researchers themselves also played the part of Kasparov
3. In the Sycamore case, a huge leap compared to previous efforts is claimed
Aaronson previously wrote:
As I’ve said in dozens of my talks, the application of QC that brings me the most joy is refuting Gil Kalai, Leonid Levin, and all the others who said that quantum speedups were impossible in our world.

Monday, May 24, 2021

What makes quantum mechanics mysterious?

Everybody says quantum mechanics is mysterious, but what is the mystery?

Entanglement. A lot of smart physicists have made a big deal out of this, but it is really not mysterious unless combined with some other effect. I recently posted about this.

Double-slit experiment. We get a similar interference pattern for any wave phenomenon. We would expect it for light, even with no QM.

Uncertainty principle. Not too surprising, once you assume a wave nature of matter.

Superposition. Like the half-dead half-alive cat. Just a useful fiction.

Many worlds. This is just a fantasy. You can have the same fantasy about classical theories, if you wish.

Identical particles. It is strange that all electrons are the same. It makes the world very simple, compared to the alternatives.

Discrete energy levels. This is truly one of the nice features of QM, but not usually described as mysterious.

Probabilities. Scott Aaronson says the essence is negative probabilities. I say there is no such thing. QM has regular probabilities, and so does every other theory.

Nonlocality. QM has no true nonlocality, in the sense doing something in one place causing action at a distance elsewhere.

Canceling infinities. These are mostly artifacts of extreme assumptions, such as having mass and charge concentrated at a point, having infinite density.

Linearity. Some other theories are linear, such as Maxwells equations.

Super-Turing computation. This would be interesting, if proved. Some say it has been. I don't believe it.

Lack of local hidden variables. Bell made a big deal out of this, but nobody expected local hidden variable anyway.

No counterfactual definiteness. This is closely related to the lack of hidden variables.

So what makes quantum mechanics different from classical mechanics? I say that it is the non-commuting observables and positivist outlook. The above stuff is just not that mysterious.

Thursday, May 20, 2021

Google promises to make a Logical Qubit

In my skepticism about quantum computing, I have occasionally remarked that all the claims about research machines with dozens of qubits is exaggerated, because they still have not made a true qubit yet. Here is an explanation of that.

Google announces at its annual I/O conference:

Within the decade, Google aims to build a useful, error-corrected quantum computer. ...

To begin our journey, today we’re unveiling our new Quantum AI campus in Santa Barbara, California. ...

[wild futuristic hype snipped]

To reach this goal, we’re on a journey to build 1,000,000 physical qubits that work in concert inside a room-sized error-corrected quantum computer. That’s a big leap from today’s modestly-sized systems of fewer than 100 qubits.

To get there, we must build the world’s first “quantum transistor” — two error-corrected “logical qubits” performing quantum operations together — and then figure out how to tile hundreds to thousands of them to form the error-corrected quantum computer. That will take years.

To get there, we need to show we can encode one logical qubit — with 1,000 physical qubits.

Got that? Google claims that it has achieved quantum supremacy, but it is still a long way from building a system with even one logical qubit.

Monday, May 17, 2021

Rethinking entanglement of a single particle

Dr. Bee has caused me to rethink entanglement, and reader Ajit sends a paper on Entanglement isn't just for spin
Quantum entanglement occurs not just in discrete systems such as spins, but also in the spatial wave functions of systems with more than one degree of freedom.
It is sometimes said that Einstein discovered entanglement in 1935, and it was immediately recognized as the central defining feature of quantum mechanics. But as the above paper notes, the word was not in common use until about 1987, and did not find its way into textbooks until after that.

As the article explains, entanglement is not some peculiarity of tricky spin experiments. It is a property of all quantum systems.

Entanglement is explained as the thing that makes quantum mechanics nonlocal, and hence the essence of why the theory is non-classical and mysterious.

Paul Dirac one said:

Quantum-mechanically, an interference pattern occurs due to quantum interference of the wavefunction of a photon. The wavefunction of a single photon only interferes with itself. Different photons (for example from different atoms) do not interfere.
This is not an exact quote, but he said something similar.

This is a confusing statement, and I would not take it too literally. But in a similar spirit, I would say that a quantum particle can be entangled with itself.

Entanglement is often introduced by describing creation of a pair of particles with equal and opposite spins. But it is much more common. In any atom with several orbital electrons, those electrons are entangled. Nearby particles usually are. The case of the equal and opposite pair is interesting because that gives distant entanglement, but nearby entanglement occurs all the time.

Consider a stream of particles being fired into a double slit. Each particle is interfering with itself, and is entangled with itself. The interference results in the interference pattern on the screen.

The entanglement results in each particle hitting the screen exactly once. If you purely followed the probabilities, there are many places on the screen where the particle might hit. Those possibilities are entangled. If the particle is detected in one spot, it will not be detected in any other.

You cannot understand the experiment as localized probabilities in each spot of the screen.

Viewed this way, I am not sure the 2-particle entanglement story is any more mysterious than the 1-particle story. Maybe explanations of entanglement should just stick to the 1-particle story, as the essence of the matter.

Update: Reader Ajit suggests that I am confusing entanglement with superposition. Let me explain further. Consider the double-slit experiment with electrons being fired thru a double-slit to a screen, and the screen is divided into ten regions. Shortly before an electron hits the screen, there is an electron-possibility-thing that is about to hit each of the ten regions. Assuming locality, these electron-possibility-things cannot interact with each other. Each one causes an electron-screen detection event to be recorded, or disappears. These electron-possibility-things must be entangled, because each group of ten results in exactly one event, and the other nine disappear. There is a correlation that is hard to explain locally, as seeing what happens to one electron-possibility-thing tells you something about what will happen to the others. You might object that the double-slit phenomenon is observed classically with waves, and we don't call it entanglement. I say that when a single electron is fired, that electron is entangled with itself. The observed interference pattern is the result.

Tuesday, May 11, 2021

President Joe Biden is Politicizing Science

Lawrence Krauss has a WSJ article attacking Pres. Biden for politicizing science.
The New Scientific Method: Identity Politics
The National Academy of Sciences fights bias by explicitly introducing more of it.
Lubos Motl praises the article.

In particular there is now an aggressive affirmative action program at the National Academy of Sciences, where less competent women and Blacks are being appointed in order to meet diversity quotas.

Biden's White House Coronavirus Response Coordinator is a Democrat political hack named Jeffrey Zients. Donald Trump had an immunology expert in that job.

The authorities are still not telling us the truth about the virus. See this article by a NY Times science reporter on evidence it came from the Wuhan Institute of Virology.

Update: From a CDC official in a press conference, as reported in the NY Times:

DR. WALENSKY: … There’s increasing data that suggests that most of transmission is happening indoors rather than outdoors; less than 10 percent of documented transmission, in many studies, have occurred outdoors.
The paper goes on to explain that the true number is more like 0.1%. Yes, that is less than 10%, but appears to be a distortion attempting to justify outside mask requirements.

Monday, May 10, 2021

Quantum wavefunction is not everything

Reader Ajit argues that I am not following textbook quantum mechanics properly. He has posted Postulates of quantum mechanics, as stated by various authors.

Checking other versions of the postultes, I find:

The state of a system is completely described by a wavefunction

Associated with any particle moving in a conservative field of force is a wave function which determines everything that can be known about the system.

I wonder why this would be stated as a postulate. It is not used by the theory anywhere, and it is not true.

Sometimes it is stated for a single particle, but it cannot be true if the particle is entangled with another. Sometimes it is stated for scalar wave functions, but that cannot be true if the particle has spin.

You can correct those problems by introducing spinor-valued wave functions of several variables, but then you are still ignoring quantum fields and all sorts of other complexities.

Now you might say: Okay, but if use the whole Standard Model, or some bigger unified field that takes into account all possible interactions, and then we construct a wave function of the universe, then that would completely describe the state of the universe.

That would not be quantum mechanics. That would be some theorist's fantasy that has never been carried out.

Quantum mechanics is a theory that takes in some available info, and makes some predictions, but never achieves a complete description of the system. Nobody has any idea how any such complete description would ever be accomplished.

Take a simple example, the Schroedinger Cat. The wavefunction is a superposition of dead and alive states. Is it a complete description of the state of the system? No, of course not. The cat is either dead or alive. You can get a more complete description by opening the door and looking to see if the cat is dead. The wavefunction is most emphatically not giving a complete description.

I don't know why anyone would say that the wavefunction is a complete description of the system. Other physics theories do not start off with a postulate declaring some sort of god-like omniscient. It doesn't make sense to even say something like that.

And yet this postulate is prominent on various lists of postulates for quantum mechanics. I will have to do some further research to find out who is responsible for this silly idea.

This week's Dr. Bee video is on Einstein's spooky action at a distance. She says that the spookiness is the measurement update (ie, collapse of the wavefunction), not entanglement.

Believing that the wave function is a complete description necessarily causes these spooky concerns. Any observation affect distant parts of the wavefunction. If the wavefunction is a complete physical thing, then it is spooky.

Monday, May 3, 2021

Does Quantum AI have Free Will?

A new paper argues that a quantum computer could be conscious, and have free will.

Since I am skeptical that quantum computer will ever achieve quantum supremacy, you probably think that I dismiss this as nuts. Actually I don't.

Turing machines are deterministic, and do not have free will. But I believe humans do. The mechanism is not understood, and may involve quantum mechanics. So maybe a quantum computer can do that, even if cannot factor large numbers.

The London Guardian has a good essay on the arguments about free will. It says:

Harris argues that if we fully grasped the case against free will, it would be difficult to hate other people: how can you hate someone you don’t blame for their actions? Yet love would survive largely unscathed, ...

I personally can’t claim to find the case against free will ultimately persuasive; it’s just at odds with too much else that seems obviously true about life.

I agree with that last sentence. A lot of intellectuals reject free will, but in the process they also reject a lot of things that seem obviously true.

I do not agree with the love/hate analysis. If I believe that someone has no free will, and is merely a preprogrammed robot to do evil things, then sure, that is a good reason to hate him. He would be a sub-human evil nuisance. A puppet of the devil. As for love, try telling your wife that you only love her because the chemicals in your body have made that illusion. Some psychologists say that, and I don't think it helps.

The article says that philosophers have gotten death threats over such issues.

Jerry Coyne endorses most of the essay, but argues:

Contracausal free will is the bedrock of Abrahamic religions, which of course have many adherents.
No. Islam doesn't accept free will. Moslems are always talking about God's will being carried out, as if no one can do anything about it. Jews have mixed views. Catholics believe strongly in free will, and so do many Protestants, but some, such as Calvinists, do not.

A previous Guardian essay by historian Yuval Noah Harari said:
Unfortunately, “free will” isn’t a scientific reality. It is a myth inherited from Christian theology. Theologians developed the idea of “free will” to explain why God is right to punish sinners for their bad choices and reward saints for their good choices. If our choices aren’t made freely, why should God punish or reward us for them? ...

You cannot decide what desires you have. You don’t decide to be introvert or extrovert, easy-going or anxious, gay or straight. Humans make choices – but they are never independent choices. ...

But now the belief in “free will” suddenly becomes dangerous. If governments and corporations succeed in hacking the human animal, the easiest people to manipulate will be those who believe in free will.

He is gay Israeli Jewish atheist. Perhaps he is a slave to his programming, but others are not.

Theologians did not invent free will. The Gospels use phrases like "go and sin no more". This assumes that you can choose to sin, or not sin.

You can decide to be an introvert or extravert. Change is not easy, but people do it.

No, the easiest to manipulate are those who think that they are already slaves.

Coyne argues:

Free will skepticism (sometimes called “hard determinism”). As you must know, this is the view to which I adhere. Though it’s often called “determinism”, with the implication that the laws of physics have already determined the entire future of the universe, including what you will do, that’s not my view. There is, if quantum mechanics be right, a fundamental form of indeterminism that is unpredictable, like when a given atom in a radioactive compound will decay. It’s unclear to what extent this fundamental unpredictability affects our actions or their predictability, but I’m sure it’s played some role in evolution (via mutation) or in the Big Bang (as Sean Carroll tells me). Thus I prefer to use the term “naturalism” rather than “determinism.” But, at any rate, fundamental quantum unpredictability cannot give us free will, for it has nothing to do with either “will” or “freedom”.
I call this argument: Only God has Free Will.

Coyne is an atheist, but he seems to believe in some sort of Spinoza God. Humans have no freedom or free will. We are just puppets being controlled. God is not a predictable robot, and can make choices for us and the world. God even guides evolution of biological species by directing mutations.

The phrase "fundamental quantum unpredictability" means that the human observer can only predict probabilities. It always leaves open the possibility that someone with more info could make a better prediction. If Coyne wants to believe that it is some sort of God making all our choices for us, I guess that possibility is allowed.

For example, a quantum mechanics textbook might say that a uranium atom has a certain probability of radioactive decay in the next hour. And maybe that is all that can be said with the info available. But nowhere will it say that it is impossible to make a better prediction, if the state of the atom could be more precisely determined. As a practical matter, it is hopeless to get the wavefunctions of all the quarks in a uranium nucleus, but the point remains that better info might give a better prediction.

In my opinion, attributing all the decisions in the world to a Spinoza God is contrary to common sense and experience, and does not really solve anything. It is like a turtle argument that atheists like to mock. In fact, I worry about the mental health of anyone who believes that, as it is similar to schizophrenics who say that they are obeying voices in their heads.

Thursday, April 29, 2021

Feynman quote on Leftist Groupthink

...There was a special dinner at some point, and the head of the theology place, a very nice, very Jewish man, gave a speech. It was a good speech, and he was a very good speaker, so while it sounds crazy now, when I’m telling about it, at that time his main idea sounded completely obvious and true. He talked about the big differences in the welfare of various countries, which cause jealousy, which leads to conflict, and now that we have atomic weapons, any war and we’re doomed, so therefore the right way out is to strive for peace by making sure there are no great differences from place to place, and since we have so much in the United States, we should give up nearly everything to the other countries until we’re all even. Everybody was listening to this, and we were all full of sacrificial feeling, and all thinking we ought to do this. But I came back to my senses on the way home. The next day one of the guys in our group said, “I think that speech last night was so good that we should all endorse it, and it should be the summary of our conference.” I started to say that the idea of distributing everything evenly is based on a theory that there’s only X amount of stuff in the world, that somehow we took it away from the poorer countries in the first place, and therefore we should give it back to them. But this theory doesn’t take into account the real reason for the differences between countries—that is, the development of new techniques for growing food, the development of machinery to grow food and to do other things, and the fact that all this machinery requires the concentration of capital. It isn’t the stuff, but the power to make the stuff, that is important. But I realize now that these people were not in science; they didn’t understand it. They didn’t understand technology; they didn’t understand their time. The conference made me so nervous that a girl I knew in New York had to calm me down. “Look,” she said, “you’re shaking!..."

Thursday, April 22, 2021

New book on Free Will Debate

Philosophers Dan Dennett and Greg Caruso wrote a a 2018 debate on free will, and now they have expanded it into a book. From a review, they agree more than they disagree:
Both are naturalists (JD p.171) who see no supernatural interference in the workings of the world. That leaves both men accepting general determinism in the universe (JD p.33), which simply means all events and behaviours have prior causes. Therefore, the libertarian version of free will is out. Any hope that humans can generate an uncaused action is deemed a “non-starter” by Gregg (JD p.41) and “panicky metaphysics” by Dan (JD p.53). Nonetheless, both agree that “determinism does not prevent you from making choices” (JD p.36), and some of those choices are hotly debated because of “the importance of morality” (JD p.104). Laws are written to define which choices are criminal offenses. But both acknowledge that “criminal behaviour is often the result of social determinants” (JD p.110) and “among human beings, many are extremely unlucky in their initial circumstances, to say nothing of the plights that befall them later in life” (JD p.111). Therefore “our current system of punishment is obscenely cruel and unjust” (JD p.113), and both share “concern for social justice and attention to the well-being of criminals” (JD p.131).
Their hair-splitting philosophical differences are not that interesting. What interests me is how they could both have such a screwed-up view of life, and still think that they are on opposite sides of a big issue.

Caruso says we have no free will. Dennett says that we think that we do, and it is useful to maintain the illusion, but it is not real.

Without free will, there ia no consciousness. Our systems of law, ethics, morality, and politics depend on free will. Christianity is based on it. So is the scientific method. It is hard to imagine how modern civilization could even exist without free will.

These philosophers discard it all based on a belief that all events have prior causes.

When a uranium nucleus decays, is it determined by prior causes? Our best scientists cannot answer this question. But somehow these philosophers can get the answer by playing silly word games? No, it is all nonsense.

Monday, April 19, 2021

Trans Ideology and the New Ptolemaism

The social sciences often make cosmological analogies, and screw them up so badly that I cannot even tell what point they are making.

Here is a new scholarly paper on an academic dispute:

Trans Ideology and the New Ptolemaism in the Academy ...

Ptolemy constructed an inordinately complex model of the universe in order to make all of the empirical data conform to a central, organizing false assumption, namely, that the earth was at the center.

Foucault’s influence in the academy is at least as often lamented as celebrated, and I will not attempt in what follows a comprehensive critique of his work. Instead, I will focus on one tendency his example has encouraged, which, using Rockhill’s analogy, I will call the “new Ptolemaism.” This is a push for scholarship to be insistently insular and to be much less interested in the study of the world than in the study of the study of the world. This kind of work, which is by now very common in the social sciences and humanities, performs the same neat trick every time. It turns out, in every such analysis, that the framing of inquiry turns out to be more significant than the object of inquiry. ... ...

Consider present day calls to remake the academy. There should be more soft sociology of the hard sciences; there should be more women in male dominated disciplines; we should “indigenize” the university. There are two terms in each case; we should reverse the conventional hierarchy of those terms; and the results will be profoundly liberatory, because, Ptolemaically, the university rather than the world is the most important locus of struggle. ...

Gender critical feminists like me notice, of course, that one infinitely more often sees and hears the slogan “transwomen are women” than its counterpart “transmen are men.” To understand why this is the case, you’d have to pay attention to patterns of power in the world rather than to Ptolemaic valence-flipping. One of the signs on my office door that most infuriated feminist academic women colleagues on social media described the parallels between men’s rights activism and trans rights activism. Many feminist academic women clearly saw it as their moral and intellectual duty to decry this assertion.

The Foucealt here has nothing to do with the Foucault pendulum, which helped prove Ptolemy wrong about the motion of the Earth. No, it is a French post-modernist and pedophile rapist.

The author might have some valid points about feminism and trans ideology, but the Ptolemy stuff is nonsense, and the Foucealt stuff probably is also.

Ptolemy did not construct an inordinately complex model of the universe. It was not any more complicated that any other model achieving similar accuracy. He did assume that the Earth was at the center, but the model is not really any different or more complicated from that. He descibed the stars, Sun, planets, and Moon as seen from Earth, so he would have to include the calculations needed for that whether the Earth moved or not. It was not really a false assumption that the Earth was at the center, but a way of defining an Earth-centered coordinate system that is a completely legitimate way of recording observations.

The motion of the Earth was one of the great scientific issues in the history of mankind, but it is nearly always misrepresented.

This Babylon Bee parody is a lot more entertaining on the subject. To understand it, it helps to have seen the November 3, 1961 episode of The Twilight Zone.