Monday, December 6, 2021

Job for Contemporarily Minoritized Individuals Underrepresented

Modern job announcement:
Fermilab launches the new Gates Fellowship

November 29, 2021 | edited by Lisa Roberts

The Theory Division at the Department of Energy’s Fermi National Accelerator Laboratory is pleased to announce its new Sylvester James Gates, Jr. Fellowship. Inspired by the achievements of Jim Gates, currently Ford Foundation professor and director of the Brown University Theoretical Physics Center, the Gates Fellowship at Fermilab prioritizes the inclusion of first-generation college graduates, and the representation of historically and contemporarily minoritized individuals underrepresented in theoretical physics.

The new Gates Fellowship takes its name from Sylvester James “Jim” Gates, Jr., who attended a segregated African-American high school in Orlando, Florida. While earning his Ph.D. at M.I.T., he began his pioneering research on supersymmetry and supergravity, which became the basis for the string theory revolution of theoretical physics in the 1980s.

I can only guess what races and ethnic groups are eligible.

That "string theory revolution" was a failure. I don't know much about Gates, but here is a 2011 panel discussion with him and others trying to find positive things to say about various "theory of everything" failures.

The latest Lubos Motl rant:

Contemporary West's far left "religions" are as dumb and devastating as radical Islam As recently as 5-10 years ago, I took it for granted that the fuzzy region referred to as the West had an advantage in comparison with the Muslim World that was way more important than the immediate wealth: the ability to think impartially, fairly, rationally, and calmly – a broader pattern of behavior that produces things like science, mathematics, and rigorous trials in the courtrooms as special branches. The Westerners looked so different from the Palestinians or black Africans or Indonesians... when it came to such things. And the rational, balanced judgement is ultimately the primary cause that gives rise to the potential to create wealth and happiness; it is more fundamental than the wealth and happiness themselves.

In recent years and especially months, I realized that it was necessary to revise this judgement. The West's mental superiority could have looked like a fact for decades or centuries but in the truly long-term perspective, it was probably just a mirage. The brainwashed leftists that are all around us seem to act and (fail to) think in a nearly isomorphic way to the most hardcore fundamentalist Islamists. Their relationships to the "authorities" like the far left TV stations are on par with the mindless Islamists' relationship to the mullahs. And the percentage of the lies and stupidities is about the same, too.

The amount of absolute insanity that is taking place – and that is clearly devouring tons of people around us – is so high that I increasingly insert whole days when I mostly isolate myself not only from the news on the Internet and in the "media" but also from all people who seem likely to be hopelessly brainwashed morons. I just really physically suffer when I am exposed to the human stupidity and its concentration in our environment is just unbelievable these days.

A few years ago I would have said that Lumo was losing his mind. But now I agree that The West and the major media have been taken over by brainwashed morons.

Monday, November 29, 2021

There is No Objective Probability

There are a lot of people who believe that the probabilities of classical mechanics are subjective, because the underlying processes are all deterministic, and the quantum probabilities are objective. The latter is sometimes called the propensity theory of probability.

On the other hand, Bayesians insist that probability is just an estimate of our beliefs.

A new paper tries to address the difference:

Forty some years ago David Lewis (1980) proposed a principle, dubbed the Principal Principle (PP), connecting rational credence and chance. A crude example that requires much refining is nevertheless helpful in conveying the intuitive idea. Imagine that you are observing a coin áipping experiment. Suppose that you learn -- for the nonce never mind how -- that the objective chance of Heads on the next flip is 1/2. The PP asserts that rationality demands that when you update your credence function on said information your degree of belief in Heads-on-the-next-áip should equal 1/2, and this is so regardless of other information you may have about the coin, such as that, of the 100 flips you have observed so far, 72 of the outcomes were Tails.

The large and ever expanding philosophical literature that has grown up around the PP exhibits a number of curious, disturbing, and sometimes jaw-dropping features.1 To begin, there is a failure to engage with the threshold issue of whether there is a legitimate subject matter to be investigated. Bruno de Finettiís (1990, p. x) bombastic pronouncement that "THERE IS NO PROBABILITY" was his way of asserting that there is no objective chance, only subjective or personal degrees of belief, and hence there is no need to try to build a bridge connecting credence to a mythical entity. Leaving doctrinaire subjectivism aside for the moment and assuming there is objective chance brings us to the next curious feature of the literature: the failure to engage with substantive theories of chance, despite the fact that various fundamental theories of modern physicsó in particular, quantum theoryó ostensibly speak of objective chance. Of course, as soon as one utters this complaint the de Finetti issue resurfaces since interpretive principles are needed to tease a theory of chance from a textbook on a theory of physics, and de Finettiís heirsó the self-styled quantum Bayesians (QBians)ó maintain that the probability statements that the quantum theory provide are to be given a personalistic interpretation.2

I am not sure that any of this makes any sense.

The only way I know to make rigorous sense out of probability is the Kolmogorov probability axioms.

I don't believe there is any such thing as objective probability. It has never been an essential part of quantum mechanics.

Quantum mechanics is about observables. Probabilities are not observable. Believing in physical/objective/propensity probability goes against the spirit of the theory.

Monday, November 22, 2021

Carroll on Consciousness

Sean M. Carroll is very good at explaining textbook physics, but when he discusses his own beliefs, he has some wacky ideas. He believes in determinism, and many-worlds theory.

(Yes, many-worlds is not deterministic, but that is not my point here.)

Here he debates panpsychism:

Theoretical physicist Sean Carroll joins us to discuss whether it make sense to think of consciousness as an emergent phenomenon, and whether contemporary physics points in this direction.
I see 3 possibilities.

1. There is no such thing as consciousness. Yes, we perceive all sorts of things, and act on those perceptions, but that's all.

2. Consciousness is real, and has a physical basis that may eventually be understood in terms of fundamental physics, chemistry, and biology.

3. Consciousness is in the mind or soul, and not the body, and it best understood in spiritual terms.

As a scientific reductionist, I lean towards (2), but the others are possible, especially since we don't even have a good definition of consciousness.

If (2) and scientific reductionism are true, and humans are composed of 1030 or so quarks and electrons, then it seems plausible that each quark and electron has a little bit of consciousness.

Carroll's answer to this is that the behavior of electrons is completely determined by physical law, and so very strange changes to those laws would be needed to explain partially conscious electrons.

But our best laws of physics are not deterministic. Not in this universe, anyway. Those elections could be partially conscious without any change to known laws.

Carroll has elaborated on his own blog:

The idea was not to explain how consciousness actually works — I don’t really have any good ideas about that. It was to emphasize a dilemma that faces anyone who is not a physicalist, someone who doesn’t accept the view of consciousness as a weakly-emergent way of talking about higher-level phenomena.

The dilemma flows from the following fact: the laws of physics underlying everyday life are completely known. They even have a name, the “Core Theory.” We don’t have a theory of everything, but what we do have is a theory that works really well in a certain restricted domain, and that domain is large enough to include everything that happens in our everyday lives, including inside ourselves. ...

That’s not to say we are certain the Core Theory is correct, even in its supposed domain of applicability.

He then launches into a discussion of zombies who appear to be just like conscious humans, but are not.

I don't see how this proves anything. He cannot define consciousness, and when he takes it away, peeople behave just the same. No. If conscious means anything, it means that people would behave differently if they didn't have it.

Carroll calls his viewpoint physicalism, but it is really the opposite, as he refuses to accept a physical basis for consciousness.

I get why non-physicalists about consciousness are reluctant to propose explicit ways in which the dynamics of the Core Theory might be violated. Physics is really strong, very well-understood, and backed by enormous piles of experimental data. It’s hard to mess with that.
He is assuming that a theory of consciousness would violate the Core Theory, but I doubt it.

Dr. Bee explains why particles decay, and adds a consciousness argument:

the tau can decay in many different ways. Instead of decaying into an electron, a tau-neutrino and an electron anti-neutrino, it could for example decay into a muon, a tau-neutrino and a muon anti-neutrino. Or it could decay into a tau-neutrino and a pion. The pion is made up of two quarks. Or it could decay into a tau-neutrino and a rho. The rho is also made up of two quarks, but different ones than the pion. And there are many other possible decay channels for the tau. ...

The taus are exactly identical. We know this because if they weren’t, they’d themselves be produced in larger numbers in particle collisions than we observe. The idea that there are different versions of taus is therefore just incompatible with observation.

This, by the way, is also why elementary particles can’t be conscious. It’s because we know they do not have internal states. Elementary particles are called elementary because they are simple. The only way you can assign any additional property to them, call that property “consciousness” or whatever you like, is to make that property entirely featureless and unobservable. This is why panpsychism which assigns consciousness to everything, including elementary particles, is either bluntly wrong – that’s if the consciousness of elementary particles is actually observable, because, well, we don’t observe it – or entirely useless – because if that thing you call consciousness isn’t observable it doesn’t explain anything.

So you could have two identical tau particles, and one decays into an electron and 2 neutrinos, and the other decays into a muon and 2 neutrinos.

And we have a Core Theory that explains the dynamics of everything that happens!

No, this is untenable. I see a couple of possibilities.

1. Those taus are not really identical. They have internal states that determine how they will decay.

2. The taus are identical, but they have some sort of conscious free will that allow them to choose how and when they decay.

Some physicists would say that the taus are identical and intrinsically random. But saying that is just a way of saying that we don't know whether it is possibility (1) or (2).

Dr. Bee gives an argument for the taus being identical. But we can never be sure that they are truly identical. Maybe they just appear identical in a particular quantum field theory, but that ignores a deeper reality.

I know it seems crazy to say that a tau particle has a little bit of conscious free will. But the alternatives are stranger.

Now where does human consciousness come from? Carroll says that it cannot come from anything in the Core Theory, because it is dynamically complete and there is no room for any panpsychism.

There is room. Quantum mechanics is not deterministic. It predicts probabilities because there are mysterious causal factors that it cannot account for.

Jerry Coyne says that Carroll decisively refutes panpsychism. I disagree. I say Carroll has the worse argument.

After writing this, I am surprised to see Lubos Motl take the side of panpsychism.

But even before QM, it was rather clear that panpsychism was needed in any scientific world view simply because there can't be any "metaphysically sharp" boundary between objects like humans that we consider conscious; and other objects. So some amount of the "consciousness substance" must be assigned to any object in Nature, otherwise we end up with a clearly scientifically ludicrous anthropocentric or anthropomorphic theory. ...

OK, Carroll hasn't noticed that the current "Core Theory" is actually quantum mechanical and therefore needs conscious observers to be applied. Much of his article is a circular reasoning ...

For the 9,877th time, he can only be a "physicalist" because he doesn't do science. If he were doing science, he would be abandoning theories that conflict with the observations. And because all classical i.e. P-world theories conflict with the observations, they are dead....

Carroll behaves exactly like you expect from a zombie: Sean Carroll is a simulation of a generic zombie

This isn't fair because Carroll's Core Theory is not classical mechanics. Carroll would say that he is very much a believer in quantum mechanics.

But Carroll doesn't really believe in textbook quantum mechanics. He believes in many-worlds theory, where there is no wave function collapse, no probabilities, no predicted events, no free will, and no correspondence with any scientific experiments.

Yes, I do think that many-worlds theory is fundamentally incompatible with science. It is owrse than believing in astrology or witchcraft.

Nautilus has more on the pros and cons of panpsychism.

Wednesday, November 17, 2021

IBM now Claims Quantum Supremacy

Here is some skepticism about Google:
Now though, in a paper to be submitted to a scientific journal for peer review, scientists at the Institute of Theoretical Physics under the Chinese Academy of Sciences said their algorithm on classical computers completed the simulation for the Sycamore quantum circuits [possibly paywalled; alternative source of the same article] "in about 15 hours using 512 graphics processing units (GPUs)" at a higher fidelity than Sycamore's. Further, the team said "if our simulation of the quantum supremacy circuits can be implemented in an upcoming exaflop supercomputer with high efficiency, in principle, the overall simulation time can be reduced to a few dozens of seconds, which is faster than Google's hardware experiments".
I t hink this is why Scott Aaronson retracted his quantum supremacy blessing.

IBM had denied that Google reached quantum supremacy, and now makes its own supremacy claim:

IBM has created a quantum processor able to process information so complex the work can't be done or simulated on a traditional computer, CEO Arvind Krishna told "Axios on HBO" ahead of a planned announcement.

Why it matters: Quantum computing could help address problems that are too challenging for even today's most powerful supercomputers, such as figuring out how to make better batteries or sequester carbon emissions.

Driving the news: IBM says its new Eagle processor can handle 127 qubits, a measure of quantum computing power. In topping 100 qubits, IBM says it has reached a milestone that allows quantum to surpass the power of a traditional computer. "It is impossible to simulate it on something else, which implies it's more powerful than anything else," Krishna told "Axios on HBO...."

Krishna says the quantum computing push is one part of his approach to return the company to growth.

The comments about these news items are mostly skeptical, such as:
This is the third Quantum Computing BS story today. IBM literally "simulates" a quantum circuit and then claims it is superior to classical. Well, no shit sherlock. As I said before this is the equivalent of claiming a pebble tossed in water is super to a classical computer because it accurately and instantly shows the interference and refraction patterns. There you go, call it "pebble supremacy." Or a camera and flash photography setup is superior to a classical computer in rendering photorealistic ray-tracing. Technically yes, but in reality bullshit. ...

Indeed. The history of QC claims is an endless series of lies. ...

But the summary says it will help to sequester carbon. That must be true because IBM would never spew BS about something as important as sequestering carbon and the relevance of quantum computing to carbon sequestration is totally obvious to everyone. ...

Extraordinary Popular Delusions and the Madness of Crowds, Charles Mackay. ...

Is it me, or does this quantum computing stuff starting to sound like a bunch of hooey? What are they even going to calculate with it? It never seems to have any real world application. Why do I have this iMac on my desktop, when a quantum computer is billions of times better? Can they make a little one for me that's only a 1000 time better than my iMac? And how the hell is it gonna solve the battery problem. That's gonna take people making things and testing them? 127 qubits, my ass!

Aaronson weighs in on these new developments:
About IBM’s new 127-qubit superconducting chip: As I told New Scientist, I look forward to seeing the actual details! As far as I could see, the marketing materials that IBM released yesterday take a lot of words to say absolutely nothing about what, to experts, is the single most important piece of information: namely, what are the gate fidelities? How deep of a quantum circuit can they apply? How have they benchmarked the chip? Right now, all I have to go on is a stats page for the new chip, which reports its average CNOT error as 0.9388—in other words, close to 1, or terrible! ...

About the new simulation of Google’s 53-qubit Sycamore chip in 5 minutes on a Sunway supercomputer (see also here): This is an exciting step forward on the classical validation of quantum supremacy experiments, and—ironically, what currently amounts to almost the same thing—on the classical spoofing of those experiments. Congratulations to the team in China that achieved this! But there are two crucial things to understand. First, “5 minutes” refers to the time needed to calculate a single amplitude (or perhaps, several correlated amplitudes) using tensor network contraction. It doesn’t refer to the time needed to generate millions of independent noisy samples, which is what Google’s Sycamore chip does in 3 minutes.

I am inferring that there is still no consensus on whether quantum computers are possible. Maybe IBM will convince some people, and maybe not.

Update: Aaronson also makes some political comments in support of those trying to stop Woke Leftists from stifling all alternative views in academia. But he adds:

Just for the record, I also have never considered voting Republican. The way I put it recently is that, if the Republicans disavowed their authoritarian strongman and came around on climate change (neither of which they will), and if the Democrats continued their current descent into woke quasi-Maoism, my chance of voting Republican would surely increase to at least a snowball’s chance in hell, from its current individual snowflake’s chance in hell. 🙂
This shows that he is very much a part of the deranged Left. He thinks Trump was an authoritarian strongman, but Pres. Biden has been much more authoritarian. Differences in climate policy have been negligible.

Monday, November 15, 2021

Free Will can look Random to Others

A argument is commonly made that everything in the world is either deterministic or random, and this leaves no room for free will. I am surprised that otherwise intelligent men make this silly argument.

The argument essentially says that nothing good can come out of quantum randomness. But if that were true, then nothing good could come out of quantum computers.

While I have argued that quantum computers are useless, I do not make that silly argument. Of cuorse quantum indeterminacy can be part of a useful process.

From a free will essay:

A second criticism of the indeterminacy argument is that it does not allow for the type of human choices that free will advocates need. The indeterminacy of electrons is a random thing, but genuinely free choices cannot be random: they are thoughtful and meaningful actions. If I am deciding between buying chocolate ice cream and vanilla and I randomly flip a coin to decide, that is an arbitrary action, not a free action. If in fact all of our actions were indeterminate in the way that electrons are, we would have nonstop spasms and convulsions, not meaningfully chosen actions. Rather than selecting either the chocolate ice cream or vanilla, I would start quivering like I am having a seizure. Thus, subatomic indeterminacy is no real help to the free will advocate. ...

Second, regarding the contention that indeterminacy will only produce random actions, this is not necessarily the case. Quantum computers do not result in arbitrary events, like memory chips catching on fire, or printers printing out gibberish. Rather, quantum phenomena are carefully introduced into precise spots within the computer’s hardware, the result being that it can perform tasks with enormously greater efficiency than any other existing computer. So too with quantum biology: the results are biological processes that perform highly complex tasks with great efficiency, such as global detection, vision, and photosynthesis. If evolution has in fact tied brain activity to quantum phenomena, it is reasonable to assume that it would similarly facilitate an important biological process with great efficiency. None of this proves the existence of free will through indeterminacy, but it at least offers a scientifically-respectable theory for how nature might have implanted within our brains the ability to have done otherwise.

Yes, I think it is possible that a reductionist microscopic analysis of free will would show some quantum indeterminacy.

The essence of free will is the ability to make a decision that others cannot predict. So your decision will look random to them. So saying that there is randomness in fundamental physics is an argument for free will, not against it.

Here is philosopher Massimo Pigliucci making an argument that free will is incoherent:

The next popular argument for a truly free will invokes quantum mechanics (the last refuge of those who prefer to keep things as mysterious as possible). Quantum events, it is argued, may have some effects that “bubble up” to the semi-macroscopic level of chemical interactions and electrical pulses in the brain. Since quantum mechanics is the only realm within which it does appear to make sense to talk about truly uncaused events, voilà!, we have (quantistic) free will. But even assuming that quantum events do “bubble up” in that way (it is far from a certain thing), what we gain under that scenario is random will, which seems to be an oxymoron (after all, “willing” something means to wish or direct events in a particular — most certainly not random — way). So that’s out as well.

It now begins to look like our prospects for a coherent sense of free will are dim indeed.

Pigliucci is wrong on many levels. We invoke quantum mechanics because it is our best physical theory, and hence makes the world less mysterious, not more mysterious.

Quantum mechanics is no more about "truly uncaused events" than any other theory. It makes predictions based on causality from past info and events. And an action from free wiil is not an uncaused event.

He says "random will" is an oxymoron, but a free choice does indeed appear random to someone who cannot predict that choice.

While this does not prove free will, it does refute certain arguments against free will.

Friday, November 12, 2021

Diversity Police come for CalTech

Nature magazine reports:
Human Betterment Foundation (HBF), one of the most prominent eugenics groups of its time, begun in Pasadena in the same decade that Caltech transformed from a sleepy small-town technical school into a science and engineering powerhouse. ...

In June 2020, shortly after the killing of George Floyd by police in Minneapolis, Minnesota, student groups including the Socialists of Caltech, a group to which Panangaden belongs, put the spotlight on Caltech’s most famous former president — Nobel-prizewinning physicist Robert Millikan — and his involvement with the Human Betterment Foundation as a trustee. ...

At Caltech, events progressed quickly last year. The institute assigned a committee to investigate its links to eugenics advocacy. And after a months-long process that sometimes pitted students against administrators, leaders decided in January to remove Millikan’s name and several others from prominence on campus. This week, they announced some of the names that will replace them.

It’s a meaningful move for Panangaden, who identifies as multiracial and disabled. “I find it important to rename the buildings just because I don’t want to have that constant reminder that the people who built this institution didn’t want me to be there, and didn’t even want me to exist.”

But she says it would be a hollow effort without further steps to address the institution’s diversity gaps.

I am not going to defend forced sterilization, but what does that have to do with George Floyd, affirmative action quotas, and a student identifying as socialist/multiracial/Indian/disabled/female?

Perhaps in another century, forced sterilization will be seen as morally similar to vaccination mandates or child support. So will today's scientists be eventually canceled if they advocated vax mandates?

Millikan was a great physicist, and a CalTech pioneer. He had no power to force any medical procedure on anyone. If he expressed some opinions about some proposed laws, why should anyone care now?

The worse accusation against Millikan is that he allowed his name on this pamphlet. The opinions expressed appear to be sincere policy suggestions to make the world better for everyone. It says:

There can be no question that a very large portion of feeblemindedness is due to inheritance. The same is tru of mental disease
This is true, but people do not like to admit it. I think the critics of this pamphlet ought to explain exactly where is goes wrong. I have my own opinions, but I suspect that the leftist critics have other issues, and would rather not spell them out.

This weirdo students dig this stuff up as an excuse to make childish demands.

Thursday, November 11, 2021

Scientific American gets more Woke

Scientific American goes deeper into partisan politics and bogus science. Here is the latest:
The recent election of Glenn Youngkin as the next governor of Virginia based on his anti–critical race theory platform is the latest episode in a longstanding conservative disinformation campaign of falsehoods, half-truths and exaggerations designed to create, mobilize and exploit anxiety around white status to secure political power. The problem is, these lies work, and what it shows is that Democrats have a lot of work to do if they want to come up with a successful countermessage.

Conservatives have spent close to a century galvanizing white voters around the “dangerous” idea of racial equality.

Another headline is:
Many Neuroscience Conferences Still Have No Black Speakers
The illustration is of George Floyd.

The magazine used to be outstanding at getting to the heart of scientific issues. Not promoting racial tokenism.

Update: Jerry Coyne also comments on Scientific American again posting nonscientific political editorials.

I have no idea why Scientific American is publishing editorials that have absolutely nothing to do with science. Yes, they have gone woke, and yes, they’re circling the drain, and while they of course have the right to publish what they want, they’ve abandoned their mission to shill for the progressive Democrats.

The latest shrill editorial is a critique of CRT implying that those who oppose its teaching in schools in whatever form, and are in favor of anti-CRT bills, are white supremacists. If you don’t believe me, read the article below. ...

But why is Scientific American publishing this kind of debatable (and misleading) progressive propaganda? Why don’t they stick with science?  As a (former) scientist, I resent the intrusion of politics of any sort into scientific journals and magazines. If I want to read stuff like the above, well, there’s Vox and Teen Vogue, and HuffPost and numerous other venues.

I wonder how long Scientific American will last. . . . .

Monday, November 8, 2021

Relativity is Based on Geometry

The essence of relativity is a non-Euclidean geometry view of causality.

I have posted about the historical importance of geometry here and here, and stressed the importance to relativity here and here.

When physicists talk about non-Euclidean geometry in relativity, they usual mean gravitational masses curving space. If not that, they mean some clever formulas that relate hyperbolic space to velocity addition formulas and other aspects of special relativity. I mean something different, and more basic.

For an excellent historical summary, see The Non-Euclidean Style of Minkowskian Relativity, by Scott Walter. See also Wikipedia.

Euclidean geometry means 3-dimensional space (or Rn) along with the metric given by the Pythagorean Theorem, and the symmetries given by rotations, translations, and reflections.

Other geometries are defined by some space with some metric-like structure, and some symmetry group.

Special relativity is spacetime with the metric dx2 + dy2 + dz2 - c2 dt2, and the Lorentz group of symmetries. It is called the Poincare group, if translations are included.

That's it. That's why the speed of light appears constant, why nothing can go faster, why we see a FitzGerald contraction, why times are dilated, why simultaneity is tricky, and everything else. They are all byproducts of living in a non-euclidean geometry.

I am not even referring to curved space, or hyperbolic space. I mean the minus sign in the metric that makes time so different from space. It gives flat spacetime a non-euclidean geometry.

Historically, H. Lorentz viewed relativity as an electromagnetism theory. He viewed everything as based on electromagnetism, so I am not sure he would have distinguished between a spacetime theory and an electromagnetism theory.

Now we view electromagnetism as just one of the four fundamental forces, but Lorentz was not far off. At the time, it was not known that electromagnetism underlies all of chemistry, so he was right to view it as much more pervasive that was commonly accepted.

Poincare was the first to declare relativity a spacetime theory, and to say it applies to electromagnetism or gravity or anything else. He was also the first to write the spacetime metric, and the Lorentz symmetry group. He did not explicitly say that it was a geometry, but that would have been obvious to mathematicians at the time.

I credit J.C. Maxwell with being the father of relativity. He was till alive in 1972 when the Erlangen Program for studying non-euclidean geometries was announced. He probably had no idea that it would provide the answer to what had been puzzling him most about electromagnetism.

Minkowski cited Poincare, and much more explicitly treated relativity as a non-euclidean geometry. He soon died, and his ideas caught on and were pursued by others.

Einstein understood in his famous 1905 paper that the inverse of a Lorentz transformation is another Lorentz transformation, and that the transformations can be applied to the kinematics of moving objects. Whether he got any of this from Poincare is hard to say. Einstein rejected the geometry view as the basis for relativity. He did partially adopt Minkowski's metric about 1913, but continued to reject relativity geometrization at least until 1925. Carlos Rovelli continues to reject it.

The geometry is what enables relativity to explain causality. It is what made J.C. Maxwell's electromagnetism the first relativistic theory, and hence the first truly causal theory. Everything is caused by past events in the light cone. The non-euclidean geometry restricts the causality that way.

Science is all about reductionism, ie, reducing observables to simpler components. Applied to space and time, it means locality. Objects only depend on nearby objects, and not distant ones. Events depend on recent events, and not those in the distant past. It is the Euclidean distance that defines what objects are nearby. Clocks define the recent past, but is an event recent if it is a nearby object at a recent time?

Maybe  yes, maybe no. That is where we need the non-euclidean geometry. The Minkowski metric is a fancy way of saying that the event was recent if light could have made the trip quickly.

If spacetime had a Euclidean geometry, then two events would be close if they are close in space and close in time. Causality does not work that way.

An event might seem to be spatially close, but if it is outside the light cone, then it cannot be seen, and it can have no causal effect. That is the consequence of the geometry.

Special relativity is usually taught based on the Michelson-Morley experiment on the motion of the Earth, and that is how it was historically discovered. It became widely accepted after Minkowski convinced everyone in 1908 that it was all geometry.

Today general relativity books emphasize the non-euclidean geometry of curved space, but the textbooks do not explain that the non-euclidean geometry of flat spacetime is at the very core of special and general relativity.

Tuesday, November 2, 2021

China claims Quantum Supremacy

News:
In July this year, a team in China demonstrated that it has the world’s most powerful quantum computer, finally leapfrogging Google, who claimed to have achieved quantum supremacy back in 2019. Back then, China was touting a super-advanced 66-qubit quantum supercomputer called “Zuchongzhi” as a contender against Google’s 54-qubit Sycamore processor. But while Google’s quantum computers have not progressed noticeably since then, China on the other hand never slowed down, coming up with more powerful quantum processors.

According to a recent study published in peer-reviewed journal Physical Review Letters and Science Bulletin, physicists in China claim they’ve constructed two quantum computers with performance speeds that far outrival competitors in the US or indeed anywhere in the world — debuting a superconducting machine along with a speedier unit that uses light photons to obtain unprecedented results.

I didn't read the papers, but they are still not computing anything. They just generate some random noise, and claim that it would be hard for a regular computer to simulate it.

It will be news if they actually compute something.

Wednesday, October 27, 2021

We are Late in an Interglacial Cycle

The US NOAA agency has posted this amazing chart showing what is known about glacial periods. This is mainstream science, and not a skeptic site. Think of it as a more complete version of the
Hockey stick graph.

The yellow bars are interglacial periods. The larger white regions in between are glacial cycles, aka ice ages.

These cycles are driven by slow variations in the Earth's orbit.

These glacial–interglacial cycles have waxed and waned throughout the Quaternary Period (the past 2.6 million years). Since the middle Quaternary, glacial–interglacial cycles have had a frequency of about 100,000 years (Lisiecki and Raymo 2005). In the solar radiation time series, cycles of this length (known as “eccentricity”) are present but are weaker than cycles lasting about 23,000 years (which are called “precession of the equinoxes”).
Here are my amateur observations from the chart.

Long-term cosmological cycles are much more important that what humans have done so far.  

We are overdue for another ice age.

Current CO2 levels are high, but the biggest increases were 10-15k years ago, long before humans could have influenced the climate.

As these slow variations appear to be exceptionally important for glaciation, climate, and life on Earth, the Earth's orbit must be finely-tuned for human life.

If the long-term trends hold up, we have more to fear from cooling than warming. However, if catastrophic greenhouse gas emissions cause runaway warming, then this chart tells us nothing about the future.

I am not a climate expert, and I do not know whether greenhouse gases are a problem. They probably are. I just want to understand the physics. It seems possible to me that the Industrial Revolution put out just enough CO2 to postpone an ice age.

Monday, October 18, 2021

Veritasium Video explains Many Worlds

Veritasium makes a lot of truly outstanding videos, and I recommend the channel for any readers here. But it made one on Parallel Worlds Probably Exist. Here’s Why about a year ago, and it leaves me scratching my head:
In the 1950's Hugh Everett proposed the Many Worlds interpretation of quantum mechanics. It is so logical in hindsight but with a bias towards the classical world, experiments and measurements to guide their thinking, it's understandable why the founders of quantum theory didn't come up with it. Rather than proposing different dynamics for measurement, Everett suggests that measurement is something that happens naturally in the course of quantum particles interacting with each other. The conclusion is inescapable. There is nothing special about measurement, it is just the observer becoming entangled with a wave function in a superposition. Since one observer can experience only their own branch, it appears as if the other possibilities have disappeared but in reality there is no reason why they could not still exist and just fail to interact with the other branches. This is caused by environmental decoherence.
It leads up to an interview of Sean M. Carroll, a many-worlds believer.

The concepts are explained pretty well, with good graphics.

Carroll tells that many-worlds allows essentially anything to happen, such as him being USA President or an NBA champion, as long as it does not violate conservation of energy or some other such principle. He says that this is not so strange, because we should ignore low probability events. Also, if you believe in an infinite universe with eternal inflation or some such mechanism, then there could be infinitely many copies of yourself doing bizarre things in distant galaxies/universes anyway.

Whilte the video exposes these silly arguments, it does not counter them.

Talking about infinite doppelgangers in infinite unobservable universes is no more scientific that discussion how many angels can dance on the head of a pin.

Carroll's use of probability was unchallenged, but the many-worlds theory has no way to say that any universe is more probable than any other. So while it is reasonable to ignore low probabilities, as Carroll says, there is nothing in the theory to say that those bizarro worlds have low probability.

Tuesday, October 12, 2021

Quantum Supremacy Claim Retracted

Scott Aaronson has just retracted his blessing for Google's quantum supremacy claims.

Quantum supremacy is the idea that a quantum computer could compute something much faster than a classical computer. Aaronson's big idea was that a quantum computer could just sample outputs from a complicated quantum random number generator, and that would pass as quantum supremacy if the classical computer could not simulate it efficiently.

That is what Google did two years ago, and Aaronson was the journal referee who approved the claim of quantum supremacy.

Now some Chinese researchers have shown that they can simulate Google's output on a classical computer. Aaronson says the Google team claims that they can keep changing the benchmark until they find one that the Chinese cannot simulate. He does not believe them.

Aaronson does not go as far as saying that Google's quantum supremacy is all a big hoax, but that's what I get out of his post. Read it yourself.

I am waiting for the quantum computers to compute something that is demonstrably difficult. That has not happened, and may never happen.

Monday, October 11, 2021

The Invention of Zero

When was the zero invented? It appears to have been around 0 AD, except that there was no such year. The year after 1 BC was 1 AD. It did not reach Europe until about 1200 AD.

I would have said that modern mathematicians agree that zero is a natural number, but I find that the world's smartest mathematician disagrees. He likes multiplicative number theory, where zero is avoided. I am pretty sure all the logicians would say that zero is a natural number.

Here is a new video on the subject: Is Zero More Than Nothing? Introducing the Zero Project

Closer To Truth 358K subscribers

Like the domestication of fire and the invention of the wheel, the concept of zero changed the course of human history. Yet zero’s origin remains shrouded in mystery.

Closer To Truth and Robert Lawrence Kuhn explore the mystery with The Zero Project, a group of international researchers searching for evidence of the invention of the numeral zero. Join the global expedition with expert guides from across diverse cultures and a range of specialty fields. In the quest for zero, visit far-flung places lost in time: North and South Africa, Central America, the Middle East, South-, Southeast, and Far East Asia.

Learn more about the project at www.zerorigindia.org

The invention of the numeral 0, and the number 0, seem like two different things. The numeral 0 was invented to allow base 10 numeric representation, like 100, a huge advance over Roman numerals. The number 0 is for the quantity 1-1.

I don't know how anyone can think clearly about anything, without the concept of zero.

The video says emptiness is important in Indian philosophy.

Tuesday, October 5, 2021

No Nobel Prize for Bell Test Experiments

Dr. Bee wrote:
I think a Nobel prize for the second quantum revolution is overdue. The people whose names are most associated with it are Anton Zeilinger, John Clauser, and Alain Aspect. They’ve been on the list for a Nobel Prize for quite some while and I hope that this year they’ll get lucky.
Many have been predicting Nobel prizes for these guys. And previously for David Bohm and John Bell, but they are now dead. See Bell test, for a survey of this work.

They have been passed up again this year.

My guess is that the explanation is that they do not give prizes for merely confirming existing knowledge. These experiments had the potential of disproving quantum mechanics, and that is what was driving the work by Bell and others. But they just confirmed the 1927 theory.

They say these experiments prove how strange quantum mechanics, because they show that it cannot be replaced by a local hidden variable theory. But again, that has been the consensus since about 1930. A Nobel prize for this would be like a prize for demonstrating energy conservation.

Some say that the Bell ideas provoked a lot of thinking about quantum information, "it from bit", and maybe even quantum computing.

But the maybe oddest thing to have come out of this is quantum teleportation. Quantum teleportation allows you to send quantum information with entangled states, even if you don’t yourself know the quantum information. ...

Quantum technologies have a lot of potential that we’re only now beginning to explore.

Dozens of Nobel prizes have been given for quantum theory. I just don't see quantum teleportation as important or interests, either theoretically or practically.

Monday, October 4, 2021

A Theory is a Hypothetical Explanation

I occastionally see science popularizers made a big point about the meaning of the word "theory". For example, the American Museum of Natural History says:
In everyday use, the word "theory" often means an untested hunch, or a guess without supporting evidence.

But for scientists, a theory has nearly the opposite meaning. A theory is a well-substantiated explanation of an aspect of the natural world that can incorporate laws, hypotheses and facts. The theory of gravitation, for instance, explains why apples fall from trees and astronauts float in space. Similarly, the theory of evolution explains why so many plants and imals—some very similar and some very different—exist on Earth now and in the past, as revealed by the fossil record.

But you only hear this from those promoting biological evolution or climate change theories. Here is more typical scientific usage, from a recent Nature magazine podcast:
Theories is the right word for it, because no one is really sure. ... Recently there has been a new theory. ... two existing theories... [2:30]
That's right, scientists talk about competing theories all the time, and they certainly aren't all well-substantiated as they usually contradict each other.

There are also theories like String Theory, which have no substantiation, and do not even make any testable predictions.

Wikipedia is dominated by evolutionists and climate leftists who insist on defining:

In modern science, the term "theory" refers to scientific theories, a well-confirmed type of explanation of nature, made in a way consistent with scientific method, and fulfilling the criteria required by modern science. Such theories are described in such a way that scientific tests should be able to provide empirical support for it, or empirical contradiction ("falsify") of it. Scientific theories are the most reliable, rigorous, and comprehensive form of scientific knowledge,[1] in contrast to more common uses of the word "theory" that imply that something is unproven or speculative (which in formal terms is better characterized by the word hypothesis).[2]
No, the theories on that Nature podcast are not well-confirmed, and String Theory is not described in a way to make it testable.

In Mathematics, a theory is a body of axioms, along with the theorems deducible from those axioms. They are usually computable and consistent, but may not match anything in the real world.