Saturday, December 14, 2019

Sick of Quantum Computing's Hype

Wired mag reports:
This spring, a mysterious figure by the name of Quantum Bullshit Detector strolled onto the Twitter scene. Posting anonymously, they began to comment on purported breakthroughs in quantum computing—claims that the technology will speed up artificial intelligence algorithms, manage financial risk at banks, and break all encryption. The account preferred to express its opinions with a single word: “Bullshit.” ...

In the subsequent months, the account has called bullshit on statements in academic journals such as Nature and journalism publications such as Scientific American, Quanta, and yes, an article written by me in WIRED. Google’s so-called quantum supremacy demonstration? Bullshit. Andrew Yang’s tweet about Google’s quantum supremacy demonstration? Bullshit. Quantum computing pioneer Seth Lloyd accepting money from Jeffrey Epstein? Bullshit. ...

The anonymous account is a response to growing anxiety in the quantum community, as investment accelerates and hype balloons inflate. Governments in the US, UK, EU, and China have each promised more than $1 billion of investment in quantum computing and related technologies. Each country is hoping to become the first to harness the technology’s potential to help design better batteries or to break an adversary’s encryption system, for example. But these ambitions will likely take decades of work, and some researchers worry whether they can deliver on inflated expectations—or worse, that the technology might accidentally make the world a worse place. “With more money comes more promises, and more pressure to fulfill those promises, which leads to more exaggerated claims,” says Bermejo-Vega.
The guy has to remain anonymous to avoid career consequences.

A reader sends this video (skip the first half, an interesting discussion of an unrelated topic) describing a guy who attacked M-Theory as a failure, and it was career suicide. Besides attacking M-theory, he attacks the whole high-energy physics enterprise, as he says that the useful new particle was discovered in 1929. (The positron has some medical uses.)

Abby Thompson is a tenured mathematician, or else she would have committed career suicide to pointing out the corruption of diversity statements. See this article from a few weeks ago, and these responses just published by the American Mathematical Society.

Thursday, December 12, 2019

Overtones of violence, neocolonialism and racism

Nature is one of the two top science journals in the world, and it just published this wacky letter:

We take issue with the use of ‘supremacy’ when referring to quantum computers that can out-calculate even the fastest supercomputers (F. Arute et al. Nature 574, 505–510; 2019). We consider it irresponsible to override the historical context of this descriptor, which risks sustaining divisions in race, gender and class. We call for the community to use ‘quantum advantage’ instead.

The community claims that quantum supremacy is a technical term with a specified meaning. However, any technical justification for this descriptor could get swamped as it enters the public arena after the intense media coverage of the past few months.

In our view, ‘supremacy’ has overtones of violence, neocolonialism and racism through its association with ‘white supremacy’. Inherently violent language has crept into other branches of science as well — in human and robotic spaceflight, for example, terms such as ‘conquest’, ‘colonization’ and ‘settlement’ evoke the terra nullius arguments of settler colonialism and must be contextualized against ongoing issues of neocolonialism.

Instead, quantum computing should be an open arena and an inspiration for a new generation of scientists.
I don't think this solves anything. Someone could still complain that white men are advantaged over other groups.

If quantum supremacy turns out to be a big fraud, and quantum supremacy is associated with neocolonialism, that maybe that will help credit neo-colonialism?

These sorts of ridiculous complaints have become commonplace now, and I have concluded that most of them are not sincere. They are not really offended by the phrase. They are just trying to exercise some political power.

Lubos Motl also criticizes the Nature letter, and others:
Well, I find it amazing that Nature that used to be a respectable journal is publishing similar lunacy from such despicable and intellectually empty activists these days. ...

Wow. Dr Preskill, aren't you ashamed of being this kind of a hardcore coward? How does it feel to be a pußy of sixteen (OK, 1000 in binary) zeroes? People who are far from being supreme?

I encourage readers from Caltech to spit at Prof Preskill, a spineless collaborationist with pure evil. Maybe he needs to start to drown in saliva to understand that pure evil shouldn't be supported in this way. ...

Let's hope that the NPCs will never open the U.S. Constitution because they would find 3 copies of the word "supremacy" there (two of them are in "national supremacy") and they would start to burn the book immediately.
He is overreacting a bit, but it is outrageous that a leading science journal publishes a social justice warrior demand that we stop using a perfectly good neutral word.

Monday, December 9, 2019

Applying covariance to white empiricism

A University of Chicago journal just published a wacky paper:
Making Black Women Scientists under White Empiricism: The Racialization of Epistemology in Physics

White empiricism is one of the mechanisms by which this asymmetry follows Black women physicists into their professional lives. Because white empiricism contravenes core tenets of modern physics (e.g., covariance and relativity), it negatively impacts scientific outcomes and harms the people who are othered. ...

Yet white empiricism undermines a significant theory of twentieth-century physics: General Relativity (Johnson 1983). Albert Einstein’s monumental contribution to our empirical understanding of gravity is rooted in the principle of covariance, which is the simple idea that there is no single objective frame of reference that is more objective than any other (Sachs 1993). All frames of reference, all observers, are equally competent and capable of observing the universal laws that underlie the workings of our physical universe. Yet the number of women in physics remains low, especially those of African descent ... Given that Black women must, according to Einstein’s principle of covariance, have an equal claim to objectivity regardless of their simultaneously experiencing intersecting axes of oppression, we can dispense with any suggestion that the low number of Black women in science indicates any lack of validity on their part as observers.
I am pretty sure this article is not intended to be a joke.

Covariance was not really Einstein's contribution. His papers do not show that he ever understood the arguments for covariance that Poincare made in 1905, and that Minkowski made in 1907. Einstein wrote a paper in 1914 arguing that covariance was impossible in relativity. It appears that Grossmann, Levi-Civita, and Hilbert convinced him of the merits of covariance.

Not everyone agrees that covariance, by itself, has physical significance. It is a mathematical concept, and it allows formulas in one frame to be converted to formulas in another frame. Poincare's "principle of relativity" is what says that inertia frames see the same physics.

I try to stick to physics on this blog. Other fields are hopelessly corrupted with sloppy work and political ideology. Physics is supposed to have higher standards. I mention this because of its goofy relativity reasoning.

Update: Jerry Coyne criticizes this article, but takes it way too seriously.

Update: I had assumed that this woman is black, but her web site says:
I will not say yes to any invitations that clash with the Jewish High Holy Days (Rosh Hashanah, Shabbat Shuvah, and Yom Kippur) or the first two nights of Passover.
Maybe she married a Jew and converted.

Friday, December 6, 2019

Believe this vision of the world religiously

Physicist Chris Fuchs says in a Discover mag interview:
You’ve written critically about the Many Worlds (or Everettian) Interpretation of
quantum mechanics. What are its main shortcomings?

Its main shortcoming is simply this: The interpretation is completely contentless. I am not exaggerating or trying to be rhetorical. It is not that the interpretation is too hard to believe or too nonintuitive or too outlandish for physicists to handle the truth (remember
the movie A Few Good Men?). It is just that the interpretation actually does not say anything whatsoever about reality. I say this despite all the fluff of the science-writing press and a few otherwise reputable physicists, like Sean Carroll, who seem to believe this vision of the world religiously.

For me, the most important point is that the interpretation depends upon no particular or
actual detail of the mathematics of quantum theory. No detail that is, except
possibly on an erroneous analysis of the meaning of “quantum measurement” introduced by John von Neumann in the 1930s, which is based on a reading of quantum states as if they are states of reality. Some interpretations of quantum theory, such as the one known as QBism, reject that analysis. ..

The Many Worlds Interpretation just boils down to this: Whenever a coin is tossed (or any process occurs) the world splits. But who would know the difference if that were not true? What does this vision have to do with any of the details of physics? ..

You also object to the idea of multiple alternate worlds on a philosophical level, correct?

Depending in no way on the details of quantum theory, the Many Worlds Interpretation has
always seemed to me as more of a comforting religion than anything else. It takes away human responsibility for anything happening in the world in the same way that a completely fatalistic, deterministic universe does, though it purportedly saves the appearance of quantum physics by having indeterministic chance in the branches.
I have been saying similar things here for years. I quit calling Many-Worlds an interpretation, because it is not even that. It doesn't even make any predictions. As he says, there is no content to it.

Tuesday, November 12, 2019

Eroding public trust in nutrition science

Harvard has responded to new research that red meat is harmless:
[N]utrition research is complex, and rarely do [its findings] reverse so abruptly. That's why it's so important to look beyond the headlines at the quality of the evidence behind the claims. Still, the publication of these new guidelines in such a prominent medical journal is unfortunate as it risks further harm to the credibility of nutrition science, eroding public trust in research as well as the recommendations they ultimately inform.
Funny how new research nearly always causes further harm to the credibility of nutrition science. Others say:
The misplaced low-fat craze of the 80's was the direct result of Harvard Professor Dr. Hegsted, who participated in the McGovern report that lead to dietary recommendation changes for Americans to eat more carbs in place of meat and fat, a recommendation that turned out to be based on "science" paid for by the sugar industry. Those recommendations caused an explosion of obesity, diabetes, heart disease, and cancer - all metabolic disorders caused by the insulin resistance that resulted from those recommended dietary changes.
My trust in nutrition science is nearly zero.

What do any of these people know about nutrition?

Physicians get a lot of respect for their medical opinions, and they probably deserve it most of the time. But most of them have never taken a course on nutrition, and don't know more than anyone else on the subject.

Everyone eats food, and so has opinions about food. Child-rearing is another subject where everyone has an opinion, but those opinions have almost no scientific value.

The nutrition research is so confusing that I don't know how to conclude that any food is healthier than any other food.

Sunday, November 10, 2019

Academic groupthink on paradigm shifts

Novelist Eugene Linden writes in a NY Times op-ed:
How Scientists Got Climate Change So Wrong ...

The word “upended” does not do justice to the revolution in climate science wrought by the discovery of sudden climate change. The realization that the global climate can swing between warm and cold periods in a matter of decades or even less came as a profound shock to scientists who thought those shifts took hundreds if not thousands of years. ...

In 2002, the National Academies acknowledged the reality of rapid climate change in a report, “Abrupt Climate Change: Inevitable Surprises,” which described the new consensus as a “paradigm shift.” This was a reversal of its 1975 report.
I wonder if he even realizes what these terms means. A scientific revolution or paradigm shift was famously described by Thomas Kuhn as a change in thinking that is incommensurable with previous theories. That is, there is no data to say whether the new thinking is any better or worse than the old. Kuhn described scientists jumping to the new paradigm like a big fad, and not really based on any scientific analysis.

Of course it is all Donald Trump's fault:
computer modeling in 2016 indicated that its disintegration in concert with other melting could raise sea levels up to six feet by 2100, about twice the increase described as a possible worst-case scenario just three years earlier.
Computer models change that much in 3 years? That says more about the instability of the models than anything else.

If the Trump administration has its way, even the revised worst-case scenarios may turn out to be too rosy. ... But the Trump administration has made its posture toward climate change abundantly clear: Bring it on!
Trump is one of the most pro-science presidents we have ever had. Even tho he is widely hated in academia, we hardly ever hear any criticisms of how he has funded scientific work.

Trump has also over-funded quantum computing, and yet Scott Aaronson posts a rant against him. Everyone is entitled to his opinion, of course, but it seems clear to me that academia is dominated by a groupthink mentality that makes their opinions on climate or presidential politics useless.

Wednesday, November 6, 2019

Carroll plugs many-worlds in videos

Lex Fridman interviews Sean M. Carroll on his new quantum mechanics book.

Carroll says that there are three contenders for a QM interpretation: (1) many-worlds, (2) hidden-variables, and (3) spontaneous collapse.

None of these has a shred of empirical evidence. We know that hidden variable theories have to be non-local, and no one has ever observed such a nonlocality. Spontaneous collapse theories contradict quantum mechanics.

After some questions, he admitted another: (4) theories concerned with predicting experiments!

He derided (4) as "epistemic", and complained that those theories (like textbook Copenhagen quantum mechanics) are unsatisfactory because they just predict experiments, and fail to predict what is going on in parallel universes or ghostly unobserved particles.

He also complained that under (4), two different observers of a system might collect different data, and deduce different wave functions.

Yes, of course, that is the nature of science.

Carroll's problem is that he has a warped view of what science is all about. He badmouths theories that make testable predictions, and says that we should prefer a theory that somehow tells us about "reality", but doesn't actually make any testable predictions.

He is a disgrace to science.

Update: See also this Google Talk video, where Carroll makes similar points.

He compares his 3 leading interpretations of QM to the 3 leading Democrat contenders for the White House. Maybe that is a fair analogy, and the leading Democrat contenders are all unfit for office, for different reasons.

Thursday, October 31, 2019

Aaronson explain qubits in the NY Times

Scott Aaronson announces his New York Times op-ed on quantum supremacy. His own personal interest in this is greater than I thought, as he says the NY Times forced him to reveal:
Let’s start with applications. A protocol that I came up with a couple years ago uses a sampling process, just like in Google’s quantum supremacy experiment, to generate random bits. ... Google is now working toward demonstrating my protocol; it bought the non-exclusive intellectual property rights last year.
He was the outside reviewer of the Google paper published in Nature. So he had a big hand in the editorial decision to say that this was quantum supremacy. Aaronson claims the credit for Google confirming that quantum computers can be used for generating random numbers. And Google paid Aaronson for the privilege.

I am not accusing Aaronson of being crooked here. I sure his motives are as pure as Ivory Soap. But he sure has a lot invested in affirming quantum supremacy based on random number generation. Maybe the Nature journal should have also required this disclosure.

He admits:
The computer revolution was enabled, in large part, by a single invention: the transistor. ... We don’t yet have the quantum computing version of the transistor — that would be quantum error correction.
So we don't have real qubits yet.

Aaronson has spent many years trying to convince everyone that there is a right way and a wrong way to explain qubits. Here is the wrong way:
For a moment — a few tens of millionths of a second — this makes the energy levels behave as quantum bits or “qubits,” entities that can be in so-called superpositions of the 0 and 1 states.

This is the part that’s famously hard to explain. Many writers fall back on boilerplate that makes physicists howl in agony: “imagine a qubit as just a bit that can be both 0 and 1 at the same time, exploring both possibilities simultaneously.”
So here is his better version:
Here’s a short version: In everyday life, the probability of an event can range only from 0 percent to 100 percent (there’s a reason you never hear about a negative 30 percent chance of rain). But the building blocks of the world, like electrons and photons, obey different, alien rules of probability, involving numbers — the amplitudes — that can be positive, negative, or even complex (involving the square root of -1). Furthermore, if an event — say, a photon hitting a certain spot on a screen — could happen one way with positive amplitude and another way with negative amplitude, the two possibilities can cancel, so that the total amplitude is zero and the event never happens at all. This is “quantum interference,” and is behind everything else you’ve ever heard about the weirdness of the quantum world.
Really? I may be dense, but I don't see that this is any better. He insists that the key is realizing that probabilities can be negative, or imaginary.

But this is just nonsense. There are no negative probabilities in quantum mechanics, or anywhere else.

We do have interference. Light does show interference patterns, as is possible for all waves. There is nothing the slightest bit strange about waves showing interference. But Aaronson insists on saying that the interference comes from negative probabilities. I don't see how that is mathematically accurate, or helpful to understanding quantum mechanics.

Wednesday, October 30, 2019

Perfect qubits would be amazing

The NY Times finally has an article on Google's quantum supremacy claim:
“Imagine you had 100 perfect qubits,” said Dario Gil, the head of IBM’s research lab in Yorktown Heights, N.Y., in a recent interview. “You would need to devote every atom of planet Earth to store bits to describe that state of that quantum computer. By the time you had 280 perfect qubits, you would need every atom in the universe to store all the zeros and ones.” ...

In contrast, many hundreds of qubits or more may be required to store just one of the huge numbers used in current cryptographic codes. And each of those qubits will need to be protected by many hundreds more, to protect against errors introduced by outside noise and interference.
Got that? 100 perfect qubit would give you more storage capacity than all the atoms on Earth.

But to store just one of the numbers used in crypto codes, you would need many 100s of qubits, as well as technological breakthrus to protect against errors.

The catch here is the modifier "perfect". Nobody has made any perfect qubits, or any scalable qubits, or any qubits protected against errors from outside noise and interference.

All this talk of 53 qubits is a big scam. They don't even have 2 qubits.

Tuesday, October 29, 2019

Many-Worlds theory is not science

More and more physicists are endorsing the Many-Worlds theory, and I have criticized them many times on this blog. Lubos Motl has also defended Copenhagen and criticized MW, and he has finally gotten to the heart of the matter.

MW is not just a goofy interpretation. It turns a good scientific theory into something that is contrary to all of science. It eliminates the ability to make predictions.

Motl writes:
Even today, almost 90 years later, the anti-quantum zealots who are still around – depending on the degree of their stupidity – argue that quantum mechanics is either wrong or incomplete. The typical complaint that "quantum mechanics isn't complete" is formulated as follows:
But the Copenhagen Interpretation fails to tell us what is really going on before we look.
Well, in reality, quantum mechanics tells us everything that is happening before the observation: nothing that could be considered a fact is happening before (or in the absence of) an observation! It is an answer. You may dislike it but it's a lie to say that you weren't given an answer!

Needless to say, the statements are upside down. The Copenhagen Interpretation provides us with a definition which questions are physically meaningful; and with the method to determine the answers to these questions (which must be probabilistic and the axiomatic framework clearly and unambiguously says that no "unambiguous" predictions of the phenomena are possible in general).

Instead, it's the anti-quantum "interpretations" of quantum mechanics such as the Many Worlds Interpretation that are incomplete because
their axioms just don't allow you to determine what you should do if you want to calculate the probability of an outcome of an observation.

In particular, the Many Worlds Interpretation denies that there's any collapse following Born's rule (an axiom) but it is rather obvious that when you omit this only link between quantum mechanics and probabilities, the Many Worlds paradigm will become unable to actually predict these probabilities. You created a hole – (because the building block looked ideologically heretical to him) someone has removed something that was needed (in the Copenhagen paradigm) to complete the argumentation that normally ends with the probabilistic prediction.

This is an actually valid complaint because the primary purpose of science is to explain and predict the results of phenomena.
That's right. And if you support MW, you abandoning the primary purpose of science. (I am avoiding the word "interpretation", because it is not really an interpretation. Calling it an interpretation is part of the hoax.)

Motl doesn't name names in this post, but an example is Sean M. Carroll. Motl probably has more distinguished physicists in mind, and doesn't want to embarrass them.

Belief in MW is a belief so goofy as to discredit whatever other opinions they might have. It is like believing in the Flat Earth, or that the Moon landings were faked.

Friday, October 25, 2019

Quantum measurement problem, explained

Dr. Bee explains the quantum measurement problem:
The problem with the quantum measurement is now that the update of the wave-function is incompatible with the Schrödinger equation. The Schrödinger equation, as I already said, is linear. That means if you have two different states of a system, both of which are allowed according to the Schrödinger equation, then the sum of the two states is also an allowed solution. The best known example of this is Schrödinger’s cat, which is a state that is a sum of both dead and alive. Such a sum is what physicists call a superposition.

We do, however, only observe cats that are either dead or alive. This is why we need the measurement postulate. Without it, quantum mechanics would not be compatible with observation. ...

Why is the measurement postulate problematic? The trouble with the measurement postulate is that the behavior of a large thing, like a detector, should follow from the behavior of the small things that it is made up of. But that is not the case. So that’s the issue. The measurement postulate is incompatible with reductionism. ...

I just explained why quantum mechanics is inconsistent. This is not a 'vague philosophical concern'.
She also says QM is incomplete.

This so-called measurement problem is a 'vague philosophical concern' in the sense that it does not present any practical difficulties.

When you say a theory is inconsistent, that usually means that it allows computing two different outcomes for some proposed experiment. That never happens with QM.

To see that there is an inconsistency, you have to wade thru a discussion of not seeing cats that are alive and dead at the same time.

It is not clear that this problem is a problem.

If anything good comes out of quantum computing research, it could be a better reductionist understanding of quantum measurement. Quantum computers seek to string together qubits as much as possible without measuring them. Because the computation depends on this lack of measurement, maybe the experiments could tell us more precisely just what a measurement is.

But the quantum computing research has told us nothing of the kind. Good old QM/Copenhagen is the underlying theory for all these experiments, and we have no clue that the 1930 theory is not good enuf.

Wednesday, October 23, 2019

IBM explains why Google has a quantum flop

Wired reports:
IBM Says Google’s Quantum Leap Was a Quantum Flop ...

Monday, Big Blue’s quantum PhDs said Google’s claim of quantum supremacy was flawed. IBM said Google had essentially rigged the race by not tapping the full power of modern supercomputers. “This threshold has not been met,” IBM’s blog post says. Google declined to comment. ...

Whoever is proved right in the end, claims of quantum supremacy are largely academic for now. ... It's a milestone suggestive of the field’s long-term dream: That quantum computers will unlock new power and profits ...
Wired says "academic" because everyone quoted claims that quantum supremacy will soon be achieved.

But where's the proof?

Nobody believed the Wright brothers could fly until they actually got off the ground. Quantum supremacy was supposed to be a way of showing that quantum computers had gotten off the ground. If those claims are bogus, as IBM now claims to have proved, then no quantum computers have gotten off the ground. That "long-term dream" is pure speculation.

Update: Google is now bragging, as its paper appeared in Nature. I assumed that it was trying to get into either Science or Nature, but I believe that Nature claims that it does not object to releasing preprints. If so, Google could have addressed criticisms after the paper was leaked.

Quanta mag has an article on the Google IBM dispute:
Google stands by their 10,000 year estimate, though several computer experts interviewed for this article said IBM is probably right on that point. “IBM’s claim looks plausible to me,” emailed Scott Aaronson of the University of Texas, Austin. ...

Aaronson — borrowing an analogy from a friend — said the relationship between classical and quantum computers following Google’s announcement is a lot like the relationship in the 1990s between chess champion Garry Kasparov and IBM’s Deep Blue supercomputer. Kasparov could keep up for a bit, but it was clear he was soon going to be hopelessly outstripped by his algorithmic foe.

“Kasparov can make a valiant stand during a ‘transitional era’ that lasts for maybe a year or two,” Aaronson said. “But the fundamentals of the situation are that he’s toast.”
Following his analogy, IBM's Deep Blue did beat Kasparov, but not convincingly. Maybe the computer was lucky. It was really the subsequent advances by others that showed that computers were superior.

So Aaronson seems to be saying that this research does not prove quantum supremacy, but other research will soon prove it.

We shall see.

Meanwhile, let's be clear about what Google did. It made a random number generator out of near-absolute-zero electronic gates in entangled states. Then it made some measurements to get some random values. Then it said that the device could be simulated on a classical computer, but it would take more time. Maybe 10,000 years more, but maybe just a couple of hours more.

That's all. No big deal, and certainly not quantum supremacy.

Update: Scott Aaronson weighs in , and admits that he was a Nature reviewer. He is happy because he is professionally invested in quantum supremacy being proved this way.

But he admits that Google's claim of 10k years is bogus, and that Google does not have any scalable qubits at all. Further more the researchers cooked the circuits so that they would be hard to simulate classically, while being completely useless for actually doing a computation.

To actually compute something useful, Google would need scalable qubits with some fault-tolerance system, and Google is no closer to doing that.

It has long been known that there are quantum systems that are hard to simulate. The only new thing here is that Google says its system is programmable. I am not sure why that is a good thing, as it cannot be programmed to do anything useful.

Update: Aaronson argues:
But I could turn things around and ask you: do you seriously believe at this point that Nature is going to tell the experimenters, “building a QC with 53 qubits is totally fine — but 60? 70? no, that’s too many!”
The flaw in this argument is that they don't really have a QC with 53 qubits. They have a random number generator with 53 components that act enuf like qubits to generate random numbers.

Computing something useful is expected to require 10 million real (scalable) qubits. Yes, Nature may very well say you can have a quantum device to generate random numbers, but not get a quantum computational advantage.

Monday, October 21, 2019

Indian books on Superior and Inferior

I enjoy John Horgan's SciAm columns, especially when he expresses skepticism for fad scientific work. For example, he appears to be winning a bet that no Nobel Prize will be awarded for string theory.

He has his share of goofy ideas, such as his belief in abolishing war.

His latest column is a rant against scientific work on human races, as all such work is inherently racist
But no, he was condemning Watson’s critics, whom he saw as cowards attacking a courageous truth-teller. I wish I could say I was shocked by my host’s rant, but I have had many encounters like this over the decades. Just as scientists and other intellectuals often reveal in private that they believe in the paranormal, so many disclose that they believe in the innate inferiority of certain groups. ...

I once suggested that, given the harm done by research on alleged cognitive differences between races, it should be banned. I stand by that proposal. I also agree with Saini that online media firms should do more to curb the dissemination of racist pseudoscience. “This is not a free speech issue,”
Really? Scientists and intellectuals often reveal in private that they believe in the paranormal? I doubt that.

My guess is that he is just using "paranormal" as a word to cover beliefs he does not recognize.

I am no expert in race research, but there is a lot of it, and I cannot believe it is all bogus.
I read Superior: The Return of Race Science by British journalist Angela Saini (who is coming to my school Nov. 4, see Postscript). Superior is a thoroughly researched, brilliantly written and deeply disturbing book. It is an apt follow-up to Saini’s previous book, Inferior, which explores sexism in science (and which I wrote about here and here). Saini calls “intellectual racism” the “toxic little seed at the heart of academia. However dead you might think it is, it needs only a little water, and now it’s raining.”
British? She has Indian parents, and India is famous for its racial/caste divisions. And its sexism too, for that matter.

Her books are mostly politics, not science. The favorable reviews just show how science has been corrupted. Here is how she writes:
If anything, the public debate around race and science has sunk into the mud. To state even the undeniable fact that we are one human species today means falling afoul of a cabal of conspiracy theorists. The “race realists,” as they call themselves online, join the growing ranks of climate change deniers, anti-vaxxers and flat-earthers in insisting that science is under the yoke of some grand master plan designed to pull the wool over everyone’s eyes. In their case, a left-wing plot to promote racial equality when, as far as they’re concerned, racial equality is impossible for biological reasons.

How did we get here? How did society manage to create so much room for those who genuinely believe that entire nations have innately higher or lower cognitive capacities,
Maybe because some nations have achieved much more than other nations?
What has started with a gentle creep through the back door of our computers could end, if we’re not careful, with jackboots through the front door of our homes. Populism, ethnic nationalism and neo-Nazism are on the rise worldwide.
No, this is just leftist paranoia. Neo-Nazism does not even exist, as far as I know.

Saturday, October 19, 2019

Google overhyped announcement imminent

News on the quantum physics grapevine, Frankfurt Institute theoretical physicist Sabine Hossenfelder tells me, is that Google will announce something special next week: Their paper on achieving quantum supremacy, the realization of a quantum computer that outdoes its conventional counterpart. ...

It’s nothing to get too excited about yet. “This” — NISQ — “is really a term invented to make investors believe that quantum computing will have practical applications in the next decades or so,” Hossenfelder says. “The trouble with NISQs is that while it is plausible that they soon will be practically feasible, no one knows how to calculate something useful with them.” Perhaps no one ever will. “I am presently quite worried that quantum computing will go the same way as nuclear fusion, that it will remain forever promising but never quite work.”
We know that the Sun gets its energy from nuclear fusion. We don't know that quantum speedups are even possible.

Thursday, October 17, 2019

Rovelli: Neither Presentism nor Eternalism

Physicist Carlo Rovelli writes in support of Neither Presentism nor Eternalism:
Shortly after the formulation of special relativity, Einstein's former math professor Minkowski found an elegant reformulation of the theory in terms of the four dimensional geometry that we call today Minkowski space. Einstein at first rejected the idea. (`A pointless mathematical complication'.) But he soon changed his mind and embraced it full heart, making it the starting point of general relativity, where Minkowski space is understood as the local approximation to a 4d, pseudo-Riemannian manifold, representing physical spacetime.

The mathematics of Minkowski and general relativity suggested an alternative to Presentism: the entire 4d spacetime is `equally real now', and becoming is illusory. This I call here Eternalism.
Other make this argument that relativity implies a eternalism philosophy of time. I disagree with this argument. You can talk about spacetime with either Galilean or Lorentz transformations. If that is eternalist, then it is either with or without relativity.

Note that Rovelli is compelled to make his relativity story all about Einstein, even tho he had nothing to do with the issue at hand. Minkowski did not reformulate Einstein's theory, as it is not clear that Minkowski was ever influenced by anything Einstein wrote. Spacetime relativity was first published by Poincare, and Minkowski cited Poincare.

Rovelli ends up wanting some compromise between presentism and eternalism, as both views are really just philosophical extremes to emphasize particular ways of thinking about time. This might seem obvious, except that there are a lot of physicists who say that relativity requires eternalism.