Not even trying: the corruption of real scienceThis is over-stated, but there is some truth to what he says.

Bruce G Charlton

University of Buckingham Press: Buckingham, UK. 2012

Briefly, the argument of this book is that real science is dead, and the main reason is that professional researchers are not even trying to seek the truth and speak the truth; and the reason for this is that professional ‘scientists’ no longer believe in the truth - no longer believe that there is an eternal unchanging reality beyond human wishes and organization which they have a duty to seek and proclaim to the best of their (naturally limited) abilities. Hence the vast structures of personnel and resources that constitute modern ‘science’ are not real science but instead merely a professional research bureaucracy, thus fake or pseudo-science; regulated by peer review (that is, committee opinion) rather than the search-for and service-to reality. Among the consequences are that modern publications in the research literature must be assumed to be worthless or misleading and should always be ignored. In practice, this means that nearly all ‘science’ needs to be demolished (or allowed to collapse) and real science carefully rebuilt outside the professional research structure, from the ground up, by real scientists who regard truth-seeking as an imperative and truthfulness as an iron law.

## Tuesday, December 26, 2017

### The corruption of real science

I stumbled across this book from 5 years ago:

## Sunday, December 24, 2017

### A review of Voigt's transformations

Here a new paper:

Unfortunately, no one appreciated the significance of what he had done, including himself.

Voigt's paper did not have much influence on historical development of special relativity by others, but the same could be said of Einstein's 1905 paper. The historical chain of special relativity ideas went from Maxwell to Michelson-Morley to Lorentz to Poincare to Minkowski to textbooks. Voigt, FitzGerald, Larmor, and Einstein were minor players.

A review of Voigt's transformations in the framework of special relativityI have posted a lot on the history of special relativity, without saying much about Voigt. He deserves credit for being the first to derive a version of the Lorentz transformations.

In 1887 Woldemar Voigt published the paper "On Doppler's Principle," in which he demanded covariance to the homogeneous wave equation in inertial reference frames, assumed the invariance of the speed of light in these frames, and obtained a set of spacetime transformations different from the Lorentz transformations. Without explicitly mentioning so, Voigt applied the postulates of special relativity to the wave equation. Here, we review the original derivation of Voigt's transformations and comment on their conceptual and historical importance in the context of special relativity. We discuss the relation between the Voigt and Lorentz transformations and derive the former from the conformal covariance of the wave equation.

Unfortunately, no one appreciated the significance of what he had done, including himself.

Voigt's paper did not have much influence on historical development of special relativity by others, but the same could be said of Einstein's 1905 paper. The historical chain of special relativity ideas went from Maxwell to Michelson-Morley to Lorentz to Poincare to Minkowski to textbooks. Voigt, FitzGerald, Larmor, and Einstein were minor players.

## Sunday, December 17, 2017

### Aaronson ducks QC hype for a year

I keep noting that we are now reaching the point where we find out whether quantum computing is all a big fraud. The supposed smart money has been saying that we would have quantum supremacy by the end of this year, 2017. If we still don't have it a year later, then journalists are going to start to wonder if we have all been scammed.

Wary about this crisis, Scott Aaronson is hiding out:

Scott insists on telling journalists that it is possible that quantum computers will occupy a complexity class that is faster than Turing machines but slower than full exponential search. He even gave a TED Talk entirely devoted to making this obscure technical point. I don't know why this silly point is so important. It is true that quantum computers get their hypothetical power from entangling 0s and 1s.

Scott is one of the few experts in this field who are honest enuf to admit that quantum supremacy has not been achieved. My guess is that he just doesn't want to be a professional naysayer for all his QC friends. He doesn't want to be the one quoted in the press saying that some over-hyped research is not what it pretends to be. Note that he is willing to talk to the press if someone really does achieve quantum supremacy.

His other gripe about the "Minus Sign Test" is just as ridiculous. He says that an explanation of quantum mechanics should explain that particles have wave-like properties, including destructive interference. He doesn't quite say it that way, because most explanations do mention destructive interference. His specific gripe is that he wants the destructive interference explained with an analogy to negative probabilities.

The trouble with his version of quantum mechanics is that there are not really any negative probabilities in quantum mechanics. The probabilities of quantum mechanics are exactly the same as classical probabilities. The minus signs are wave amplitudes, and destructive interference occurs when two amplitudes meet with opposite signs. This is a property of all waves, and not just quantum mechanical waves. I think that he is misleading journalists when he acts as if minus signs are the essence of quantum mechanics.

If any journalists get rejected by Scott and call me instead, my answer is simple. If a researcher claims to have a quantum computer, then ask for a peer-reviewed paper saying that quantum supremacy has been demonstrated. Otherwise, these guys are just making Turing machines.

Update: Stephen Hsu writes:

Wary about this crisis, Scott Aaronson is hiding out:

So then and there, I swore an oath to my family: that from now until January 1, 2019, I will be on vacation from talking to journalists. This is my New Years resolution, except that it starts slightly before New Years. Exceptions can be made when and if there’s a serious claim to have achieved quantum supremacy, or in other special cases. By and large, though, I’ll simply be pointing journalists to this post, as a public commitment device to help me keep my oath.His point here is that explanations of quantum computing often say that a qubit is a 0 and 1 at the same time, like a Schrodinger cat is dead and alive simultaneously, and therefore searching multiple qubits entails searching exponentially many values in parallel.

I should add that I really like almost all of the journalists I talk to, I genuinely want to help them, and I appreciate the extreme difficulty that they’re up against: of writing a quantum computing article that avoids the Exponential Parallelism Fallacy and the “n qubits = 2^{n}bits” fallacy and passes the Minus Sign Test, yet also satisfies an editor for whom even the so-dumbed-down-you-rip-your-hair-out version was already too technical.

Scott insists on telling journalists that it is possible that quantum computers will occupy a complexity class that is faster than Turing machines but slower than full exponential search. He even gave a TED Talk entirely devoted to making this obscure technical point. I don't know why this silly point is so important. It is true that quantum computers get their hypothetical power from entangling 0s and 1s.

Scott is one of the few experts in this field who are honest enuf to admit that quantum supremacy has not been achieved. My guess is that he just doesn't want to be a professional naysayer for all his QC friends. He doesn't want to be the one quoted in the press saying that some over-hyped research is not what it pretends to be. Note that he is willing to talk to the press if someone really does achieve quantum supremacy.

His other gripe about the "Minus Sign Test" is just as ridiculous. He says that an explanation of quantum mechanics should explain that particles have wave-like properties, including destructive interference. He doesn't quite say it that way, because most explanations do mention destructive interference. His specific gripe is that he wants the destructive interference explained with an analogy to negative probabilities.

The trouble with his version of quantum mechanics is that there are not really any negative probabilities in quantum mechanics. The probabilities of quantum mechanics are exactly the same as classical probabilities. The minus signs are wave amplitudes, and destructive interference occurs when two amplitudes meet with opposite signs. This is a property of all waves, and not just quantum mechanical waves. I think that he is misleading journalists when he acts as if minus signs are the essence of quantum mechanics.

If any journalists get rejected by Scott and call me instead, my answer is simple. If a researcher claims to have a quantum computer, then ask for a peer-reviewed paper saying that quantum supremacy has been demonstrated. Otherwise, these guys are just making Turing machines.

Update: Stephen Hsu writes:

I received an email from a physicist colleague suggesting that we might be near a "tipping point" in quantum computation.Yeah, that is what the quantum computer enthusiasts want you to believe. The big breakthrough is coming any day now. I don't believe it.

## Friday, December 15, 2017

### IBM signs up banks for QC

A reader alerts me to this Bloomberg story:

A few months ago, we were promised quantum supremacy before the end of this year. There are only two weeks left.

International Business Machines Corp. has signed up several prominent banks as well as industrial and technology companies to start experimenting with its quantum computers. ...Note that this is all still just hype, prototypes, simulators, and sales pitches.

IBM is competing with Alphabet Inc.’s Google, Microsoft Corp., Intel Corp., Canadian company D-Wave Systems Inc. and California-based Rigetti Computing as well as a number of other small start-ups to commercialize the technology. Many of these companies plan to offer access to quantum computers through their cloud computing networks and see it as a future selling point.

For now, quantum computers still remain too small and the error rates in calculations are too high for the machines to be useful for most real-world applications. ...

IBM is competing with Alphabet Inc.’s Google, Microsoft Corp., Intel Corp., Canadian company D-Wave Systems Inc. and California-based Rigetti Computing as well as a number of other small start-ups to commercialize the technology. Many of these companies plan to offer access to quantum computers through their cloud computing networks and see it as a future selling point.

For now, quantum computers still remain too small and the error rates in calculations are too high for the machines to be useful for most real-world applications. ...

IBM and the other companies in the race to commercialize the technology, however, have begun offering customers simulators that demonstrate what a quantum computer might be able to do without errors. This enables companies to begin thinking about how they will design applications for these machines.

A few months ago, we were promised quantum supremacy before the end of this year. There are only two weeks left.

## Monday, December 11, 2017

### Second Einstein book update

I publised a book on Einstein and relativity, and this blog has had some updates.

Einstein book update A 2013 outline of posts that explain aspects of relativity written after the book.

Einstein did not discover relativity A 2017 outline of arguments against Einstein's priority.

Einstein agreed with the Lorentz theory Einstein's 1905 relativity was a presentation of Lorentz's theory, as Einstein agreed with Lorentz on every major point. See also Calling the length contraction psychological, where Einstein published a 1911 paper rejecting a geometric interpretation of special relativity, and insisting that he still agreed with Lorentz's interpretation.

The geometrization of physics Einstein is largely idolized for geometrizing physics, but he had nothing to do with it, and even opposed it when published by others. See also Geometry was backbone of special relativity.

History of general relativity Link to a good historical paper summarizing the steps leading to general relativity. A couple of the steps are credited to Einstein, but the biggest steps are due to others.

In particular, I found much more evidence against two common claims:

that Einstein has a superior theoretical understanding of relativity, and

that Einstein's work was important for the development and acceptance of relativity.

In fact, Poincare, Minkowski, and Hilbert had superior geometrical interpretations, and Einstein rejected them. Special relativity became accepted very rapidly thru the work of Lorentz, Poincare, and Minkowski, and Einstein's 1905 paper had very little influence on anyone.

Update: I should have also added that I have much more evidence against the common claim:

that Poincare subscribed to Lorentz's interpretation of relativity.

While I explain in the book that this is not true, see Poincare was the new Copernicus, where I show that Poincare was presenting a view radically different from Lorentz's. See also my comment below, where I elaborate on just what Poincare meant.

Poincare and later Minkowski explicitly claimed modern geometrical interpretations of relativity that sharply break from Lorentz, and Einstein did not.

In particular, I found much more evidence against two common claims:

In fact, Poincare, Minkowski, and Hilbert had superior geometrical interpretations, and Einstein rejected them. Special relativity became accepted very rapidly thru the work of Lorentz, Poincare, and Minkowski, and Einstein's 1905 paper had very little influence on anyone.

Update: I should have also added that I have much more evidence against the common claim:

While I explain in the book that this is not true, see Poincare was the new Copernicus, where I show that Poincare was presenting a view radically different from Lorentz's. See also my comment below, where I elaborate on just what Poincare meant.

Poincare and later Minkowski explicitly claimed modern geometrical interpretations of relativity that sharply break from Lorentz, and Einstein did not.

Subscribe to:
Posts (Atom)