Pages

Monday, April 27, 2015

Poincare applied theory to wireless telegraphy

A new paper on Poincare's forgotten conferences on wireless telegraphy:
At the beginning of the twentieth century while Henri Poincaré (1854-1912) was already deeply involved in the developments of wireless telegraphy, he was invited, in 1908, to give a series of lectures at the Ecole Supérieure des Postes et Télégraphes (today Sup’Télecom). In the last part of his presentation he established that the necessary condition for the existence of a stable regime of maintained oscillations in a device of radio engineering completely analogous to the triode: the singing arc, is the presence in the phase plane of stable limit cycle.

The aim of this work is to prove that the correspondence highlighted by Andronov between the periodic solution of a non-linear second order differential equation and Poincaré’s concept of limit cycle has been carried out by Poincaré himself, twenty years before in these forgotten conferences of 1908.
This work was intended for engineers, and was not included in his complete works.

Physics books often discount Poincare as a mathematician who did not really understand physics like electromagnetism. However, the record shows that he understood it better than Einstein or anyone else in Europe:
During the last two decades of his life, Poincaré had been involved in many research on the propagation of electromagnetic waves. In 1890, he wrote to Hertz to report a miscalculation in his famous experiments. Three years later, he solved the telegraphists equation [Poincaré, 1893]. The following year he published a book entitled: “Oscillations électriques” [Poincaré, 1894] and in 1899 another one: “La Théorie de Maxwell et les oscillations hertziennes” [Poincaré, 1899]. This book, also published in English and German in 1904 and reprinted in French in 1907, has been considered as a reference. In Chapter XIII, page 79 Poincaré stated that the singing arc and the Hertz spark gap transmitter were also analogous except that oscillations are maintained in the first and damped in the second. Thus, from the early twentieth century until his death Poincaré continued his research on wireless telegraphy and on maintained waves and oscillations [Poincaré, 1901, 1902, 1903, 1904, 1907, 1908, 1909abcde, 1910abc, 1911,1912].

Thursday, April 23, 2015

Bell doubted quantum mechanics and relativity

Physicist John Stewart Bell was famous for Bell's theorem, and considered by many a great genius who should have gotten a Nobel prize.

A physics article says:
But despite the ascendancy of the Copenhagen interpretation, the intuition that physical objects, no matter how small, can be in only one location at a time has been difficult for physicists to shake. Albert Einstein, who famously doubted that God plays dice with the universe, worked for a time on what he called a "ghost wave" theory of quantum mechanics, thought to be an elaboration of de Broglie's theory. In his 1976 Nobel Prize lecture, Murray Gell-Mann declared that Niels Bohr, the chief exponent of the Copenhagen interpretation, "brainwashed an entire generation of physicists into believing that the problem had been solved." John Bell, the Irish physicist whose famous theorem is often mistakenly taken to repudiate all "hidden-variable" accounts of quantum mechanics, was, in fact, himself a proponent of pilot-wave theory. "It is a great mystery to me that it was so soundly ignored," he said.
It got this comment:
The author of pilot-wave theory Louis deBroglie was an aetherist and he wrote, for example:
"any particle, even isolated, has to be imagined as in continuous "energetic contact" with a hidden medium ... It certainly is of quite complex character. It could not serve as a universal reference medium, as this would be contrary to relativity theory."
It's not surprising, the guy promoting the aether and even doubting of relativity theory gets ignored with no mercy for century with mainstream physics.
People assume that Bell was a champion of quantum mechanics, but actually he and his early followers were trying to prove quantum mechanics wrong. His theorem gave a way to distinguish quantum mechanics from hidden variable theories. Subsequent experiments verified quantum mechanics, and ruled out hidden variable theories.

The de Broglie Bohm pilot wave theory is considered an interpretation of quantum mechanics, but it is really only an interpretation of some non-relativistic special cases. It is supposedly more intuitive because it tells you where the electron really is, but it also requires you to believe in action-at-a-distance ghosts that accompany the electrons. So it is a lot more bizarre than quantum mechanics.

Bell did not believe in relativity as a spacetime theory either, as he is quoted in this 2009 paper:
If it is just long enough to span the required distance initially, then as the rockets speed up, it will become too short, because of its need to Fitzgerald contract, and must finally break. It must break when, at a sufficiently high velocity, the artificial prevention of the natural contraction impose s intolerable stress.
In the Minkowski spacetime relativity, the FitzGerald contraction is a geometric illusion, and does not cause any stresses. Bell's explanation is wrong, as explained in that paper.

Relativity and quantum mechanics are the pillars of XX-century physics. It is good to have skeptics who keep the mainstream physicists honest, I guess, but nobody gets a Nobel prize for that. If experiments had overthrown the conventional wisdom, then Bell would be hailed as a great genius, but that did not happen. In the end, all Bell had was an argument against hidden variables that was more persuasive than the existing arguments that had already convinced everyone.

I use the The Son of Man painting because it is the symbol of the John Stewart Bell Prize. Something about reality being obscured, I guess.

Here is a recent paper for a book honoring Bell.

Monday, April 13, 2015

Concise argument against quantum computing

I summarize my arguments against quantum computers (QC). By QC, I mean super-Turing scalable quantum computing. I have posted similar arguments before in Sluggishly expanding wave function, The universe is not a quantum computer, and Ivan Deutsch says no true qubit yet.

People have claimed to have done quantum computations, such as factoring 15, but there is no true qubit, and no quantum speedup over a conventional computer.

Here are the main 3 reasons.

1. The same reason I am skeptical about supersymmetry (SUSY): the theory is interesting and the math is sound, but when our leading experts spend a billion dollars looking for ten years and see no sign of it, it is time to consider that it may not exist.

2. Disproving Church-Turing Thesis as a characterization of computable functions would be a very surprising result. Quantum mechanics (QM) is also surprising, but it doesn't actually conflict with Church unless it is strictly valid to a very high precision where it has never been tested.

3. QC is unexpected in positivist interpretations of QM. I see QM as a clever calculus for tracking electrons possibly being in 2 or more places at once. For example, the famous double-slit experiment can be interpreted as a electron going thru both slits at once, and the wave function giving probabilities for each slit.

To a positivist like myself, the electron is never really in two places at once, as that is never observed and the theory does not require it. That is just a convenient fiction for visualizing a quantum that has some particle-like and some wave-like properties, but no exact classical counterpart.

If I really believed that an electron could be in two places at once, then I might believe in QC. The whole idea behind QC is that the multi-positional property of electrons can somehow be used for computational advantage. Whereas regular computers are built out of on-off switches and gates, a QC would be built out of gates that act like little double-slits, dividing reality into pieces and simultaneously operating on the pieces.

QC teaches complementarity, meaning that an electron can be considered a particle and a wave, but if you try to do both at the same time, you get into trouble. That is just what QC does -- try to make an electron a particle and a wave at the same time.

Scott Aaronson would say that I have fallen for the fallacy that a QC is a parallel computer, or that I have not fully accepted QM, or that I have failed to give a precise mechanism that is going to prevent a QC from achieving super-Turing results. He says:
With quantum computing the tables are turned, with the skeptics forced to handwave about present-day practicalities, while the proponents wield the sharp steel of accepted physical law.
Maybe so. When the experts figure out whether QC is possible, we can revisit this page and see who was more right.

Meanwhile, Aaronson explains some of those sharp edges of accepted physical law:
I would not recommend jumping into a black hole as a way to ensure your privacy. ...

But a third problem is that even inside a black hole, your secrets might not be safe forever! Since the 1970s, it’s been thought that all information dropped into a black hole eventually comes out, in extremely-scrambled form, in the Hawking radiation that black holes produce as they slowly shrink and evaporate. What do I mean by “slowly”? Well, the evaporation would take about 1070 years for a black hole the mass of the sun, or about 10100 years for the black holes at the centers of galaxies. Furthermore, even after the black hole had evaporated, piecing together the infalling secrets from the Hawking radiation would probably make reconstructing what was on the burned paper from the smoke and ash seem trivial by comparison! But just like in the case of the burned paper, the information is still formally present (if current ideas about quantum gravity are correct), so one can’t rule out that it could be reconstructed by some civilization of the extremely remote future.
Sorry, but I do not accept conservation of information as a physical law. I say that if you burn a book, you destroy the information in it. He says that the information is retained in the ashes somehow, but there is no practical or conceivable way of getting the info out of the ashes. Saying that the ashes retain the info is not a scientific statement.

The same kind of people, who believe in the many-worlds interpretation of quantum mechanics, also believe in time-reversibility, and in information conservation. I guess you could believe in these things separately, as I don't think Aaronson believes in many-worlds. But they are all just abstract prejudices that have no real theoretical or experimental grounding.

Conservation of information is nothing like conservation of energy or momentum. Those types of conservation laws are theoretically grounded in symmetry principles, and experimentally tested to high accuracy. Conservation of information is just based on a belief about how people should think of probability.

When Aaronson talks about the "sharp steel of physical law", he means things like information conservation. And by that, he means that your privacy is not safe in a black hole because info might leak out over the next 1070 years.

Making a quantum computer might be like extracting info from a black hole. Certain experts will always say that it is possible, but it is so far removed from anything observable that it is just idle speculation.

Friday, April 10, 2015

The problems with Everett's many worlds

Lubos Motl attacks Hugh Everett's thesis, and a new paper discusses The Problem of Confirmation in the Everett Interpretation:
I argue that the Oxford school Everett interpretation is internally incoherent, because we cannot claim that in an Everettian universe the kinds of reasoning we have used to arrive at our beliefs about quantum mechanics would lead us to form true beliefs. I show that in an Everettian context, the experimental evidence that we have available could not provide empirical confirmation for quantum mechanics, and moreover that we would not even be able to establish reference to the theoretical entities of quantum mechanics. I then consider a range of existing Everettian approaches to the probability problem and show that they do not succeed in overcoming this incoherence.
I criticized the author, Emily Adlam, for a paper on relativity history.

Everett is the father of the Many Worlds Interpretation of quantum mechanics. I agree that it is incoherent. Some people prefer to call it the Everett interpretation so that it sounds less wacky. It is amazing how many seemingly-educated people take it seriously. It is like a stupid science fiction plot, and there isn't much substance to the theory at all.

There are not really any theoretical or empirical reasons for preferring MWI. The arguments for it are more philosophical. Its adherents say that it is more objective, or more deterministic, or more localized, or something like that. I don't know how postulating the spontaneous creation of zillions of unobservable parallel universes can do any of those things, but that is what they say.

Wednesday, April 8, 2015

Galileo got ship relativity from Bruno

I have posted many times on the origin of special relativity, but this is a new one to me. A new paper credits Bruno in 1584:
The trial and condemnation of Giordano Bruno was mainly based on arguments of philosophical and theological nature, and therefore different from Galilei's. Such elements contribute to unfairly devalue the scientific contribution of Bruno and do not properly account in particular for his contribution to physics. This paper discusses the contribution of Bruno to the principle of relativity. According to common knowledge, the special principle of relativity was first enunciated in 1632 by Galileo Galilei in his Dialogo sopra i due massimi sistemi del mondo (Dialogue concerning the two chief world systems), using the metaphor today known as "Galileo's ship": in a boat moving at constant speed, the mechanical phenomena can be described by the same laws holding on Earth. We shall show that the same metaphor and some of the examples in Galilei's book were already contained in the dialogue La cena de le Ceneri (The Ash Wednesday Supper) published by Giordano Bruno in 1584. In fact, Giordano Bruno largely anticipated the arguments of Galilei on the relativity principle. It is likely that Galilei was aware of Bruno's work, and it is possible that the young Galilei discussed with Bruno, since they both stayed in Venezia for long periods in 1592.
I knew that Bruno was a Catholic monk who denied the divinity of Jesus and argued for many other heresies, and was executed. And I knew that he speculated about an infinity of worlds. But I did not that he had any legitimate scientific contributions.

There were ancient Greeks who argued for the motion of the Earth, such as Aristarchus of Samos, but his works have been lost and we don't know his arguments. Since we do not feel the motion of the Earth, he surely must have argued for some sort of relativity principle. Aristotle argued that our failure to feel the motion suggests that the Earth is not moving. So I do not see how the principle can possibly be due to Bruno or anyone of that time period.

This paper does show that Galileo could have met Bruno and gotten important ideas from him, including the relativity of a moving ship.

Modern relativity got started when James Clerk Maxwell observed that his theory of electromagnetism appeared to be incompatible with the relativity principle. He coined the word "relativity", and suggested an experimental test.

Monday, April 6, 2015

Aaronson's first paragraph

Since I often mention MIT complexity theorist Scott Aaronson, I am giving a plug for his book.

You can tell a lot from the first paragraph of the first chapter of a book. That is where the author carefully attempts to grab your attention and give you a flavor of the book. Some authors will rewrite it 20 times until they get it right.

Here is how Quantum Computing since Democritus starts:
So why Democritus? First of all, who was Democritus? He was this Ancient Greek dude. He was born around 450 BC in this podunk Greek town called Abdera, where people from Athens said that even the air causes stupidity. He was a disciple of Leucippus, according to my source, which is Wikipedia. He's called a “pre-Socratic,” even though actually he was a contemporary of Socrates. That gives you a sense of how important he's considered: “Yeah, the pre-Socratics – maybe stick ’em in somewhere in the first week of class.” Incidentally, there's a story that Democritus journeyed to Athens to meet Socrates, but then was too shy to introduce himself.
The book is 370 pages of this.

He is writing a new book, from his blog posts on similar topics. He has acknowledged that his own book has led people to quantum computer over-hype, and proclaims: "Single most important application of QC (in my opinion): Disproving the people who said QC was impossible!" He laments that "we as a community have failed to make the honest case for quantum computing". Much of the content of his book is on his web site.

Wednesday, April 1, 2015

Lemaitre's Big Bang

There is a new paper on Lemaitre's Big Bang:
I give an epistemological analysis of the developments of relativistic cosmology from 1917 to 1966, based on the seminal articles by Einstein, de Sitter, Friedmann, Lemaitre, Hubble, Gamow and other historical figures of the field. It appears that most of the ingredients of the present-day standard cosmological model, including the acceleration of the expansion due to a repulsive dark energy, the interpretation of the cosmological constant as vacuum energy or the possible non-trivial topology of space, had been anticipated by Georges Lemaitre, although his articles remain mostly unquoted.
I have posted before that Lemaitre should be credited with the Big Bang more than any other single person. This paper confirms that, and describes later contributions of others as well.