Monday, April 13, 2026

Special Relativity was Announced in 1904

Relavity historians give 1905 for the theory's origin, but it was really 1904.

Hector Giacomini writes in a new paper, also here:

Henri Poincaré’s Saint Louis lecture, delivered on 24 September 1904 at the International Congress of Arts and Science, occupies a distinctive place in the pre-history of twentieth-century theoretical physics. In this text, Poincaré formulated the principle of relativity in explicit and general terms, not as a narrow empirical rule limited to electrodynamics, but as one of the major guiding principles of mathematical physics. The lecture also offered a principle-based conception of theory centered on invariance, least action, and general theoretical coherence.
Poincare and other great scholars made the trip all the way to St. Louis, Missouri, USA, where the World's Fair was being held. For a month, St. Louis was the center of the world.

The text of the lecture was widely published and distributed in 1904, and an English translation was published in a popular philosophical journal in Jan. 1905. The above paper documents the wide distribution.

Poincare's lecture did not have any formulas, but he clearly had the essence of special relativity.

He wrote two kinds of papers. Technical papers with formulas intended for mathematicians, and papers without formulas intended for a wider audience. He was probably the most widely-read intellectual in Europe. Perhaps some did not appreciate him because he did not attempt to explain the formulas to non-mathematicians.

In explicit and programmatic terms, Poincaré formulated the principle of relativity as follows: the laws of physical phenomena must be the same for an observer at rest and for an observer carried along in uniform translational motion. Consequently, no experiment should allow one to determine whether one is in such uniform motion or not. In the lecture this formulation appears within the canonical list of fundamental principles and is treated on the same conceptual level as energy conservation and the principle of least action.

The relativity principle is not introduced merely as an empirical summary of ether-drift experiments. Rather, Poincaré presents it as a structural requirement increasingly supported by the persistent failure of attempts to detect motion relative to the ether. He discusses in particular the negative results of Michelson-type experiments and emphasizes the remarkable stability of electromagnetic theory under uniform motion. The continued empirical confirmation of null results is interpreted as evidence that the invariance of physical laws under uniform translation may reflect a deep structural property of nature. ...

In this respect, Poincaré’s principle of relativity appears as the explicit crystallization of themes already articulated in his writings [11, 12, 13], and made accessible to German-speaking readers through the 1904 translation of La science et l’hypothèse [14]. Poincaré further analyzes the theoretical devices introduced to preserve this invariance. He discusses Lorentz’s notion of “local time”, obtained by synchronizing clocks through light signals ...

Importantly, Poincaré suggests that the situation may ultimately require a new mechanics in which no velocity could exceed that of light and in which inertia would increase with speed. Such remarks indicate that the principle of relativity is not treated as a peripheral correction within classical mechanics, but as a constraint capable of reshaping its conceptual structure.

Lorentz’s 1904 Theory

An important component of the lecture is Poincaré’s discussion of Hendrik Antoon Lorentz’s recent work. In May 1904 Lorentz had published a major paper [6]. In that work Lorentz presented a refined mathematical formulation of the transformations required to preserve the form of Maxwell’s equations in a moving frame.

Poincaré’s lecture demonstrates that by September 1904 he was fully aware of the structure and implications of Lorentz’s construction. He describes the introduction of local time, the contraction hypothesis — according to which bodies moving through the ether undergo a physical contraction in the direction of motion — and the modification of forces and masses required to reconcile theory with experiment.

The paper makes no mention of Einstein, as he did not write anything on relativity until Summer 1905.

Poincare wrote his great relativity paper in 1905, but his 1904 lecture has the essence: the relativity principle, Lorentz transformations, length contraction, Michelson-Morley, clock synchronization, local time, and a new mechanics where nothing goes faster than light.

Previously I argued that the essence of relativity was the 4D spacetime, Lorentz group, non-euclidean geometry, covariant equations, and extending beyond electromagnetism. Poincare had all these in 1905, and Einstein did not understand them until years later.

Einstein sometimes denied that he knew about Lorentz's 1904 paper and Poincare's 1905 paper, although it is documented that he had access to both before submitting his own 1905 paper. I do not know if he was ever asked about Poincare's St. Louis lecture. It is hard to believe he could have missed it, as it was read by anyone with an interest in Mathematical Physics.

Einstein did not reference Lorentz or Poincare in his famous 1905 relativity paper. Even if he really did not know about these papers, he surely knew about them when he wrote survey papers on relativity a couple of years later.

You could also argue that relativity started in 1895, with Lorentz's paper. He had the approximate Lorentz transformations, Michelson-Morley to second order, local time, length contraction, and relation to Maxwell's equations. Lorentz got the 1902 Nobel Prize for his electromagnetic theory. He did not have the higher order theory he found in 1904, the symmetries as a group, and connecting local time to clock synchronization.

Wednesday, April 8, 2026

Ready to Warn Us about Broken Cryptography

Prof. Scott Aaronson really wants us to believe in quantum computing, and the press regularly asks him to comment on the latest developments, many of which are bogus. So now he posts:
Then one evening, you hear a howl in the distance, and sure enough, on a hill overlooking the town is the clear silhouette of a large wolf. So you point to it — and all the same people laugh and accuse you of “crying wolf.”

Now you know how it’s been for me with cryptographically relevant quantum computing.

No, the wolf not there yet. Some are predicting 2030. I say there is no chance of that.

A comment says:

I see you’re following in the footsteps of Eliezer Yudkowsky, getting so frustrated at people not understanding what you’re saying that you resort to explaining the basic principles of rationality in hope that this will help.
Meanwhile the NY Post reports:
The CIA used a futuristic new tool called “Ghost Murmur” to find and rescue the second American airman who was shot down in southern Iran, The Post has learned.

The secret technology uses long-range quantum magnetometry to find the electromagnetic fingerprint of a human heartbeat and pairs the data with artificial intelligence software to isolate the signature from background noise, two sources close to the breakthrough said.

It was the tool’s first use in the field by the spy agency — and was alluded to Monday afternoon by President Trump and CIA Director John Ratcliffe at a White House briefing.

I do not know anything about it.

Update: Sabine Hossenfelder Quantum Computers Just Got Much More Dangerous. She cites Google predicting Q-day for 2029, when quantum computers break popular cryptosystems.

Google has never realized its quantum predictions. I am glad to see it predicting 2029. That is only 3 years. We shall soon see. I say no chance.

Monday, April 6, 2026

Particles do not Pass Both Slits

The double-slit experiment is often explained as particles going through both slits. Supposedly this quantum mechanics interpretation was made rigorous by R.P. Feynman's path integral formulation, where particles take all possible paths.

This is not really correct, as explained in a new video: Debunking Veritasium: The “All Possible Paths” Myth & What Feynman Really Showed

Curt Jaimungal rigorously debunks the viral myth popularized by Veritasium (that quantum particles literally take “all possible paths” has been proved) and clarifies the true mathematical purpose of Feynman's formalism. Learn why this concept is a computational tool in configuration space rather than a physical map of reality.
People doing quantum computing are always talking about an electron being in two places at once, like the Schroedinger Cat that is alive and dead at the same time. The many-worlds fans especially like to talk this way. It these things do not happen in standard textbook quantum mechanics.

In textbook/Copenhagen QM, does not have a defined position until it is measured. It does not get observed in two places.

Update: A reader points out that the video is almost a year old.

Friday, April 3, 2026

Google to Crack Bitcoin

Yahoo reports:
Google recently issued two warnings in a span of a few days.

First, quantum computers will be able to crack cryptography encrypting cryptocurrencies like Bitcoin (BTC) by 2029. In fact, hackers might try stealing encrypted financial details right now and wait until 2029 for quantum computers to become powerful enough to decrypt those details.

Google recommended transition to post-quantum cryptography (PQC) to address the threat.

Second, a quantum system could crack a real-time Bitcoin transaction in about nine minutes. Here is how it could happen.

When a Bitcoin transaction is executed, the public key is revealed for a brief period. A quantum computer powerful enough can use the public key to find out the private key and steal the crypto assets.

It takes approximately 10 minutes for a Bitcoin transaction to confirm; the probability of success is only slightly less than 41%, the paper estimated.

The paper also revealed that it could take fewer than 500,000 qubits — far less than millions of qubits cited earlier — to crack Bitcoin's cryptography. It's a 20-fold reduction in the number of qubits needed to crack the encryption.

If there is a quantum computer, it would have to crack someone's key in that 10-minute window to steal money. The computers would have to be millions of times more efficient than they are now.

If the quantum computers get close, the Bitcoin community could change their protocols to resist the attack. It might be difficult to get everyone to agree to a new protocol. But as long as they did agree, the attack would be easily defended.

Dr. Quantum Supremacy has his take on the new announcements. I am skeptical, as usual.

In particular, the Caltech group estimates that a mere 25,000 physical qubits might suffice for this, where a year ago the best estimates were in the millions.
Here is a new PBS tv video on The Truth About Quantum Computers.
4:47 Microsoft claimed not only had they observed Majoranas, they also figured out how to control them. And they said they'd be able to use them to build reliable qubits that would be able to hold up in ways that other qubits can't. This breakthrough would provide a much faster pathway to quantum computing at a much larger scale than anyone else has been able to achieve. Microsoft was faced with an avalanche of skepticism. And as of filming, the data hasn't firmly established everything they claimed.

But some are optimistic that Microsoft can improve its chip and provide the breakthrough the industry has been waiting for. If they do, the whole world will change fast as we gain the ability to solve all kinds of problems we can't currently fully explore. For example, we might be able to create computer simulations of our world, down to the molecular level. That would open the door for incredible breakthroughs in chemistry and medicine. Or we could develop new battery technology, which could be key for mitigating climate change.

So this "truth" is all speculation.

Thursday, April 2, 2026

A River is now a Science Journal Co-author

Centuries ago, scientists might thank God, or cite Christianity for their belief in an orderly world. No science journal would tolerate that today, right?

Actually the leading science journal, Nature, has publish an article praising a river god.

Biology professor Jerry Coyne reports:

Conservationist Anne Poelina has a deep connection to the fresh water that runs through the dry red-rock landscape of the Kimberley region in Western Australia. Poelina identifies as a Nyikina Warrwa woman, and her people are the Traditional Custodians of the Martuwarra Fitzroy River. ...

Poelina explains that “in terms of property rights, the river owns me. So, I have a duty of care and the fiduciary duty to protect this river’s right to life.” ...

In 2020, she started including the Martuwarra River of Life as the first author on her publications. ...

When asked why the river should be listed as first author, she responded: “Because it’s the authority. It’s where I get my authority.”

This is not religious tolerance. This is a Wokeness statement.

Tuesday, March 31, 2026

China has taken over Science

T. Greer argues China is the future of science:
The scientific publisher Nature makes a similar catalog on a slightly more granular level, looking at specific fields of science. According to Nature’s most recent rankings, 18 of the top 25 most productive research institutes in the physical sciences, 19 of the top 20 in geosciences, and a full 25 out of 25 in chemistry are Chinese. Only in the biosciences do American scientists still have a lead—but even on that list three of the top ten are Chinese.6

The kicker is, none of that was true even just a decade ago.

Sunday, March 29, 2026

The QWERTY Keyboard was not Designed to be Slow

I need to watch Youtube with an AI LLM to fact check it.

Physicist Brian Keating posted a cosmology interview, and started making a false analogy about keyboard. My AI advises:

While Brian Keating is an astrophysicist and not a historian of technology, he is repeating a very common popular myth. The claim that the QWERTY layout was designed specifically to "slow down" typists is technically incorrect, though it is based on a grain of truth regarding mechanical limitations.

According to historical research into the origins of the typewriter:

  • The "Jamming" Problem: In early mechanical typewriters, if a typist hit two keys that were physically close to each other in rapid succession, the "typebars" (the hammers that hit the paper) would collide and jam.
  • The Real Purpose of QWERTY: The layout was designed to separate frequently used letter pairs (like "TH" or "ST") so their physical hammers were on opposite sides of the machine. This allowed the hammers to fall back into place before the next one rose, actually allowing the user to type faster without the machine seizing up.
  • The "Slow Down" Misconception: The myth likely arose because the layout is not "ergonomically optimal" for modern electronic keyboards. Since we no longer have mechanical hammers to worry about, the QWERTY layout seems unnecessarily difficult compared to alternatives like the Dvorak Simplified Keyboard.

    In the video, Keating uses this as an analogy for "technological lock-in", arguing that even if a system is no longer the most efficient, it becomes so widespread that it is nearly impossible to replace—much like he fears is happening with current AI architectures.

  • The truth is that the qwerty keyboard is not significantly slower than the Dvorak. It just reduces jamming on a mechanical typewriter.

    I am wary when some expert tries to make a point using some ridiculous myth that was debunked decades ago. He could have just googled it, and seen that he was wrong.

    In the video, Keating shares another classic bit of "technological lock-in" lore, claiming that modern railroad gauges (the distance between the tracks) can be traced back to the width of a Roman chariot, which was designed to fit two horses' butts.

    While it’s a great story for a podcast, historians and engineers generally categorize this as a "spurious correlation"—it's mostly a myth, though it has a tiny thread of logic.

    The Fact vs. The Fiction

    • The Claim: Roman chariots created ruts in European roads. Wagons were built to fit those ruts so they wouldn't break their wheels. Early tramways used those same wagon dimensions, and English trains followed suit, eventually leading to the Standard Gauge (4 feet 8.5 inches) used in the U.S. and much of Europe.

    • The Reality: While the standard gauge is roughly the width of an ancient wagon, the specific measurement of 4' 8.5" was actually popularized by George Stephenson, the "Father of Railways." He chose it because it was the width of the existing pre-steam coal tramways in North England where he worked.

    • The "Horse's Butt" Part: There is no evidence that Romans measured horses' rear ends to build chariots. Chariots were built to be wide enough to hold two people and stable enough not to tip over. The fact that two horses fit in front of that width is a matter of practical spacing, not a design blueprint that dictated the Space Shuttle's dimensions (another common layer of this myth).

    Why Keating Uses It

    Keating’s broader point at isn't necessarily about Roman history; he's using it to explain Path Dependence. He argues that once a standard is set—whether it's QWERTY keyboards, railroad widths, or GPU-based AI architectures—it becomes "locked in" because the cost of changing the entire infrastructure is too high, even if a better way exists.

    He is probably also wrong with his predictions about AI architectures. He also compares AI to a cockroach, and I think his point is that our AIs could suffer a technological lock-in at a sub-cockroach lever.

    I assume that Keating is more accurate when he talks about cosmology experiments. But he says this:
    0:30 If you said there's one galaxy, you're stupid. If you said there's one planet, you're stupid. If you said there's one. So why say there's one universe?
    I am sticking to one universe. Maybe I am stupid.

    Special Relativity was Announced in 1904

    Relavity historians give 1905 for the theory's origin, but it was really 1904. Hector Giacomini writes in a new paper , also here : Hen...