tag:blogger.com,1999:blog-81485735514175786812024-03-18T10:15:26.921-07:00Dark BuzzNatura non facit saltus.Rogerhttp://www.blogger.com/profile/03474078324293158376noreply@blogger.comBlogger1719125tag:blogger.com,1999:blog-8148573551417578681.post-22106123112997953942024-03-18T06:00:00.000-07:002024-03-18T06:00:00.134-07:00Deutsch defends Many-Worlds TheoryNew podcast: <a href="https://www.youtube.com/watch?v=bux0SjaUCY0">The Multiverse is REAL - David Deutsch</a>.
<p>
He says the double-slit experiment conclusively proves the many-worlds theory. He accepts the parallel worlds
for the same reason he accepts the existence of dinosaurs (millions of years ago). It is the only way to
explain the evidence.
</p><p>
He admits that the photons are not really particles, and that any waves show a similar diffraction pattern.
But he says that for the photon to behave as a wave, it must exist in multiple copies.
</p><p>
He also admits that all this has been known for decades, and yet most physicists do not accept this argument
for many-worlds.
</p><p>
I do not see any merit to this argument. The double-slit experiment only proves that light, and other beams like electron
beams, have wave properties. That's all. It is not even evidence for quantum mechanics, as this wave explanation was
accepted before quantum mechanics was invented.
</p><p>
When asked about alternative theories, he says the von Neumann had this crazy idea that if you observe an electron
in one place, then the possibility of it being somewhere else ceases to exist.
</p><p>
I do not see the problem with that. I do not even think the issue has anything to do with quantum mechanics.
Anytime you estimate the probability of an event, and then observe it, that means that the other possibilities did not happen. That is how probabilities work.</p><p>
He pushes quantum computing, but admits that he has not followed the latest technology.
</p><p>I see the argument for many-worlds as nothing more than a rejection of probability theory. You could take any scientific theory that predicts probabilities, deny that the probabilities make any sense, and conclude that there are parallel worlds of unobserved possibilities.</p><p>That is all many-worlds theory is. I don't think that it even has anything to with quantum mechanics. It is only expressed in terms of quantum mechanics, because textbook QM emphasizes the probabilities. But other theories use probabilities the same way, and could have many-worlds interpretations.</p><p>Many-worlds thsoey is just the same as taking a science textbook, announcing some philosophical disagreement with probability theory, and redacting all the sections mentioning probability. It adds nothing. It just removes the theory's predictive power.</p><p>It is hard to see how any intelligent man takes many-worlds seriously. It offers nothing. Maybe they just don't understand probability theory, as I do not see anywhere that they recognize that they are just rejecting probability.</p><p>
<a href="https://www.youtube.com/watch?v=uzVu8fwJU4o">PBS TV has a news item</a>:
</p><blockquote>How quantum computing could help us understand more about the universe
<p>
Scientists, researchers and some big companies are eager to jumpstart the next generation of computing, one that will be far more sophisticated and dependent on understanding the subatomic nature of the universe. But as science correspondent Miles O’Brien reports, it’s a huge challenge to take this new quantum leap forward.</p></blockquote>
A lot of hype. They admit that a fault tolerant quantum computer might be decade away.
No one admits that it might be impossible.<p></p><p></p><p></p><p></p><p></p><p></p><p></p><p></p>Rogerhttp://www.blogger.com/profile/03474078324293158376noreply@blogger.com1tag:blogger.com,1999:blog-8148573551417578681.post-7331504109851407312024-03-11T06:00:00.000-07:002024-03-11T06:00:00.267-07:00Physics v. Magic<div class="separator" style="clear: both;"><a href="https://imgs.xkcd.com/comics/physics_vs_magic.png" style="display: block; padding: 1em 0; text-align: center; "><img alt="" border="0" width="400" data-original-height="294" data-original-width="740" src="https://imgs.xkcd.com/comics/physics_vs_magic.png"/></a></div>
<p>
From <a href="https://xkcd.com/2904/">xkcd</a> comics.
<p>
His point here is that Physics is inherently causal. Things happen because there is some causal sequence of interactions
from event A to event B, when A causes B.
<p>
For example, the Sun's gravitational pull on the Earth was once thought to be action-at-a-distance, but is not
thought of as the Sun perturbing spacetime and a wave traveling to Earth. Or gravitons traveling to Earth.
<p>
However Physics often reasons from results, without following a causal chain. Examples are thermodynamics,
conservation laws, and Lagrangians.
<p>
There are also examples in quantum mechanics. A particle might tunnel through a wall, without any understanding
of how it gets through the wall.
<p>
In spite os these example, I still believe the universe is inherently local, in cause and effect. Sometimes our
reasoning can skip some steps, but only as a mathematical convenience. There is no action-at-a-distance, and no magic.Rogerhttp://www.blogger.com/profile/03474078324293158376noreply@blogger.com2tag:blogger.com,1999:blog-8148573551417578681.post-782415490865512552024-03-04T06:00:00.000-08:002024-03-04T06:00:00.138-08:00Poincare was Five Years ahead of EinsteinIt is funny to see historians try to credit Einstein. Here is a
<a href="https://link.springer.com/article/10.1007/s40329-013-0013-1">2013 essay</a> I had not seen before: <blockquote>On some points, such as the principle of relativity or the physical interpretation of the Lorentz transformations, Poincaré’s contributions preceded by at least 5 years those of Einstein’s published in 1905. On the other hand, many of their contributions were practically simultaneous. In 1905 Poincaré published an abridged version of his “Sur la dynamique de l’électron” [3] (which preceded the work of Einstein); the expanded version of the article appeared in 1906 [4].
<p>
What are the conceptual differences? According to Darrigol,
</p><blockquote>Einstein completely eliminated the ether, required that the expression of the laws of physics should be the same in any inertial frame, and introduced a “new kinematics” in which the space and time measured in different inertial systems were all on exactly the same footing. In contrast, Poincaré maintained the ether as a privileged frame of reference in which “true” space and time were defined, while he regarded the space and time measured in other frames as only “apparent.” …Einstein derived the expression of the Lorentz transformation from his two postulates (the relativity principle and the constancy of the velocity of light in a given inertial system), whereas Poincaré obtained these transformations as those that leave the Maxwell–Lorentz equations invariant [1].</blockquote>
These are conceptual differences that have no actual experimental consequences as far as electromagnetism and optics are concerned. As Lorentz commented, the difference is purely epistemological: it concerns the number of conventional and arbitrary elements that one wishes to introduce in the definitions of the basic physical concepts.
<p>
Are we then dealing with a case of simultaneous discovery? </p><p></p></blockquote>
No, it was not simultaneous. Poincare was five years ahead of Einstein.
Poincare was years ahead of Einstein with the relativity principle, rejection of the aether, local time, synchronizing clocks, interpreting Michelson-Morley, Lorentz group, mass-energy equivalence, four-dimensional spacetime, and relativistic theories of gravity.
Einstein's only claim to originality is to certain epistemological differences of no physical significance.
<p>
Credit Einstein with those obscure conceptual differences if you want, but there is no mathematical or physical value to any of them. Mostly they consist of mathematical misunderstandings by Einstein and other physicists. For example, there is nothing erroneous about choosing a privileged frame on a symmetric space. It does not break the symmetry. Those who criticize Poincare for occasionally choosing a privileged frame are just mathematically ignorant.
<p>
The essay concludes:
</p><blockquote>As Darrigol suggests, it seems wiser to concede that Lorentz, Poincaré and Einstein all contributed to the emergence of the theory of relativity, that Poincaré and Einstein offered two different versions of the theory, and that Einstein gave form to what today is considered the best one. </blockquote>
No, Einstein's is not considered best today.
Nearly everyone prefers the spacetime formulation that Poincare advanced in 1905 and Minkowski perfected in 1907. Einstein was <a href="https://blog.darkbuzz.com/2023/09/ehrenfest-paradox-and-psychological.html">still rejecting it in 1911</a>, and did not even speak positively about it until after that, and never really accepted the geometrical significance.<p>
A lot of historians concede that Lorentz and Poincare had all the mathematics of special relativity, and all the
physical consequences, but insist that Einstein had a more modern viewpoint or superior understanding,
leading to how we understand the theory today. But that is false. The Poincare-Minkowski geometrical interpretation
has been preferred by nearly everyone but Einstein, since 1908.</p>Rogerhttp://www.blogger.com/profile/03474078324293158376noreply@blogger.com3tag:blogger.com,1999:blog-8148573551417578681.post-34456425971125279652024-02-24T11:20:00.000-08:002024-02-24T11:20:50.185-08:00Google AI is Rewriting History<div class="separator" style="clear: both;"><a href="https://www.unz.com/wp-content/uploads/2024/02/Screenshot-2024-02-23-at-10.35.30%E2%80%AFPM.png" style="display: block; padding: 1em 0; text-align: center; "><img alt="" border="0" height="600" data-original-height="800" data-original-width="727" src="https://www.unz.com/wp-content/uploads/2024/02/Screenshot-2024-02-23-at-10.35.30%E2%80%AFPM.png"/></a></div>Rogerhttp://www.blogger.com/profile/03474078324293158376noreply@blogger.com5tag:blogger.com,1999:blog-8148573551417578681.post-3861376271371429502024-02-22T09:59:00.000-08:002024-02-22T09:59:25.726-08:00Chinese Deflate Quantum Hype AgainSabine Hossenfelder is now doing short daily physics news videos, and her latest is on <a href="https://www.youtube.com/watch?v=KSV0RMlJpEg">Bad News for Quantum Computing: Another Advantage Gone</a>.
<p>
In short, quantum computing researchers have been claiming quantum supremacy for years. Some call it quantum advantage.
However, there has never been any convincing demonstration that quantum computers have any speedup at all over
conventional computers.
<p>
The latest is that IBM claimed last year to do a quantum calculation on a "noisy" quantum computer.
Some thought that they had outdone Google. But a Chinese group outdid them by doing the calculation faster and better
on a classical computer.
<p>
The quantum enthusiasts will argue, as usual, that this does not disprove quantum computing, and maybe a more
clever experiment would show an advantage. I am waiting.Rogerhttp://www.blogger.com/profile/03474078324293158376noreply@blogger.com4tag:blogger.com,1999:blog-8148573551417578681.post-31914725514368522032024-02-12T06:00:00.001-08:002024-02-12T06:00:00.136-08:00The physicists philosophy of physicsPrinceton astrophysicist PJE Peebles <a href="https://arxiv.org/abs/2401.16506">writes</a>: <blockquote>The starting idea of the natural sciences is that the world operates by rules that can
be discovered by observations on scales large or small, depending on what interests
you. In fundamental physics, the subject of this essay, the idea is narrowed to four
starting assumptions.
<p>
A: The world operates by rules and the logic of their application that can be
discovered, in successive approximations.
<p>
B: A useful approximation to the rules and logic, a theory, yields reliably
computed quantitative predictions that agree with reliable and repeatable mea-
surements, within the uncertainties of the predictions and measurements.
<p>
C: Fundamental physical science is growing more complete by advances in
the quantity, variety, and precision of empirical fits to predictions, and by occa-
sional unifications that demote well-tested fundamental physical theories to useful
approximations to still better theories.
<p>
D: Research in fundamental physical science is advancing toward a unique
mind-independent reality.
</blockquote>
These sound reasonable, but they leave no room for many-worlds theory, string theory, simulation hypothesis,
superdeterminism, or many of the ideas that are now fashionables.
<p>
The essay gives way too much attention to philosopher Thomas Kuhn.
<p>
It quotes Einstein:
<blockquote>The supreme task of the physicist is to arrive at those universal elemen-
tary laws from which the cosmos can be built up by pure deduction.</blockquote>
This sounds a little like Weinberg's mythical Final Theory, also discussed.
<p>
No, trying to build the cosmos from pure deduction is foolishness.Rogerhttp://www.blogger.com/profile/03474078324293158376noreply@blogger.com7tag:blogger.com,1999:blog-8148573551417578681.post-69670811488701424652024-02-08T06:00:00.000-08:002024-02-08T20:00:19.891-08:00Dissecting Einstein's BrainThe RadioLab podcast just <a href="https://radiolab.org/podcast/g-relative-genius">rebroadcast this</a>: <blockquote>Albert Einstein asked that when he died, his body be cremated and his ashes be scattered in a secret location. He didn’t want his grave, or his body, becoming a shrine to his genius. When he passed away in the early morning hours of April, 18, 1955, his family knew his wishes. There was only one problem: the pathologist who did the autopsy had different plans.
<p>
In the third episode of “G”, Radiolab’s miniseries on intelligence, we go on one of the strangest scavenger hunts for genius the world has ever seen. We follow Einstein’s stolen brain from that Princeton autopsy table, to a cider box in Wichita, Kansas, to labs all across the country. And eventually, beyond the brain itself entirely. All the while wondering, where exactly is the genius of a man who changed the way we view the world? </blockquote>
Later in the show, it discussed theories for the origin of Einstein's most brillian idea -- special relativity. Besides his extra-smart brain, it mentioned his physicist wife and a philosopher. It even had professor Galison explaining how train schedules causes people to rethink time.
<p>
Okay, but there was no mention of Lorentz and Poincare, or the fact that they had published the entire theory ahead of Einstein.
<p>
Galison is unusual because he does not recite crazy stories about Einstein's originality, like other Einstein scholars.
He read Lorentz and Poincare and obviously understands that they did it all first, but he <a href="https://en.wikipedia.org/wiki/Relativity_priority_dispute#Peter_Galison_%282002%29">refuses to
comment on the priority dispute</a>.Rogerhttp://www.blogger.com/profile/03474078324293158376noreply@blogger.com2tag:blogger.com,1999:blog-8148573551417578681.post-7959196667611938752024-02-05T06:00:00.000-08:002024-02-05T06:00:00.130-08:00What's the difference, said HeisenbergFrom a <a href="https://math.stackexchange.com/questions/38387/distinguishing-between-symmetric-hermitian-and-self-adjoint-operators">math site</a>: <blockquote>
In the 1960s Friedrichs met Heisenberg and used the occasion to express to him the deep gratitude of mathematicians for having created quantum mechanics, which gave birth to the beautiful theory of operators on Hilbert space. Heisenberg allowed that this was so; Friedrichs then added that the mathematicians have, in some measure, returned the favor. Heisenberg looked noncommittal, so Friedrichs pointed out that it was a mathematician, von Neumann, who clarified the difference between a self-adjoint operator and one that is merely symmetric. "What's the difference," said Heisenberg.
<p>
- story from Peter Lax, Functional Analysis (slightly edited for length) </blockquote>
There is the difference between a physicist, and a mathematical physicist.
<p>
<a href="https://en.wikipedia.org/wiki/John_von_Neumann">John von Neumann</a> wrote a <a href="https://en.wikipedia.org/wiki/Mathematical_Foundations_of_Quantum_Mechanics">1932 book on quantum mechanics</a>, and turned it into a real theory.
<p>
To a physicist, an observable is a symmetric operator, because those are the ones that give real values, and only real values are observed.
To von Neumann, an observable is a <a href="https://en.wikipedia.org/wiki/Self-adjoint_operator">self-adjoint operator</a> on a Hilbert space, where some additional technical requirements are needed in order to prove the <a href="https://en.wikipedia.org/wiki/Spectral_theorem">spectral theorem</a>.
<p>
I am not trying to say that Heisenberg was stupid. But it is striking that a world-famous physicist could get a Nobel Prize for using operators as observables, and still be oblivious to the formal mathematical definition found in textbooks. We cannot expect physicists to understand mathematical subtleties. Rogerhttp://www.blogger.com/profile/03474078324293158376noreply@blogger.com2tag:blogger.com,1999:blog-8148573551417578681.post-9735872141960624062024-02-01T06:00:00.000-08:002024-02-01T06:00:00.246-08:00The World is not DiscreteSome people like to say that Quantum Mechanics makes the world discrete. That is not true. But I always assumed that QM models could be approximated by lattice models.
<p>
Apparently this is not true. We know that the weak force is chiral, ie, it violates mirror reflection symmetry. Neutrinos are left-handed in the Standard Model.
<p>
From the
<a href="https://scottaaronson.blog/?p=7705#comment-1966495">Scott Aaronson blog</a>: <blockquote>“There is currently no fully satisfactory way of evading the Nielsen-Ninomiya theorem. This means that there is no way to put the Standard Model on a lattice. On a practical level, this is not a particularly pressing problem. It is the weak sector of the Standard Model which is chiral, and here perturbative methods work perfectly well. In contrast, the strong coupling sector of QCD is a vector-like theory and this is where most effort on the lattice has gone. However, on a philosophical level, the lack of lattice regularisation is rather disturbing. People will bang on endlessly about whether or not we live “the matrix’”, seemingly unaware that there are serious obstacles to writing down a discrete version of the known laws of physics, obstacles which, to date, no one has overcome.”</blockquote>
There is a whole industry of physicists doing lattice approximations to the SM, but the SM is chiral and the approximations are not, so there is no hope that the approximations converge to the SM.
<p>
Aaronson is commenting on the silly idea that we live in a computer simulation. If we did, it would raise another silly idea
that we could overwork the simulator by doing certain experiments.Rogerhttp://www.blogger.com/profile/03474078324293158376noreply@blogger.com3tag:blogger.com,1999:blog-8148573551417578681.post-75844119615694467432024-01-29T06:00:00.000-08:002024-01-29T06:00:00.132-08:00Quantum Computer Revolution may be Further off<a href="https://spectrum.ieee.org/quantum-computing-skeptics">IEEE Spectrum reports</a>: <blockquote>
The quantum computer revolution may be further off and more limited than many have been led to believe. That’s the message coming from a small but vocal set of prominent skeptics in and around the emerging quantum computing industry.
<p>
The problem isn’t just one of timescales. In May, Matthias Troyer, a technical fellow at Microsoft who leads the company’s quantum computing efforts, co-authored a paper in Communications of the ACM suggesting that the number of applications where quantum computers could provide a meaningful advantage was more limited than some might have you believe.
<p>
“We found out over the last 10 years that many things that people have proposed don’t work,” he says. “And then we found some very simple reasons for that.”
...
<p>Even in the areas where quantum computers look most promising, the applications could be narrower than initially hoped. In recent years, papers from researchers at scientific software company Schrödinger and a multi-institutional team have suggested that only a limited number of problems in quantum chemistry are likely to benefit from quantum speedups. ...
<p>
“In the public, the quantum computer was portrayed as if it would enable something not currently achievable, which is inaccurate,” he says. “Primarily, it will accelerate existing processes rather than introducing a completely disruptive new application area. So we are evaluating a difference here.”
...
“Most problems in quantum chemistry do not scale exponentially, and approximations are sufficient,” he says. “They are well behaved problems, you just need to make them faster with increased system size.”
</blockquote>
Compare to the hype surrounding Artificial Intelligence (AI).
It is also over-hyped by its enthusiasts, but it has also delievered a lot of very impressive demonstrable results.
Quantum computing has delivered nothing, and may never deliver anything.
<p>
Someday we really will have personal robots and self-driving cars, but we may never have a useful quantum computer.
<p>
Google Research <a href="https://youtu.be/yZ8WRLC2RT8">just released a video</a>:
<blockquote>Quantum Computing - Hype vs. reality | Field Notes
<p>
Google Research<br>
35.7K subscribers
<p>
25,188 views Jan 22, 2024 #GoogleAI #GoogleResearch<br>
As the race to build the world's first truly useful quantum computer intensifies, so too does the need for clear-eyed assessment. This Field Notes episode brings in the Google Quantum AI team to help answer a few fundamental questions to drive understanding of its impact now and in the future.</blockquote>
It says quantum computers could become useful by 2030, or maybe a few years later.
<p>
the group is called "Quantum AI", but the video said nothing about AI. Just combining buzzwords,
I guess.
<p>
The most touted application was fusion simulations, in order to help bring fusion power
plants to market. Othere were discuvering drugs, and making the planet greener with
chemistry for better batteries and fertilizer.
<p>
No mention of breaking everyone's cryptosystems. That is the only think quantum computer
enthusiasts are sure about.
<p>
I am amazed that Google keeps funding this pipe dream. It has <a href="https://killedbygoogle.com/">canceled hundreds
of really useful products</a>.
It has developed some really good AI, but is checken to market it like OpenAI and Microsoft.
Elsewhere <a href="https://blog.research.google/2020/03/announcing-tensorflow-quantum-open.html">Google touts quantum machine learning</a>, but I doubt this will ever be practical.
The non-quantum methods are progressing rapidly, and there is no sign
that quantum computers would be useful.
<p>
Self-driving cares are over-hyyped, but I believe we are making progress and will get
there. I do not think that that we are getting any closer to quantum computing.Rogerhttp://www.blogger.com/profile/03474078324293158376noreply@blogger.com1tag:blogger.com,1999:blog-8148573551417578681.post-34259956966857685362024-01-26T06:00:00.000-08:002024-01-26T06:00:00.134-08:00The Evidence for CO2 Global WarmingSabine Hossenfelder <a href="https://www.youtube.com/watch?v=J1KGnCj_cfM">posts</a>: <blockquote>How do we know climate change is caused by humans?
<p>
In this video I summarize the main pieces of evidence that we have which show that climate change is caused by humans. This is most important that we know in which frequency range carbon dioxide absorbs light, we know that the carbon dioxide ratio in the atmosphere has been increasing, we know that the Ph-value of the oceans has been decreasing, the ratio of carbon isotopes in the atmosphere has been changing, and the stratosphere has been cooling, which was one of the key predictions of climate models from the 1960s.
</p></blockquote>
She says this info is hard to find, but I found the same info as the first link from a search, a <a href="https://www.ucsusa.org/resources/are-humans-major-cause-global-warming">2009 artucke</a>:
<blockquote>How Do We Know that Humans Are the Major Cause of Global Warming?</blockquote>
YouTube also slaps an obnoxious "context" link on the video, with some info. You also get the same info from ChatGPT.
<p>
The evidence is that humans burning fossil fuels emit CO<sub>2</sub>, and the increases in atmospheric CO<sub>2</sub> have
caused warming. Probably most of the warming observed in recent decades.
</p><p>
My quibble is when they leap from this to saying that humans cause most of the climate change. The climate is changing a lot
of different ways, in different places. I do not see anyone even trying to quantify climate change. Just CO<sub>2</sub> and temperature.</p><p></p>Rogerhttp://www.blogger.com/profile/03474078324293158376noreply@blogger.com2tag:blogger.com,1999:blog-8148573551417578681.post-33717645260084529322024-01-22T06:00:00.001-08:002024-01-22T06:00:00.156-08:00Albert Explains Flaws in Many-Worlds<a href="https://youtu.be/vkVn2oQD_u0?si=2INxxTvUh0pG5CyF">Newly-released video</a>: <blockquote>
David Albert - What Does Quantum Theory Mean?
<p>
Quantum theory may be weird—superposition and entanglement of particles that in our normal world would make no sense—but quantum theory is truly how the microworld works. What does all this weirdness mean? How to go from microworld weirdness to macroworld normalcy? Will we ever make sense out of quantum mechanics?</p></blockquote>
Albert is a physicist-turned-philosopher, and he explains this pretty well.
<p>
He goes on to say that more and more physicists are adopting the many-worlds interpretation.
He says it is counter-intuitive, but does not reject it for that reason. He rejects
it because it does not explain the world.
</p><p>
In his opinion, it does not really solve the measurement problem, for two reasons.
</p><p>
(1) it tries to explain the definite outcomes as an illusion. Maybe this position could be justified some day.
</p><p>
(2) it cannot explain the probabilities we see, as many-worlds says all outcomes are determined.
</p><p>
He admits that physicists have done a lot of contortions to try to get around these issues, but they have failed.
</p><p>
"At the end of the day, it does not account for our experience."
</p><p>
I agree with him on these points. Perhaps mathematical physicists will develop a decoherence theory showing that the wave function branching resembles what we see. It hasn't happened yet, but it is possible.
<p>
But many-worlds will never explain the probabilities, because the whole point of many-worlds is
to reject probabilities. The parallel worlds arise because probabilities are interpreted as world splittings,
and all possibilities are realized in inaccessible alternate worlds.
<p>
So why are more and more physicists adopting such a wrong theory? No answer given. Physicists are losing
their grip on reality.</p>Rogerhttp://www.blogger.com/profile/03474078324293158376noreply@blogger.com4tag:blogger.com,1999:blog-8148573551417578681.post-64664517457636614142024-01-18T17:25:00.000-08:002024-01-18T17:25:00.128-08:00Higgs Boson did not Revolutionize PhysicsDavid Berlinski <a href="https://evolutionnews.org/2012/11/surely_its_disc/">wrote in a 2012 essay</a>: <blockquote>The discovery was announced; the story reported; and then there was silence. Physicists endeavoured, of course, to maintain the impression that they had discovered something of inestimable value. They were game. Writing in The Daily Beast, Sean Carroll predicted that the Higgs Boson would “revolutionize physics,” and if this is what physicists always say, then at least they seem never weary of saying it.
<p>
Lawrence Krauss, writing in The Daily Beast as well, gave it his best. Many years ago, Leon Lederman had designated the Higgs Boson as the God particle. No one can today remember why. The God particle? “Nothing could be further from the truth,” Krauss remarked. In this, of course, he was entirely correct: Nothing could be further from the truth.
<p>
In the end, Krauss, like Carroll before him, could do no better than an appeal to the revolution. The discovery of the Higgs Boson “validates an unprecedented revolution in our understanding of fundamental physics …” Readers of The Daily Beast are always pleased to uphold the revolution, no matter how revolting. Yet, the Standard Model was completed in the early 1970s, </blockquote>
From what I have found, calling everything a <i>revolution</i> stems from calling the Copernicus heliocentric model the Copernican Revolution, because the Earth revolved around the Sun. It was a weak pun. And then that was so important, it became the Scientific Revolution.
<p>
Finding the Higgs Boson just confirmed what people thought 50 years earlier. Not a revolution.Rogerhttp://www.blogger.com/profile/03474078324293158376noreply@blogger.com1tag:blogger.com,1999:blog-8148573551417578681.post-66316550711822917302024-01-15T06:00:00.000-08:002024-01-15T06:00:00.130-08:00What would it have looked like?Lawrence Krauss likes to tell <a href="https://www.econlib.org/archives/2015/02/the_wittgenstei.html">this ancedote</a> about the nature of science: <blockquote>“Tell me,” the great twentieth-century philosopher Ludwig Wittgenstein once asked a friend, “why do people always say it was natural for man to assume that the sun went around the Earth rather than that the Earth was rotating?” His friend replied, “Well, obviously because it just looks as though the Sun is going around the Earth.” Wittgenstein responded, “Well, what would it have looked like if it had looked as though the Earth was rotating?”</blockquote>
For example, he <a href="https://youtu.be/XicDuZai9go?t=490">tells it in this interview</a>,
where he attributes it to a play, so it might be fiction. He <a href="https://youtu.be/jbzfBE0FwkI?si=xDMxvYwNq-520KGw&t=2603">tells it again here</a>, plugging his latest book.
<p>
It is a good story. Just because your data fits your model, you cannot conclude that your model is right. There could be a completely different model that fits just as well.
<p>
I am not sure what point Krauss was making. He seems to be saying that the many-worlds theory
would look just like the Copenhagen interpretation of quantum mechanics. This is not a great example, because in our world we see more probable events as more likely. In many-worlds theory,
there is no known reason for that happening. It is like saying the world is a simulation.
It does not look like a simulation unless you also assume that the simulator has
replicated natural laws very accurately.Rogerhttp://www.blogger.com/profile/03474078324293158376noreply@blogger.com2tag:blogger.com,1999:blog-8148573551417578681.post-76752873802788574492024-01-10T06:00:00.000-08:002024-01-10T06:00:00.133-08:00If a Proton is just Bits, it must be a lot<a href="https://en.wikipedia.org/wiki/Seth_Lloyd">Seth Lloyd</a> <a href="https://youtu.be/tv1pTKeHjmI?t=188">argues that matter is made of information</a>: <blockquote>Does information work at the deep levels of physics, including quantum theory, undergirding the fundamental forces and particles? But what is the essence of information—describing how the world works or being how the world works. There is a huge difference. Could information be the most basic building block of reality?
<p>
Seth Lloyd is a professor of mechanical engineering at the Massachusetts Institute of Technology. He refers to himself as a “quantum mechanic”.</blockquote>
Okay, but he is challenged for a proton, and says that a proton is fully described by 50-60 bits for its location
in the universe, and 1 bit for spin up or down.
<p>
What? The diameter of the observable universe is about 4x10<sup>28</sup> cm. So that is about 6x10<sup>84</sup> cm<sup>3</sup> in volume, so it would take that many bits to specify location to the nearest cubic cm.
<p>
A cubic cm is a lot of space for a proton. We need at least 100 bits to specify a proton location to some
small region. And the universe could be bigger than what is observable.
<p>
But that is not my issue here. The proton could have velocity. Need many more bits for that.
<p>
And spin is not just one bit. Spin could point in any direction, not just up or down.
<p>
None of these proton parameters can be specified precisely, because of Heisenberg Uncertainty.
A proton can have a wave function, and not position and momentum at the same time.
So how many bits are needed for a wave function?
<p>
But then the wave function is not even real, so I don't know if it makes sense to ask how
many bits are needed for a wave function.
<p>
So if a proton is equivalent to some number of bits of information, I don't know how to
calculate that number. Lloyd is underestimating them.Rogerhttp://www.blogger.com/profile/03474078324293158376noreply@blogger.com3tag:blogger.com,1999:blog-8148573551417578681.post-53609390593209418122024-01-08T06:00:00.013-08:002024-01-08T06:00:00.152-08:00Remembering Voigt in the Relativity Priority DisputeFrom a <a href="http://philsci-archive.pitt.edu/16962/">2019 paper</a> on the origin of special relativity:
<blockquote> Voigt transformations in retrospect: missed
opportunities? ...
<p>Nearly two decades before the vigorous development of special relativity has started, in 1887 Woldemar Voigt published an article on the
Doppler effect in which some fundamental principles underlying the relativity theory were anticipated. Namely, he was the first who used Einstein’s second postulate (universal speed of light) and the restricted form
of the first postulate (invariance of the wave equation when changing the
inertial reference system) to show that the Doppler shift of frequency was
incompatible with Newtonian absolute time and required a relative time
identical with the Lorentz’s local time introduced later.</blockquote>
Voigt's paper was not appreciated. The paper moves on to "the pointless Einstein-Poincar´e priority dispute."
<blockquote>In
particular we are interested in to find out why the role played by Poincar´e
was not properly acknowledged at that time by his contemporaries. Our
hypothesis is that this happened because Poincar´e’s approach required
a higher level of mathematical education than the majority of physicists
had at that time. Minkowski belonged to a few who were in a position to
duly appreciate Poincar´e’s contribution. </blockquote>
Poincare and Minkowski died in the next several years, so that partially explains why they
did not take part in a priority dispute with Einstein. The paper acknowledges that Einstein lied
about his sources all his life.
<p>
Many credit Einstein for discovering clock synchronization and the relativity of simultaneity in 1905,
but that is clearly false:
<blockquote>Already in 1898, “Poincar´e had presented
exactly the same light signaling and clock synchronization thought experiment that would later be found in Einstein’s 1905 relativity paper”
[19], although Poincar´e’s presentation is without any mention of the relativity principle and Lorentz’s local time.
Two years later in his lecture “Lorentz’s theory and the principle
of reaction” Poincar´e used his light signaling and clock synchronization
thought experiment to explain the physical meaning of the Lorentz’s
local time [19]. ...
<p>
In 1902 letter to the Nobel committee to nominate Lorentz
for the Nobel prize in Physics, which he indeed was awarded, Poincar´e
praises very highly Lorentz’s “most ingenious invention” of “local time”
and writes: “Two phenomena happening in two different places can appear simultaneous even though they are not: everything happens as if
the clock in one of these places were late with respect to that of the
other, and as if no conceivable experiment could show evidence of this
discordance” [19].
</blockquote>
Minkowski was the much bigger influence on acceptance of relativity:
<blockquote> Minkowski’s September 21st, 1908, lecture “Space and Time” was a
crucial event in the history of relativity 1. ...
<p>The influence of the Cologne lecture was enormous. Its published
version “sparked an explosion of publications in relativity theory, with
the number of papers on relativity tripling between 1908 (32 papers)
and 1910 (95 papers)” [31]. The response to the Minkowski’s lecture
was overwhelmingly positive on the part of mathematicians, and more
mixed on the part of physicists — only in the 1950s their attitude began
to converge toward Minkowski’s space-time view [31]. ...
<p>
However, in our opinion, to make the decision to exclude Poincar´e’s name
from the Cologne lecture Minkowski needed some serious reason to psychologically justify such an unfair omission. </blockquote>
The geometry of special relativity was only appreciated by mathematicians:
<blockquote>A surprising fact about Minkowski’s “Raum und Zeit” lecture is
that it never mentions Klein’s Erlangen program of defining a geometry by its symmetry group [27]. A link between Minkowski’s presentation of special relativity and Erlangen program was immediately recognized by Felix Klein himself [116] who remarked: “What the modern
physicists call the theory of relativity is the theory of invariants of the
4-dimensional space-time region x, y, z, t (the Minkowski ’world’) under
a certain group of collineations, namely, the ’Lorentz group’ ”. Untimely death of Minkowski presumably hindered the appreciation of this
important fact by physicists. </blockquote>
It concludes:
<blockquote>In parallel to the advance in modern physics, in the middle of the twentieth century it became increasingly evident that Poincar´e’s contribution to relativity was unjustly downplayed. As a result, some attempts
to restore the justice followed. ...
<p>
Most succinctly this difference was expressed by Lorentz
himself: ‘the chief difference being that Einstein simply postulates what
we have deduced, with some difficulty, and not altogether satisfactorily,
from the fundamental equations of the electromagnetic field” [133].
<p>
Poincar´e’s objective was much more ambitious
than Einstein’s as he wanted to derive special relativity as an emergent
phenomenon. It is quite possible therefore that Poincar´e simply considered Einstein’s contribution as being too trivial in light of this bigger
goal. “To Poincar´e, Einstein’s theory must have been seen as a poor attempt to explain a small part of the phenomena embraced by the Lorentz
theory” [135].
<p>
There is still another aspect which makes Einstein-Poincar´e priority dispute pointless. Modern understanding of relativity is significantly
different from the one that was cultivated at the beginning of the twentieth century. Two examples are the notions of æther and relativity of
simultaneity which are often used in the priority dispute. ...
<p>
Usually this stubbornness of Poincar´e with respect to the æther is
considered as his weak point, as an evidence that he didn’t really understand relativity. It is historically true that the abolishment of the æther
by Einstein played a crucial role and revolutionized physics. However,
frankly speaking, in retrospect, when this revolution came to its logical
end in modern physics, we can equally well consider Poincar´e’s attitude
as prophetical.
<p>
As modern physics has progressed in the twentieth century, it became increasingly evident that the vacuum, the basic state of quantum
field theory, is anything but empty space. In fact, at present an æther,
“renamed and thinly disguised, dominates the accepted laws of physics”
[141]. It is clear that only “intellectual inertia” [142] prevents us from using historically venerable word “æther” instead of “vacuum state” when
referring to the states with such complex physical properties as vacuum
states of modern quantum theories.
<p>
Poincar´e proponents in the priority dispute argue that Einstein synchronization, which Einstein himself considered as the crucial element
of special relativity, has in fact originated from Poincar´e’s work.
<p>
In light of this immense and still continuing progress of modern physics, attempts to retrospectively induce an artificial Poincar´e-Einstein priority dispute and rewrite the history seem minute. We will
be happy if this arid and futile dispute will come to its end. There is
nothing scientific in it and its presence only emphasizes hideous traits
of human nature. </blockquote>
Did the authors of this paper think that they were going to write the last word on the subject?
<p>
This paper is convincing that Lorentz and Poincare had all of relativity theory before Einstein, that Einstein lied about his sources to get more credit for himself, and that in retrospect the Poincare-Minkowski view was superior to Einstein's.
<p>
However the authors think that it is unfair to judge Einstein in retrospect, as no one could have known which ideas would be more important later.
It took 50 years, the authors say, for the Physics community to come around to the Poincare-Minkowski geometric view.
The paper puts a lot of weight on the opinion of Max Born, who was a friend of Einstein, and who was greatly influenced by Einstein's 1905 paper.
But there is not much substance to Born's opinion. Many people are greatly influenced by a textbook, but that does not mean that the textbook is original. While Born was a relativity expert, it is not clear that he understood Poincare's papers.
<p>
I might agree that the priority dispute is tiresome and settled, except that the Physics community continues to idolize Einstein as the greatest genius ever, for how he discvoered relativity. For example, see <a href="https://www.discovermagazine.com/the-sciences/the-10-greatest-scientists-of-all-time">this recent Discover magazine list</a> of the ten greatest scientists of all time, where Einstein is number one, mainly for relativity work. (Three of the other nine are women, but that is another story.)Rogerhttp://www.blogger.com/profile/03474078324293158376noreply@blogger.com0tag:blogger.com,1999:blog-8148573551417578681.post-78520461023066814872024-01-05T06:00:00.000-08:002024-01-05T06:00:00.133-08:00Nature has Fake Franklin ControversyNature magazine declared its <a href="https://www.nature.com/articles/d41586-023-04046-7">favorite science stories of the year</a>, and one of the top ones was <a href="https://www.nature.com/articles/d41586-023-01313-5">What Rosalind Franklin truly contributed to the discovery of DNA’s structure</a>: <blockquote>Rosalind Franklin was not a ‘wronged heroine’, she was an equal contributor to the discovery.</blockquote>
No, she was not an equal contributor. She had little contact with Crick and Watson.
<p>
I am all for crediting her for what she did. She did valuable work on DNA that got used
by Crick and Watson. But this Nature article added nothing new. The story is well-known,
and can be found on Wikipedia.
<p>
This story is politicized, because people hate Watson for saying that people have
genetic difference, and love to put Franklin on lists of great XX century scientists,
because she was a woman.Rogerhttp://www.blogger.com/profile/03474078324293158376noreply@blogger.com0tag:blogger.com,1999:blog-8148573551417578681.post-11141704225859076982024-01-03T06:00:00.000-08:002024-01-03T06:00:00.242-08:00Free WIll is Indetermined BehaviorPhilosopher Ned Block <a href="https://youtu.be/JNaEvSeFGNs?si=oewLHdIF8bpu8WF9&t=80">argues</a>: <blockquote> Determinism ... Indeterminism, though, is just as bad because if you do something by chance,
that doesn't mean it is done by you freely. This is a point made many years ago. It looks like
both determinism and indeterminism are incompatible with free will, which shows there is something wrong with the concept. </blockquote>
Yes, the argument has been made many times, and it is nonsense.
<p>
Saying that someone's choices are indetermined is essentially the same as saying that he is free to
make a choice.
<p>
If you can predict my choice, then it is apparently determined by past events. But if I make a
free choice, then you cannot predict it, and it seems like random chance.
<p>
His argument is like saying electrons do not exist. An electron is a charged particle, so its
charge must be positive or negative. If the charge is positive, then it is not an electron.
A negative charge is just as bad, because then it would be a negative charge carrier,
instead of an electron.
<p>
The argument does not say anything.
<p>
It is amazing how many philosophers and other scholar swallow this nonsense argument.Rogerhttp://www.blogger.com/profile/03474078324293158376noreply@blogger.com0tag:blogger.com,1999:blog-8148573551417578681.post-60014340594496169352024-01-01T06:00:00.000-08:002024-01-01T07:30:05.417-08:00History of General Relativity DevelopmentGalina Weinstein <a href="https://arxiv.org/abs/2311.04612">writes in a new paper</a>: <blockquote>This analysis explores Einstein's evolving ideas and decisions regarding the mathematical framework of his theory of gravity during the critical period 1912-1916. My findings in this paper highlight that Einstein's brilliance did not exist in isolation but thrived within a vibrant scientific discourse. His work was significantly enriched through contributions and discussions with friends and colleagues, notably Michele Besso and Marcel Grossmann, illustrating the collaborative essence of scientific advancement.</blockquote>
Yes, I think that is correct. Einstein got most of the crucial ideas from others.
<blockquote>Einstein’s theory of general relativity is widely regarded as one of the most significant breakthroughs in the history of physics. It challenged established notions and
expanded the boundaries of our understanding, unveiling a new vision of spacetime
and gravity.
<p>
Many intriguing questions surround Einstein’s groundbreaking achievements. Was
the theory of general relativity solely the creation of Einstein, the solitary figure who
would seclude himself in an office with his violin, pipe, and a stack of papers? Or
was it the culmination of Einstein’s multifaceted collaborations and interactions with
other scientists? </blockquote> She has written a book to give a long answer.
<p>
As I see it, special relativity was the more significant breakthrough. After that,
it was clear that we need a Lorentz-invariant gravity theory that locally looked
like Minkowski space, and that approximated Newtonian gravity. The main obstacle was
the development of Riemannian geometry.
<p>
The available covariant tensors were the Riemann tensor, Ricci tensor, metric tensor, and scalar curvature.
<p>
The field equations seem complex, but they really just say that the Ricci tensor is zero between stars and planets. It is not clear who had that idea.
<p>
There is a Wikipedia page on <a href="https://en.wikipedia.org/wiki/General_relativity_priority_dispute">General relativity priority dispute</a>.
<p>
<a href="https://www.azquotes.com/quote/823611">Einstein once said</a>:
<blockquote>Thanks to my fortunate idea of introducing the relativity principle into physics, you (and others) now enormously overrate my scientific abilities, to the point where this makes me quite uncomfortable.</blockquote>
He did not introduce the relativity principle. It was not his idea.
He got it from Poincare. But yes, Einstein was greatly overrated because he was falsely
credited with special relativity.Rogerhttp://www.blogger.com/profile/03474078324293158376noreply@blogger.com0tag:blogger.com,1999:blog-8148573551417578681.post-12603316318539502402023-12-26T06:00:00.000-08:002023-12-26T06:00:00.132-08:00Q-Day Predicted for 2025<a href="https://www.reuters.com/investigates/special-report/us-china-tech-quantum/">Reuters reports</a>:<blockquote>QD5’s executive vice president, Tilo Kunz, told officials from the Defense Information Systems Agency that possibly as soon as 2025, the world would arrive at what has been dubbed <b>“Q-day,” the day when quantum computers make current encryption methods useless</b>. Machines vastly more powerful than today’s fastest supercomputers would be capable of cracking the codes that protect virtually all modern communication, he told the agency, which is tasked with safeguarding the U.S. military’s communications.
<p>
In the meantime, Kunz told the panel, a global effort to plunder data is underway so that intercepted messages can be decoded after Q-day in what he described as “<b>harvest now, decrypt later</b>” attacks, according to a recording of the session the agency later made public.
<p>
Militaries would see their <b>long-term plans and intelligence gathering exposed to enemies</b>. Businesses could have their intellectual property swiped. People’s health records would be laid bare. ...
<p>
Kunz is among a growing chorus sounding this alarm. Many cyber experts believe <b>all the major powers are collecting ahead of Q-day</b>. The United States and China, the world’s leading military powers, are accusing each other of data harvesting on a grand scale.</blockquote>
Okay, I am marking the calendar. 2025 is only a year away.
<p>
I think that there is no chance of a Q-Day in my lifetime, but it is nice to have these predictions.Rogerhttp://www.blogger.com/profile/03474078324293158376noreply@blogger.com7tag:blogger.com,1999:blog-8148573551417578681.post-38735713366975552572023-12-24T06:00:00.000-08:002023-12-24T06:00:00.129-08:00Guth on ObserversHere is a <a href="https://youtu.be/y86FgGYV6cM">short interview</a>: <blockquote>
Alan Guth - What are Observers?
<p>
Why is an observer a critical part of quantum physics? What does it mean to be an observer? Does the act of observation affect what exists and what happens in the external world? Why is observation in the quantum world still a mystery?</blockquote>
He accepts many-worlds theory, and claims most of his colleagues do. He says it is simpler because
you just accept the Schroedinger equation, and you eliminate the need for observers or for making
predictions.
<p>
So a theory is simpler if you do not worry about observations.
<p>
They may sound bad, he says, but it ties in nicely with the eternal inflation cosmology theory.
That has infinitely many universes being spawned for other reasons, and the infinities
make probabilities hard to understand.
<p>
<a href="https://en.wikipedia.org/wiki/Alan_Guth">Guth</a> is a big-shot MIT Physics professor.
It baffles me how smart guys can recite this nonsense.
<p>
Sure, you can simplify a theory by removing the part that allows predictions to be compared
with observations. But then what good is the theory?
<p>
There are no infinities in nature.
<p>
You can say that collapse of the wave function is not needed if we just better understood
how wave functions can evolve into disparate pieces. Then we could focus on the piece
that applies to observations in our world. But that is just another way of saying the
function collapsed, with the other pieces being unreachable. Many worlds theory does not explain anything.Rogerhttp://www.blogger.com/profile/03474078324293158376noreply@blogger.com1tag:blogger.com,1999:blog-8148573551417578681.post-44376874486396387782023-12-21T06:00:00.000-08:002023-12-21T06:00:00.132-08:00Relativity was not Influenced by Philosophy<a href="https://www.theguardian.com/commentisfree/2019/feb/24/einstein-got-it-philosophy-and-science-do-go-hand-in-hand">2019 article</a>: <blockquote>Last week it was revealed that Edinburgh University’s David Purdie had discovered a letter from Albert Einstein in which the great scientist notes the importance of 18th-century Scottish philosopher David Hume in developing his theory of special relativity.
<p>
Without having reading Hume’s A Treatise of Human Nature, Einstein wrote: “I cannot say that the solution would have come.”
<p>
Historians have, in fact, long known about Einstein’s debt to Hume, and indeed about that letter. They’ve known, too, about the influence on Einstein of many other philosophers, from Ernst Mach to Arthur Schopenhauer. Part of what many find intriguing about the story is the idea that scientific theories should be shaped by philosophical ideas. It has become common for scientists to dismiss philosophy as irrelevant to their work.</blockquote>
The flaw in this argument is that Einstein had almost nothing to do with the discovery of special relativity. He wrote a 1905 paper that is credited heavily today, but at the time it was just an exposition of Lorentz's theory, and soon superseded by papers by Poincare and Minkowski. Relativity became popular from Minkowski, not Einstein.
<p>
I have argued that a belief in causality could have led natural philosophers to the basic ideas of relativity, but it did not.Rogerhttp://www.blogger.com/profile/03474078324293158376noreply@blogger.com0tag:blogger.com,1999:blog-8148573551417578681.post-15808507503780651042023-12-18T06:00:00.000-08:002023-12-18T06:00:00.128-08:00Do Black Holes have Singularities?<a href="https://arxiv.org/abs/2312.00841">New paper</a>: <blockquote>There is no proof that black holes contain singularities when they are generated by real physical bodies. Roger Penrose claimed sixty years ago that trapped surfaces inevitably lead to light rays of finite affine length (FALL's). Penrose and Stephen Hawking then asserted that these must end in actual singularities. When they could not prove this they decreed it to be self evident. It is shown that there are counterexamples through every point in the Kerr metric. These are asymptotic to at least one event horizon and do not end in singularities. </blockquote>
It is by the same Kerr who found the general relativity solution for rotating black holes.
<p>
I will have to read this. It is hard to believe that everyone exaggerated the Penrose Hawking
singularity theorems.Rogerhttp://www.blogger.com/profile/03474078324293158376noreply@blogger.com3tag:blogger.com,1999:blog-8148573551417578681.post-67881811159494379642023-12-14T06:00:00.000-08:002023-12-14T06:00:00.131-08:00Why we use the Lebesgue IntegralNon-mathematicians are often baffled at why mathematicians seek generalizations and abstractions, and often think that the abstractions can have no practical purpose.
<p>
Here is an example. The Riemann integral appears to suffice for any function of practical interest, and yet mathematicians insist on defining a Lebesgue integral to handle a wider variety of cases.
<p>
Andrew D. Lewis <a href="https://arxiv.org/abs/2309.08908">writes</a>: <blockquote>Should we fly in the Lebesgue-designed airplane? -- The correct defence of the Lebesgue integral
<p>
It is well-known that the Lebesgue integral generalises the Riemann integral. However, as is also well-known but less frequently well-explained, this generalisation alone is not the reason why the Lebesgue integral is important and needs to be a part of the arsenal of any mathematician, pure or applied. ...
<p>
The title of this paper is a reference to the well-known quote of the applied mathematician
and engineer Richard W. Hamming (1915–1998):
<blockquote>Does anyone believe that the difference between the Lebesgue and Riemann
integrals can have physical significance, and that whether say, an airplane
would or would not fly could depend on this difference? If such were claimed,
I should not care to fly in that plane.</blockquote></blockquote>
The paper goes on to explain this very well. In particular, it shows why the Riemann integral is not good enough.
In short, the Lebesgue integral makes L<sup>p</sup>(R) complete normed vector spaces. If a sequence appears to converge, then it really does converge to a function in the space. This allows the use of limits in Fourier analysis, differential equations, and other areas of analysis.Rogerhttp://www.blogger.com/profile/03474078324293158376noreply@blogger.com0tag:blogger.com,1999:blog-8148573551417578681.post-678855233589470822023-12-11T06:00:00.001-08:002023-12-11T06:00:00.235-08:00Deriving Lorentz Metric from Electromagnetism<a href="http://philsci-archive.pitt.edu/22538/">New paper</a>: <blockquote>Chen, Lu and Read, James (2023) Is the metric signature really electromagnetic in origin?</blockquote>
The paper is interesting, but the 4-metric signature +++- is mainly a consequence of causality, be it electromagnetic or anything else.
<p>
Causality requires that events only affect nearby events. If spacetime were Euclidean, with metric signature ++++, then an event could be close to an event outside its light cone. Affecting that nearby event would mean going faster than light. Action at a distance.
<p>
Electromagnetic effects do not go faster than light. You need the non-euclidean geometry of a +++- signature metric. Once you accept all that, Maxwell's electromagnetism is one of the simplest possible field theories, compatible with the geometry.Rogerhttp://www.blogger.com/profile/03474078324293158376noreply@blogger.com2