Monday, June 29, 2015

Definition of a function

Many mathematical concepts have long histories, but did not actually get a modern definition until the XX century. I have argued that even the number line was invented in the twentieth century or the late 19th century. See also The real numbers - a survey of constructions:
The novice, through the standard elementary mathematics indoctrination, may fail to appreciate that, compared to the natural, integer, and rational numbers, there is nothing simple about defining the real numbers. The gap, both conceptual and technical, that one must cross when passing from the former to the latter is substantial and perhaps best witnessed by history. The existence of line segments whose length can not be measured by any rational number is well-known to have been discovered many centuries ago (though the precise details are unknown). The simple problem of rigorously introducing mathematical entities that do suffice to measure the length of any line segment proved very challenging. Even relatively modern attempts due to such prominent figures as Bolzano, Hamilton, and Weierstrass were only partially rigorous and it was only with the work of Cantor and Dedekind in the early part of the 1870’s that the reals finally came into existence.
Jeremy Avigad writes:
Today, we think of a “function” as a correspondence between any two mathematical domains: we can talk about functions from the real numbers to the real numbers, functions that take integers to integers, functions that map each point of the Euclidean plane to an element of a certain group, and, indeed, functions between any two sets. As we began to explore topic, Morris and I learned that most of the historical literature on the function concept focuses on functions from the real or complex numbers to the real or complex numbers. ...

Even the notion of a “number theoretic function,” like the factorial function or the Euler function, is nowhere to be found in the literature; authors from Euler to Gauss referred to such entities as “symbols,” “characters,” or “notations.” Morris and I tracked down what may well be the first use of the term “number theoretic function” in a paper by Eisenstein from 1850, which begins with a lengthy explanation as to why is it appropriate to call the Euler phi function a “function.” We struggled to parse the old-fashioned German, which translates roughly as follows:
Once, with the concept of a function, one moved away from the necessity of having an analytic construction and began to take its essence to be a tabular collection of values associated to the values of one or several variables, it became possible to take the concept to include functions which, due to conditions of an arithmetic nature, have a determinate sense only when the variables occurring in them have integral values, or only for certain value-combinations arising from the natural number series. For intermediate values, such functions remain indeterminate and arbitrary, or without any meaning.
When the gist of the passage sank in, we laughed out loud.
It is funny because it so clumsy. It should be obvious that a function can have any domain, and have any definition on that domain.

It is easy to forget how subtle these concepts are, as their meanings have been settled for a century and explained in elementary textbooks. But they were not obvious to some pretty smart 19th century mathematicians. Even today, most physicists and philosophers have never seen rigorous definitions of these concepts.

Now a function can be defined as a suitable set of ordered pairs, once set theory machinery is defined. The domain of the function can be any set, and so can the range. These things seem obvious to mathematicians today, but it took a long time to get these concepts right. And concepts like infinitesimals are still widely misunderstood.

Saturday, June 27, 2015

Quantum computers will not help interpretations

MIT quantum complexity theorist is plugging a PBS essay on his favorite quantum speculations:
a principle that we might call “Occam’s Razor with Computational Aftershave.” Namely: In choosing a picture of physical reality, we should be loath to posit computational effort on Nature’s part that vastly exceeds what could ever in principle be observed. ...

Could future discoveries in quantum computing theory settle once and for all, to every competent physicist’s satisfaction, “which interpretation is the true one”? To me, it seems much more likely that future insights will continue to do what the previous ones did: broaden our language, strip away irrelevancies, clarify the central issues, while still leaving plenty to argue about for people who like arguing.
Lubos Motl defends Copenhagenism:
Quantum mechanics may be understood as a kind of a "black box", like a computer that spits the right result (probability of one observation or another). And we may learn how to perform the calculations that exactly reproduce how the black box works. This is a description that Feynman used to say, too. Some people aren't satisfied with that – they want to see something "inside" the black box. But there is nothing inside. The black box – a set of rules that produce probabilistic predictions for measurements out of past measurements – is the most fundamental description of Nature that may exist. Everything else is scaffolding that people add but they shouldn't because it's not a part of Nature.

Quantum computers won't change anything about the desire of the laymen to see something inside the black box. On the contrary, a quantum computer will be an even blacker box! You press a button, it does something that is completely incomprehensible to a layman, and announces a correct result very quickly, after a short time that experts say to be impossible to achieve with classical computers.
A lot of quantum interpretations fall into the trap of trying to say what is in that black box, without any way to say that it is right or wrong. Aaronson does this when he talks about how much computation Nature has to do in that box. He implicitly assumes that Nature is using the same mathematical representations that we are when we make macro observations.

I do not accept that assumption, as I have posted in essays here.

Eric Dennis comments to Aaronson:
I doubt anyone is really suspicious of the possibility of long coherence times for small systems. We’re suspicious of massive parallelism.
I question those long coherence times. Some of the experiments are like rolling a coin on the floor so that it eventually falls over to heads or tails, with equal probability. Or balancing a rock on the head of a pin so that it eventually falls, with all directions equally likely. Can this be done for a long time? Maybe with the coin, but not with the rock. Can the uncertainty during the long time be used to extract a super-Turing computation? No way.

Update: A company claims:
D-Wave Systems has broken the quantum computing 1000 qubit barrier, developing a processor about double the size of D-Wave’s previous generation, and far exceeding the number of qubits ever developed by D-Wave or any other quantum effort, the announcement said.

It will allow “significantly more complex computational problems to be solved than was possible on any previous quantum computer.”

At 1000 qubits, the new processor considers 21000 possibilities simultaneously, a search space which dwarfs the 2512 possibilities available to the 512-qubit D-Wave Two. ‪”In fact, the new search space contains far more possibilities than there are ‪particles in the observable universe.”
This annoys Aaronson as much as I do, as he says that he has proved that a quantum computer cannot really search all those possibilities simultaneously.

D-Wave has probably a legitimate technological achievement, but they do not have any real qubits, and there are no previous quantum computers.

Thursday, June 25, 2015

Essays for and against MUH

I mentioned the FQXi essay contest winners, and here are a couple more.

French Canadian physicist Marc S├ęguin won one of the two second prizes with this:
My God, It’s Full of Clones: Living in a Mathematical Universe

Imagine there’s only math — physics is nothing more than mathematics, we are self-aware mathematical substructures, and our physical universe is nothing more than a mathematical structure “seen from the inside”. If that’s the case, I will argue that it implies the existence of the Maxiverse, the largest imaginable multiverse, where every possible conscious observation is guaranteed to happen. ...

In the end, I believe in the Maxiverse because it is the ultimate playground for the curious mind. Living forever… across wildly divergent realities… who could ask, literally, for anything more than the Maxiverse? And if I’m right, somewhere within its infinitely complex simplicity, one of my F-clones is having a drink with one of your F-clones, and we’re having a big laugh about it all. Cheers!
He endorses Max Tegmark's Mathematical Universe Hypothesis, and more. While Tegmark contradicts himself about whether he believes in infinity, Sequin believes that he exists in a infinite number of copies.

This is not even good science fiction.

Lee Smolin won a third prize, and after a lot of dopey comments, ends with:
In closing, I would like to mention two properties enjoyed by the physical universe which are not isomorphic to any property of a mathematical object.

1. In the real universe it is always some present moment, which is one of a succession of moments. Properties off mathematical objects, once evoked, are true independent of time.

2. The universe exists apart from being evoked by the human imagination, while mathematical objects do not exist before and apart from being evoked by human imagination.
The first property is silly. You can say that math objects are independent of time, just as you can say they are independent of space, temperature, energy, or any other physical property. Unless of course the math is interpreted as modeling those things, as they usually do in mathematical physics.

The second is just anti-Platonism. Many or most mathematicians believe that math objects like the real numbers do exist independently of humans.

Monday, June 22, 2015

Myth of the Dark Ages


An essay by Tim O'Neill calls this chart "The Most Wrong Thing On the Internet Ever". He elaborates here and here, as does OFloinn.

Similarly wrong is the conflict thesis:
The conflict thesis is the proposition that there is an intrinsic intellectual conflict between religion and science and that the relationship between religion and science inevitably leads to public hostility. Although the thesis in modern form remains generally popular, the original form of the thesis is no longer widely supported among historians.
This blog presents the view that math, science, and technology have been under continuous development for millennia. For the most part, Christianity has contributed to the development of science.

By contrast, the Marxist view is that history is driven by conflict between grand social forces, with sudden breakthrus and revolutions between these forces being key to explaining everything. So they will say that the Roman Popes caused the Dark Ages by suppressing knowledge of the motion of the Earth and by burning witches, until Galileo stood up to them and started the scientific revolution by leading the Protestant Reformation. Or some such nonsense.

If interested in the broader history, see Catholic Church and science and List of Roman Catholic cleric-scientists.

Leftist scientists are praising the Pope for Laudato si. That book endorses consumer sacrifices and boycotts in order to promote Third World population growth, and it is all justified by some supposed consensus on global warming. The science of CO2 may be fine, but the rest is dubious. The theology is outside the scope of this blog, but I would like to see more analysis of whether the science is correct.

(Now he denounces "businessmen who call themselves Christian and they manufacture weapons.")

The Dark Ages are called dark because of the dearth of historical records, compared to the Roman Empire and other periods. It is like the dark side of the Moon, which got that name because we did not see it and knew almost nothing about it. No one is saying that sunlight failed to shine on the Dark Ages or the dark side of the Moon. But the sunlight from the dark side does not get to us. Likewise dark matter is dark because no starlight is scattered off it to us. And dark energy is dark. Read this blog for the Dark Buzz.

Saturday, June 20, 2015

Quantum computers attract commercial interest

The British Economist magazine is enthusiastic about quantum computing:
After decades languishing in the laboratory, quantum computers are attracting commercial interest ...

By exploiting certain quantum effects they can create bits, known as qubits, that do not have a definite value, thus overcoming classical computing’s limits.

Around the world, small bands of such engineers have been working on this approach for decades. Using two particular quantum phenomena, called superposition and entanglement, they have created qubits and linked them together to make prototype machines that exist in many states simultaneously. Such quantum computers do not require an increase in speed for their power to increase. In principle, this could allow them to become far more powerful than any classical machine — and it now looks as if principle will soon be turned into practice. Big firms, such as Google, Hewlett-Packard, IBM and Microsoft, are looking at how quantum computers might be commercialised. The world of quantum computation is almost here.

Ready or not, then, quantum computing is coming. It will start, as classical computing did, with clunky machines run in specialist facilities by teams of trained technicians. Ingenuity being what it is, though, it will surely spread beyond such experts’ grip. Quantum desktops, let alone tablets, are, no doubt, a long way away. But, in a neat circle of cause and effect, if quantum computing really can help create a room-temperature superconductor, such machines may yet come into existence.
No, this is crazy. No one has overcome any classical computing limits, no quantum computers are being commercialized, and there will not be any room-temperature superconductor.

There are many other technologies that are being commercialized after decades of languishing in the lab. Self-driving cars. Image identification. Voice recognition. Natural language processing. Robots.

In each of those areas, steady progress is being made. There are prototypes that qualify as a proof of concept. There may not be agreement about how far the technology will go, but it is obvious that commercial applications are coming.

Quantum computing does not qualify. There are lots of experiments that qualify as interesting tests quantum mechanics. But there is no prototype that exceeds any classical computing limits, even on a small slow scale.

Most of you are going to say, "Why should I believe some stupid blogger saying it is impossible, when lots of smart people say this technology is coming, and they are backed by a lot of big money?"

There is no need to believe me. Just tell me how long you are willing to wait. What will you say if there is still no prototype in 2 years? 5 years? 10 years? 20 years?

This is the biggest research scam I've seen. String theory and the multiverse are scams, but at least those folks do not pretend to have commercial applications. There have been lots of over-hyped technologies before, such as fuel cells and hydrogen economy, but those are at least technological possibilities. There is never going to be a quantum computer that out-performs a Turing machine.

Thursday, June 18, 2015

More in infinitesimals

I criticized Sylvia Wenmackers, and she posted a rebuttal in the comments.

She explained what she meant by the hyperreals being incomplete. She is right that the hyperreals do not have the least upper bound property if you include non-internal sets. That is, the infinitesimals are bounded but do not have a least upper bound. But the bounded internal sets have least upper bounds.

The distinction is a little subtle. Arguments involving hyperreals mostly use internal sets, because then the properties of the reals can be used.

On another point, she refers me to this Philip Ehrlich article in the history of non-Archimedean fields. He says:
In his paper Recent Work On The Principles of Mathematics, which appeared in 1901, Bertrand Russell reported that the three central problems of traditional mathematical philosophy – the nature of the infinite, the nature of the infinitesimal, and the nature of the continuum – had all been “completely solved” [1901, p. 89]. Indeed, as Russell went on to add: “The solutions, for those acquainted with mathematics, are so clear as to leave no longer the slightest doubt or difficulty” [1901, p. 89]. According to Russell, the structure of the infinite and the continuum were completely revealed by Cantor and Dedekind, and the concept of an infinitesimal had been found to be incoherent and was “banish[ed] from mathematics” through the work of Weierstrass and others [1901, pp. 88, 90].
I think that it is correct that the continuum ( = number line = real numbers) was figured out in the late 19th century by Weierstrauss, Dedikind, and others, and widely understood in the early XXc, as I explained here. See Construction of the real numbers for details on the leading methods. The standard real numbers do not include infinitesimals.

But "banished" is not the right word. It is more accurate to say that infinitesimal arguments were made rigorous with limits.

The Stanford Encyclopedia of Philosophy entry on Continuity and Infinitesimals starts:
The usual meaning of the word continuous is “unbroken” or “uninterrupted”: thus a continuous entity — a continuum — has no “gaps.” We commonly suppose that space and time are continuous, and certain philosophers have maintained that all natural processes occur continuously: witness, for example, Leibniz's famous apothegm natura non facit saltus — “nature makes no jump.” In mathematics the word is used in the same general sense, but has had to be furnished with increasingly precise definitions. So, for instance, in the later 18th century continuity of a function was taken to mean that infinitesimal changes in the value of the argument induced infinitesimal changes in the value of the function. With the abandonment of infinitesimals in the 19th century this definition came to be replaced by one employing the more precise concept of limit.

Traditionally, an infinitesimal quantity is one which, while not necessarily coinciding with zero, is in some sense smaller than any finite quantity. For engineers, an infinitesimal is a quantity so small that its square and all higher powers can be neglected. In the theory of limits the term “infinitesimal” is sometimes applied to any sequence whose limit is zero. An infinitesimal magnitude may be regarded as what remains after a continuum has been subjected to an exhaustive analysis, in other words, as a continuum “viewed in the small.” It is in this sense that continuous curves have sometimes been held to be “composed” of infinitesimal straight lines.

Infinitesimals have a long and colourful history. They make an early appearance in the mathematics of the Greek atomist philosopher Democritus (c. 450 B.C.E.), only to be banished by the mathematician Eudoxus (c. 350 B.C.E.) in what was to become official “Euclidean” mathematics. Taking the somewhat obscure form of “indivisibles,” they reappear in the mathematics of the late middle ages and later played an important role in the development of the calculus. Their doubtful logical status led in the nineteenth century to their abandonment and replacement by the limit concept. In recent years, however, the concept of infinitesimal has been refounded on a rigorous basis.
This mentions and explains what I have been calling my motto or slogan, only it calls it a "apothegm", whatever that is. My dictionary says "A short pithy instructive saying". The "g" is silent, and it is pronounced APP-u-thum. Okay, I'll accept that, and may even adopt the word.

Consider the above statement that a continuous curve is composed of infinitesimal straight lines. Taken literally, it seems like nonsense. It took mathematicians 3 centuries to make it rigorous, and you can find the result in mathematical analysis textbooks.

The common textbook explanations use infinitesimal methods and limits, but not hyperreals.

My problem with Wenmackers is that she treats infinitesimals as sloppy reasoning until hyperreals came along, and the conventional epsilon-delta arguments as something that might only merit a footnote as it might distract casual readers.

This is just wrong. The mainstream methods for rigorous infinitesimal methods use epsilons, deltas, limits, tangents, and derivatives. The hyperreals have their place as a fringe alternative view, but they are not central or necessary for rigor.

In quantum mechanics, the momentum operator is an infinitesimal translation symmetry. This does not mean that either sloppy reasoning or hyperreals are used. It means that infinitesimal methods were used to linearize the symmetry group at a point. This is essential to how quantum mechanics have been understood for almost 90 years.

I also mentioned that special relativity is the infinitesimal version of general relativity. So yes, infinitesimal analysis is essential to XXc physics. How else do you understand relativity and quantum mechanics?

She writes:
What I mean by "loose talk involving infinitesimals" "frowned upon by mathematicians" is that physicists often talk about infinitesimals in a way that is close in spirit to Leibniz's work (and hence to non-standard analysis), which is not compatible with the definition of the classical limit as mathematicians use it in standard analysis.
No, I disagree with this. Limits and standard analysis were invented to make work by Leibniz and others rigorous, and they are still accepted as the best way.

She is essentially saying that the hyperreals are a better way to make Leibniz's work rigorous. If that is so, then why do all the textbooks do it a different way?

There are a few hyperreal enthusiasts among mathematicians who believe the hyperreals are superior, but they are a very small minority. I doubt that there are any colleges that teach analysis that way.
I think many of your other remarks also boil down to the same point: I use the term 'infinitesimal' in a more restricted sense than the way you seem to interpret it.
This is like saying:
When I refer to atoms, I am not talking about the atoms that are commonly described in college chemistry textbooks. I mean the hyper-atoms that were recently conceived as being closer in spirit to the way that the ancient Greek Democritus talked about atoms.
Russell did not just try to banish infinitesimals. He also tried to banish causality from physics, and convinced modern philosophers that there is no such thing.

This is a problem with modern philosophers. They can say the most ridiculous things just because they are accepted by other philosophers and historians. If they want to know how Leibniz's work was made rigorous, they could knock on a mathematician's door and ask, "Did anyone make Leibniz's work rigorous?" He would say, "Sure, just take our Calculus I and Analysis I courses." If she said, "what about the hyperreals?", he would say, "Yes, you can do it that way also, but we do not teach a class on it."

If I am wrong, please explain in the comments.

Monday, June 15, 2015

Contest winner misunderstands infinitesimals

FQXi has announced its annual essay contest winners, and the top prize went to a Belgium philosopher of physics, Sylvia Wenmackers, who wrote:
Essay Abstract
Our mathematical models may appear unreasonably effective to us, but only if we forget to take into account who we are: we are the children of this Cosmos. We were born here and we know our way around the block, even if we do not always appreciate just how wonderful an achievement that is.
My essay did not win any prizes. I suspect that the prizes would have been a lot different if the evaluation had been blinded (ie, if author names were removed during evaluation).

She has her own blog, and a grad student working on quantum teleportation and time travel.

The most substantive comments in her essay are about infinitesimals:
The natural sciences aim to formulate their theories in a mathematically precise way, so it seems fitting to call them the ‘exact sciences’. However, the natural sciences also allow – and often require – deviations from full mathematical rigor. Many practices that are acceptable to physicists – such as order of magnitude calculations, estimations of errors, and loose talk involving infinitesimals – are frowned upon by mathematicians. Moreover, all our empirical methods have a limited range and sensitivity, so all experiments give rise to measurement errors. Viewed as such, one may deny that any empirical science can be fully exact.
No, this is not right. Physicists take non-rigorous shortcuts, and mathematicians frown on loose talk. But mathematicians have rigorous theories for estimating errors, infinitesimals, and all other math in use. Non-rigorous work may be convenient, and full rigor may be impractical in some cases, but it is a mistake to say that science requires non-rigorous math. Mathematicians strive to make all math rigorous.
In mathematics, infinitesimals played an important role during the development of the calculus, especially in the work of Leibniz [11], but also in that of Newton (where they figure as ‘evanescent increments’) [12]. The development of the infinitesimal calculus was motivated by physics: geometric problems in the context of optics, as well as dynamical problems involving rates of change. Berkeley [13] ridiculed infinitesimals as “ghosts of departed quantities”. It has taken a long time to find a consistent definition of this concept that holds up the current standards of mathematical rigor, but meanwhile this has been achieved [14]. The contemporary definition of infinitesimals considers them in the context of an incomplete, ordered field of ‘hyperreal’ numbers, which is non-Archimedean: unlike the field of real numbers, it does contain non-zero, yet infinitely small numbers (infinitesimals). The alternative calculus based on hyperreal numbers, called ‘non-standard analysis’ (NSA), is conceptually closer to Leibniz’s original work (as compared to standard analysis).

While infinitesimals have long been banned from mathematics, they remained in fashion within the sciences, in particular in physics: not only in informal discourse, but also in didactics, explanations, and qualitative reasoning. It has been suggested that NSA can provide a post hoc justification for how infinitesimals are used in physics [15]. Indeed, NSA seems a very appealing framework for theoretical physics: it respects how physicists are already thinking of derivatives, differential equations, series expansions, and the like, and it is fully rigorous.11
I have previously argued that Berkeley was not ridiculing infinitesimals with that quote. The ghosts are the limits, not the infinitesimals.

I don't know why she says the hyperreals are incomplete. They have the same completeness properties as the real numbers. That is, Cauchy sequences converge, bounded sets have least upper bounds, and odd order polynomials have roots.

The impression given here is that differential calculus and mathematical physics were non-rigorous until hyperreals and NSA justified infinitesimals. That is not true, and most mathematicians and physicists today do not even pay any attentions to hyperreals or NSA.

The mainstream treatment of infinitesimals is to treat them as a shorthand for certain arguments involving limits, using a rigorous definition of limit. The main ideas were worked out by Cauchy, Weierstrauss, and others in the 19th century, and probably perfected in the XXc. There is no need for hyperreals.

Infinitesimals were never banned from mathematics. They are completely legitimate if backed up by limits or hyperreals. Maybe physicists never learn that, but mathematicians do.

I might say: "Special relativity is the infinitesimal version of general relativity." What that means is that if you take a tangent geometric structure to the curved spacetime of general relativity, you get the (flat) geometry of special relativity. The tangent may be defined using limits, derivatives, or hyperreals. It is a rigorous statement, and these sorts of statements were never banned.

You do not see statements like that in physics books. They are more likely to say that special relativity is an approximation to general relativity, as they might say that a tangent line is an approximation to a curve. Mathematicians would rather take the limit, and make an exact statement.

Consider f'(x)dx which can be integrated to get f(x). You can view dx as a hyperreal infinitesimal, and the integral as an infinite sum. But the more conventional view is that infinitesimals are not numbers, but a method for getting tangents and tensors. Then f'(x)dx is not a simple function, but something that acts on tangent vectors and can be integrated. I am skipping over subtle details, but it is a rigorous infinitesimal method and described in elementary math textbooks.

Also dy/dx is symbolically the division of infinitesimals, but rigorously defined as a limit.

So the above paper badly misunderstands infinitesimals to treat them as only made rigorous by hyperreals. She also mentions considering Planck's constant h, or the reciprocal of the speed of light 1/c, to be like infinitesimals.

A recent book claims that Galileo used infinitesimals and the Jesuits banned such use. I don't know about that, but that predated Newton, Leibniz, and calculus. And I am sure that some use of infinitesimals was sloppy. All pre-XXc work was sloppy by modern standards. But the usage by mathematicians can be made rigorous. By the early XXc, it was all rigorous (in the math books).

Thursday, June 11, 2015

Dog consciousness causes wave function collapse


Lubos Motl defends the Copenhagen interpretation of quantum mechanics, and now he defends the Von Neumann–Wigner interpretation. Von Neumann supposedly believed that human consciousness caused collapse of the wave function, and Wigner said he thought that a dog had sufficient consciousness to cause collapse.

It sounds ridiculous when you phrase it that way, but it is not so silly. The electron may have an independent objective existence, but our best explanation uses wave functions that cleverly encode how it was observed in the past and how it might be observed in the future.

The Moon exists whether we look at it or not, but the exact position and other physical characteristic are either directly observed or inferred from models and previous observations. The observations confirm the predictions and narrow the error bars.

And yes, a dog can look at the Moon.

Peter Woit cites an ex-string theorist ranting about the field. One notable point is that hardly anyone is really doing string theory any more. They are toying around with mathematical structures and models inspired by string theory, but they are not trying to study electrons as tiny strings or anything you might recognize from popular accounts of the field.

Scott Aaronson has joined Noam Chomsky and other MIT eggheads in denouncing investment in oil companies. Some of the comments explain how this is just feel-good leftist political posturing that will accomplish nothing worthwhile. You would think that all these smart MIT professors could recommend some constructive changes for our society.

Sometimes I think that the environmentalist movement is dominated by anti-environmentalists who invent stupid causes to distract people away from bigger threats.

Monday, June 8, 2015

A Crisis at the Edge of Physics

Physics professors Adam Frank and Marcelo Gleiser, write in the NY Times:
DO physicists need empirical evidence to confirm their theories?

You may think that the answer is an obvious yes, experimental confirmation being the very heart of science. But a growing controversy at the frontiers of physics and cosmology suggests that the situation is not so simple. ...

But the standard model, despite the glory of its vindication, is also a dead end. It offers no path forward to unite its vision of nature’s tiny building blocks with the other great edifice of 20th-century physics: Einstein’s cosmic-scale description of gravity. Without a unification of these two theories — a so-called theory of quantum gravity — we have no idea why our universe is made up of just these particles, forces and properties. (We also can’t know how to truly understand the Big Bang, the cosmic event that marked the beginning of time.)
No, quantum gravity is a stupid pipe dream that would tell us nothing about the universe.

SUSY is at least testable and potentially explanatory, but it may soon be dead:
Today, the favored theory for the next step beyond the standard model is called supersymmetry (which is also the basis for string theory). Supersymmetry predicts the existence of a “partner” particle for every particle that we currently know. It doubles the number of elementary particles of matter in nature. The theory is elegant mathematically, and the particles whose existence it predicts might also explain the universe’s unaccounted-for “dark matter.” As a result, many researchers were confident that supersymmetry would be experimentally validated soon after the Large Hadron Collider became operational.

That’s not how things worked out, however. To date, no supersymmetric particles have been found. If the Large Hadron Collider cannot detect these particles, many physicists will declare supersymmetry — and, by extension, string theory — just another beautiful idea in physics that didn’t pan out.

But many won’t. Some may choose instead to simply retune their models to predict supersymmetric particles at masses beyond the reach of the Large Hadron Collider’s power of detection — and that of any foreseeable substitute.
Now physics is being overrun by non-empirical pursuits:
Consider, likewise, the cutting-edge theory in physics that suggests that our universe is just one universe in a profusion of separate universes that make up the so-called multiverse.
A current essay on Scientia Salon argues that string theory and many-worlds are legitimate science and not pseudoscience, even tho there is not likely to ever have any empirical evidence for either.
Recall the epicycles, the imaginary circles that Ptolemy used and formalized around A.D. 150 to describe the motions of planets. Although Ptolemy had no evidence for their existence, epicycles successfully explained what the ancients could see in the night sky, so they were accepted as real. But they were eventually shown to be a fiction, more than 1,500 years later. Are superstrings and the multiverse, painstakingly theorized by hundreds of brilliant scientists, anything more than modern-day epicycles?
This explanation of epicycles makes no sense. It says they were "imaginary" and "had no evidence", but also that the ancients saw them in the night sky.

Ptolemy approximated the night sky view of planets as main circles plus epicycles. They are real in a sense similar to saying that the phases of the Moon are real. We see them in the sky. That is all Ptolemy meant.

Saying that Ptolemy had no evidence for epicycles is like saying that he had no evidence for phases of the Moon.

You could say that electrons are modern-day epicycles in that we have very good models for what we see in electron experiments, but not necessarily a deep understanding of what an electron really is.

Superstrings are not like that at all. There is nothing relating theoretical superstrings to anything that is observed, nor is any such relationship likely in the future.

At that Scientia Essay, Massimo Pigliucci argues that the philosopher's consensus is that there is no clearcut definition of science, but he writes books attacking pseudoscience anyway. A responder gives this definition:
Sorry, but as a physicist, I’m pretty much a Popperian. I have a very narrow definition of what constitutes “science:” it is a methodology consisting of a systematic, iterative use of observation (all sense-data) and reason. Observations are used as hypotheses, reason generates models (mathematical in the advanced form) which are tested by prediction against additional observation. And Falsification is, indeed, the criterion which must be met by a model (theory) if it is to be judged part of science.

And I’m not too sure about psychology: why is it obvious that a methodology created by the human intellect should obviously be applicable to analyzing the human mind? There’s a Doug Hofstadter strange-loop problem, potentially. And then there’s “political science,” “management science,” and other comic labels.

So astrology, creation science, and natural theology are not sciences by the Popper test. Also, I would say that string theory is only a science if it generates a prediction that is at least IN PRINCIPLE falsifiable. It is near the margin, because, while there are in-principle observations that might test its predictions, these are not even close to practical. So it is certainly questionable as science. Many-worlds quantum mechanics is not science – UNLESS someone comes up with a way to test it. As I said, I have a narrow definition of the word.
That is a decent definition.

Ethan Siegel tries to answer Does Quantum Gravity Need String Theory? He starts by quoting:
“I just think too many nice things have happened in string theory for it to be all wrong. Humans do not understand it very well, but I just don’t believe there is a big cosmic conspiracy that created this incredible thing that has nothing to do with the real world.” -Edward Witten
Lubos Motl explains:
Witten wants to say that even without definite empirical proofs, the mathematical properties of string theory make us certain that it is an incredibly tight, rich, and unique mathematical structure that seems to contain the ideas compatible with physics as well as many new structures and relationships that came as surprises and taught us to think about physical and mathematical concepts in new ways.
This is just the faith of a true believer. There are lots of nice mathematical structures, from p-adic numbers to category theory, but they do not describe the structure of electrons. A lot of nice mathematics finds unexpected applications, so some of these things might be applied to physics, but it is nutty to think that electrons are based on 6-dimensional Calabi-Yau manifolds.

Siegel explains that quantum gravity is fully understood for all observable possibilities:
“So,” you reason, “we’ll simply do our quantum field theory calculations in the background of curved space!” This is known as semi-classical gravity, and it’s this type of calculation that allows us to calculate things like Hawking radiation. But even that is only at the event horizon of the black hole itself, not at the location where gravity is truly at its strongest. As Sabine Hossenfelder elegantly explained, there are multiple physical instances where we need a quantum theory of gravity, all having to do with gravitational physics on the smallest of scales: at tiny distances.

What happens, for example, at the central locations of black holes? You might think, “oh, there’s a singularity,” but a singularity isn’t quite so much a point of infinite density, but is more likely an instance where the mathematics of General Relativity returns nonsensical answers for things like potentials and forces.
Black holes are not observable inside the event horizon. We can speculate about what is inside, but according to relativity, there is no way of every knowing. It is as impossible as going faster than the speed of light.

The quantum gravity researchers are concerned with the center point of the black hole, where there is supposedly a singularity with infinite density. Or maybe the matter does not collapse all the way, because of very high energy interactions that are not understood. Or maybe God is hiding in there. Believe whatever you want, because physically it is a meaningless question.

Thursday, June 4, 2015

Zeilinger has popular but wrong interpretation of Bell

A 2006 physics paper argued:
At a time when the forces of obfuscation in America are engaged in a campaign against the theory of evolution on behalf of Intelligent Design, it is perhaps worth asking Zeilinger how the idea that there is no difference between in formation and reality can be compatible with the emergence of information processing systems such as we are from a lifeless reality. And it is perhaps also worth asking the editors of Nature how, at a time when, rightly, papers on Intelligent Design are consistently rejected by peer-reviewed journals, an essay like Zeilinger’s is not.
They attack this Nature essay:
The discovery that individual events are irreducibly random is probably one of the most significant findings of the twentieth century. ...

John Bell showed that the quantum predictions for entanglement are in conflict with local realism. ...

Maybe this suggests that reality and information are two sides of the same coin, that they are in a deep sense indistinguishable.
Anton Zeilinger is widely respected, and often mentioned as a candidate for a Nobel Prize if one were ever to be awarded for quantum foundational word related to Bell's Theorem.

We don't know that anything is irreducibly random, or that there is any such meaningful concept. We do know that certain quantum phenomena seems random, and cannot be explained as just the random sampling of local hidden variables.

But random just means difficult to predict. We say that coin tosses are random because it is impractical to predict the outcome, but maybe not impossible.

A free neutron has a half-life of 15 minutes, so that might seem irreducibly random. But we now know that a neutron is composed of 3 quarks and many gluons, so the decay may be reducible to the mechanics of those particles, and those may or may not be deterministic.

Meanwhile, Slashdot asks Are We Entering a "Golden Age of Quantum Computing Research"?

Monday, June 1, 2015

Science is about truth, not falsity

Leftist-atheist-evolutionist Jerry Coyne plugs his latest book:
In fact, the conflict between science and religion — at least the Abrahamic faiths dominant in the U.S. — is deep, endemic, and unlikely to be resolved. For this conflict is one between faith and fact — a battle in the long-fought war between rationality and superstition. ...

But while science and religion both claim to discern what’s true, only science has a system for weeding out what’s false. In the end, that is the irreconcilable conflict between them. Science is not just a profession or a body of facts, but, more important, a set of cognitive and practical tools designed to understand brute reality while overcoming the human desire to believe what we like or what we find emotionally satisfying. The tools are many, including observation of nature, peer review and replication of results, and above all, the hegemony of doubt and criticality. The best characterization of science I know came from physicist Richard Feynman: “The first principle is that you must not fool yourself — and you are the easiest person to fool. So you have to be very careful about that.”
I do not find this to be a very good description of science. Lots of religious believers are also concerned about fooling themselves.

I also do not agree with this view that science is all about proving things false. It is about establishing empirical truths.

Consider a basic scientific finding that can be expressed in the positive: "Energy is conserved", or the negative: "There is no perpetual motion machine."

The positive is preferable, by far. It comes with theories of how energy can be transformed from one form to another, and experiments measuring the conservation to high precision. We are extremely confident of these results because they are so often replicated, and applied in a useful way.

Saying that there is no perpetual motion machine is less certain, and less verifiable. Some say that the expansion of the universe is creating dark energy. Maybe that could be used for perpetual. I doubt it, but we cannot be sure.
But even if science and religion are incompatible, what’s the harm? Most of the damage comes from something inherent in many faiths: proselytizing. If you have a faith-based code of conduct attached to beliefs in absolute truths and eternal rewards and punishments, you’re tempted to impose those truths on others. ...

There is also “horizontal” proselytizing: pressing faith-based beliefs on others via politics. This has led to religion-based opposition to things like global warming, ...
Are there religious folks trying to impose some sort of biblical truth about global warming on others? News to me.

He argues that modern DNA evidence is inconsistent with the Adam and Eve story. Yes, I guess that is true, but it is not necessarily inconsistent with the religious lessons people draw from the story.

Coyne's main concern seems to be that religion is an obstacle to his leftist political agenda. He does not similarly attack people with a religious belief in environmentalism.

Update: SciAm's John Horgan does not have any problem with leftism, atheism, or evolutionism, but trashes Coyne's book as going too far.
Mr. Coyne’s critique of free will, far from being based on scientific “fact,” betrays how his hostility toward religion distorts his judgment. Evidence against free will, he says, “kicks the props out from under much theology, including the doctrine of salvation.” Mr. Coyne thinks that if religious people believe in free will, it must be an illusion.

Mr. Coyne’s loathing of creationism, similarly, leads him to exaggerate what science can tell us about our cosmic origins. Mr. Coyne asserts that “we are starting to see how the universe could arise from ‘nothing,’ and that our own universe might be only one of many universes that differ in their physical laws.” Actually, cosmologists are more baffled than ever at why there is something rather than nothing… And multiverse theories are about as testable as religious beliefs. ...

Actually, Faith vs. Fact serves as a splendid specimen of scientism. Mr. Coyne disparages not only religion but also other human ways of engaging with reality. The arts, he argues, “cannot ascertain truth or knowledge,” and the humanities do so only to the extent that they emulate the sciences. This sort of arrogance and certitude is the essence of scientism.
Sometimes I post stuff like this, and people tell me that creationists are so much worse, and that anyone fighting the creationists is doing a good thing.

This blog is about science, and I hold scientists to scientific standards. However bad someone's theology might be, that is all the more reason for scientists to be scientific, if they want to show that science is superior.

Thursday, May 28, 2015

EPR was not a sleeping beauty

Lubos Motl writes:
Nature wrote an article with the list of top 15 "sleeping beauty" papers that were initially almost ignored but many decades later, they exploded and began to attract lots of followups.

Almost all of them are about the physics of surfaces and closely related issues in solid state physics. One exception, ranking as the #14 sleeping beauty, is the Einstein-Podolsky-Rosen 1935 paper ...

It's funny because the 100% unjustified and self-evidently incorrect assertion "no reasonable definition of reality could be expected to permit this" is the central point that decides about the validity or, in this case, invalidity of the whole paper. This is the point of the paper saying "here a miracle occurs". Quantum mechanics changes our notions of reality in such a way that exactly the "forbidden" insight is true and fundamental in the whole theory: the reality always depends on the observables we can make, and realities of noncommuting observables are always mutually exclusive.
Motl is right about this.

People commonly praise this 1935 paper as if it pointed out some profound flaw in quantum mechanics, or as if it opened the way for quantum information/cryptography/computing. It did not.

All it did was to draw attention to an aspect of quantum mechanics, and declare it unreasonable.

It was the belief of Bohm, Bell, Clauser, and others that Einstein had the germ of an idea that might be turned into an experimental disproof of quantum mechanics. If they had turned out to be right, then this would have a very important development. But it was not right, and the quantum mechanics of 1930 has held up.

Monday, May 25, 2015

John Nash dies in taxi crash

Sad news:
John Forbes Nash Jr., a mathematical genius whose struggle with schizophrenia was chronicled in the 2001 movie "A Beautiful Mind," has died along with his wife in a car crash on the New Jersey Turnpike. He was 86.

Nash and Alicia Nash, 82, of Princeton Township, were killed in a taxi crash Saturday, state police said. A colleague who had received an award with Nash in Norway earlier in the week said they had just flown home and the couple had taken a cab home from the airport.
He was best known for the movie. It got a lot of complaints about accuracy, largely because there was material in the book that was omitted from the movie. But the accuracy of the book is also questionable, as it was an unauthorized biography and included a lot of hearsay. The movie was too long, and adding anything else probably would have required other omissions.

Some of the movie's inaccuracies were for dramatic effect, such as visual hallucinations. One outrageous lie was saying that drugs cured his schizophrenia; Nash says that he only took drugs when forced, and they never did him any good.

The title was stupid. What is beautiful about a schizophrenic mind? His math was beautiful.

This was another example of how Hollywood likes to portray mathematicians as mad geniuses. Other examples include Good Will Hunting, Pi, and Proof.

I was a student at Princeton when Nash was wandering the halls writing strange political numerology on the blackboards. Nobody bothered him, and he did not bother anyone else.

The movie is about his proof of an equilibrium in game theory and economics. In mathematics, he is mainly known for proving that any Riemannian (metric) manifold can be embedded in Euclidean space. So an abstract metric, such as what defines gravity in general relativity, can be realized as the metric inherited from a higher dimensional space.

(Technically, the GR metrics are not really metrics because they can be positive or negative, and Nash did not address that situation, but his ideas can be adapted.)

Update: The NY Times obituary says:
Dr. Nash’s theory of noncooperative games, published in 1950 and known as Nash equilibrium, provided a conceptually simple but powerful mathematical tool for analyzing a wide range of competitive situations, from corporate rivalries to legislative decision-making. Dr. Nash’s approach is now pervasive in economics and throughout the social sciences and applied in other fields as well, including evolutionary biology.

Harold W. Kuhn, an emeritus professor of mathematics at Princeton and a longtime friend and colleague of Dr. Nash’s who died in 2014, once said, “I think honestly that there have been really not that many great ideas in the 20th century in economics, and maybe, among the top 10, his equilibrium would be among them.” A University of Chicago economist, Roger Myerson, went further, comparing the impact of the Nash equilibrium on economics “to that of the discovery of the DNA double helix in the biological sciences.”

Dr. Nash also made contributions to pure mathematics that many mathematicians view as more significant than his Nobel-winning work on game theory. In one he solved an intractable problem in differential geometry derived from the work of the 19th century mathematician G. F. B. Riemann.
This may seem obvious now, but previous work on game theory by von Neumann and others was on cooperative games. For real world economic applications, you have to assume that no one is cooperating with anyone else.

He just collected the Abel Prize in Norway, and the most dangerous part of the trip was the New Jersey taxicab ride from the airport.

Saturday, May 23, 2015

Einstein and Hitler, the hero and the villain

Phys.org reports:
What do Einstein, Mother Teresa, Gandhi, Martin Luther King Jr., Newton, Jesus, Mandela, Edison, Lincoln and the Buddha all have in common? They all make up the top 10 heroes in world history. As regards the villains, the first 10 positions are occupied by Adolf Hitler, Osama bin Laden, Saddam Hussein, George W. Bush, Stalin, Mao, Lenin, Genghis Khan, Saladin, the emperor Qin and Napoleon.

This classification into heroes and villains is the result of a study carried out jointly across the world by various universities, including the UPV/EHU-University of the Basque Country. 6,902 university students voluntarily participated in this international research; their average age was 23 and they were drawn from 37 countries, such as Argentina, Australia, Pakistan, South Korea, USA, India, Tunisia, Italy, Japan. The work was based on the evaluation that these young adults have made of 40 figures and significant events in world history.
Here is the full paper.

You may wonder why I badmouth Einstein so much, when he made some legitimate scientific contributions. This is why. He is wildly overrated, and people have learned the wrong lessons from him.

Some of those other characters are overrated also. If Saddam Hussein were really such a great villain, then G.W. Bush would be a great hero for destroying him. And I am surprised that so many people outside the USA have even heard of Martin Luther King.

Thursday, May 21, 2015

Lady Gaga of French mathematicians

The New Yorker magazine has a profile of a famous mathematician:
Villani has been called the Lady Gaga of French mathematicians. ...

Given the chance, not many of Villani’s colleagues would choose fame over mathematics. “A mathematician would usually be very reluctant to say half-lies,” Mouhot said, or to omit or overstate something. Villani has taken flak for involving himself in politics ...

Many mathematicians are glad that Villani is willing to participate in public life, Mouhot said, so that they don’t have to.
Yes, you rarely see publicity-seeking mathematicians who go around overstating things to get attention. In contrast, there are lots of physicists who do this all the time, such as Stephen Hawking, Sean M. Carroll, Brian Greene, and Lawrence Krauss. An extreme example is Michio Kaku.

Peter Woit used to be a physicist, but after switching to mathematics, he is now repulsed by overblown and unjustifiable claims.

Einstein's colleagues used to tell him that he was embarrassing himself with all the publicity seeking. So did Carl Sagan's.

The mathematicians who solved the biggest problems of the last 25 years, Fermat's Last Theorem and the Poincare Conjecture, are recluses who refuse to do any interviews.

I side with the mathematicians. A theme of this blog is that leading physicists have really embarrassed themselves by promoting wacky theories.
He was high on his soapbox now. “Languages were invented all around the world; technology was invented many times. Mathematics was developed once and collectively —- your culture cannot be complete if you don’t have at least a glimpse of what is mathematical reasoning.”
This is a good point. You sometimes hear people say that math is a language, but that misses the point of what math is all about.

I occasionally see claims that Chinese proved Pythagorean Theorem, or other claims that math was separately invented. These are nearly all false, as far as I can see. The axiomatic method, as in Euclid's Elements, was only developed by the ancient Greeks, as far as I know.

Yes, there are examples of some independent discovery, such as Newton and Leibniz finding calculus. But even in that case, they are more access to each others ideas that they wanted to admit.