Essay AbstractMy essay did not win any prizes. I suspect that the prizes would have been a lot different if the evaluation had been blinded (ie, if author names were removed during evaluation).
Our mathematical models may appear unreasonably effective to us, but only if we forget to take into account who we are: we are the children of this Cosmos. We were born here and we know our way around the block, even if we do not always appreciate just how wonderful an achievement that is.
She has her own blog, and a grad student working on quantum teleportation and time travel.
The most substantive comments in her essay are about infinitesimals:
The natural sciences aim to formulate their theories in a mathematically precise way, so it seems fitting to call them the ‘exact sciences’. However, the natural sciences also allow – and often require – deviations from full mathematical rigor. Many practices that are acceptable to physicists – such as order of magnitude calculations, estimations of errors, and loose talk involving infinitesimals – are frowned upon by mathematicians. Moreover, all our empirical methods have a limited range and sensitivity, so all experiments give rise to measurement errors. Viewed as such, one may deny that any empirical science can be fully exact.No, this is not right. Physicists take non-rigorous shortcuts, and mathematicians frown on loose talk. But mathematicians have rigorous theories for estimating errors, infinitesimals, and all other math in use. Non-rigorous work may be convenient, and full rigor may be impractical in some cases, but it is a mistake to say that science requires non-rigorous math. Mathematicians strive to make all math rigorous.
In mathematics, infinitesimals played an important role during the development of the calculus, especially in the work of Leibniz , but also in that of Newton (where they figure as ‘evanescent increments’) . The development of the infinitesimal calculus was motivated by physics: geometric problems in the context of optics, as well as dynamical problems involving rates of change. Berkeley  ridiculed infinitesimals as “ghosts of departed quantities”. It has taken a long time to find a consistent definition of this concept that holds up the current standards of mathematical rigor, but meanwhile this has been achieved . The contemporary definition of infinitesimals considers them in the context of an incomplete, ordered field of ‘hyperreal’ numbers, which is non-Archimedean: unlike the field of real numbers, it does contain non-zero, yet infinitely small numbers (infinitesimals). The alternative calculus based on hyperreal numbers, called ‘non-standard analysis’ (NSA), is conceptually closer to Leibniz’s original work (as compared to standard analysis).I have previously argued that Berkeley was not ridiculing infinitesimals with that quote. The ghosts are the limits, not the infinitesimals.
While infinitesimals have long been banned from mathematics, they remained in fashion within the sciences, in particular in physics: not only in informal discourse, but also in didactics, explanations, and qualitative reasoning. It has been suggested that NSA can provide a post hoc justification for how infinitesimals are used in physics . Indeed, NSA seems a very appealing framework for theoretical physics: it respects how physicists are already thinking of derivatives, differential equations, series expansions, and the like, and it is fully rigorous.11
I don't know why she says the hyperreals are incomplete. They have the same completeness properties as the real numbers. That is, Cauchy sequences converge, bounded sets have least upper bounds, and odd order polynomials have roots.
The impression given here is that differential calculus and mathematical physics were non-rigorous until hyperreals and NSA justified infinitesimals. That is not true, and most mathematicians and physicists today do not even pay any attentions to hyperreals or NSA.
The mainstream treatment of infinitesimals is to treat them as a shorthand for certain arguments involving limits, using a rigorous definition of limit. The main ideas were worked out by Cauchy, Weierstrauss, and others in the 19th century, and probably perfected in the XXc. There is no need for hyperreals.
Infinitesimals were never banned from mathematics. They are completely legitimate if backed up by limits or hyperreals. Maybe physicists never learn that, but mathematicians do.
I might say: "Special relativity is the infinitesimal version of general relativity." What that means is that if you take a tangent geometric structure to the curved spacetime of general relativity, you get the (flat) geometry of special relativity. The tangent may be defined using limits, derivatives, or hyperreals. It is a rigorous statement, and these sorts of statements were never banned.
You do not see statements like that in physics books. They are more likely to say that special relativity is an approximation to general relativity, as they might say that a tangent line is an approximation to a curve. Mathematicians would rather take the limit, and make an exact statement.
Consider f'(x)dx which can be integrated to get f(x). You can view dx as a hyperreal infinitesimal, and the integral as an infinite sum. But the more conventional view is that infinitesimals are not numbers, but a method for getting tangents and tensors. Then f'(x)dx is not a simple function, but something that acts on tangent vectors and can be integrated. I am skipping over subtle details, but it is a rigorous infinitesimal method and described in elementary math textbooks.
Also dy/dx is symbolically the division of infinitesimals, but rigorously defined as a limit.
So the above paper badly misunderstands infinitesimals to treat them as only made rigorous by hyperreals. She also mentions considering Planck's constant h, or the reciprocal of the speed of light 1/c, to be like infinitesimals.
A recent book claims that Galileo used infinitesimals and the Jesuits banned such use. I don't know about that, but that predated Newton, Leibniz, and calculus. And I am sure that some use of infinitesimals was sloppy. All pre-XXc work was sloppy by modern standards. But the usage by mathematicians can be made rigorous. By the early XXc, it was all rigorous (in the math books).