What Frenkel and Ross did not tell us is that the “math” that led to the discovery of the Higgs boson is not their kind of (pure-and-rigorous) math, but the much more effective, and efficient, nonrigorous mathematics practiced by theoretical physicists called quantum field theory. This highly successful (and precise!) mathematical theory would not be considered mathematics by most members of the American Mathematical Society, since it is completely nonrigorous.I would not phrase it that way. I agree that the nonrigorous math is not really math, and that this is a dividing issue between mathematicians and physicists. Most physicists are sloppy in their math and do not really appreciate mathematical rigor.
But there is a lot of legitimate math behind quantum field theory, even if most physicists ignore it.
He appears to have a legitimate gripe about how the math community devalues computational math.
Proof is the only way to determine mathematical truth, and experiment (or observation) is the only way to determine scientific truth. When you have neither, then you have abominations like string theory.
He doubles down:
Traditional “rigorous proof” is yet another religious dogma, which did some good for a long time (as did the belief in God). Of course, it is not surprising that people can get deeply offended when someone denies the existence of their “God”.The Annals of Mathematics does not prohibit computer-assisted proofs. It published this paper of mine. It was a rigorous proof, not a heuristic or experiment.
But the God of (alleged!) rigorous proof is dead (well, not yet, but it should be!), and we should allow diversity. Rigorous proofs should still be tolerated, but they should lose their dominance, and the Annals of Mathematics should mostly accept articles with mathematics that has only semi-rigorous or non-rigorous proofs (of course, aided by our much more powerful and superior silicon brethren), because this way the horizon of mathematical knowledge (and mathematical insight!), broadly defined, would grow exponentially wider.
Yes, rigorous has been the standard for a long time. Ever since Euclid's Elements in 300 BC.
His web site has many opinions, starting with this:
First Published (in Hebrew): ... The history of science supplied us with many examples of true "geniuses" that were kicked out of high school because of poor achievements, and one of the most prominent examples is the greatest scientist of our time, the physicist Albert Einstein. But Einstein was the greatest scientist of all times, and hence he managed somehow to find his way in life (but it wasn't easy, even for him).No, Einstein was not kicked out of school or anything like that. He had very good grades and progressed thru a fairly rigid system to get a doctoral degree in physics.
Einstein did not succeed in doing any rigorous math. I am not sure he ever proved anything. Relativity is very mathematical, and he needed mathematicians Poincare and Minkowski to figure out the special theory, and Grossmann and Hilbert for the general theory.
For an example of math that seems sloppy but can actually be made rigorous, see
this NY Times account of how the natural numbers sum to -1/12. Or read Terry Tao's rigorous explanation or others.
He also has some examples of rejection from egghead journals: http://www.math.rutgers.edu/~zeilberg/Opinion77.htmlReplyDelete
Doron is quite right about the subject and also for being an ultrafinitist. Please note that he engages in a great deal of sarcasm. Someone like Galois would be a better example than Einstein.
"Real" Analysis is a Degenerate Case of Discrete Analysis
I find the argument that real numbers are sensible in the 21st century as being complete quackery. If you get rid of them, you keep solid math in another form and throw out all of the nonsense paradoxes. The point about proof is that the world has irreducible complexity. Is uncertainty on the order of 1*10^-100 good enough? What happens when proofs are so long that you can pile them as high as the Empire State building? Only small islands of mathematics are really accessible to human proof. The sooner we face that fact the better off we will be. Zeilberger is against the continuum, the potential and actual infinite and the reckless use of universal quantification. I am too!
Norman Wildberger makes a similar case in his foundations series on YouTube:
I have understood that mathematicians are not the best logicians and seem to overuse the law of the excluded middle. There is no clear definition of Dedekind cuts, or Cauchy sequences:
Dedekind cuts and computational difficulties with real numbers
Real numbers as Cauchy sequences don't work!
On Cantor's important proofs (W. Mueckenheim)
Chaitin on Real Numbers:
(Note: he makes a slight error about halting being a stronger result when it is a weaker version of Godel's first incompleteness theorem because it requires soundness.)
Logic of Actual Infinity and G. Cantor's Diagonal Proof of the Uncountability of the Continuum
Solomon Feferman debunks the Cantor cranks:
"Cosmologists do not know if the universe is physically infinite in either space or time, or what it means if it is or isn’t."
Terry Tao does boring analysis. Another failed savant in an outdated field. How many angels are on the head of that pin again?
Although the map is not the territory, I see how the gap between math and physics grows needlessly. People who defend this gap as sensible are just making an argument from narcissism. Get some logic or go home.
Traditional rigour is, in many cases, a kind of limiting behaviour with regards to heuristics. Consider a statement of the form 'there exists no positive integers a,b,c such that a^3+b^3=c^3'. Now there being no such integers with 'c < 10^10' will be enough for some (a very 'practical' heuristic). For others 'c < 10^10^10' will be plenty. What the 'traditional rigour' proof of FLT tells us is that we can increase our lower bound on c to get better and better 'heuristics' for as long as we like. As such, traditional rigour is the 'lazy mathematicians' way of doing away with the need for practical heuristics and the need to worry about their accuracy and reliability. But that 'tradtional rigour' comes at a cost in some cases: a 'good enough approximation' (e.g. pi to 100 decimal places in physics) is way easier to sort out than an 'arbitrarily good approximation' as happens when we have the computational p***ing contest to calculate pi to trillions of decimal places).ReplyDelete
One thing I do like about Zeilberger and others is that they at least take a serious interest in what mathematics is about, and what mathematicians do (as do quite a number of conventional mathematicians), as opposed to many who just 'play with symbols on paper for a living'.