Wednesday, August 6, 2014

A political book, not a math book

I have criticized physicist Sean B. Carroll many times, as he gets outsized publicity for his silly ideas, and now his wife reviews several math books in the NY Times:
HOW NOT TO BE WRONG
The Power of Mathematical Thinking
By Jordan Ellenberg
Penguin Press, $27.95.

Every math teacher cringes at the inevitable question from students: “When am I ever going to use this?” Ellenberg, a math professor at the University of Wisconsin, admits that even though we’ll never need to compute long lists of integrals in our daily lives, we still need math. It’s “a science of not being wrong about things,” he writes, and it gives us tools to enhance our reasoning, which is prone to false assumptions and cognitive biases.
She does not mention the political bias, exposed by these Amazon reviews:
Mr Ellenberg seems like he knows his math, but he's so entrenched in liberal academia that it probably never dawned on him that others might have different views. I only got to chapter six. When he started in on yet another example of statistical abuse, this time defending President Obama from his detractors, I just couldn't stick it any more.

I don't demand that an author agree with me politically. I'm perfectly aware (as Mr Ellenberg seems not to be) that thoughtful people can disagree. There was just no reason for every second or third example chosen by Mr Ellenberg to illustrate his ideas to be political and absolutely no reason at all for all of them to lean one way. Both parties commit egregious violations of mathematical principles every day, why go out of your way to alienate half of your audience? I'm surprised that the editor at Penguin let this pass. ...

The first indication I had that this isn't a book about how to analyze things objectively through mathematics was when I read the inside flap and it raised the question about who really won Florida in the 2000 election. I skipped ahead to that part of the book, hoping to see a sterile mathematical analysis. But there wasn't one.

There was a lot of totally subjective talk about why Scalia and the other justices ruled the way they did in Bush vs. Gore, with thinly veiled criticism of Scalia. This included the suggestion that Scalia doesn't care about finding the truth in a murder case, based on the author's superficial, sound-bite reference to a case that Scalia once ruled on.

Then he said that the overall count would have been more accurate if the court had allowed a recount in the counties that Gore asked for. This is totally false. The overall count would not have been more accurate if the recount - which could result in more votes being tallied - was only in counties deliberately chosen because they were likely to have more votes for one particular candidate. If you chose a few counties at random for a more thorough recount it would theoretically make the overall count more accurate, but not when you add the bias of choosing counties favorable to one candidate. A math expert should know this.

And, astonishingly, he disregarded the post-election recount sponsored by the New York Times and other media outlets. How can one do a mathematical analysis of the election without using this recount?

Well, he didn't do a mathematical analysis. He just concluded with conjecture about why he thinks the justices ruled the way they did - something that he has no way of knowing, and that isn't a mathematical analysis - with the implication that the result of the election would have been different had they not ruled that way. And, again, he gave NO math no support this.

From what I could see, the rest of the book wasn't any better. It's a political book, not a math book.
Ellenberg does not rebut these statements in the Amazon comments.

Many math and physics professors are living in a liberal bubble where they have no practice in critical political thinking skills.

Another Jennifer Ouellette review says:
We take it as given today that a continuous straight line is made up of an infinite number of distinct tiny parts — a concept that in the 17th century became the foundation of calculus. But it wasn’t always the case. As Alexander, a U.C.L.A. historian, reminds us, there was a time when “infinitesimals” were considered downright heretical. In 1632 the Society of Jesus forbade their use, and they ignited much contentious debate within London’s Royal Society. The debate still raged in the 1730s, when the Anglican bishop George Berkeley mockingly dismissed infinitesimals as “ghosts of departed quantities.”

The argument had little to do with how we look at a simple line, and everything to do with the major cultural shifts at the time, as the rise of scientific thinking challenged longstanding precepts of faith, and aristocratic privilege was beset by a wave of liberal egalitarianism.
No, the bishop did not consider infinitesimals heretical. They were not the ghosts, but his "ghosts" were the derivatives, being the ghostly limits of a sequence of ratios.

A line has been defined as a infinite set of points ever since Euclid, two millennia ago. The concept did not challenge medieval precepts of faith or anything like that. Berkeley's book is quoted here:
The infidel mathematician is believed to have been either Edmond Halley or Isaac Newton. He argued that although the calculus led to true results, its foundations were no more secure than those that underpin religion. He stated that the calculus involved a logical fallacy and described derivatives thus:

"And what are these fluxions? The velocities of evanescent increments? And what are these same evanescent increments? They are neither finite quantities, nor quantities infinitely small, nor yet nothing. May we not call them ghosts of departed quantities?"

In modern language, this could be read as:

"What are these 'instantaneous' rates of change? The ratios of vanishing increments? And what are these 'vanishing' increments? They are neither finite quantities nor 'infinitesimal' quantities, nor yet nothing. May we not call them the ghosts of departed quantities?"

His interesting theory as to why the calculus actually worked was that it was the result of two compensating errors.

As a consequence of the controversy surround Berkeley's publication, the foundations of calculus were rewritten in a much more formal and rigorous manner using limits.
A Scottish university site says:
Berkeley's criticisms were well founded and important in that they focused the attention of mathematicians on a logical clarification of the calculus. He developed an ingenious theory to explain the correct results obtained, claiming that it was the result of two compensating errors. Ren writes in [30]:-
By reviewing Berkeley's lifetime and the content of the "Analysts", we conclude that his critique was correct and that it impelled the improvement of the foundations of calculus objectively. It is helpful for the normal development of mathematics to accept various forms of critique positively.
Many of the other references which we give also discuss Berkeley's attack on the calculus; see [5], [11], [19], [21], [26], [30], and [33]. De Moivre, Taylor, Maclaurin, Lagrange, Jacob Bernoulli and Johann Bernoulli all made attempts to bring the rigorous arguments of the Greeks into the calculus.
Yes, achieving rigor in calculus was a good thing, and not an ignorant prejudice of medieval faith and aristocratic privilege, as Ouellette would have you believe. I don't know whether she shares her husband's atheist ideology, but she is taking cheap shots at religious scholars who did legitimate work.

It is easy to laugh at people who did not have a modern understanding centuries ago. Putting calculus on firm logical foundations took a long time from a lot of smart mathematicians.

1 comment:

  1. Calculus is not on a solid basis. The epsilon-delta definition is a simple non-sequitur.

    It's a sad state of affairs that mathematics is self-contradictory. Notice, I didn't say anything about Platonism or ontology because it's a red herring:
    http://projecteuclid.org/euclid.rml/1203431978

    All of these silly ideas lead to issues you see Solomon Fefferman talk about. In his book In the Light of Logic he points out how the math of actual science gives us evidence that the mathematicians really are unjustified.

    ReplyDelete