Written: Nov. 25, 2015
I just finished reading the gripping and very insightful recent book, A Brief History of Numbers", written by Leo Corry.
The blurb in the back cover says that "this book should be of interest to anyone interested in the history of mathematics". True, the story of how humankind's conceptions of numbers evolved should be of great interest to anyone who is even remotely interested in history, but, as I argue below, it should be of interest to every one, since everybody should be interested in the future.
Regarding the past, Corry's great book would tell you, among many other things:
But don't be so smug-and patronizing-to past mathematicians! In one hundred years, people will be just as amused at our own mental blocks, and "not seeing" what to future mathematicians would seem obvious. For one, we still use a semi-rhetorical style in communicating our mathematics. In 100 years, a mathematician will glance, say, at Andrew Wiles' FLT proof and smile at how wordy it is, the same way as we smile at Cardano's solution of the cubic. The 2116 version of Wiles' proof would be more like Gonthier's fully automated proof of the Appel-Haken Four Color Theorem, and probably even more automated. Also, in one hundred years, it would be realized that using the Langlands Program was a very inefficient route, and there would be a "one-line" computer, fully elementary, proof (but with a rather long line, that only a computer would be able to verify).
I also realized how contingent mathematics is. I really liked the following sentence from the penultimate page (p. 293) of Corry's book
"The increasing predominance of the use of electronic computers in mathematics is a phenomenon that is bound to have long-ranging consequences also on the foundational conceptions of what are numbers and what they should be".
In particular, we would realize that all the efforts of Cauchy, Weierstrass, Dedekind, and Cantor (to just name a few) to get a "solid" and "rigorous" foundation to calculus and "real" numbers, was a big waste of time, but these guys probably enjoyed it, so we should not feel sorry for them. But we do have to feel sorry for our poor undergraduate (and graduate) students who have to suffer through these unnecessary "foundations". We can backtrack back to the Pythagoreans, and refuse to accept the existence of square-root of 2, qua number (more precisely, as a philosopher would say, its ontological status). The only numbers that really exist are rational numbers, and instead of talking about "real" numbers , e.g. sqrt(2), or Pi, we can talk about intervals of rational numbers [a,b], of as-small-as-we-wish size, where they are supposed to lie. In fact, that's exactly how people treat "real" numbers to get rigorous proofs (e.g. Tom Hales' second-generation, formal proof, of Kepler's conjecture), and it is called "interval arithmetic".
Also, in a digital computer, to solve differential equations, numerically, one solves, discrete, finite difference equations, and all one needs is the far simpler "discrete analysis" where the Fundamental Theorem of Calculus is a triviality:
S(n)=a(1)+ ...+ a(n), iff ( S(n)-S(n-1)=a(n) and S(0)=0 ) .
So the actual history of mathematics, that lead to our current mathematics, is just one, and most probably not the best, possible mathematics, out of many virtual histories, that could have arisen, if humans had different hang-ups, or the leading mathematicians would have had different personalities and predilections. All the notions of Lebesgue measure, Cantor cardinals, the baroque theory of "existence" and "uniqueness" theorems for partial differential equations, with its hair-splitting, angels-on-a-pinhead flavor, using Sobolev spaces and what not, could have been replaced by a much more enjoyable, constructive, discrete mathematics.
To sum up, even if you don't care at all about the past, you should at least care about the future, and while Corry's book will not tell you anything specific about the future of mathematics, one thing is obvious, we would be the laughing-stocks of twenty-third century mathematicians.