Dear Professor Zeilberger,

Hello, I recently found your opinions page and have spent several hours of a rainy Sunday reading through many of your opinions (they are certainly not boring!). Often though you take such extreme positions that I wonder where the zeal comes from and sometimes, whether you are really serious, as when you say that mathematicians are an analogue of dentists (I'll bite: what are the other two terms of the analogy?).

When you say (or rather when Garcia says, and you applaudingly quote) "Cauchy ruined mathematics! Let's throw out all that epsilon-delta nonsense," I think you are really going too far.

(First, to prove that I am a mathematician, I will make the nitpicky criticism that epsilon-delta is due to Weierstrass, and was an improvement over the vaguer, more verbal notion of "arbitrarily small" that Cauchy developed.)

But really, how can you say that Cauchy (and/or Weierstrass) ruined mathematics? That's just ridiculous, and even disrespectful: both of these men have given incredible gifts to mathematics, even to the areas of mathematics that you seem to vastly prefer (Cauchy was one of the founders of finite group theory; Weierstrass revolutionized complex analysis by showing that the whole theory could, at least in principle, be written in the language of power series; surely this was one of the most important events leading up to the theory of formal power series.)

Epsilon-delta is simply not nonsense; it is a key tool that we justifiably teach starting at the undergraduate level (and among the gadgets that one has to learn, its ratio of usefulness to difficulty of apprehension is in fact rather high). I suppose you and Garcia use epsilon-delta as a sort of synecdoche for "analysis" or even "continuous mathematics", and in other opinions you decry these topics (and even infinite sets). I frankly don't understand where you're coming fom when you say that we are slaves to analysis, nor do I think that words like "slave" -- or "rape" as I believe you use elsewhere -- make for appropriate analogies. In general I think people are starting to do way too much of this; the "Soup Nazi" episode of Seinfeld perhaps played a role in popularizing this trend. The word Nazi should be reserved for members of the Third Reich (the "Holocaust Nazis"). What's next, the "Auschwitz of parking garages?"

Getting back on topic, I think (as Gian-Carlo Rota did) that it is unwise for mathematicians in one discipline to put other disciplines down. If your point is that many problems of asymptotic or exact enumeration -- in which complex analysis is currently an important technique -- can probably (and in at least one interesting case, provably) be proved by other, more combinatorial and/or finitistic, methods, then sure, no argument there. (Are there really people who do argue with this?) Lots of theorems can be proven in different ways, and it is interesting and important to explore alternate proofs, which often develop important techniques which will be used to prove new theorems down the road. But come on, are you really saying that a would-be enumerative combinatoricist should not learn the one semester's worth of complex analysis that can be applied so fruitfully (to be sure, along with other methods)? You call that heavy artillery?!?

Why would you call analysis boring? Is it because, up until relatively recently, it was fashionable to make similar insults about finite mathematics? You're not going to convince a young mathematician to do combinatorics instead of something with a more continuous flavor just by putting the latter down. (Rather, talk positively about the interesting theorems in finite mathematics, of which there are of course many, and plenty of them proved by you.) But the idea that analysis has nothing to offer the rest of mathematics just baffles me. I am not an analyst: I am an arithmetic geometer, and there is certainly a combinatorial flavor in this subject. (Moreover I have an amateurish interest in some aspects of discrete mathematics and have published one -- not so impressive -- paper in that area: see http://www.cs.uwaterloo.ca/journals/JIS/VOL8/Clark/clark80.pdf.) But certainly complex analysis plays a key role in arithmetic geometry: a remarkable meta-theorem in geometry (a version of the "Lefschetz principle") is that if a theorem about varieties is true in all characteristics there should be (and usually is) a purely algebraic-geometric proof (i.e., a purely algebraic proof, dressed up in some fancy language), whereas if a theorem is true only in characteristic zero, then the most natural proof will involve complex analysis in some fundamental way. Do I believe that these proofs could in principle be recovered by some other methods (maybe even more finitistic methods)? Yes, and there are some famous examples of this, including a characteristic p proof of the Hodge Theorem by Deligne and Illusie. Is it interesting to look for these other proofs? Again yes, it's fascinating and important. Should we "hide" complex analysis and look only for proofs involving more algebraic and combinatorial methods? Of course not: why limit the tools we have at our disposal?

Just because all of mathematics "can be reduced to" a certain area or a certain method doesn't mean we should only think about things in these terms: there are many such reductions. I agree with you that all mathematical theorems must be ultimately (in some sense) finitistic, just because both the statement and the proof of a theorem must be finitely long (and, as you correctly point out, human-generated theorems and proofs must be not just finite but short enough for humans to apprehend, which is pretty short). However, I think that the infinite plays just as important a role: the hallmark of most theorems is that by doing a finite amount of work we deduce an infinite number of consequences.

I am not a fan of theorems which are (only) about large cardinals, or problems for which the answer turns out to be undecidable in our axiom system (and, like most practicing mathematicians outside of logic, set-theory or set-theoretic topology, I would be shocked if a problem I was working on turned out to be undecidable). If you had said that set theory is boring -- well, I wouldn't post that claim on my webpage, but I wouldn't argue with you either. It is a remarkable subject in that three weeks of it is indispensable in most areas of modern mathematics, what we learn in the fourth week might come up once every few years or so (e.g., the least uncountable ordinal as a counterexample in point-set topology), and more than that does not seem to be of much use, except to the set-theorists themselves.

But infinite sets are not a fiction (any more than sets with 10^10^10^10^10 elements are a fiction) nor a hell, and they form a very natural context for many arguments. Try to prove that there is no function from the unit interval to itself which has a removable discontinuity at every point. Now try to prove it without saying the word "uncountable": it's much harder (in fact, I do not know of such a proof, although I'm willing to bet one exists). Moreover, the distinction between countable and uncountable sets is the key to measure theory; the theory of measures of finite total mass "is just" probability (and conversely); and probability is super-important in finite mathematics. Don't you want to be able to say that a sequence of measures with finite support converges to a measure with uncountable support? Isn't this the most efficient way to describe many equidistribution results in finite mathematics?

Best regards, Pete L. Clark

Dear Doron,

Hmm, yes. Late last night I read more of your articles, and by this morning I am not surprised to hear that you are serious, since you seriously do not seem to believe in real numbers.

Summoning restraint, I will not try to talk you out of your ultrafinitist views, because surely others have tried with much the same words I would use. (I didn't quite succeed; see below!) Rather, it seems like the burden of exposition is on you: in papers such as "'Real' Analysis Is a Degenerate Case of Discrete Analysis" you sketch out a sort of manifesto, but surely you will agree that it is far from mathematics: there are too few proofs (or even arguments of any kind) of the statements you assert.

For instance, consider your claims (i) and (ii) on the bottom of page 2. Do you take these claims to be self-evident? I don't, and to be honest the only claim whose meaning I really understand is that the physical universe is finite. I find this statement to be meaningful and even somewhat plausible -- unfortunately I feel the same way about its negation, and my understanding is that the physics community is far from getting much evidence one way or the other (as far as I know, most physicists believe that we will see more or fewer subatomic particles depending upon how much money we can allocate to building bigger and bigger machines to smash particles together). It is hard to imagine a (finite!) sequence of experiments that would really make me believe finiteness versus infinitude of the physical universe. Rather, it seems that the best that one could hope for is a theory which is fundamentally based on one or the other, and which gives increasingly many new and always correct experimental predictions. We don't have such a theory as yet.

I don't know what "the mathematical universe" is, and (ii) is really puzzling to me: I guess you mean that h = 1/q is the reciprocal of some whole number and you are modelling the line by some large finite group Z/nZ, with n = pq? (But then why not choose one of the other factors of n?) Why do you think this? What evidence do you have (of any kind)? Why do you think that working with a huge, unknowable finite number is any more rigorous than allowing all natural numbers? (E.g., I would like to add one to your huge, unknowable prime, and your idea that I can't deserves further explication. If p is the cardinality of some set, then if you tell me what set it is, I'll give you back another set whose cardinality is p+1. If you're not using sets but something else, fair enough, but tell me what you are using.) How should actual mathematical practice be modified so as to pay suitable obeisance to your large, unknowable p? Do your own papers adhere to this practice? (Have you, for instance, never proved something by induction?)

There is an implied analogy on the top of page 3: Newtonian physics is to traditional real analysis as staying at subrelativistic speeds is to...what?? What has traditional real analysis gotten wrong? Wrong not in the sense of using methods of proof that make you uncomfortable, but actually gives wrong answers in the sort of concrete situations we know and love?

"It is probably possible to deconstruct the whole of traditional mathematics along finitism, but I doubt whether it is worth the effort."

This makes me think you are not really serious. If you believe you can set up a precise interface between my technical jargon and your technical jargon, then you are saying that our two jargons must both be correct. If you say that it is not worth the effort, then you are saying that you believe in advance that my jargon is correct.

Your definition of the classical derivative is almost exactly that of the B calculus student. If you mean something more here -- which I think you might, otherwise why would I be writing? -- you'll have to be more explicit. One criticism is that the definition you wrote down on the bottom of page four does not agree with the Maple code you gave on the top of page 5 -- you told Maple to algebraically simplify the expression before you evaluated h = 0. Of course this is not in general possible (and we even ask our best calculus students to get beyond this). The next example you give is not the best chosen: a rare case where, at least at the elementary level, defining the problem away is the best solution. But what if you chose sin x instead? Is it satisfactory to say that the derivative of sin x is (lim_{h -> 0} (sin h)/h) cos x?

More fundamentally I wonder what the exponential functions and the sine functions mean in your philosophy. I myself would prefer to define the sine function by a certain (everywhere convergent!) power series and then prove it has whatever properties you would like a sine function to have. Presumably Maple also will compute the sine function -- approximately but to any asked for degree of accuracy -- using this power series expansion. But isn't the data in a formal power series expansion (even with rational coefficients) exactly the same as the data for a real number? Can you believe in sin(1) without believing in real numbers? How?

Timothy Gowers has an article in which he takes the tack that many of your views seem so strange as to be worth dismissing, he has thought more carefully and understood and accepted your points in at least some cases. If a Fields Medalist has struggled to make sense of some of your views, it would seem that lessons to be drawn are (i) there is probably some aspect of your opinions which are worthy of consideration by the mainstream mathematical community -- perhaps there are even important new ideas here that we could greatly benefit from, and (ii) it is pretty hard for most of us to extract this meaning from what you're saying. So could you perhaps explain more and polemicize less?

Best regards, Pete

P.S.: I forgot to say that I _agree with you_ that discrete analysis exists, and is more general and more difficult than ordinary continuous analysis: of course the function D(f,h) = (f(x+h)-f(x))/h -- or the sequence of functions D(f,1/n) -- is more complicated than f'(x) (or the upper and lower derivatives). This is not a completely new idea -- and indeed, Rota in his Indiscrete Thoughts points out that we seem to pretend that continuous probability is harder than discrete probability when just the opposite is true. And there is fruitful mathematics here, as e.g. Beck and his collaborators have been pushing the view that the volume of a polytope is just one of the terms of the Erhardt polynomial. (But the increased difficulty of the discrete case is one of the merits of the continuous case -- in a known discrete real-world situation, if you can get away with a good approximation to a continuous situation, that's often a very good way to go.)

Back to Opinion 74 of Doron Zeilberger