Regarding my critics: I called this a mild critics since you may be right (even very likely) and although we don't know if the concept of infinity makes any sense in the real world we keep it because it is so DAMN PRACTICAL. In fact engineers, being the very embodiment of pragmatic views, LOVE INFINITY, although most of them think of our world as a finite construct.
I remember well when I learned electrodynamics when the professor suddenly started using "an infinitely stretched plane" using the argument that although the plane is finite, in relation for a small electron the plate can be seen as infinitely big. Now one could use simple calculus to provide a numerical value instead of an un-aproacheable closed formula.
The fact we love infinity is that it allows us to "blur" or "smear" things. Think of JPEG or MP3: A picture/Sound is a discrete thing. But making it to a continuous function (an infinite thing) we can apply Fourier theory to filter out "unimportant" information and store it in small size. Or the usage of normal distribution in favour of binomial distribution is similar.
Also the replacement of the finite difference quotient for the derivative will not suffice, since the problems stem from numerics: When we have errors from the function (due to rounding or measurement errors) we have in the worst case:
(f(x+h)-f(x))/h APPROXIMATELY EQUAL ((f(x+h) + ε) - (f(x)- ε))/h = (f(x+h)-f(x))/h + 2 ε/h,
where 2 ε/h will get very very big. Anyone working in numerics or optimization runs into that problem from time to time. The only solution is to use the classical derivative, although it may be problematic in a philosophical point of view, it serves a numerical stable approximation. Fact is a new branch in HPC/optimization emerged: Automatic differentiation, where much effort is put into (automatically) rewriting algorithms to provide exact derivatives by using the chain rule many many times. It's not that they want to, they NEED to, because finite differences are solely too unstable and slow in order to solve big real world problems (EDF for example uses this to compute the optimal route for a stream).
Optimization is actually the best example: Although we could solve discrete problems always directly, we still use "blurring" to look at an continuous auxiliary problem where we can find the optimal solution quickly and then going back into the discrete world.
The last discipline worth to mention is asymptotic analysis, the very embodiment of this ansatz: Recursive relations can be expressed by formal power series, which again can be seen as actual convergent series hence resulting in an analytic function. Those can now be tackled with function theory, not providing necessarily closed forms, but handy asymptotic approximations which often are sufficient (or even better suited in some cases). I lalso had a case during my master's thesis were I used asymptotic analysis for high oscillating cosine functions (appearing when dealing with waves). The asymptotic approximation was more stable than any attempt using power series or the like.
So even if there is nothing infinite we currently still need it as a mere tool to provide as with good approximations, and ironically we don't need infinite sets to have something infinity like: Think on the Riemann sphere where the plane of complex numbers is projected on the 3D sphere using the stereographical projection. In this setting infinity is simply expressed by the north pole of the sphere and suddenly we are provided with an at least "infinite like thing" for which we can use simple arithmetic and a "movement" towards it.
So even if one day computers will be fast enough to deal with the problems mentioned in an instant
And even if you are right with your point of view a lot of people will still use this non-existing concept for solving their problems as a
useful heuristic. So studying infinity as an idea can still make sense.
Back to
Opinion 160 of Doron Zeilberger