An Ultra-Finitistic Foundation of Probability

By Doron Zeilberger

Delivered Nov. 19, 2018

Abstract: Probability theory started on the right foot with Cardano, Fermat and Pascal when it restricted itself to finite sample spaces, and was reduced to counting finite sets. Then it got ruined by attempts to come to grips with that fictional (and completely superfluous) 'notion' they called 'infinity'.

A lot of probability theory can be done by keeping everything finite, and whatever can't be done that way, is not worth doing. We live in a finite world, and any talk of 'infinite' sample spaces is not even wrong, it is utterly meaningless. The only change needed, when talking about an 'infinite' sequence of sample spaces, say of n coin tosses, {H,T}n, for 'any' n, tacitly implying that you have an 'infinite' supply of such n, is to replace it by the phrase 'symbolic n'.

This new approach is inspired by the philosophy and ideology behind symbolic computation. Symbolic computation can also redo, ab initio, without any human help, large parts of classical probability theory.

[This lecture was delivered on Nov. 19, 2018 at the Rutgers University Foundation of Probability Seminar, organized by Harry Crane, Barry Loewer, and Glenn Shafer.]

After reading the abstract, Glenn Shafer kindly prepared these fascinating historical notes.

Acknowledgement: The video was filmed and uploaded to youtube by Harry Crane.

Personal Journal of Shalosh B. Ekhad and Doron Zeilberger

Homepage of Doron Zeilberger