%nothing.tex:
%%a Plain TeX file by Doron Zeilberger (5 pages)
%begin macros
\def\L{{\cal L}}
\baselineskip=14pt
\parskip=10pt
\def\halmos{\hbox{\vrule height0.15cm width0.01cm\vbox{\hrule height
0.01cm width0.2cm \vskip0.15cm \hrule height 0.01cm width0.2cm}\vrule
height0.15cm width 0.01cm}}
\font\eightrm=cmr8 \font\sixrm=cmr6
\font\eighttt=cmtt8
\magnification=\magstephalf
\def\P{{\cal P}}
\def\A{{\cal A}}
\def\Q{{\cal Q}}
\def\1{{\overline{1}}}
\def\2{{\overline{2}}}
\parindent=0pt
\overfullrule=0in
\def\Tilde{\char126\relax}
\def\frac#1#2{{#1 \over #2}}
%\headline={\rm \ifodd\pageno \RightHead \else \LeftHead \fi}
%\def\RightHead{\centerline{
%Title
%}}
%\def\LeftHead{ \centerline{ By Doron Zeilberger}}
%end macros
\bf
\centerline
{
}
\rm
\bigskip
\centerline{ {\it
By Doron ZEILBERGER}\footnote{$^1$}
{\eightrm \raggedright
Department of Mathematics, Rutgers University (New Brunswick),
Hill Center-Busch Campus, 110 Frelinghuysen Rd., Piscataway,
NJ 08854-8019, USA.
%\break
{\eighttt zeilberg at math dot rutgers dot edu} ,
\hfill \break
{\eighttt http://www.math.rutgers.edu/\~{}zeilberg} .
First version: June 15, 2007.
Transcript of a talk, entitled `` '' (without the quotes)
given at the DIMACS REU program, June 14, 2007.
Exclusively published in the Personal Journal of S.B. Ekhad
and D. Zeilberger
{\eighttt http://www.math.rutgers.edu/\Tilde zeilberg/pj.html} .
Supported in part by the NSF.
This version: Feb. 18, 2009 (thanks to Yiftach Levy).
}
}
[Silence for 30 seconds]. I always dreamed about giving a talk
about {\it nothing}. First, I thought
of just standing here, saying nothing, for the whole fifty minutes,
but with people's attention spans today-they have trouble
focusing only on {\it one} thing at a time-
I find it hard to believe that people would be able to handle
doing {\it zero} things for 50 minutes. Also, even I would probably get
bored not saying anything for so long, so let me say {\it something}
about {\it nothing}.
Every math talk is supposed to contain at least
{\it one} proof. Myself, I hate proofs, so let me get it out of the
way. But not all proofs are {\it that} bad. Here is a rather
nice one featuring today's protagonist: {\bf Nothing} .
{\bf Theorem:} (anon.) A ham sandwich is better than good sex.
{\bf Proof:} (anon.) The following two assertions are obvious.
1. A ham sandwich is better than {\it Nothing} .
[Indeed, I am both Jewish and Vegetarian, so I would try very hard
not to eat it, but if I were stuck on a desert island, and had
nothing to eat for five days, I admit that I would eat it,
and it is better than nothing.]
2. {\it Nothing} is better than Good Sex .
The theorem follows by the {\it transitivity} of the {\it is better} relation.
Having gotten the perfunctory proof out of the way, let me
talk a little more about {\it nothing}.
The analog of Nothing in number theory is {\it zero}, which is
the {\bf most} important number of all numbers. To my surprise,
when I googled {\tt zero} a few hours ago, it only got
about $243 \cdot 10^6$ hits, while {\tt one} got
about $2.06 \cdot 10^9$ hits, an order-of-magnitude higher,
and {\tt zero} only ranked somewhere between {\tt eight} ($280 \cdot 10^6$)
and {\tt nine} ($197 \cdot 10^6$). So the moral is
that you can't tell the importance of something just by the number
of google hits, a fact that I should remember when I pride
myself that my own number of google hits is higher than
those of most Fields medalists.
And indeed the invention of {\it zero} was one of the greatest
accomplishments of human-kind, comparable to the discovery of
{\it fire}.
If {\it zero} is the most important {\it number} then
the {\it empty set}, $\emptyset$, is the {\bf most important} set.
My favorite computer algebra system, {\tt Maple$^{TM}$},
can, of course, handle {\it both} zero and $\emptyset$, the
latter being denoted by $\{\,\, \}$, both of which represent
{\it nothing} in some sense. But it has an even more nothing object.
Try to do the following in Maple:
{\tt solve($\{ 0=1\}$,{x}); } \quad ,
and see what you get. You get {\bf absolutely nothing}, not
even the empty set. If you then ask it
{\tt evalb(\%=NULL);} \quad ,
you would get {\tt true}, so Maple's name to the epitome of
{\it nothingness} is {\tt NULL}.
Speaking of {\it Nothing}, once there was a factory that invited
an industrial engineer to see whether things can be made more efficient
and streamlined, and whether it would be possible to eliminate some
jobs due to duplications. So the engineer sees a worker who seems
to be idle, and asks him: ``what are you doing all day?'',
``Nothing'', he replied. The engineer says ``thank you''
and goes on. A while later, the engineer sees another worker
who seems to be idle, and asks him the same question, and
gets the same answer, ``Nothing''. The engineer then
recommends to the manager to fire that other guy. ``But that's
unfair, exclaimed the worker, that first guy is also doing
nothing, and you didn't fire him!''. The engineer replied:
``That's {\it exactly} my point, we don't need {\bf two} different people doing
the {\bf same} thing!''
So {\it Nothing} is really something, but only {\bf one} thing.
Let me also remind you that zero, like all of mathematics, is
fictional and an {\it idealization}. It is impossible to reach
{\it absolute zero} temperature or to get {\it perfect vacuum}.
Luckily, mathematics is a fairyland where ideal
and fictional objects are possible.
Furthermore, let me take issue with {\it Three Dog Night} who claimed that
{\it one is the loneliest number}. It is not at all lonely.
I never get lonely with my own company, but would feel depressed
without it, so {\it zero} is the loneliest number,
and {\it one} is (good!) company, and {\it two} (not {\it three}!)
is already a crowd.
{\bf The Empty Set}
In the 19th century, Leopold Kronecker famously said that
``{\it God created the integers, all the rest is man-made}''.
A bit later, about a hundred years ago, mathematicians and logicians
thought that even integers are too complex to be really fundamental,
and they tried to reduce {\it everything}, in particular integers,
to {\it sets}. So the great computer pioneer, John von Neumann,
when he was still rather young, came with a brilliant way
to {\it define} integers in terms of {\it sets}, and all he
really needed was a starting point: {\it the empty set}.
So according to von Neumann:
$$
0:=\emptyset \quad .
$$
Now we have {\it one} object at our disposal, so let us form the
set consisting of what we have so far:
$$
1:=\{ \emptyset \} \quad .
$$
Now, at the second day, let's gather what we had so far,
and make them into a set
$$
2:=\{ \emptyset, \{ \emptyset \} \} \quad,
$$
and at the third day, let's define
$$
3:=\{ \emptyset, \{ \emptyset \},
\{ \emptyset, \{ \emptyset \} \} \} \quad,
$$
and, in general
$$
n := (n-1) \cup \{ n-1 \} \quad ,
$$
ad infinitum.
But von Neumann's construction can only handle {\it integers},
starting with the {\bf empty set}.
What about other kinds of numbers?
About thirty-five years ago, John Horton Conway realized
something {\it revolutionary}. All numbers are games!
Now there are many games that are not numbers, so {\it game}
is a more fundamental object than {\it number}. According
to his own account, he felt a bit guilty that during the
research slump that hit him after discovering the Conway groups,
he hardly did any mathematics, but spent most of the day playing
games at the common room of his Cambridge college.
Only later did he realize that {\it playing games} {\bf is}
research, and furthermore, more interesting and significant
than the vast majority of his colleagues' research, since it
lead him to the great idea that {\bf numbers are games}.
What is a game? The kind of games that Conway handles are
{\it partizan } games, with two players {\bf Left} (L) and
{\bf Right} (R), with perfect information, perfect players,
and no chance.
A game in Conway's sense is really a {\it game position}, so Chess
is not really {\it one} game but billions of different games, each
corresponding to a (legal) position on the board. A game can be completely
characterized by Left's options, which is a certain
{\it set of games}, and by Right's options, which is another
set of games. So we have the {\bf recursive definition}:
$$
Game:=[ SomeSetOfSimplerGames, AnotherSetOfSimplerGames] \quad.
$$
Who wins and who loses such a game? The convention is that
if it is {\it your} turn to move, and
{\it there is nothing you can do} then {\it you} {\bf lost}.
The simplest conceivable game is where both Left's and Right's
sets of options happen to be the empty set. He calls it {\bf the zero game}.
$$
0:=[ \emptyset, \emptyset ] \quad.
$$
Who wins {\bf the zero game}?
If $L$ goes first, he lost, since there is nothing he can do.
Likewise, if $R$ goes first, he lost, since there is nothing he can do.
So the {\bf second player wins}, regardless of race, sex, sexual
orientation, or in our case, political views.
Conway defines a {\bf zero game} to be a game that has this
property that the {\bf second player wins}. There is only
one ``{\bf the zero game}'', but there are lots and lots of
``{\bf a zero game}'', namely all those for which the
second player wins regardless .
So much for the only ``game'' that was created at day $0$.
What games are born at day $1$?
Now we have two sets at our disposal, $\emptyset$ and $\{0\}$.
This gives rise to $2 \times 2=4$ pairs of sets, but
since $0=[\emptyset, \emptyset]$ has already been created yesterday,
we only have three new ones
$$
[\{ 0 \}, \emptyset] \quad , \quad
[ \emptyset , \{ 0 \} ] \quad , \quad
and \quad [ \{ 0 \} , \{ 0 \} ]
$$
that Conway christened $1$, $-1$ and $*$, respectively.
$*$ is the simplest game that is {\bf not} a number.
Who wins $1$? If $L$ goes first, he may (in fact must!)
go to game $0$, and it is $R$'s turn to go, and of course,
poor $R$ has nothing to do, so $R$ loses and $L$ wins.
If $R$ goes first, then poor $R$ can't do anything, and
again he loses. So the game $1$ is always won by $L$.
$1$ is the simplest number that is {\bf positive}, and
Conway defines a {\bf positive game} to be one for which
$L$ wins regardless of who starts.
Who wins $-1$? If $R$ goes first, he may (in fact must!)
go to game $0$, and it is $L$'s turn to go, and of course,
poor $L$ has nothing to do, so $L$ loses and $R$ wins.
If $L$ goes first, then poor $L$ can't do anything, and
again he loses. So the game $-1$ is always won by $R$.
$-1$ is the simplest number that is {\bf negative}, and
Conway defines a {\bf negative game} to be one for which
$R$ wins regardless of who starts.
Finally, who wins $*:=[\{ 0\}, \{ 0\}]$?
If $L$ goes first then $R$ must play the $0$ game, and loses it, of course.
Likewise,
if $R$ goes first then $L$ must play the $0$ game, and loses it, of course.
So for $*$, the {\bf first player wins} regardless of who starts it.
Such games are called {\bf fuzzy}, since they are not numbers, and
any game that has the property that the {\bf first} player
always wins is called a {\bf fuzzy game}.
Now we can go to day 2, day 3, and so on,
and define more and more complicated
games, and for each of these we can determine (recursively!)
whether they happen to be
zero, fuzzy, positive, or negative. But we haven't yet defined
{\bf number}, and reconcile the usual meaning of ``positive''
and ``negative'' with Conway's meanings.
A {\it number} is defined very similarly to a {\it game}.
$$
Number:=[ SomeSetOfSimplerNumbers, AnotherSetOfSimplerNumbers] \quad,
$$
but with the {\it additional condition} that the members of the
right set are all ``bigger'' than all the members of the left set.
More formally, the definition of {\it number}
is a pair of sets $[X_L,X_R]$, with the condition that
you {\bf cannot} find $x_R \in X_R,x_L \in X_L$
such that $x_R \leq x_L$.
But what does it mean for a number to be $\leq$ another number?
The relation $\leq$ is also defined recursively
$$
x (=[X_L,X_R]) \leq y (=[Y_L,Y_R]) \quad ,
$$
iff
it can {\bf never} happen that $x_L \in X_L$ and $y \leq x_L$ and
it can {\bf never} happen that $y_R \in Y_R$ and $y_R \leq x$ .
Finally, one can naturally define the {\bf minus} of a game (or number)
$$
-[X_L,X_R]:= [-X_R, -X_L] \quad,
$$
($-X$ is the set of $-x$ for $x \in X$),
and the {\bf sum} is defined (again recursively):
$$
H+G=[\, (H_L+G) \cup (G_L+H) \, , \, (H+G_R) \cup (H_R+G) \, ] \quad .
$$
(Given a set of games $\A$ and a game $G$, $\A+G$ is the set
of games $A+G$, for all $A \in \A$).
This makes sense, since if we have two games on the table,
$H$ and $G$ and it is Left's turn, he may decide to work on
the $H$ game, leaving $G$ intact, or on the $G$ game, leaving
$H$ intact, and analogously for Right's options.
There is a charming little book called ``Surreal Numbers'' by guru Don Knuth,
where two lovers, Alice and Bob, stuck on a tropical island,
develop the theory {\it from scratch} just using
a few lines written in a stone they found.
Of course, Alice and Bob are fictional, but
{\it you} are at least as smart as they are. Anyway, in order
to have the fictional Alice and Bob develop the theory of
surreal numbers, their
creator, Don Knuth, had to do it first, and he did it just
from the few {\it definitions} that Conway told him over lunch.
I strongly recommend that you read Knuth's charming little
booklet, but I challenge you, before you read it,
to try and develop the theory, on your own, as much as possible,
and then compare notes with the book. Since, you are not
stranded in an island,
you have one advantage over Alice and Bob, that you can program everything.
so I also hope that some of you will program the definitions
and test the theorems, since the
best way to learn something new is to program it.
As you will develop the theory, you would have to prove lots
of things {\it inductively}, since all the definitions
are recursive. But again and again you would encounter
situations where you have to check something about the {\bf empty set}.
So let me conclude with the {\bf take-home} message of this
talk, that is useful here and elsewhere:
{\it ANY} PROPERTY, {\bf WHATSOEVER}, IS TRUE FOR
{\it EVERY} MEMBER OF THE {\bf EMPTY SET}.
\end