\input epsf
\nopagenumbers
\magnification=\magstep1

\noindent {\bf Problem statement} 
Questions about asymptotic growth ``near'' $\infty$ occur naturally
when computer scientists analyze algorithms. One seemingly simple
problem is sorting. How many comparisons are required to sort a list
of $n$ numbers? {\sl Sorting and
Searching}, volume 3 of {\sl The Art of Computer Programming}, by
D.~Knuth, gives the following average running times for several
sorting algorithms as a function of $n$:

\medskip

\halign to 4in {\qquad\qquad # \hfil &\qquad # \hfil \cr
{\hfil \bf Name} & {\hfil \bf Running time} \cr
\noalign{\smallskip}
Comparison & $4n^2 + 10n$ \cr
Merge exchange & $3.7 n (\ln n)^2$ \cr
Heapsort & $23.08 n \ln n + 0.2 n$\cr}
\medskip

\noindent Which sorting method would you rather use if, in your
application, $10 \le n \le 20$ (e.g., sorting a bridge hand)? Which
would you rather use if $100 \le n \le 150$
(e.g., sorting grades in a lecture course)? Which would you rather use
if $n \approx 10^6$ (e.g., sorting license plate numbers in New
Jersey)? What happens to these functions as $n \to \infty$?


\vfil\eject\end

