------------Email correspondence between Doron Zeilberger and Svante Janson, June 1995--------- >From svante@tdb.uu.se Wed Jun 7 06:38:55 1995 Return-Path: Received: from bellatrix.tdb.uu.se by euclid.math.temple.edu (8.6.10/Math-HUB-v8) id GAA13787; Wed, 7 Jun 1995 06:38:51 -0400 Received: (from svante@localhost) by bellatrix.tdb.uu.se (8.6.11/8.6.11) id MAA04846; Wed, 7 Jun 1995 12:39:11 +0200 Date: Wed, 7 Jun 1995 12:39:08 +0200 (MET DST) From: Svante Janson To: zeilberg@euclid.math.temple.edu Subject: Re: your mail In-Reply-To: <9506021328.AA00851@peace.math.temple.edu> Message-ID: MIME-Version: 1.0 Content-Type: TEXT/PLAIN; charset=US-ASCII Status: RO Dear Doron, Thanks for your paper. This is what I asked about (although more exactly). A very nice proof! Svante Dear Svante, Don Knuth wrote me a letter and he mentioned that you have asked him about `the asymptotic covariance' of inv and maj. Is the answer in the paper below the answer to your question?, or does the phrase `asymptotic covariance' mean something other than the `the asymptotics of a_n:=E[(maj(\pi)-E(maj))(inv(\pi)-E(inv))], as n->\inf'. If I understood correctly, then a_n has exact answer itself,and one does not need to do asymptotics. Best wishes Doron %----------begin paper %brute.tex: A 1-page plain TeX file by D. Zeilberger %begin macros \def\inv{\mathop{\rm inv} \nolimits} \def\maj{\mathop{\rm maj} \nolimits} \def\des{\mathop{\rm des} \nolimits} \baselineskip=14pt \parskip=10pt \def\halmos{\hbox{\vrule height0.15cm width0.01cm\vbox{\hrule height 0.01cm width0.2cm \vskip0.15cm \hrule height 0.01cm width0.2cm}\vrule height0.15cm width 0.01cm}} \font\eightrm=cmr8 \font\sixrm=cmr6 \font\eighttt=cmtt8 \magnification=\magstephalf \def\lheading#1{\bigbreak \noindent {\bf#1} } \parindent=0pt \overfullrule=0in %end macros \bf \centerline {The Joy of Brute Force:} \centerline {The Covariance of The number of Inversions and the Major Index} \rm \bigskip \centerline{ {\it Doron ZEILBERGER}\footnote{$^1$} {\eightrm \raggedright Department of Mathematics, Temple University, Philadelphia, PA 19122, USA. %\break {\eighttt zeilberg@math.temple.edu .} Supported in part by the NSF. This note is exclusively published in {\eighttt http://www.math.temple.edu/$\tilde{\,\,}$zeilberg}. } } As usual, for a permutation $\pi$ of length $n$, let $\inv \pi$ be the number of $(i,j)$, such that $1 \leq i \pi[j]$, and $\maj \pi$ be the sum of $i$, such that $1 \leq i \pi[i+1]$. Svante Janson asked Don Knuth, who asked me, about the covariance of $\inv$ and $\maj$. The answer is ${{n} \choose {2}}/4$. To prove it, I asked Shalosh to compute the average of the quantity $(\inv \pi - E(\inv))(\maj \pi - E(\maj)) $ over all permutations of a given length $n$, and it gave me, for $n=1,2,3,4,5$, the values $0,1/4,3/4,3/2,5/2$, respectively. Since we know a priori\footnote{$^2$} {\eightrm This is the old trick to compute moments of combinatorial `statistics', described nicely in Graham, Knuth, and Patashnik's `Concrete Math', section 8.2, by changing the order of summation. It applies equally well to covariance. Rather than actually carrying out the gory details, we observe that this is always a polynomial whose degree is trivial to bound.} that this is a polynomial of degree $\leq 4$, this must be it! \halmos {\bf Exercises:} {\bf 1.} Find explicit expressions for the average of $(\inv \pi - E(\inv))^r (\maj \pi - E(\maj))^s $, for $r,s \in \{1,2,3\}$, by complete brute force. {\bf 2.} Find an explicit expression for the sequence $a_n:=$ the average of $(\inv \pi - E(\inv)) (\maj \pi - E(\maj))(\des \pi - E(\des)) $, over all permutations of length $n$, where $\des \pi$ is the number of $i$, such that $1 \leq i\pi[i+1]$. {\bf Moral:} We mathematicians were brainwashed to be clever, and to {\it think before computing}. The conventional wisdom is that clever thinking can eliminate the need for computation. This was true in the old days, since thinking is more pleasant than rote computing. But thinking is still painful, and now that computers can do the job, we should train ourselves and our students to go the other way: replace thinking by computation, that we can delegate to our computers. Of course, in order to justify this, we need some {\it meta-thinking}, as was done in footnote 2. The next step would be to train our computers to do the meta-thinking, and free us to do the meta-meta-thinking, until, at the limit, computers and humans would be indistinguishable (except that humans would err at a much higher rate, and would be much slower). \bye >From zeilberg Wed Jun 7 08:46:46 1995 Return-Path: Received: by euclid.math.temple.edu (8.6.10/Math-HUB-v8) id IAA14632; Wed, 7 Jun 1995 08:46:14 -0400 Date: Wed, 7 Jun 1995 08:46:14 -0400 From: zeilberg (Doron Zeilberger) Full-Name: Doron Zeilberger Message-Id: <199506071246.IAA14632@euclid.math.temple.edu> To: svante@tdb.uu.se Subject: Re: your mail Status: R Dear Svante, thanks for the kind words. I believe that the trick of reversing the order of summation to compute moments is not as well known as it should be. By the say, are you the same S.Janson as in Alm and Janson? I liked your paper very much. Did you see my short note on SAWs (to be published in J. Stat. Anal. and Planning)? In case you didn't, I am enclosing it under seperate cover. Best wishes Doron ---------------------------