Opinion 64: Make New (Kinds of) Science But Keep the Old, One is Silver, the Other Gold [and their Fusion is Platinum]

By Doron Zeilberger

Written: April 25, 2005.

Stephen Wolfram's wonderful "A New Kind of Science" states that equation-centric "traditional science" is dead, long live computer-simulation-and-experimentation-centric NKS (New Kind of Science). Even though, with Feynman-style childish exuberance and nonchalance of true genius, Wolfram "slightly" overstates his case, there is lots of merit in his manifesto. It is true that similar ideas have been pronounced before, e.g. by the great visionary Gregory Chaitin, and Wolfram did not invent computer-simulation, but the way he packaged it, and using Cellular Automata as a very apt case-study, he made a very convincing case for computer-experiment-centric science and math, since, as Chaitin has already pointed out before, The Principle of Computational Irreducibility presents a red-brick-wall between the humanly reachable and the real hard stuff.

Ironically, Wolfram's NKS itself is a beautiful example of Old Science. No computer, at present, can get the wonderful insights of Stephen Wolfram, who reasoned on the meta-level. What humans should do is leave the object-level to computers and graduate from the object-level to the meta-level. Think about science and math and discover new paradigms.

Also, don't get too excited about numerical simulations. What humans, from Archimedes to Wiles, were mostly doing was symbol-crunching, albeit, in a very limited way. If we teach the computer how to symbol-crunch, we can use the Wolfram methodology to discover empirically new "traditional-style" equations, formulas and laws, but of much greater complexity, that will carry us much further than number-crunching. For example, if you are looking for nice numerical facts, it would take a long time to discover the beautiful identity (100056+1)*(100056-1)=100056^2-1, but a high-school-algebra-discovery symbol-manipulation program will very soon discover (a+1)*(a-1)=a^2-1, which immediately implies the former.

Of course, the very best humans also engaged in idea-crunching, and it would be awhile before computers can do that. But, in a sense, idea-crunching is but an elevated form of symbol-crunching.

Also the dichotomy between induction and deduction is not as sharp as it seems. When we discover a theorem or a proof, it is but a factoid, that in principle, and in many cases already in practice, could have been found by computer, using inductive methods. I talked elsewhere about semi-rigorous math, where statements have rigor-measure-x, with x between 0 and 1 and x=0 is (physical) induction and x=1 is deductive old-time-certainty.

But even Wolfram, when he talks about mathematics, adheres to "traditional", what Feynman calls Greek-style (deductive) math, as opposed to Feynman's favorite, Babylonian-style (algorithmic) math. The future of math lies not in its traditional logic-centric approach, but rather in what I call the "Ansatz Ansatz", which consists in finding targeted and focused sub-areas of math, that are known or conjectured to have a certain form ("ansatz") to the expected answer, and then use computers to do "data-fitting" and find, empirically, the right formula, that can often be proved trivially, a posteriori, sometimes rigorously, and failing that, semi-rigorously, and at the very-least, non-rigorously.

To take my favorite example, I am sure that there exists a "reduction formula" that expresses F(n,a,b,c):=(a^n+b^n-c^n)^2 in terms of F evaluated at smaller values of (n,a,b,c) that immediately implies that F is strictly positive for n larger than 2 and a,b,c positive integers. One only needs to find the right ansatz, and a sufficiently large computer that will find the right parameters.

So as Hegel said, with all due respect to the thesis and anti-thesis, the SYNTHESIS is what is going to be the Post-New-Kind-of-Science.


Doron Zeilberger's Opinion's Table of Content

Doron Zeilberger's Homepage