Academia.eduAcademia.edu

From Newton to Cellular Automata

2009

Abstract

I outline a possible logical path from the formulation of physics of classical mechanics to "abstract" systems like cellular automata. The goal of this article is that of illustrating why physicists often study extremely simplified models, instead of just numerically integrating the fundamental equations of physics. This exposition is obviously only partial and based on my expertise and my interests. A similar version of this text appeared under the title Interaction Based Computing in Physics in the Encyclopedia of Complexity and System Science, Springer, New York 2009 p. 4902. 2 Physics and computers Some sixty years ago, shortly after the end of Second World War, computers become available to scientists. Computers were used during the last years of the war for performing computations about the atomic bomb [1, 2]. Up to then, the only available tool, except experiments, was paper and pencil. Starting with Newton and Leibnitz, humans discovered that continuous mathematics (i.e., differential and integral calculus) allowed to derive many consequences of the a given hypothesis just by the manipulation of symbols. It seemed natural to express all quantities (e.g., time, space, mass) as continuous variables. Notice however that the idea of a continuous number is not at all "natural": one has to learn how to deal with it, while (small) integer numbers can be used and manipulated (added, subtracted) by illiterate humans and also by many animals. A point which is worth to be stressed is that any computation refers to a model of certain aspects of reality, considered most important, while others are assumed to be not important Unfortunately most of human-accessible explorations in physics are limited to almost-linear systems, or systems whose effective number of variables is quite small. On the other hand, most of naturally occurring phenomena can be "successfully" modeled only using nonlinear elements. Therefore, most of pre-computer physics is essentially linear physics, although astronomers (like other scientists) used to integrate numerically, by hand, the non-linear equations of gravitation, in order to compute the trajectories of planets. This computation, however, was so cumbersome that no "playing" with trajectories was possible. While analog computers have been used for integrating differential equations, the much more flexible digital computers are deterministic discrete systems. The way of working of a (serial) computer is that of a very fast automaton, that manipulates data following a program. In order to use computers as fast calculators, scientists ported and adapted existing numerical algorithms, and developed new ones. This implied the development of techniques able to approximate the computations of continuous mathematics using computer algebra. However, numbers in computers are not exactly the same as human numbers, in particular they have finite (and varying) precision. This intrinsic precision limit has deep consequences in the simulations of nonlinear systems, in particular of chaotic ones. Indeed, chaos was "numerically discovered" by Lorenz [3] after the observation that a simple approximation, a number that was retyped with fewer decimals, caused a macroscopic change in the trajectory under study. With all their limits, computers can be fruitfully used just to speed-up computations that could eventually be performed by humans. However, since the increase in velocity is of several order of magnitude, it becomes possible to include more and more details into the basic model of the phenomenon under investigation, well beyond what would be possible with an army of "human computers". The idea of exploiting the brute power of fast computers has originated a fruitful line of investigation in numerical physics especially in the field of chemistry, biological molecules, structure of matter. The power of computers has allowed for instance to include quantum mechanical effects in the computation of the structure of biomolecules [4], and although these techniques may be targeted as "brute force", the algorithms developed are actually quite sophisticated. However, a completely different usage of computers is possible: instead of exploiting them for performing computations on models that already proved to approximate the reality, one can use computers as "experimental" apparatus to investigate the patterns of theoretical models, generally non-linear. This is what Lorenz did after having found the first examample of chaos in computational physics. He started simplifying his equations in order to enucleate the minimal ingredients of what would be called the butterfly effect. Much earlier than Lorenz, Fermi, Pasta and Ulam (and the programmer Tsingou [5]) used one on the very first available computers to investigate the basis of statistical mechanics: how energy distributes among the oscillation modes of a chain of nonlinear oscillators [6]. Also in this case the model is simplified at its maximum, in order to put into evidence what are the fundamental ingredients of the observed pattern, and also to use all the available power of computers to increase the precision, the duration and the size of the simulation. This simplification is even more fruitful in the study of systems with many degrees of freedom, that we may denote generically as extended systems. We humans are not prepared to manipulate more that a few symbols at once. So, unless there is a way of grouping together many parts (using averages, like for instance when considering the pressure of a gas as an average over extremely many particle collisions), we are in difficulties in understanding such systems. They may nevertheless be studied performing "experiments" on computers. Again, the idea is that of simplifying at most the original model, in order to isolate the fundamental ingredients of the observed behavior. It is therefore natural to explore systems whose physics is different from the usual one. These artificial worlds are preferably formulated in discrete terms, more suitable to be implemented in computers (see Sect. 4).