Chance, Calculation and Life. Группа авторов
Чтение книги онлайн.
Читать онлайн книгу Chance, Calculation and Life - Группа авторов страница 9
1.1. Introduction
Randomness is everywhere, for better or for worse: vagaries of weather, day-today fluctuations in the stock market, random motions of molecules or random genetic mutations are just a few examples. Random numbers have been used for more than 4,000 years, but they have never been in such high demand than they have in our time. What is the origin of randomness in nature and how does it relate to the only access we have to phenomena, that is, through measurement? How does randomness in nature relate to randomness in sequences of numbers? The theoretical and mathematical analysis of randomness is far from obvious. Moreover, as we will show, it depends on (and is relative to) the particular theory that is being worked on, the intended theoretical framework for the phenomena under investigation.
1.1.1. Brief historical overview
Democritus (460–370 BCE) determined the causes of things to necessity and chance alike, justifying, for example, the fact that atoms’ disorderly motion can produce an orderly cosmos. However, the first philosopher to think about randomness was most likely Epicurus (341–270 BCE), who argued that “randomness is objective, it is the proper nature of events”.
For centuries, though, randomness has only been mathematically analyzed in games and gambling. Luca Pacioli (in Summa de aritmetica, geometria, proporzioni et proporzionalita, 1494) studied how stakes had to be divided among gamblers, particularly in the difficult case when the game stops before the end. It is worth noting that Pacioli, a top Renaissance mathematician, also invented modern bookkeeping techniques (Double Entry): human activities, from gambling to financial investments, were considered as the locus for chance. As a matter of fact, early Renaissance Florence was the place of the invention of banks, paper currency and (risky) financial investments and loans1.
Cardano (in De Ludo Aleae (The Game of Dice), 1525) developed Pacioli’s analysis further. His book was only published in 1663, so Fermat and Pascal independently and more rigorously rediscovered the “laws of chance” for interrupted games in a famous exchange of letters in 1654. Pascal clarified the independence of history in the games of chance, against common sense: dice do not remember the previous drawings. Probabilities were generally considered as a tool for facing the lack of knowledge in human activities: in contrast to God, we cannot predict the future nor master the consequences of our (risky) actions. For the thinkers of the scientific revolution, randomness is not in nature, which is a perfect “Cartesian Mechanism”: science is meant to discover the gears of its wonderful and exact mechanics. At most, as suggested by Spinoza, two independent, well-determined trajectories may meet (a walking man and a falling tile) and produce a random event. This may be considered a weak form of “epistemic” randomness, as the union of the two systems, if known, may yield a well-determined and predictable system and encounter.
Galileo, while still studying randomness, but only for dice (Sopra le scoperte de i dadi, 1612), was the first to relate measurement and probabilities (1632). For him, in physical measurement, errors are unavoidable, yet small errors are the most probable. Moreover, errors distribute symmetrically around the mean value, whose reliability increases with the number of measurements.
Almost two centuries later, Laplace brought Pascal’s early insights to the modern rigor of probability theory (1998). He stressed the role of limited knowledge of phenomena in making predictions by equations: only a daemon with complete knowledge of all the forces in the Universe could “embrace in a single formula the movements of the greatest bodies of the universe and those of the tiniest atom; for such an intellect nothing would be uncertain and the future just like the past would be present before its eyes”. The connection between incomplete knowledge of natural phenomena and randomness is made, yet no analysis of the possible “reasons” for randomness is proposed: probability theory gives a formal calculus of randomness, with no commitment on the nature of randomness. Defining randomness proved to be a hugely difficult problem which has only received acceptable answers in the last 100 years or so.
1.1.2. Preliminary remarks
Randomness is a tricky concept which can come in many flavors (Downey and Hirschfeldt 2010). Informally, randomness means unpredictability, with a lack of patterns or correlations. Why is randomness so difficult to understand and model? An intuitive understanding comes from the myriad of misconceptions and logical fallacies related to randomness, like the gambler’s fallacy. In spite of the work of mathematicians since the Renaissance, there is the belief that after a coin has landed on tails 10 consecutive times, there are more chances that the coin will land on heads on the next flip. Similarly, common sense argues that there are “due” numbers in the lottery (since all numbers eventually appear, those that have not come up yet are “due”, and thus more likely to come up soon). Each proposed definition of randomness seems to be doomed to be falsified by some more or less clever counter-example.
Even intuitively, the quality of randomness varies: tossing a coin may seem to produce a sequence of zeroes and ones which is less random than the randomness produced by Brownian motion. This is one of the reasons why users of randomness, like casinos, lotteries, polling firms, elections and clinical evaluations, are hard pressed to “prove” that their choices are “really” random. A new challenge is emerging, namely, to “prove randomness”.
For physical systems, the randomness of a process needs to be differentiated from that of its outcome. Random (stochastic) processes have been extensively studied in probability theory, ergodic theory and information theory. Process or genesis randomness refers to such processes. On one hand, a “random” sequence does not necessarily need to be the output of a random process (e.g. a mathematically defined “random” sequence) and, conversely, a random process (e.g. a quantum random generator) is expected, but not guaranteed, to produce a “random output”. Outcome (or product) randomness provides a prima facie reason for the randomness of the process generating that outcome (Eagle 2005, p. 762), but, as argued in Frigg (2004, p. 431), process and outcome randomness are not extensionally equivalent. Process randomness has no mathematical formalization and can only be accessed/validated with theory or output randomness.
Measurement is a constant underlying issue: we may only associate a number with a “natural” process, by measurement. Most of the time we actually associate an interval (an approximation), an integer or a rational number as a form of counting or drawing.
Let us finally emphasize that, in spite of the existing theoretical differences in the understanding of randomness, our approach unifies the various forms of randomness in a relativized perspective:
Randomness is unpredictability with respect to the intended theory and measurement.
We will move along this epistemological stand that will allow us to discuss and compare randomness in different theoretical contexts.
1.2. Randomness in classical dynamics
A major contribution to the contemporary understanding of randomness was given by Poincaré. By his “negative result” (his words) on the Three Body Problem (1892, relatively simple deterministic dynamics, see below), he proved that minor fluctuations or perturbations below the best possible measurement may manifest in a measurable, yet unpredictable consequence: “we then have a random phenomenon” (Poincaré 1902). This started the analysis of deterministic chaos, as his description of the phase-space trajectory derived from a nonlinear system is the first description