Chance, Calculation and Life. Группа авторов
Чтение книги онлайн.
Читать онлайн книгу Chance, Calculation and Life - Группа авторов страница 16
This form of randomness mimics the human perception of randomness well, but its quality is rather low because computability destroys many symptoms of randomness, e.g. unpredictability15. One of the reasons is that pseudo-random generators “silently fail over time, introducing biases that corrupt randomness” (Anthes 2011, p. 15).
Although, today, no computer or software manufacturer claims that their products can generate truly random numbers, these mathematically unfounded claims have re-appeared for randomness produced with physical experiments. They appear in papers published in prestigious journals, like Deutsch’s famous paper (Deutsch 1985), which describes two quantum random generators (3.1) and (3.2) which produce “true randomness” or the Nature 2010 editorial (titled True Randomness Demonstrated (Pironio et al. 2010)). Companies market “true random bits” which are produced by a “True Random Number Generator Exploiting Quantum Physics” (ID Quantique) or a “True Random Number Generator” (MAGIQ). “True randomness” does not necessarily come from the quantum. For example, “RANDOM.ORG offers true random numbers to anyone on the Internet” (Random.Org) using atmospheric noise.
Evaluating the quality of quantum randomness can now be done in a more precise framework. In particular, we can answer the question: is a sequence produced by repeated outcomes of measurements of a value indefinite observable computable? The answer is negative, in a strong sense (Calude and Svozil 2008; Abbott et al. 2012):
THEOREM 1.6.– The sequence obtained by indefinitely repeating the measurement of a value indefinite observable, under the conditions of Theorem 1.1, produces a bi-immune sequence (a strong form of incomputable sequence for which any algorithm can compute only finitely many exact bits).
Incomputability appears maximally in two forms:
– Individualized: no single bit can be predicted with certainty (Theorem 1.4), i.e. an algorithmic computation of a single bit, even if correct, cannot be formally certified;
– asymptotic (Theorem 1.6): only finitely many bits can be correctly predicted via an algorithmic computation.
It is an open question whether a sequence (such as Theorem 1.6) is Martin-Löf random or Schnorr random (Calude 2002; Downey and Hirschfeldt 2010).
1.7. Conclusion and opening: toward a proper biological randomness
The relevance of randomness in mathematics and in natural sciences further vindicates Poincaré’s views against Hilbert’s. The first has stressed (since 1890) his interest in negative results, such as his Three Body Theorem, and further claimed in Science et Méthode (1908) that “… unsolvable problems have become the most interesting and raised further problems to which we could not think before”.
This is in sharp contrast with Hilbert’s credo – motivated by his 1900 conjecture on the formal provability (decidability) of consistency of arithmetic – presented in his address to the 1930 Königsberg Conference16 (Hilbert 1930):
For the mathematician, there is no unsolvable problem. In contrast to the foolish Ignorabimus, our credo is: We must know, We shall know.
As a matter of fact, the investigation of theoretical ignorabimus, e.g. the analysis of randomness, opened the way to a new type of knowledge, which does not need to give yes or no answers, but raises new questions and proposes new perspectives and, possibly, answers – based on the role of randomness, for example, a demonstrable mathematical unpredictability from which the geometry of Poincaré’s dynamic systems is derived, or even the subsequent notions of an attractor, etc. Hilbert was certainly aware of Poincaré’s unpredictability, but (or thus?) he limited his conjectures of completeness and decidability to formal systems, i.e. to pure mathematical statements. Poincaré’s result instead, as recalled above, makes sense at the interface of mathematics and the physical world; for an analysis of the methodological links with Gödel’s theorem, purely mathematical, and Einstein’s result on the alleged incompleteness of quantum mechanics, see Longo (2018).
The results mentioned in section 1.6.1 turn, by using Birkhoff randomness in dynamical systems, physico-mathematical unpredictability into a pure mathematical form: they give relevant relations between the two frames, by embedding the formal theory of some physical dynamics into a computational frame, in order to analyze unpredictability with respect to that theory.
As for biology, the analysis of randomness at all levels of biological organization, from molecular activities to organismal interactions, clearly plays an increasing role in contemporary work. Still, we are far from a sound unified frame. First, because of the lack of unity even in the fragments of advanced physical analysis in molecular biology (typically, the superposition of classical and quantum randomness in a cell), to which one should add the hydrodynamical effect and the so-called “coherence” of water in cells pioneered by del Giudice (2007).
Second, biology has to face another fundamental problem. If one assumes a Darwinian perspective and considers phenotypes and organisms as proper biological observables, then the evolutionary dynamics imply a change to the very space of (parameters and) observables. That is, a phylogenetic analysis cannot be based on the a priori physical knowledge, the so-called condition of possibility for physico-mathematical theories: space-time and the pertinent observable, i.e. the phase space. In other words, a phylogenetic path cannot be analyzed in a pre-given phase space, like in all physical theories, including quantum mechanics, where selfadjoint operators in a Hilbert space describe observables. Evolution, by the complex blend of organisms and their ecosystem, co-constitutes its own phase space and this in a (highly) unpredictable way. Thus, random events, in biology, do not just “modify” the numerical values of an observable in a pre-given phase space, like in physics: they modify the very biological observables, the phenotypes, which is more closely argued in Longo and Montévil (2014b). If we are right, this poses a major challenge. In the analysis of evolutionary dynamics, randomness may not be measurable by probabilities (Longo et al. 2012a; Longo and Montévil 2014a). This departs from the many centuries of discussion on chance only expressed by probabilities.
If the reader were observing Burgess fauna, some 520 million years ago (Gould 1989), they would not be able to attribute probabilities to the changes of survival of Anomalocaris or Hallucigenia, or to one of the little chordates, nor their probabilities to become a squid, a bivalve or a kangaroo. To the challenges of synchronic measurement, proper to physical state determined systems, such as the ones we examined above, one has to add the even harder approximation of diachronic measurement, in view of the relevance of history in the determination of the biological state of affairs (Longo 2017).
Note that in section 1.4 we could have made a synthetic prediction: phenotypic complexity increases along evolution by a random, but asymmetric, diffusion. This conclusion would have been based on a global evaluation, a sum of all the numbers we associated with biological forms (fractal dimensions, networks’ numbers, tissue folding … all summed up). In no way, however, can you “project” this global value into specific phenotypes. There is no way to know if the increasing complexity could be due to the transformation of the lungs of early tetrapods into swim bladders and gills (branchia), or of their “twin-jointed jaw” into the mammalian