The Contributory Revolution. Pierre Giorgini
Чтение книги онлайн.
Читать онлайн книгу The Contributory Revolution - Pierre Giorgini страница 15
However, this scientific development will, almost as a side benefit, generate observation tools that will increase the scope of phenomena subject to human scrutiny. The telescope will reveal the invisible of the infinitely large (astrophysics) and microscopy, or the tools of electromagnetic experimentation and measurement, the invisible of the infinitely small. Galileo will assert in a completely counter-intuitive way that the Sun does not revolve around the Earth but rather the opposite. It is the passage from geocentrism to heliocentrism. Then the game of co-fertilization between mathematics and physics will literally explode and give rise to new formalisms that are increasingly counter-intuitive. They will no longer conform to the principles emanating from the relationship to the visible and temporal world, perceived directly by the human senses. They will be efficient, however, because they will allow the formal construction to legitimize itself in the principles of proof through measurement and indirect observation. We will see, moreover, that in certain cases these devices can no longer be thought of as neutral and transparent, not intervening on the observed reality, but only transcribing by observation and measurement a faithful part of this reality (which is no longer the case in quantum physics, for example).
But this legitimacy through evidence or experience will remain inevitably partial. This is moreover a first elementary principle of incompleteness. We will return to it because, however multiple it may be, experience will remain discrete (discontinuous and countable) in this world of the invisible and will never completely cover the whole continuous field, theoretically covered by formalism itself. Experience will tell us that this principle is verified here and there, while never being able to prove that this is true everywhere and always.
There are many examples of theories leading to counter-intuitive principles of proof. For mathematics, this is the case of the body of complex numbers which escape the relationship of order and successor, the matrix formalism which furthermore escapes the principle of commutativity of the product, non-commutative geometry, geometric algebra which axiomatizes the fact that two parallel lines intersect at infinity, Hilbert’s N-dimensional spaces, etc. Their equivalents in physics can also be mentioned, with general relativity, which upsets our intuitive reference points in terms of space–time, or quantum physics with non-locality, for example, or Heisenberg’s uncertainty principle (the more we know the position of a particle, the less we can determine its speed and vice versa), or string theory requiring a 10-dimensional geometric formalism.
Here again, we will have the emergence of new principles of uncertainty linked to the fact that, in reality, it is impossible to measure each of the operators of a complex number or matrix, but only their product, or their square, or their linear combination in the space of real numbers, as their name indicates. The same is true for 10-dimensional spaces, whose reduction can only be observed by projection into a three- or four-dimensional space. This introduces an intrinsic distance between the “deterministic truth” of a theory and the uncertainty of the principles of proof. Knowing the product of two variables is not enough to determine the value of each variable. For example, the appearance of probabilistic variables makes it impossible to reproduce experiments with strictly the same initial conditions giving strictly the same result. On the other hand, it can be observed that two absolutely identical results in the experiment can come from two completely different situations in formal terms.
Let us take a well-known example from quantum physics. The wave functions mathematized by Schrödinger’s equations can be broken down into a product of two functions: a form function (trajectory for the wave, probability of presence for the particle or confined wave) and an energy function governed by the universal law of energy conservation. Formally, this theory is perfectly deterministic. The equation and its solution have no hazardous components. It is the interpretation and especially the process of quantum measurement that will introduce the uncontrollable, what is historically called the wave packet projection rule: chance projection on one of the components. Without measurement, the evolution of the quantum system would be just as deterministic for a physicist as a classical system.
Conversely, a formal theory can only be constructed within a formal locality (physical or temporal space defined by an axiomatic). Another source of incompleteness will then emerge, that of the influence of what is not local or of what is at the limits of the physical or temporal formal locality: hidden variables, for example, or side effects, or sensitivity to initial conditions.
All in all, we can see that what underpins the epistemological approach in this mathematical–physical approach is exo-distributive. The principle of formal construction proceeds from the fabrication of a formal intelligence external to the object, which even “creates the object” by constituting it from a conceptual point of view. The principle of proof is separate, and real matter is thought by the scientist to be subject to formal laws that determine it even in time, according to a principle of causality. Even if incompleteness and uncertainties are intrinsic (we will return to this), the formalism is distributed within the real to constitute it as an object that “no longer” escapes human understanding. In fact, formal continuity creates real continuity.
Thus, let us imagine a ball for drawing the lottery, itself made up of balls each containing 10 balls for each unit, 10, 100, etc. Although the result of the draw is perfectly random, each of the processes of impact between them, between them and the walls, and of centrifugal acceleration is perfectly deterministic and yet creates a perfectly random process by its combination. This possibility was highlighted by Henri Poincaré (1890) for the laws of gravitation in his theory of the three bodies. The temporality of the machine, and its conformity to the laws that are distributed in it to interpret its functioning determine it entirely, even if incompleteness and indeterminacy prevail through the combinatorial process (impacts, speed, etc.). We speak of deterministic chaos. Indeed, in theory, if the initial conditions at the time of the launching of the draw were exactly the same (weight and shape of the balls, atmosphere, temperature, pressure, etc.), the result would be identical each time. But the multiplicity of impacts and the divergent nature of the rebounds on the walls, and the sensitivity of the trajectories to the exact nature of the impact mean that a tiny variation in one of the many conditions, both initial and occurring during the draw, makes the uncertainty grow exponentially with time.
So what difference does it make if the objects are not inert and the machine is a biological and living organism? The first and most important difference is that for the machine there is no contingent intention other than to provide the lottery number. The designer’s intention is entirely and exclusively translated by a technique without any autonomy in relation to this. There is no intrinsic conservation requirement apart from the principle of energy conservation, no intrinsic “intelligence or decision-making capacity” that is distributed in the balls full of balls and the balls themselves. For the machine, essence precedes existence. The design of the machine and its operation have the sole aim of putting the machine at the service of its designer’s objective, to draw the lottery at random.
For the living world, on the other hand, for each complex constituent element there is a teleology which perpetually reinvents itself, driven by a “contingent purpose” which runs through like a force field: the forces of disorder, of entropic alteration, against the forces of anti-entropic conservation, intrinsic intelligence, a distributed decision-making capacity, continuously reinventing its ecorithms (Valiant 2013) and in a co-contributory manner. Moreover, the chaos intrinsic to the complexity and variability of the initial and “living” conditions is subject to ongoing anti-chaotic corrections, maintained by adaptive ecorithms.
It is important to pause here for a moment on this probably inappropriate term, purpose. I use it here deliberately to give me the opportunity to clarify a fundamental point to which I will return at the end of the book. Indeed, the word purpose is tricky because it most often implies the notion of intentionality. As with the machine, it implies the existence of a designer (a great architect) with a precise intention; therefore, the word teleology would be more precise. For it is an internal purpose within a living organism, whereas purpose brings to mind the whole universe and a designer