Information Organization of the Universe and Living Things. Alain Cardon
Чтение книги онлайн.
Читать онлайн книгу Information Organization of the Universe and Living Things - Alain Cardon страница 7
![Information Organization of the Universe and Living Things - Alain Cardon Information Organization of the Universe and Living Things - Alain Cardon](/cover_pre1052479.jpg)
Computer science as a science of the calculable appears here. All these integer functions, all that mathematicians can define on these integers in the form of various equations, are equivalent to computer programs of abstract machines. It has been shown that for any function of a sequence of integers in another sequence of integers to make mathematical sense, to be coherent, there must exist some abstract machine, an “abstract computer”, with instructions that allow it to be calculated. The existential of all mathematical functions on integers has a meaning if the computable allows it to have one and vice versa. This very powerful theoretical result is the famous thesis of Alonzo Church, dating from 1936. It amounts to saying that for a function on integers to have a mathematical meaning, to be coherent, we need only define the program of a theoretical machine which can calculate it. If there is no such program, the function does not exist and is not logically admissible. Today, the fields of application of computer science are considerable, making it possible to represent practically all structurable knowledge in all fields and to direct quantities of electronic devices in real time.
In the usual approach, computer science deals with the processing of information related to multiple calculations, including those established from a great number of functions, with systems using as a basic element what is called state machines. A state machine is an abstract machine passing through strictly determined states, choosing them one after the other from a set defined as available, and in which precise elementary instructions are executed. We use the automaton by starting from an initial state to reach a final state which is the expected result of the following calculations. This notion of state automaton is fully used when dealing with problems that are decomposed into many sub-problems, all very well defined, the whole forming a perfectly structured set, where what is to be computed at each step is well specified and develops the continuation of what is to be computed. Such problems belong to the class of what are called well decomposable problems. Computer science deals, at the electronic level, with binary information encoding elementary instructions, forming the programs themselves composed of sequences of calculations carried out by elementary instructions. The length of the programs and the number of instructions of a program can be considerable, and several programs can easily be executed at the same time and communicate to each other at the right time information allowing to realize well synchronized calculations. But, in a classical way, any program remains a sequence of calculations which, step by step, each calculation step after each calculation step, passes through a finite number of predefined states until its final state.
This locally mechanistic vision of the computing process has evolved a lot. Today, we know how to make many, many programs communicate, based on state machines, which run in parallel and, above all, which modify their own machines during their operation by communicating to synchronize themselves, even though the basis of each program is still the state machine. We have therefore shifted the framework of the regular automation of programs to the notion of autonomy. We know how to build programs made of many sub-programs which have their own behavior, which can communicate, synchronize, modify themselves and which can especially generate new programs breaking the order given by the initially conceived state machines. These systems are called adaptive multiprocess systems, and they are the ones that run on current computer clusters. Indeed, this is the case for any networked operating system that manages the simultaneously active resources and applications of a desktop computer, which is so common today. The notion of multiprocessing is important, and it has been a basic notion for the conception of artificial consciousness (Cardon 2018), because it places the consideration of programs at the level of autonomous software entities, active, carrying out precise local actions and above all highly communicative with each other in order to form dynamic structures that are constantly changing.
We will use the following two different notions of the term process:
– The notion of a functional process which is seen as a vast movement of components exchanging information and energy, and producing the state of a certain system, as is the case of brains producing representations. We can thus speak of the process of emergence of a form of thought about something focused from a trigger generating intention.
– The much more precise notion of computer process, which is a small program wrapped in utilities and processed in a computer system that handles quantities of them simultaneously. We will then speak of swarms of processes to designate very numerous computer processes running in competition, this notion of swarm of processes being then close to the other notion of functional process.
Generally speaking, there are two categories of programs in computer science:
– The category of programs where it is a question of calculating a given function which is precise, well defined in advance, of strictly developing the calculations of all the necessary steps, which amounts to the execution of a structured set of state machines.
– The category of autonomous programs composed of multiple swarms of processes that will run in parallel, that will capture internal and external information, that will confront each other at certain times to exchange information, that will modify themselves, generate others and thus create new processes, to finally produce a global result that will be the most adapted to the situation that has evolved from its initial phase.
The second category is, for example, the state of all Internet users’ programs over a given period of time, when these programs themselves constantly consult and modify highly interactive Web sites. There are no permanent elements in this case and the problem cannot be based on an a priori decomposition into permanent elements.
The difference between the two categories of problems proposed lies in a fundamental point. There are programs in both cases, but in the second case, they will have to be constantly modified, rewritten and evolve, whereas in the first case, the software is used as it was conceived, with its initial capacities well-defined and non-variable during use. The second case must make one think of a certain form of autonomous, very abstract, artificial behavior.
1.3. Formation of the Universe in physical sciences
In mathematics there exists the domain of real numbers, which is a complete Archimedean body, whose use defines sophisticated and very powerful equations: the differential equations and the partial differential equations. This domain of real numbers using differential equations has allowed physicists to define very fine theories of the evolution of matter in the Universe. To define a differential equation, we first define the variables that characterize the observed system, and then we define the functions and their relations that should allow us to predict the evolution of the values of the variables, taking into account the values of certain constants. All this is put into a differential equation which must calculate the functions and thus obtain the characters of the evolution of a system starting from an initial state. This will be used in a very important way.
The physical sciences have been working for a very long time on a generation model for the Universe. The main model of the physical theory describing the creation of the Universe is the Big Bang model, posed by Alexandre Friedmann in 1925 and by Edwin Hubble in 1929, then very widely developed thereafter. This model posits the existence of a primordial nucleosynthesis, a very singular set of quantum elements with a considerable temperature that inhibits the propagation of photons that continuously interact with quantum particles. After a hundred seconds, the photons lose energy and the protons and neutrons are able to associate in a durable way to generate the first complex nuclei of the elements. From this initial set, the Universe developed by a considerable dilation while its temperature remained very high. The initial temperature of the