Neuro-inspired Information Processing. Alain Cappy

Чтение книги онлайн.

Читать онлайн книгу Neuro-inspired Information Processing - Alain Cappy страница 6

Neuro-inspired Information Processing - Alain Cappy

Скачать книгу

led to a veritable revolution: the development of the first microprocessor. Since then, digital information processing technologies have witnessed tremendous progress, in terms of both their technical performance and their impact on society.

      This remarkable evolution was only made possible by the existence of a universal model of information processing machines, the Turing machine, and a technology capable of physically implementing these machines, that of semiconductor devices. More specifically, the “binary coding/Von Neumann architecture/CMOS technology” triplet has been the dominant model of information processing systems since the early 1970s.

      Yet two limits have been reached at present: that of miniaturization, with devices not exceeding several nanometers in size, and that of power dissipated, with a barrier of the order of 100 Watts when the processor is working intensely.

      Associating neuroscience, information technology, semiconductor physics and circuit design as well as mathematics and information theory, the subject matter addressed covers a wide variety of fields.

      To enable the reader to progress uninterrupted through this book, they are regularly reminded of the basic concepts, or referred to the list of reference documents provided. Wherever possible, mathematical models of the phenomena studied are proposed, in order to enable an analysis that while simplified, offers a quantitative picture of the influence of the various parameters. This thinking aid using analytical formulations is, we believe, the condition for sound understanding of the physics of the phenomena involved.

      This book is organized into four essentially independent chapters:

       – Chapter 1 introduces the basic concepts of electronic information processing, in particular coding, memorization, machine architecture and CMOS technology, which constitutes the hardware support for such processing. As one of the objectives of this book is to expand on the link between information processing and energy consumption, various ways of improving the performance of current systems are presented – particularly neuro-inspired processing, the central topic of this book. A fairly general comparison of the operating principles and performance of a modern microprocessor and of the brain is also presented in this chapter.

       – Chapter 2 is dedicated to the known principles of the functioning of the brain, and in particular those of the cerebral cortex, also known as “gray matter”. In this part, the approach is top-down, i.e. the cortex is first looked at from a global, functional perspective before we then study its organization as a basic processing unit, the cortical columns. An emblematic example, vision and the visual cortex, is also described to illustrate these different functional aspects.

       – Chapter 3 offers a detailed exploration of neurons and synapses, which are the building blocks of information processing in the cortex. Based on an in-depth analysis of the physical principles governing the properties of biological membranes, different mathematical models of neurons are described, ranging from the most complex to the simplest phenomenological models. Based on these models, the response of neurons and synapses to various stimuli is also described. This chapter also explores the principles of propagation of action potentials, or spike, along the axon, and examines how certain learning rules can be introduced into synapse models.

       – Finally, Chapter 4 covers artificial neural and synaptic networks. The two major approaches to creating these networks, using software or hardware, are presented, together with their respective performance. A state of the art is also given for each approach. In this chapter, we show the benefits of hardware in the design and creation of networks of artificial neural and synaptic networks with ultra-low power and energy consumption, and examples of artificial neural networks ranging from the very simple to the highly complex are described.

      1 1 Memory that can be both written to and read from.

      2 2 Read-only memory.

      3 3 Zetta = 1021 and one byte is made up of 8 bits.

      4 4 Gordon Moore co-founded Intel in 1968.

      5 5 Represented by the number of logic gates per circuit.

      6 6 Also referred to as “bio-inspired”.

      Конец ознакомительного фрагмента.

      Текст предоставлен ООО «ЛитРес».

      Прочитайте эту книгу целиком, купив полную легальную версию на ЛитРес.

      Безопасно оплатить книгу можно банковской картой Visa, MasterCard, Maestro, со счета мобильного телефона,

Скачать книгу