Cryptography, Information Theory, and Error-Correction. Aiden A. Bruen

Чтение книги онлайн.

Читать онлайн книгу Cryptography, Information Theory, and Error-Correction - Aiden A. Bruen страница 20

Cryptography, Information Theory, and Error-Correction - Aiden A. Bruen

Скачать книгу

and secrecy systems furnish an interesting application of communication theory.

      Indeed, this is precisely the point of view which inspired the authors of this book! We believe it is unrealistic to separate the study of cryptography from the study of communication theory embodying error‐correction and information theory.

      To illustrate this, Shannon points out that just as in error‐correction, where the receiver tries to decode the message over a noisy channel so also, in cryptography, a receiver (this time, Eve, the eavesdropper) tries to decode the message over a noisy channel, the noise being the scrambling by the key which obfuscates the plain text to the cipher text.

      In this paper, Shannon discusses at length his two famous principles of confusion and diffusion described in detail in Chapter 4. The Vernam cipher offers perfect security. We discuss perfect security in detail in Part II of the book where it is shown that, under appropriate conditions, perfect security corresponds precisely to a Latin square. Shannon's paper makes it quite clear that he was aware of this phenomenon though he did not explicitly state it.

      In the paper, Shannon clearly differentiates between computational and unconditional security. Whether or not he “missed” public key cryptography is far from clear. However, in [Mas02] Massey points out that Hellman of Diffie–Hellman fame, has credited the following words from Shannon's paper as the inspiration for their discovery:

      The problem of good cipher design is essentially one of finding difficult problems ellipsis. We may construct our cipher in such a way that breaking it is equivalent to ellipsis the solution of some problem known to be laborious.

       Shannon theory: information compression and communication.

      Shannon's revolutionary paper [Sha49b] on information theory electrified the scientific world and has dominated the area of communication theory for over 50 years. No other work of the twentieth century has had greater impact on science and engineering.

      First of all, Shannon unified what had been a diverse set of communications – voice, data, telegraphy, and television. He quantified and explained exactly what information means. The unit of information is the Shannon bit. As Golomb et al. [GBC+02] so elegantly puts it, this is the “amount of information gained (or entropy removed) upon learning the answer to a question whose two possible answers were equally likely, a priori.”

      In the above, we can think of entropy as “uncertainty” analogous to entropy in physics (which is the key idea in the second law of thermodynamics). An example would be the tossing of a fair coin and learning which turned up – heads or tails. If the coin were biased, so that the probability of a head was p (and the probability of a tail was 1 minus p) with p not-equals 1 slash 2, the information gained, on learning the outcome of the toss, would be less than one. The exact amount of information gained would be

      Note that when p equals one half and q equals one half, this works out to be 1. However if, for example p equals 2 slash 3, we gain only approximately 0.918 Shannon bits of information on learning the outcome of the coin toss.

      But this was only the beginning. Shannon then went on to prove his fundamental result on communication, based on entropy and the mathematical ideas delineated above. He showed that any given communications channel has a maximum capacity for reliably transmitting information which he calculated. One can approach this maximum by certain coding techniques – random coding and now turbo coding – but one can never quite reach it. To put it succinctly: Capacity is the bound to error‐free coding. Thus, for the last 50 years, the study of error correction has boiled down to attempts to devise techniques of encoding in order to come close to the Shannon capacity. We will have much to say about this bound in Parts II and III of this book.

      Shannon's work, theoretical and practical, still dominates the field and the landscape. To quote Cover in [Cov02]:

      This ability to create new fields and develop their form and depth surely places Shannon in the top handful of creative minds of the century.

      Few can disagree with this assessment. Indeed, in Part III of this book, we describe protocols in cryptography and error‐correction based squarely on C.E. Shannon's work in information theory.

      The Data Encryption Standard, or DES, was originally approved in 1977 as a block cipher algorithm that provides good cryptographic protection. Computational power has increased dramatically since 1977. DES is no longer considered to be secure. Since May 2005, it is recommended that DES no longer be used [Cen19].

      The Advanced Encryption Standard, or AES, the replacement for DES, is detailed

Скачать книгу