Digital Communications 1. Safwan El Assad

Чтение книги онлайн.

Читать онлайн книгу Digital Communications 1 - Safwan El Assad страница 14

Digital Communications 1 - Safwan El Assad

Скачать книгу

about the symbol actually transmitted, called xi (one-to-one correspondence), therefore:

      [2.40] images

      Consequently:

      [2.41] images

      and:

      [2.42] images

      – Channel with maximum power noise: in this case, the variable at the input is independent of that of the output, i.e.:

      [2.43] images

      We then have:

      [2.44] images

      [2.45] images

      [2.46] images

      Note.– In information security, if xi is the plaintext, and yj is the corresponding ciphertext, then p(xi/yi) = p(xi) is the condition of the perfect secret of a cryptosystem.

      The mutual information obtained on the symbol xi when the symbol yj is received is given by:

      [2.47] images

      The average value of the mutual information, or the amount of information I(X, Y) transmitted through the channel is:

      [2.48] images

      or:

      [2.49] images

      Hence:

      [2.50] images

      [2.51] images

      [2.52] images

      [2.53] images

      Special cases.

       – Noiseless channel: X and Y symbols are linked, so:I(X, Y)=H(X)=H(Y)

       – Channel with maximum power noise: X and Y symbols are independent, therefore:I(X, Y) = 0

      Claude Shannon introduced the concept of channel capacity, to measure the efficiency with which information is transmitted, and to find its upper limit.

      The capacity C of a channel: (information bit/symbol) is the maximum value of the mutual information I(X, Y) over the set of input symbols probabilities images

      [2.54] images

      The maximization of I(X, Y) is performed under the constraints that:

images

      The maximum value of I(X, Y)occurs for some well-defined values of these probabilities, which thus define a certain so-called secondary source.

      The capacity of the channel can also be related to the unit of time (bitrate Ct of the channel), in this case, one has:

      [2.55] images

      The channel redundancy Rc and the relative channel redundancy pc are defined by:

      [2.56] images

      [2.57] images

      The efficiency of the use of the channel images is defined by

      [2.58] images

      2.7.1. Shannon’s theorem: capacity of a communication system

      Shannon also formulated the capacity of a communication system by the following relation:

      [2.59] images

      where:

       – B: is the channel bandwidth, in hertz;

       – Ps: is the signal power, in watts;

        is the power spectral density of the (supposed) Gaussian and white noise in its frequency band B;

        is the noise power, in watts.

      EXAMPLE.– Binary symmetric channel (BSC).

      Any binary channel will be characterized by the noise matrix:

images

      If the binary channel is symmetric, then one has:

      p(y1/x2) = p(y2/x1) = p

      p(y1/x1)

Скачать книгу