EEG Signal Processing and Machine Learning. Saeid Sanei

Чтение книги онлайн.

Читать онлайн книгу EEG Signal Processing and Machine Learning - Saeid Sanei страница 45

EEG Signal Processing and Machine Learning - Saeid Sanei

Скачать книгу

of asymmetry, or more precisely, the lack of symmetry in distribution. A distribution is symmetric if it looks the same to the left and right of its midline or mean point. The skewness is defined for a real signal as

      where μ and σ are respectively, the mean and variance and E denotes statistical expectation. If the distribution is more to the right of the mean point the skewness is negative, and vice versa. For a symmetric distribution such as Gaussian, the skewness is zero.

      Kurtosis is a measure for showing how peaked or flat a distribution is relative to a normal distribution. That is, datasets with high kurtosis tend to have a distinct peak near the mean, decline rather rapidly, and have heavy tails. Datasets with low kurtosis tend to have a flat top near the mean rather than a sharp peak. A uniform distribution would be the extreme case. The kurtosis for a signal x(n) is defined as:

      (4.2)equation

      which is zero for Gaussian distributed signals. Often the signals are considered ergodic, hence the statistical averages can be assumed identical to time averages so that they can be estimated with time averages.

      The negentropy of a signal x(n) [11] is defined as:

      (4.4)equation

      where x Gauss(n) is a Gaussian random signal with the same covariance as x(n) and H(.) is the differential entropy [12] defined as:

      and p(x(n)) is the signal distribution. Negentropy is always nonnegative.

      Entropy, by itself, is an important measure of EEG behaviour particularly in the cases in which the brain synchronization changes such as when brain waves become gradually more synchronized when the brain approaches the seizure onset. It is also a valuable indicator of other neurological disorders presented in psychiatric diseases.

      The KL distance between two distributions p 1 and p 2 is defined as:

      A number of different dissimilarity measures may be defined based on the fundamentals of signal processing. One criterion is based on the autocorrelations for segment m defined as:

      (4.7)equation

      The autocorrelation function of the mth length N frame for an assumed time interval n, n + 1, …, n + (N − 1), can be approximated as:

      (4.8)equation

      Then the criterion is set to:

      (4.9)equation

      (4.10)equation

      where m refers

Скачать книгу