Vibroacoustic Simulation. Alexander Peiffer
Чтение книги онлайн.
Читать онлайн книгу Vibroacoustic Simulation - Alexander Peiffer страница 25
or in terms of the mean square value
This can be rewritten as
This is a minimization problem and we search for the slope K that minimizes the sum of squared deviations J. This function has a quadratic dependence and can be rewritten as
where A=E[f2], B=−E[fg], and C=E[g2], with all three terms being real expected values from real random processes. Equation (1.143) is parabolic in shape, and the minimum is found by setting the first derivative with respect to K to zero.
Therefore, the point that minimizes J is given by K0=−B/A. In order to assure K0 being a minimum we need d2J/dK2>0, meaning that A must be positive. This can be easily proven, as the expected value of the squared function E[f2] must be positive. If we substitute K0 into Equation (1.143) we get the following relationships for K0 and J0
Using the definition of variances we can write J0 in the case of zero mean processes in a non-dimensional form:
The quantity ρfg=E[fg]/σfσg is the normalized correlation coefficient correlation coefficient ! normalised between f and g. If both processes are perfectly correlated ρfg=1. If they are fully uncorrelated ρfg=0. In terms of the linear relationship from (1.141) all points would be perfectly on the line for full correlation and would be arbitrarily distributed for no correlation (Figure 1.22).
Figure 1.22 Example for correlation of random processes. No correlation (left) and different correlation values (right). Source: Alexander Peiffer.
1.5.3 Correlation Functions for Random Time Signals
In the above considerations we have taken the values from an ensemble of random processes or signals taken at t1. We can also define a correlation coefficient for values taken from two processes at different times t1 and t2:
The numerator is called the cross correlation function cross correlation:
If the two processes are stationary the value of the cross correlation function depends only on the distance between the two times, i.e. t2=t1+τ. τ is called the lag or separation between the two time samples and we can write:
It also makes sense to correlate the function f(t) with itself at later moments f(t+τ). This is called the autocorrelation function defined by:
This function will later enable us to describe the spectrum of random functions. At τ = 0 the value is known as variance of f(t) as given by Equation (1.137):
The autocorrelation is symmetric in time, proven by:
In addition some useful properties can be derived for the cross correlation function