Autocorrelation and Autocovariance
The Central Role of Autocorrelation
For a WSS process, the autocorrelation function encodes how the process at time relates to itself at time . This single function determines the average power (), the rate of fluctuation (how fast decays), the bandwidth (via the power spectral density, which is the Fourier transform of β Chapter 14), and the performance of optimal linear filters. Understanding the properties of is the gateway to spectral analysis and filter design.
Definition: Autocorrelation Function
Autocorrelation Function
For a second-order process , the autocorrelation function is
If is WSS, then , where .
For a discrete-time WSS process: (independent of ).
Definition: Autocovariance Function
Autocovariance Function
The autocovariance function of a second-order process is
For a WSS process with mean :
The autocovariance measures the fluctuation around the mean. If has zero mean, then .
Theorem: Properties of the WSS Autocorrelation
Let be a WSS process with autocorrelation . Then:
-
Average power: .
-
Hermitian symmetry: . For real-valued processes: (even function).
-
Maximum at the origin: for all .
-
Non-negative definiteness: For any , any , and any :
Property 3 says that is most correlated with itself (at lag zero). This is intuitive: the best predictor of a signal is the signal itself, and correlation decreases as you look further into the past or future. Property 4 ensures that the variance of any linear combination is non-negative.
Property 1: Average power
.
Property 2: Hermitian symmetry
. Replacing by (valid since the mean is constant): . Taking the conjugate: .
Property 3: Maximum at origin
By the Cauchy-Schwarz inequality:
Property 4: Non-negative definiteness
Let . Then:
Definition: Non-Negative Definite Function (Definition 50)
Non-Negative Definite Function (Definition 50)
A Hermitian symmetric function (i.e., ) is positive semi-definite (or non-negative definite) if for all , all , and all :
For a WSS function depending only on , we say is positive semi-definite if for all choices.
The autocorrelation of any process is positive semi-definite. Conversely, every positive semi-definite function is the autocorrelation of some process. This is the Herglotz (or Bochner) theorem, which we will see in Chapter 14 in connection with the power spectral density.
Example: Exponential Autocorrelation
A real-valued WSS process has autocorrelation with . Verify the four properties of the autocorrelation and find the average power and autocovariance (assuming ).
Average power
. The average power is .
Even symmetry
. β
Maximum at origin
for all . β
Non-negative definiteness
This can be verified by showing that the Fourier transform (the PSD) is non-negative: Since , the function is positive semi-definite (Bochner's theorem). β
Autocovariance
Since : .
Autocorrelation Properties
Explore the properties of the autocorrelation function for different WSS processes. Observe the even symmetry, the maximum at the origin, and how the decay rate relates to the "memory" of the process.
Parameters
Theorem: Toeplitz Covariance Matrix of WSS Processes
Let be a WSS process. For any block of consecutive samples , the covariance matrix is which is Toeplitz (constant along diagonals) and independent of the starting index .
The Toeplitz structure is the matrix manifestation of WSS: the covariance between and depends only on the distance , not on the absolute position. This structure is exploited in fast algorithms (Levinson-Durbin) and in the asymptotic analysis of the eigenvalue distribution (SzegΕ's theorem).
Direct computation
. This depends only on , so is Toeplitz. The starting index cancels.
Example: Autocorrelation of a Moving Average Process
Let be a zero-mean i.i.d. sequence with variance . Define the moving average (MA) process . Find the autocorrelation and verify that is WSS.
Mean
(constant).
Autocorrelation
$
Evaluate for each lag
- : pairs with β three pairs. .
- : pairs with β two pairs . .
- : one pair . .
- : no pairs. .
Since depends only on and the mean is constant, is WSS.
Common Mistake: Confusing Autocorrelation with Autocovariance
Mistake:
Using and interchangeably, especially for processes with nonzero mean.
Correction:
includes the mean product, while measures only the fluctuations. For zero-mean processes they coincide, but for they differ. Many spectral analysis formulas use while estimation formulas use .
Historical Note: Norbert Wiener and the Autocorrelation Function
1930s--1940sNorbert Wiener introduced the systematic use of the autocorrelation function in his 1930 paper "Generalized Harmonic Analysis" and his classified 1942 report on fire-control prediction (later published as Extrapolation, Interpolation, and Smoothing of Stationary Time Series, 1949). Wiener recognized that for stationary processes, the autocorrelation function and its Fourier transform (the power spectral density) provide a complete framework for linear prediction and filtering. His work, independently paralleled by Kolmogorov (1941), laid the foundation for all of modern statistical signal processing.
Historical Note: Khintchine and the Positive-Definiteness Connection
1930sAleksandr Khintchine (1934) proved that the autocorrelation function of a stationary process is positive semi-definite, and conversely that every continuous positive semi-definite function is the autocorrelation of some stationary process. This result, combined with Bochner's theorem on positive-definite functions and Fourier transforms, established the rigorous link between autocorrelation and power spectral density β the Wiener-Khintchine theorem (Chapter 14).
Quick Check
For a real-valued zero-mean WSS process with and , what is ?
For a real-valued WSS process, (even symmetry).
Estimating the Autocorrelation from Data
In practice, we estimate from a single observed sample path . The standard estimator is This is the biased estimator (dividing by rather than ). The biased version is preferred because it guarantees a non-negative PSD estimate, while the unbiased version () can produce negative PSD values.
Autocorrelation Function
for a WSS process. Measures the linear dependence between the process at two time instants separated by lag .
Related: Autocovariance Function, Wide-Sense Stationary (WSS)
Autocovariance Function
. The autocorrelation of the zero-mean part of the process.
Related: Autocorrelation Function
Key Takeaway
The autocorrelation function of a WSS process is even (Hermitian symmetric), maximized at the origin (where it equals the average power), and non-negative definite. These properties are not just mathematical curiosities β they ensure that the power spectral density (its Fourier transform) is real and non-negative, which is the basis for all spectral analysis in Chapter 14.