Stationarity
Why Stationarity Matters
The full statistical description of a process requires the infinite family of fdds β an unwieldy object. Stationarity is the assumption that the statistical properties of the process do not change with time. Under this assumption, a single first-order distribution suffices (it is the same at all times), a single second-order distribution characterized by the lag suffices, and so on. This enormous simplification is what makes statistical signal processing tractable. In communications, stationarity holds (at least approximately) for thermal noise, for a fading channel observed within its coherence time, and for the output of any time-invariant system driven by a stationary input.
Definition: Strict-Sense Stationarity (Definition 49)
Strict-Sense Stationarity (Definition 49)
A stochastic process is strict-sense stationary (SSS) if its finite-dimensional distributions are invariant under time shifts: for every , every , and every shift such that ,
Strict stationarity is a very strong condition: it requires every aspect of the joint distribution to be time-invariant. In practice, we rarely verify SSS directly because it involves all orders of the fdds. Instead, we work with the weaker but far more practical notion of wide-sense stationarity.
Consequences of SSS
If is SSS, then:
-
First-order distribution is constant: for all . In particular, and are constants.
-
Second-order distribution depends only on lag: depends only on .
-
All th-order statistics depend only on time differences: the -point statistics are invariant to a common time shift.
SSS WSS (but not conversely, in general).
Definition: Wide-Sense Stationarity (Definition 52)
Wide-Sense Stationarity (Definition 52)
A second-order process is wide-sense stationary (WSS) if:
- is constant (independent of ), and
- depends only on the time difference .
We write .
For a discrete-time process, is WSS if (constant) and depends only on the lag .
WSS Is the Engineer's Stationarity
Wide-sense stationarity involves only two conditions β constant mean and lag-dependent autocorrelation β both of which can be estimated from data. This is why WSS is the default assumption in signal processing and communications: it is strong enough to enable powerful tools (power spectral density, Wiener filtering, matched filtering) but weak enough to be approximately satisfied in many practical scenarios.
The gap between WSS and SSS is bridged for Gaussian processes, as Lemma 43 below shows.
Theorem: SSS Implies WSS (for Second-Order Processes)
If is strict-sense stationary and , then is wide-sense stationary.
SSS makes all fdds time-shift invariant. In particular, the first-order moment (from the first-order distribution) and the second-order product moment (from the second-order distribution) inherit this invariance, which is exactly the WSS condition.
Constant mean
The first-order distribution is independent of (SSS with ). Therefore .
Lag-dependent autocorrelation
The joint distribution of depends only on (SSS with ). Therefore depends only on .
Definition: Gaussian Process (Definition 51)
Gaussian Process (Definition 51)
A process is Gaussian if for every and every , the random vector is jointly Gaussian.
A Gaussian process is completely determined by its mean function and its autocovariance function .
Theorem: WSS Gaussian Process Is SSS (Lemma 43)
If is a Gaussian process and wide-sense stationary, then is strict-sense stationary.
A Gaussian distribution is completely determined by its mean vector and covariance matrix. If the mean is constant and the covariance depends only on time differences (WSS), then the entire joint distribution is shift-invariant β which is SSS.
Characterize the finite-dimensional distributions
For any , the vector is Gaussian with mean (constant by WSS) and covariance matrix (depends only on time differences by WSS).
Show shift-invariance
Under a time shift , the shifted vector has the same mean and the same covariance matrix . Since a Gaussian distribution is uniquely determined by its first two moments, the distribution of the shifted vector is identical. This holds for all and all time indices, so the process is SSS.
Example: WSS but Not SSS
Let be a random variable with , , but is not Gaussian (e.g., takes values each with probability ). Define the process for all . Is WSS? Is it SSS?
Check WSS
(constant). for all , which depends only on (trivially, since it is constant). So is WSS.
Check SSS
The first-order distribution of is the distribution of , which is the same for all β consistent with SSS. The second-order joint distribution of is the distribution of , which is concentrated on the line β also the same for all .
In fact, all fdds are those of the constant sequence , which does not depend on the time indices. So this process is indeed SSS.
A genuine WSS-but-not-SSS example
For a truly WSS-but-not-SSS example, we need the higher-order statistics to vary with time. Consider where is an i.i.d. sequence with , , and . Then , (depends only on lag), so is WSS. But the third moment alternates in sign, so the third-order distribution changes with β not SSS.
WSS vs. Non-WSS Processes
Compare WSS and non-WSS processes. The WSS process has constant mean and lag-dependent autocorrelation. The non-WSS process has time-varying statistics.
Parameters
Strict-Sense vs. Wide-Sense Stationarity
| Property | SSS | WSS |
|---|---|---|
| Condition | All fdds shift-invariant | Constant mean + lag-dependent |
| Involves | All moments and distributions | Only 1st and 2nd moments |
| Testable from data? | Generally no | Yes β estimate mean and autocorrelation |
| SSS WSS? | Yes (if 2nd moment exists) | β |
| WSS SSS? | No (in general) | Yes, if Gaussian |
| Practical use | Theoretical ideal | Default assumption in signal processing |
Quick Check
Let where is a constant random bias and is a zero-mean WSS process independent of . Is WSS?
Yes, because the sum of WSS processes is WSS
Yes: the mean is constant and the autocorrelation depends only on
No, because is a random constant that biases each realization differently
. , which depends only on . Both conditions hold.
Common Mistake: Assuming WSS Implies SSS
Mistake:
Concluding that a WSS process has time-invariant distributions of all orders.
Correction:
WSS guarantees only that the first two moments are time-invariant. Higher-order statistics may still vary with time. The only general case where WSS implies SSS is for Gaussian processes (Lemma 43), because their distributions are fully determined by the first two moments.
Common Mistake: Forgetting That WSS Implies Constant Variance
Mistake:
Computing for a WSS process and getting a time-dependent answer.
Correction:
For a WSS process, is the same for all . If your calculation gives a time-dependent variance, check whether the process is truly WSS.
WSS and the Coherence Time
In mobile wireless communications, the channel is modeled as WSS only over a time interval called the coherence time . Beyond , the channel statistics change due to mobility and environmental changes. A typical design rule is to place pilot symbols at intervals shorter than so that channel estimation can exploit the WSS assumption. The wide-sense stationary uncorrelated scattering (WSSUS) model, introduced by Bello (1963), formalizes this for doubly-selective (time-frequency) channels.
Wide-Sense Stationary (WSS)
A process with constant mean and autocorrelation that depends only on the time difference. The standard assumption in linear signal processing and communications.
Related: Strict-Sense Stationary (SSS)
Strict-Sense Stationary (SSS)
A process whose finite-dimensional distributions are invariant under time shifts. Stronger than WSS; equivalent to WSS for Gaussian processes.
Related: Wide-Sense Stationary (WSS)
Key Takeaway
Wide-sense stationarity β constant mean and lag-dependent autocorrelation β is the practical form of stationarity used throughout signal processing and communications. For Gaussian processes, WSS and SSS are equivalent, which is why the Gaussian assumption is so powerful: second-order statistics tell the whole story.