Chapter Summary

Chapter 9 Summary: The Discrete-Time Wiener Filter

Key Points

  • 1.

    The MMSE linear estimator of XnX_n from {Ym}\{Y_m\} is characterized by the orthogonality principle: the error EnE_n is orthogonal to every observation used in the estimate. This translates to the Wiener-Hopf normal equations βˆ‘kh[k]ryy[β„“βˆ’k]=rxy[β„“]\sum_k h[k] r_{yy}[\ell - k] = r_{xy}[\ell].

  • 2.

    When the filter support is Z\mathbb{Z} (non-causal), the Wiener-Hopf equation is a convolution and the solution is a one-liner: hΛ‡nc(f)=Pxy(f)/Py(f)\check{h}_{\text{nc}}(f) = P_{xy}(f)/P_y(f). The non-causal MMSE is ∫(Pxβˆ’βˆ£Pxy∣2/Py) df\int (P_x - |P_{xy}|^2/P_y)\,df.

  • 3.

    The Paley-Wiener condition ∫log⁑Py(f) df>βˆ’βˆž\int \log P_y(f)\,df > -\infty is the necessary and sufficient condition for the PSD to admit a factorization Py(f)=Py+(f)Pyβˆ’(f)P_y(f) = P_y^+(f) P_y^-(f) with a minimum-phase causal factor Py+(f)P_y^+(f).

  • 4.

    The innovations process JnJ_n is obtained by whitening: Jn=(1/Py+)βˆ—YnJ_n = (1/P_y^+) * Y_n. The innovations are white, have unit variance, and span the same causal information as YnY_n.

  • 5.

    The causal Wiener filter is hΛ‡c(f)=(1/Py+(f))β‹…[Pxy(f)/Pyβˆ’(f)]+\check{h}_c(f) = (1/P_y^+(f)) \cdot [P_{xy}(f)/P_y^-(f)]_+, where [β‹…]+[\cdot]_+ is the causal projection operator. It is derived by whitening, projecting, and recoloring.

  • 6.

    The causal MMSE always satisfies Οƒc2β‰₯Οƒnc2\sigma_c^2 \geq \sigma_{\text{nc}}^2; the gap is the price of real-time processing. Equality holds iff Pxy/Pyβˆ’P_{xy}/P_y^- has no anti-causal Fourier components.

  • 7.

    Kolmogorov-Szego formula: the one-step prediction MMSE of a WSS process is the geometric mean of its PSD, Οƒp2=exp⁑∫log⁑Py(f) df\sigma_p^2 = \exp \int \log P_y(f)\,df. The formula makes predictability quantitative.

  • 8.

    For AR(1) signal in white noise, every object has a closed form: Py+P_y^+ is a first-order rational function, the causal Wiener filter is a first-order recursion, and the MMSE can be computed analytically.

  • 9.

    The steady-state Kalman filter for a time-invariant state-space model equals the causal Wiener filter derived from the model's implied PSDs. Wiener gives the frequency-domain view; Kalman gives the time-domain recursive view.

  • 10.

    LMS and RLS adaptive filters converge to the Wiener solution when statistics are stationary; they are the practical tools when statistics are unknown or slowly changing.

Looking Ahead

Chapter 10 generalizes the causal Wiener filter to time-varying state-space models, arriving at the Kalman filter. The machinery of innovations, whitening, and recursive updates that we built here re-appears in disguise: the Kalman filter is essentially a recursive Cholesky factorization of the observation covariance, and the steady-state Kalman gain is the causal Wiener gain. Chapter 11 applies the Wiener framework to equalization: the MMSE equalizer is a Wiener filter whose target is the transmitted symbol sequence and whose observation is the channel output. From there, the adaptive algorithms LMS and RLS follow naturally as stochastic-gradient and recursive-least-squares implementations of the Wiener solution when statistics must be estimated on the fly.