Chapter Summary
Chapter Summary
Key Points
- 1.
There are four modes of convergence for random variables: almost sure, in probability, in , and in distribution. The implications are a.s. in probability in distribution, and in probability. No other general implications hold.
- 2.
The Weak Law of Large Numbers (WLLN) states that for i.i.d. sequences with finite variance. The proof is a direct application of Chebyshev's inequality: .
- 3.
The Strong Law of Large Numbers (SLLN) upgrades to almost sure convergence: under only finite mean. The Borel-Cantelli proof for finite fourth moments shows .
- 4.
The Central Limit Theorem (CLT) characterizes the fluctuations around : . The proof via characteristic functions exploits the Taylor expansion and the limit .
- 5.
The Berry-Esseen theorem quantifies the CLT convergence rate at , depending on the third absolute moment of the underlying distribution.
- 6.
The multivariate CLT, delta method, and Slutsky's theorem extend the CLT to vector averages and smooth functions thereof — the foundation of asymptotic statistics and performance analysis in communications.
Looking Ahead
Chapter 12 develops conditional expectation as a random variable, the key abstraction that bridges the convergence theory of this chapter to the filtering and prediction problems of stochastic processes. The SLLN and CLT will reappear throughout the book: in ergodic theorems for stationary processes (Part IV), in the derivation of spectral estimators (Part V), and in the information-theoretic applications of Book ITA.