Chapter Summary
Chapter 7 Summary
Key Points
- 1.
The joint CDF is the fundamental object that encodes all joint distributional information. It determines the marginals, but the marginals do not determine it.
- 2.
Joint PMF/PDF: For discrete RVs, ; for continuous RVs, . Marginals are obtained by summing or integrating out the other variable.
- 3.
Conditional distributions: . The conditional expectation satisfies the tower property and the law of total variance.
- 4.
Independence: and are independent iff . Independence implies uncorrelatedness, but uncorrelated does not imply independent (except for jointly Gaussian RVs).
- 5.
Jacobian method: For an invertible transformation , .
- 6.
Convolution: The PDF of for independent is . Gaussians are closed under convolution.
- 7.
Order statistics: for the maximum; for the minimum. The minimum of i.i.d. exponentials with rate is exponential with rate .
- 8.
Covariance and correlation: ; . The variance of a sum decomposes as .
Looking Ahead
Chapter 8 extends these ideas to random vectors and the multivariate Gaussian distribution, where the covariance matrix governs everything: marginals, conditionals, and independence are all read off from the matrix. The conditional Gaussian formula β the Schur complement β is the foundation of LMMSE estimation and Kalman filtering.
Bayesian Channel Estimation via Conditional Distributions
Conditional distributions are the mathematical backbone of Bayesian channel estimation. Koller, Fesl, and Caire developed a scalable Bayesian MIMO channel estimator that computes the posterior efficiently by exploiting the structure of the prior covariance matrix. The tower property ensures that the MMSE estimator minimizes the average estimation error. The techniques developed in this chapter β conditional distributions, Bayes' rule for continuous RVs, and the conditional expectation as the MMSE estimator β are the theoretical foundation upon which this work rests.