Chapter Summary
Chapter Summary
Key Points
- 1.
Probability spaces and axioms. A probability space provides the rigorous foundation for every stochastic model in communications: the sample space enumerates all possible outcomes (e.g., all realizable channel states), the -algebra defines the measurable events, and the probability measure assigns consistent probabilities satisfying Kolmogorov's axioms , non-negativity, and countable additivity. Conditional probability and Bayes' theorem underpin maximum-likelihood and MAP detection, where the receiver inverts the channel to estimate transmitted symbols. The law of total probability decomposes complex system analyses — such as computing outage probability over a random fading channel — into tractable conditional calculations.
- 2.
Random variables, distributions, and moments. A random variable maps outcomes to numbers, and its CDF fully characterizes its statistical behavior. For continuous random variables the PDF enables computation of the expectation and variance , which quantify signal power and noise spread in communication links. The Gaussian distribution models thermal noise, the exponential distribution governs inter-arrival times and instantaneous SNR under Rayleigh fading, and the chi-squared distribution arises when summing squared Gaussian components — each playing a distinct role in receiver analysis and system design.
- 3.
Functions of random variables and fading distributions. Transforming random variables via — whether through nonlinear device characteristics or envelope detection — requires the Jacobian formula for monotonic mappings. The Rayleigh distribution, arising as the envelope of a circularly symmetric complex Gaussian , models non-line-of-sight (NLOS) fading; the Ricean distribution adds a deterministic line-of-sight component with -factor quantifying the ratio of specular to scattered power; and the Nakagami- distribution provides a flexible shape parameter that spans from severe fading () through Rayleigh () to near-deterministic channels (). These fading models directly determine outage probability and average bit-error-rate performance in wireless links.
- 4.
Moment-generating and characteristic functions. The moment-generating function and the characteristic function encode the entire distribution in a single transform, with moments extracted as derivatives at the origin: . For independent random variables, the MGF of a sum factors as a product , which greatly simplifies the analysis of diversity combining schemes (MRC, EGC) where the total SNR is a sum of independent branch SNRs. The characteristic function, guaranteed to exist for all distributions, connects to the PDF through the inverse Fourier transform and is the natural tool for proving the central limit theorem and analyzing OFDM sub-carrier statistics.
- 5.
Random vectors, covariance matrices, and complex Gaussians. A random vector has its second-order statistics captured by the covariance matrix , which is positive semidefinite Hermitian — precisely the matrix class whose eigendecomposition was studied in Chapter 1. The multivariate Gaussian and its complex counterpart are uniquely determined by their first and second moments, with circular symmetry requiring the pseudo-covariance . In MIMO systems the channel vector encodes spatial correlation, and the eigenstructure of directly determines beamforming gain, spatial multiplexing capability, and the capacity of correlated MIMO channels.
- 6.
Convergence concepts and concentration inequalities. The law of large numbers (LLN) guarantees that sample averages converge to ensemble means — justifying ergodic capacity as a meaningful metric when a codeword spans many independent fading realizations. The central limit theorem (CLT) establishes that normalized sums of i.i.d. random variables tend to Gaussian, which explains why aggregate interference in dense networks and OFDM time-domain samples are well-modeled as Gaussian. The Chernoff bound provides exponentially tight tail probabilities that are essential for bounding error rates in coded systems, while Chebyshev's inequality offers distribution-free guarantees on concentration around the mean.
- 7.
Stochastic processes: stationarity and spectral analysis. A stochastic process assigns a random variable to each time instant, with wide-sense stationarity (WSS) requiring a constant mean and an autocorrelation function that depends only on the time lag . The Wiener–Khinchin theorem establishes that the power spectral density (PSD) is the Fourier transform of the autocorrelation, , providing the bridge between time-domain correlation and frequency-domain power distribution. For wireless channels, the Doppler spectrum and its inverse — the time-domain correlation — quantify how rapidly the channel varies, directly governing pilot spacing, coherence time, and the validity of quasi-static fading assumptions.
- 8.
Gaussian processes and white noise. A Gaussian process is fully specified by its mean function and autocorrelation, since all finite-dimensional distributions are jointly Gaussian — a property inherited from the multivariate Gaussian theory of random vectors. White Gaussian noise, with flat PSD and delta autocorrelation , is the canonical noise model in communications: passing it through a filter of bandwidth produces colored noise with power . The additive white Gaussian noise (AWGN) channel serves as the fundamental reference model against which all practical systems are benchmarked, and its sufficiency statistics (matched-filter outputs) are themselves Gaussian, enabling clean derivations of optimal detection and capacity.
- 9.
Markov chains and Poisson processes. A discrete-time Markov chain satisfies the memoryless property , making the transition matrix with entries the complete descriptor of the chain's dynamics. Finite-state Markov channels model the bursty error behavior of fading links — the Gilbert–Elliott model partitions the channel into "good" and "bad" states with distinct error rates, capturing temporal error correlation that i.i.d. models miss. The Poisson process, with independent and stationary increments and inter-arrival times , models random access attempts, call arrivals, and interference events in cellular networks, providing the probabilistic backbone of teletraffic engineering and stochastic geometry.
Looking Ahead
Chapter 3 introduces information theory and coding, translating the probability foundations from this chapter into fundamental performance limits for communication systems. Shannon entropy quantifies the irreducible uncertainty of a source, mutual information measures the information conveyed through a noisy channel, and together they define channel capacity — the ultimate data rate achievable with vanishing error probability. The AWGN channel capacity and its MIMO generalization are direct consequences of the Gaussian and complex Gaussian distributions studied here, while the channel coding theorem guarantees that codes exist which approach these limits — motivating the design of practical codes (turbo, LDPC, polar) that will be examined alongside their information-theoretic underpinnings.