Random Vectors and Their Statistics
From Scalars to Vectors
In Chapters 5–7, we studied individual random variables and pairs . But in most engineering applications, we observe not a single measurement but a collection of measurements simultaneously. A MIMO receiver observes antenna outputs; an estimator processes a vector of samples; a stochastic process evaluated at time instants yields a random vector.
The natural mathematical object is the random vector , and the natural summary statistics are the mean vector and covariance matrix. This section establishes the vocabulary and the key structural result: the covariance matrix is always positive semi-definite.
Definition: Random Vector
Random Vector
A random vector is an ordered collection of random variables defined on the same probability space :
The joint PDF (when it exists) is such that for any (measurable) set .
Random vector
An ordered collection of random variables on the same probability space. Completely characterized by the family of finite-dimensional distributions.
Related: Covariance matrix
Definition: Mean Vector and Covariance Matrix
Mean Vector and Covariance Matrix
Let be a random vector with finite second moments. The mean vector is
The covariance matrix is the matrix
whose -entry is . The correlation matrix is .
The diagonal entries of are the variances , and the off-diagonal entries are the covariances .
Covariance matrix
The matrix summarizing all pairwise covariances of a random vector. Always symmetric and positive semi-definite.
Related: Random vector, Positive semi-definite (PSD)
Theorem: Covariance Matrices Are Positive Semi-Definite
For any random vector with finite second moments, the covariance matrix is symmetric and positive semi-definite:
Moreover, is strictly positive definite if and only if no non-trivial linear combination is a constant (almost surely).
The quadratic form equals , and variance is always non-negative.
Express as a variance
Let . Then
Non-negativity
Since for any random variable , we conclude for all .
Strict positivity condition
Equality holds iff , i.e., is a constant a.s. If no non-trivial linear combination is constant, then .
Positive semi-definite (PSD)
A symmetric matrix is PSD () if for all . Equivalently, all eigenvalues of are non-negative.
Related: Covariance matrix
Common Mistake: Covariance Matrix vs. Correlation Matrix
Mistake:
Confusing the covariance matrix with the correlation matrix .
Correction:
They differ by a rank-one term: . They coincide only when . In signal processing, includes the "DC component" while does not.
Example: Covariance Matrix of a Bivariate Distribution
Let with , , , , and . Write the covariance matrix and verify that it is PSD.
Write the matrix
$
Check PSD via eigenvalues
The eigenvalues satisfy , giving and . Both are positive, so .
Alternative check
and the diagonal entries are positive, confirming positive definiteness.
Cross-Covariance Matrix
For two random vectors and , the cross-covariance matrix is the matrix
Notice that . The cross-covariance measures the linear dependence between and and plays a central role in LMMSE estimation (Book FSI, Chapter 3).
Why This Matters: Covariance Matrices in MIMO Channel Modeling
In a MIMO system with transmit and receive antennas, the received signal vector has a covariance matrix that encodes the spatial correlation structure of the channel and noise. The transmit covariance is the design variable in capacity-achieving precoding (water-filling over the eigenmodes of ). The receive spatial correlation determines how much diversity the channel offers. All of massive MIMO analysis rests on the covariance matrix structure developed in this chapter.