Uncorrelated Implies Independent
The Gaussian Miracle
In general, uncorrelation () is strictly weaker than independence. We saw counterexamples in Chapter 7: two random variables can have zero covariance yet remain strongly dependent. The multivariate Gaussian is the grand exception. For Gaussian vectors, uncorrelation and independence are equivalent. This is not just a curiosity — it is the structural property that makes Gaussian models so powerful, because it means that decorrelation (a second-order, linear operation) achieves full statistical independence.
Theorem: Uncorrelated Gaussian Components Are Independent
Let with (diagonal covariance). Then are mutually independent, each with .
More generally, if is block-diagonal with blocks corresponding to sub-vectors , then these sub-vectors are mutually independent.
When is diagonal, the quadratic form in the exponent separates: . The joint PDF factors into a product of marginal PDFs, which is precisely the definition of independence.
Factor the PDF
When is diagonal, and the exponent separates:
Factored PDF implies independence
Since , the components are mutually independent by definition.
The Converse Fails for Non-Gaussian Distributions
Consider and . Then (by symmetry), so and are uncorrelated. But is a deterministic function of — they are maximally dependent! The Gaussian is special precisely because its distribution is fully determined by second-order statistics.
Example: Decorrelation via Eigenrotation
Let with . Find an orthogonal transformation such that and are independent.
Eigendecompose $\ntn{covmat}$
Eigenvalues: , . Eigenvectors: , .
Apply the rotation
Set . Then with .
Conclude independence
Since is diagonal and is Gaussian, and are independent.
Decorrelation = Independence for Gaussians
Key Takeaway
For jointly Gaussian vectors, uncorrelated independent. This is a uniquely Gaussian property. It means that PCA, whitening, and any linear decorrelation technique automatically achieves full statistical independence — but only under the Gaussian assumption.
Uncorrelated vs. Independent
| Property | General distributions | Gaussian |
|---|---|---|
| Independent Uncorrelated | Yes (always) | Yes (always) |
| Uncorrelated Independent | No (counterexample: , ) | Yes (unique to Gaussian) |
| Decorrelation technique | Removes linear dependence only | Removes all dependence |
| Sufficient statistics | Mean + covariance are not sufficient | Mean + covariance are sufficient |
| Practical implication | Must check higher-order moments | Second-order analysis is complete |
Common Mistake: Marginally Gaussian Does Not Imply Jointly Gaussian
Mistake:
Assuming that if and are each marginally Gaussian, then is jointly Gaussian.
Correction:
Marginal Gaussianity is necessary but not sufficient for joint Gaussianity. Counterexample: let , uniformly, independent of , and . Then is also , but is not jointly Gaussian (the conditional takes values , not a Gaussian). Joint Gaussianity requires that every linear combination is Gaussian.
Historical Note: Darmois and Skitovich: The Gaussian Uniqueness
1953The Darmois–Skitovich theorem (1953) provides a deep converse: if are independent and the linear forms and are also independent, then every for which both and must be Gaussian. In other words, the Gaussian is the only distribution for which independence can be preserved under arbitrary linear combinations. This is a characterization theorem that places the Gaussian family in a unique position among all probability distributions.